• 0 Posts
  • 36 Comments
Joined 1 year ago
cake
Cake day: July 1st, 2023

help-circle









  • I’ve heard that that’s similar to why Adobe creative cloud was so easy to pirate for years (maybe it still is, idk, I switched to affinity forever ago). Adobe is the industry standard because everyone uses it -> person wants to learn photo/video editing, digital illustration, etc, but can’t afford it -> pirate Adobe instead of trying cheaper/free alternatives, because it’s the industry standard -> person with Adobe skills gets hired by business that pays for Adobe legitimately because that’s what most people know -> Adobe is the industry because everyone uses it, and the cycle goes on…







  • You’re right, it’s very much context dependent, and I appreciate your incite on how this clash between psychology and computer science muddies the terms. As a CS guy myself who’s just dipping my toes into NN’s, I lean toward the psychology definition, where intelligence is measured by behavior.

    In an artificial neural network, the algorithms that wrangle data and build a model aren’t really what makes the decisions, they just build out the “body” (model, generator functions) and “environment” (data format), so to speak. If anything that code is more comparable to DNA than any state of mind. Training on data is where the knowledge comes from, and by making connections the model can “reason” a good answer with the correlations it found. Those processes are vague enough that I don’t feel comfortable calling them algorithms, though. It’s pretty divorced from cold, hard code.


  • AI knows nothing and are just dumb correlation engines

    Here’s a thought exercise, how do you “know”? How do you know your pet? LLMs like gpt can “know” about a dog in terms of words, because that’s what they “sense”, that’s how they interact with their “environment”. They understand words and how they relate to other words, basically words are their entire environment.

    Now, can you describe how you know your dog without your senses, or anything derived from your senses? Remember, chemical receptors are “senses” too.

    I remember reading about this awhile back but I don’t have the link on me: Did you know that people who were born blind but have their vision repaired years later don’t immediately know what “pointy” looks like? They never formed that correlation between the feeling of pointy and the visual of pointy the way that they could with the feeling and the word.

    My point is, we’re correlation machines too


  • This is what it comes down to. Until we agree on a testable definition of “intelligence” (or sentience, sapience, consciousness or just about any descriptor of human thought), it’s not really science. Even in nature, what we might consider intelligence manifests in different organisms in different ways.

    We could assume that when people say intelligence they mean human-like intelligence. That might be narrow enough to test, but you’d probably still end up failing some humans and passing some trained models