69
submitted 4 days ago* (last edited 4 days ago) by FlyingSquid@lemmy.world to c/technology@lemmy.world

Thanks to @General_Effort@lemmy.world for the links!

Here’s a link to Caltech’s press release: https://www.caltech.edu/about/news/thinking-slowly-the-paradoxical-slowness-of-human-behavior

Here’s a link to the actual paper (paywall): https://www.cell.com/neuron/abstract/S0896-6273(24)00808-0

Here’s a link to a preprint: https://arxiv.org/abs/2408.10234

you are viewing a single comment's thread
view the rest of the comments
[-] Buffalox@lemmy.world 1 points 4 days ago

Oh no this is not annoying, this is a very interesting question.
I suppose with the crow, it doesn't need to understand volume of water and rocks displacing it, but merely has a more basic understanding that adding rocks raise the water, or maybe even just makes the food easier to get at.
So I suppose we can agree that there are multiple levels of understanding.
But still the crow must have observed this, unless it actually figured it out? And some thought process must have led it to believe that dropping stones in the water might have the desired effect.
Now even if the crow had observed another crow doing this, and seen this demonstrated. Ir mist have had a thought process concluding that it could try this too, and perhaps it would work.

But there are other situations that are more challenging IMO, and that's with LLM, how do we decide thought and understand with those.
LLM is extremely stupid and clever at the same time. With loads of examples of them not understanding the simplest things, like how meany R's are in Strawberry, and the AI answering stubbornly that there are only 2! But on the other hand being able to spell it out and count them, then being able to realize that there are indeed 3, which it previously denied.

IMO animal studies are crucial to understand our own intelligence, because the principle is the same, but animals are a simpler "model" of it so to speak.
It seems to me that thought is a requirement to understanding. You think about something before you understand it.
Without the thought process it would have to be instinctive. But I don't think it can be argued that crows dropping rocks in water is instinctive.
But even instinctive understanding is a kind of understanding, it's just not by our consciousness, but by certain behavior traits having an evolutionary advantage, causing that behavior to become more common.

So you are absolutely right that thought is not always required for some sort of "understanding". which is a good point.
But what I meant was conscious understanding as in really understanding a concept and for humans understanding abstract terms, and for that type of understanding thought is definitely a requirement.

[-] tabular@lemmy.world 2 points 2 days ago

The crows were shown how to get the food iir.

My understanding is LLM contain artificial neural networks. A simplification with an amount of weights similar to small animals. A simpler model aught to make investigation more clear 😅

Neural networks are "trained" by adjusting the weights on "neurons". I assume real brains are training themselves on every input while LLM is limitted to sessions with training data. Do you suspect there could be a though process when it's processing how many letters are in strawberry? What about when it's weighs are adjusted during training?

[-] Buffalox@lemmy.world 1 points 2 days ago

I think whether you call it a thought process or not comes down to definition of what you mean by that. It's definitely intelligence, and there definitely is a process.
So I wouldn't have a problem calling it a thought process. But it's not self consciousness yet. But we may not be very far from it.
It's amazing the progress that has been achieved the past decade.
When I predicted 2035 as a point where we could possibly achieve strong AI, it was at a point where we'd had 2-3 decades of very little progress. But I've always been certain that the human brain is a 100% natural phenomenon, and the function of it can be copied, just like with everything else in nature. And when that is achieved, there will still be room for improvement.
As a natural process, our brain is built on the physical properties of atoms, so IMO it's only a matter of time before we have an artificial intelligence that is just as valid to call self conscious as ourselves.

this post was submitted on 26 Dec 2024
69 points (71.4% liked)

Technology

60148 readers
1977 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS