65
submitted 1 day ago* (last edited 1 day ago) by FlyingSquid@lemmy.world to c/technology@lemmy.world

Thanks to @General_Effort@lemmy.world for the links!

Here’s a link to Caltech’s press release: https://www.caltech.edu/about/news/thinking-slowly-the-paradoxical-slowness-of-human-behavior

Here’s a link to the actual paper (paywall): https://www.cell.com/neuron/abstract/S0896-6273(24)00808-0

Here’s a link to a preprint: https://arxiv.org/abs/2408.10234

you are viewing a single comment's thread
view the rest of the comments

We don’t think in “bits” at all because our brain functions nothing like a computer. This entire premise is stupid.

[-] nelly_man@lemmy.world 19 points 14 hours ago* (last edited 5 hours ago)

Bit in this context refers to the [Shannon](https://en.wikipedia.org/wiki/Shannon_(unit)) from information theory. 1 bit of information (that is, 1 shannon) is the amount of information you receive from observing an event with a 50% chance of occurring. 10 bits would be equivalent to the amount of information learned from observing an event with about a 0.1% chance of occurring. So 10 bits in this context is actually not that small of a number.

[-] GamingChairModel@lemmy.world 9 points 7 hours ago* (last edited 5 hours ago)

The paper gives specific numbers for specific contexts, too. It's a helpful illustration for these concepts:

A 3x3 Rubik's cube has 2^65 possible permutations, so the configuration of a Rubik's cube is about 65 bits of information. The world record for blind solving, where the solver examines the cube, puts on a blindfold, and solves it blindfolded, had someone examining the cube for 5.5 seconds, so the 65 bits were acquired at a rate of 11.8 bits/s.

Another memory contest has people memorizing strings of binary digits for 5 minutes and trying to recall them. The world record is 1467 digits, exactly 1467 bits, and dividing by 5 minutes or 300 seconds, for a rate of 4.9 bits/s.

The paper doesn't talk about how the human brain is more optimized for some tasks over others, and I definitely believe that the human brain's capacity for visual processing, probably assisted through the preprocessing that happens subconsciously, or the direct perception of visual information, is much more efficient and capable than plain memorization. So I'm still skeptical of the blanket 10-bit rate for all types of thinking, but I can see how they got the number.

Their model seems to be heavily focused on visual observation and conscious problem solving, which ignores all the other things the brain is doing at the same time: keeping the body alive, processing emotions, maintaining homeostasis for several systems, etc.

These all require interpreting and sending information from/to other organs, and most of it is subconscious.

[-] piecat@lemmy.world 2 points 52 minutes ago

It's a fair metric IMO.

We typically judge super computers in FLOPS, floating-point-operations/sec.

We don't take into account any of the compute power required to keep it powered, keep it cool, operate peripherals, etc., even if that is happening in the background. Heck, FLOPs doesn't even really measure memory, storage, power, number of cores, clock speed, architecture, or any other useful attributes of a computer.

This is just one metric.

[-] w3dd1e@lemm.ee 3 points 14 hours ago

Also supposing it did, I’m quite sure that everyone’s brain would function at different rates. And how do you even measure those people that don’t have an internal monologue? Seems like there is a lot missing here.

[-] sugar_in_your_tea@sh.itjust.works 3 points 3 hours ago

It's an average. The difference between two humans will be much less than the difference between humans and machines.

[-] FooBarrington@lemmy.world 3 points 22 hours ago

I also don't have 10 fingers. That doesn't make any sense - my hands are not numbers!

Ooooor "bits" has a meaning beyond what you assume, but it's probably just science that's stupid.

I can tell you’re trying to make a point, but I have no idea what it is.

[-] FooBarrington@lemmy.world 11 points 21 hours ago* (last edited 21 hours ago)

You say "we don't think in bits because our brains function nothing like computers", but bits aren't strictly related to computers. Bits are about information. And since our brains are machines that process information, bits are also applicable to those processes.

To show this, I chose an analogy. We say that people have 10 fingers, yet our hands have nothing to do with numbers. That's because the concept of "10" is applicable both to math and topics that math can describe, just like "bits" are applicable both to information theory and topics that information theory can describe.

For the record: I didn't downvote you, it was a fair question to ask.

I also thought about a better analogy - imagine someone tells you they measured the temperature of a distant star, and you say "that's stupid, you can't get a thermometer to a star and read the measurement, you'd die", just because you don't know how one could measure it.

Bits are binary digits used for mechanical computers. Human brains are constantly changing chemical systems that don’t “process” binary bits of information so it makes no sense as a metric.

imagine someone tells you they measured the temperature of a distant star, and you say "that's stupid, you can't get a thermometer to a star and read the measurement, you'd die", just because you don't know how one could measure it.

It’s not about how you measure it, it’s about using a unit system that doesn’t apply. It’s more like trying to calculate how much star costs in USD.

[-] scratchee@feddit.uk 9 points 12 hours ago

Bits are also a unit of information from information theory. In that context they are relevant for anything that processes information, regardless of methodology, you can convert analogue signals into bits just fine.

[-] FooBarrington@lemmy.world 6 points 11 hours ago* (last edited 11 hours ago)

Maybe try looking into the topic instead of confidently repeating your wrong assertions? You're literally pulling a "my hand is not a number!" right now.

Just because you have a limited understanding of a unit, doesn't mean that unit is only applicable to what you know. Literally the star example I brought up.

I already did before I formed my conclusion. It’s clear you have not and are just looking for someone which whom to argue.

Goodbye.

[-] FooBarrington@lemmy.world 1 points 4 hours ago* (last edited 4 hours ago)

Ah, so you just choose to ignore information you don't already know? What a rational thing to do. You're not anti-intellectual at all.

Or are you seriously trying to gaslight everyone into believing Shannon entropy doesn't exist?

this post was submitted on 26 Dec 2024
65 points (72.4% liked)

Technology

60112 readers
2516 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS