ITT: A bunch of people who have never heard of information theory suddenly have very strong feelings about it.
If they had heard of it, we'd probably get statements like: "It's just statistics." or "It's not information. It's just a probability."
We don’t think in “bits” at all because our brain functions nothing like a computer. This entire premise is stupid.
Bit in this context refers to the [Shannon](https://en.wikipedia.org/wiki/Shannon_(unit)) from information theory. 1 bit of information (that is, 1 shannon) is the amount of information you receive from observing an event with a 50% chance of occurring. 10 bits would be equivalent to the amount of information learned from observing an event with about a 0.1% chance of occurring. So 10 bits in this context is actually not that small of a number.
The paper gives specific numbers for specific contexts, too. It's a helpful illustration for these concepts:
A 3x3 Rubik's cube has 2^65 possible permutations, so the configuration of a Rubik's cube is about 65 bits of information. The world record for blind solving, where the solver examines the cube, puts on a blindfold, and solves it blindfolded, had someone examining the cube for 5.5 seconds, so the 65 bits were acquired at a rate of 11.8 bits/s.
Another memory contest has people memorizing strings of binary digits for 5 minutes and trying to recall them. The world record is 1467 digits, exactly 1467 bits, and dividing by 5 minutes or 300 seconds, for a rate of 4.9 bits/s.
The paper doesn't talk about how the human brain is more optimized for some tasks over others, and I definitely believe that the human brain's capacity for visual processing, probably assisted through the preprocessing that happens subconsciously, or the direct perception of visual information, is much more efficient and capable than plain memorization. So I'm still skeptical of the blanket 10-bit rate for all types of thinking, but I can see how they got the number.
Their model seems to be heavily focused on visual observation and conscious problem solving, which ignores all the other things the brain is doing at the same time: keeping the body alive, processing emotions, maintaining homeostasis for several systems, etc.
These all require interpreting and sending information from/to other organs, and most of it is subconscious.
It's a fair metric IMO.
We typically judge super computers in FLOPS, floating-point-operations/sec.
We don't take into account any of the compute power required to keep it powered, keep it cool, operate peripherals, etc., even if that is happening in the background. Heck, FLOPs doesn't even really measure memory, storage, power, number of cores, clock speed, architecture, or any other useful attributes of a computer.
This is just one metric.
Also supposing it did, I’m quite sure that everyone’s brain would function at different rates. And how do you even measure those people that don’t have an internal monologue? Seems like there is a lot missing here.
It's an average. The difference between two humans will be much less than the difference between humans and machines.
I also don't have 10 fingers. That doesn't make any sense - my hands are not numbers!
Ooooor "bits" has a meaning beyond what you assume, but it's probably just science that's stupid.
I can tell you’re trying to make a point, but I have no idea what it is.
You say "we don't think in bits because our brains function nothing like computers", but bits aren't strictly related to computers. Bits are about information. And since our brains are machines that process information, bits are also applicable to those processes.
To show this, I chose an analogy. We say that people have 10 fingers, yet our hands have nothing to do with numbers. That's because the concept of "10" is applicable both to math and topics that math can describe, just like "bits" are applicable both to information theory and topics that information theory can describe.
For the record: I didn't downvote you, it was a fair question to ask.
I also thought about a better analogy - imagine someone tells you they measured the temperature of a distant star, and you say "that's stupid, you can't get a thermometer to a star and read the measurement, you'd die", just because you don't know how one could measure it.
Because it’s a Techspot article, of course they deliberately confuse you as to what “bit” means to get views. https://en.wikipedia.org/wiki/Entropy_(information_theory) seems like a good introduction to what “bit” actually means.
Base 2 gives the unit of bits
Which is exactly what bit means.
base 10 gives units of "dits"
Which is not bits, but the equivalent 1 digit at base 10.
I have no idea how you think this changes anything about what a bit is?
The external storage data and shannon are both called bits, exactly because they’re both base 2. That does not mean they’re the same. As the article explains it, a shannon is like a question from 20 questions.
Bullshit. just reading this and comprehending it, which is thought, far exceeds 10 bits per second.
Speaking which is conveying thought, also far exceed 10 bits per second.
This piece is garbage.
Speaking which is conveying thought, also far exceed 10 bits per second.
There was a study in 2019 that analyzed 17 different spoken languages to analyze how languages with lower complexity rate (bits of information per syllable) tend to be spoken faster in a way that information rate is roughly the same across spoken languages, at roughly 39 bits per second.
Of course, it could be that the actual ideas and information in that speech is inefficiently encoded so that the actual bits of entropy are being communicated slower than 39 per second. I'm curious to know what the underlying Caltech paper linked says about language processing, since the press release describes deriving the 10 bits from studies analyzing how people read and write (as well as studies of people playing video games or solving Rubik's cubes). Are they including the additional overhead of processing that information into new knowledge or insights? Are they defining the entropy of human language with a higher implied compression ratio?
EDIT: I read the preprint, available here. It purports to measure externally measurable output of human behavior. That's an important limitation in that it's not trying to measure internal richness in unobserved thought.
So it analyzes people performing external tasks, including typing and speech with an assumed entropy of about 5 bits per English word. A 120 wpm typing speed therefore translates to 600 bits per minute, or 10 bits per second. A 160 wpm speaking speed translates to 13 bits/s.
The calculated bits of information are especially interesting for the other tasks (blindfolded Rubik's cube solving, memory contests).
It also explicitly cited the 39 bits/s study that I linked as being within the general range, because the actual meat of the paper is analyzing how the human brain brings 10^9 bits of sensory perception down 9 orders of magnitude. If it turns out to be 8.5 orders of magnitude, that doesn't really change the result.
There's also a whole section addressing criticisms of the 10 bit/s number. It argues that claims of photographic memory tend to actually break down into longer periods of study (e.g., 45 minute flyover of Rome to recognize and recreate 1000 buildings of 1000 architectural styles translates into 4 bits/s of memorization). And it argues that the human brain tends to trick itself into perceiving a much higher complexity that it is actually processing (known as "subjective inflation"), implicitly arguing that a lot of that is actually lossy compression that fills in fake details from what it assumes is consistent with the portions actually perceived, and that the observed bitrate from other experiments might not properly categorize the bits of entropy involved in less accurate shortcuts taken by the brain.
I still think visual processing seems to be faster than 10, but I'm now persuaded that it's within an order of magnitude.
Thanks for the link and breakdown.
It sounds like a better description of the estimated thinking speed would be 5-50 bits per second. And when summarizing capacity/capability, one generally uses a number near the top end. It makes far more sense to say we are capable of 50 bps but often use less, than to say we are only capable of 10 but sometimes do more than we are capable of doing. And the paper leans hard into 10 bps being a internally imposed limit rather than conditional, going as far as saying a neural-computer interface would be limited to this rate.
"Thinking speed" is also a poor description for input/output measurement, akin to calling a monitor's bitrate the computer's FLOPS.
Visual processing is multi-faceted. I definitely don't think all of vision can be reduced to 50bps, but maybe the serial part after the parallel bits have done stuff like detecting lines, arcs, textures, areas of contrast, etc.
You may be misunderstanding the bit measure here. It’s not ten bits of information, basically a single byte. It’s ten binary yes/no decisions to equal the evaluation of 1024 distinct possibilities.
The measure comes from information theory but it is easy to confuse it with other uses of ‘bits’.
Right? They do nothing to expand upon what this plainly wrong claim is supposed to actually mean. Goddamn scientists need a marketing department of their own, because the media sphere in general sells their goods however the fuck they feel like packaging them.
Some parts of the paper are available here: https://www.sciencedirect.com/science/article/abs/pii/S0896627324008080?via%3Dihub
It doesn't look like these "bits" are binary, but "pieces of information" (which I find a bit misleading):
“Quick, think of a thing… Now I’ll guess that thing by asking you yes/no questions.” The game “Twenty Questions” has been popular for centuries as a thinking challenge. If the questions are properly designed, each will reveal 1 bit of information about the mystery thing. If the guesser wins routinely, this suggests that the thinker can access about million possible items in the few seconds allotted. Therefore, the speed of thinking—with no constraints imposed—corresponds to 20 bits of information over a few seconds: a rate of 10 bits/s or less.
The authors do draw a distinction between the sensory processing and cognition/decision-making, at least:
To reiterate: human behaviors, including motor function, perception, and cognition, operate at a speed limit of 10 bit/s. At the same time, single neurons can transmit information at that same rate or faster. Furthermore, some portions of our brain, such as the peripheral sensory regions, clearly process information dramatically faster.
So ten concepts per second? Ten ideas per second? This sounds a little more reasonable. I guess you have to read the word “bit” like you’re British, and it just means “part.” Of course this is still miserably badly defined.
Here's a link to Caltech's press release: https://www.caltech.edu/about/news/thinking-slowly-the-paradoxical-slowness-of-human-behavior
Here's a link to the actual paper (paywall): https://www.cell.com/neuron/abstract/S0896-6273(24)00808-0
Here's a link to a preprint: https://arxiv.org/abs/2408.10234
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed