43
submitted 18 hours ago* (last edited 17 hours ago) by FlyingSquid@lemmy.world to c/technology@lemmy.world

Thanks to @General_Effort@lemmy.world for the links!

Here’s a link to Caltech’s press release: https://www.caltech.edu/about/news/thinking-slowly-the-paradoxical-slowness-of-human-behavior

Here’s a link to the actual paper (paywall): https://www.cell.com/neuron/abstract/S0896-6273(24)00808-0

Here’s a link to a preprint: https://arxiv.org/abs/2408.10234

top 50 comments
sorted by: hot top controversial new old
[-] VoterFrog@lemmy.world 12 points 7 hours ago

ITT: A bunch of people who have never heard of information theory suddenly have very strong feelings about it.

[-] General_Effort@lemmy.world 2 points 6 hours ago

If they had heard of it, we'd probably get statements like: "It's just statistics." or "It's not information. It's just a probability."

We don’t think in “bits” at all because our brain functions nothing like a computer. This entire premise is stupid.

[-] FooBarrington@lemmy.world 0 points 6 hours ago

I also don't have 10 fingers. That doesn't make any sense - my hands are not numbers!

Ooooor "bits" has a meaning beyond what you assume, but it's probably just science that's stupid.

I can tell you’re trying to make a point, but I have no idea what it is.

[-] FooBarrington@lemmy.world 5 points 6 hours ago* (last edited 6 hours ago)

You say "we don't think in bits because our brains function nothing like computers", but bits aren't strictly related to computers. Bits are about information. And since our brains are machines that process information, bits are also applicable to those processes.

To show this, I chose an analogy. We say that people have 10 fingers, yet our hands have nothing to do with numbers. That's because the concept of "10" is applicable both to math and topics that math can describe, just like "bits" are applicable both to information theory and topics that information theory can describe.

For the record: I didn't downvote you, it was a fair question to ask.

I also thought about a better analogy - imagine someone tells you they measured the temperature of a distant star, and you say "that's stupid, you can't get a thermometer to a star and read the measurement, you'd die", just because you don't know how one could measure it.

Bits are binary digits used for mechanical computers. Human brains are constantly changing chemical systems that don’t “process” binary bits of information so it makes no sense as a metric.

imagine someone tells you they measured the temperature of a distant star, and you say "that's stupid, you can't get a thermometer to a star and read the measurement, you'd die", just because you don't know how one could measure it.

It’s not about how you measure it, it’s about using a unit system that doesn’t apply. It’s more like trying to calculate how much star costs in USD.

[-] Aatube@kbin.melroy.org 28 points 16 hours ago

Because it’s a Techspot article, of course they deliberately confuse you as to what “bit” means to get views. https://en.wikipedia.org/wiki/Entropy_(information_theory) seems like a good introduction to what “bit” actually means.

[-] Buffalox@lemmy.world 6 points 15 hours ago

Base 2 gives the unit of bits

Which is exactly what bit means.

base 10 gives units of "dits"

Which is not bits, but the equivalent 1 digit at base 10.

I have no idea how you think this changes anything about what a bit is?

[-] Aatube@kbin.melroy.org 5 points 15 hours ago

The external storage data and shannon are both called bits, exactly because they’re both base 2. That does not mean they’re the same. As the article explains it, a shannon is like a question from 20 questions.

[-] General_Effort@lemmy.world 1 points 9 hours ago

Wrong. They are called the same because they are fundamentally the same. That's how you measure information.

In some contexts, one wants to make a difference between the theoretical information content and what is actually stored on a technical device. But that's a fairly subtle thing.

[-] typeswithpenis@lemmynsfw.com 3 points 6 hours ago

A bit in the data sense is just an element of the set of booleans. A bit in the entropy sense is the amount of information revealed by an observation with two equally probable outcomes. These are not the same thing because the amount of information contained in a bit is not always equal to one bit of entropy. For example, if a boolean is known to be 0, then the amount of information it contains is 0 bits. If it is known that the boolean is equally 0 or 1, then the information content is 1 bit. It depends on the prior probability distribution.

[-] Aatube@kbin.melroy.org 2 points 8 hours ago

I don't see how that can be a subtle difference. How is a bit of external storage data only subtly different from information content that tells the probability of the event occurring is ½?

[-] General_Effort@lemmy.world 1 points 8 hours ago

It's a bit like asking what is the difference between the letter "A" and ink on a page in the shape of the letter "A". Of course, first one would have to explain how they are usually not different at all.

BTW, I don't know what you mean by "external storage data". The expression doesn't make sense.

load more comments (2 replies)
[-] Buffalox@lemmy.world 43 points 18 hours ago

Bullshit. just reading this and comprehending it, which is thought, far exceeds 10 bits per second.
Speaking which is conveying thought, also far exceed 10 bits per second.

This piece is garbage.

[-] GamingChairModel@lemmy.world 11 points 13 hours ago* (last edited 11 hours ago)

Speaking which is conveying thought, also far exceed 10 bits per second.

There was a study in 2019 that analyzed 17 different spoken languages to analyze how languages with lower complexity rate (bits of information per syllable) tend to be spoken faster in a way that information rate is roughly the same across spoken languages, at roughly 39 bits per second.

Of course, it could be that the actual ideas and information in that speech is inefficiently encoded so that the actual bits of entropy are being communicated slower than 39 per second. I'm curious to know what the underlying Caltech paper linked says about language processing, since the press release describes deriving the 10 bits from studies analyzing how people read and write (as well as studies of people playing video games or solving Rubik's cubes). Are they including the additional overhead of processing that information into new knowledge or insights? Are they defining the entropy of human language with a higher implied compression ratio?

EDIT: I read the preprint, available here. It purports to measure externally measurable output of human behavior. That's an important limitation in that it's not trying to measure internal richness in unobserved thought.

So it analyzes people performing external tasks, including typing and speech with an assumed entropy of about 5 bits per English word. A 120 wpm typing speed therefore translates to 600 bits per minute, or 10 bits per second. A 160 wpm speaking speed translates to 13 bits/s.

The calculated bits of information are especially interesting for the other tasks (blindfolded Rubik's cube solving, memory contests).

It also explicitly cited the 39 bits/s study that I linked as being within the general range, because the actual meat of the paper is analyzing how the human brain brings 10^9 bits of sensory perception down 9 orders of magnitude. If it turns out to be 8.5 orders of magnitude, that doesn't really change the result.

There's also a whole section addressing criticisms of the 10 bit/s number. It argues that claims of photographic memory tend to actually break down into longer periods of study (e.g., 45 minute flyover of Rome to recognize and recreate 1000 buildings of 1000 architectural styles translates into 4 bits/s of memorization). And it argues that the human brain tends to trick itself into perceiving a much higher complexity that it is actually processing (known as "subjective inflation"), implicitly arguing that a lot of that is actually lossy compression that fills in fake details from what it assumes is consistent with the portions actually perceived, and that the observed bitrate from other experiments might not properly categorize the bits of entropy involved in less accurate shortcuts taken by the brain.

I still think visual processing seems to be faster than 10, but I'm now persuaded that it's within an order of magnitude.

[-] RustyEarthfire@lemmy.world 1 points 9 hours ago

Thanks for the link and breakdown.

It sounds like a better description of the estimated thinking speed would be 5-50 bits per second. And when summarizing capacity/capability, one generally uses a number near the top end. It makes far more sense to say we are capable of 50 bps but often use less, than to say we are only capable of 10 but sometimes do more than we are capable of doing. And the paper leans hard into 10 bps being a internally imposed limit rather than conditional, going as far as saying a neural-computer interface would be limited to this rate.

"Thinking speed" is also a poor description for input/output measurement, akin to calling a monitor's bitrate the computer's FLOPS.

Visual processing is multi-faceted. I definitely don't think all of vision can be reduced to 50bps, but maybe the serial part after the parallel bits have done stuff like detecting lines, arcs, textures, areas of contrast, etc.

[-] Buffalox@lemmy.world 1 points 10 hours ago

with an assumed entropy of about 5 bits per English word. A 120 wpm typing speed therefore translates to 600 bits per minute, or 10 bits per second. A 160 wpm speaking speed translates to 13 bits/s.

The problem here is that the bits of information needs to be clearly defined, otherwise we are not talking about actually quantifiable information. Normally a bit can only have 2 values, here they are talking about very different types of bits, which AFAIK is not a specific quantity.

the human brain tends to trick itself into perceiving a much higher complexity that it is actually processing

This is of course a thing.

[-] GamingChairModel@lemmy.world 2 points 8 hours ago

The problem here is that the bits of information needs to be clearly defined, otherwise we are not talking about actually quantifiable information

here they are talking about very different types of bits

I think everyone agrees on the definition of a bit (a binary two-value variable), but the active area of debate is which pieces of information actually matter. If information can be losslessly compressed into smaller representations of that same information, then the smaller compressed size represents the informational complexity in bits.

The paper itself describes the information that can be recorded but ultimately discarded as not relevant: for typing, the forcefulness of each key press or duration of each key press don't matter (but that exact same data might matter for analyzing someone playing the piano). So in terms of complexity theory, they've settled on 5 bits per English word and just refer to other prior papers that have attempted to quantify the information complexity of English.

[-] scarabic@lemmy.world 2 points 9 hours ago

Right? They do nothing to expand upon what this plainly wrong claim is supposed to actually mean. Goddamn scientists need a marketing department of their own, because the media sphere in general sells their goods however the fuck they feel like packaging them.

[-] meyotch@slrpnk.net 5 points 16 hours ago

You may be misunderstanding the bit measure here. It’s not ten bits of information, basically a single byte. It’s ten binary yes/no decisions to equal the evaluation of 1024 distinct possibilities.

The measure comes from information theory but it is easy to confuse it with other uses of ‘bits’.

[-] Buffalox@lemmy.world 4 points 16 hours ago

What? This is the perfectly normal meaning of bits. 2^10 = 1024.

[-] meyotch@slrpnk.net 4 points 12 hours ago

Only when you are framing it in terms of information entropy. I think many of those misunderstanding the study are thinking of bits as part of a standard byte. It’s a subtle distinction but that’s where I think the disconnect is

[-] Buffalox@lemmy.world 1 points 10 hours ago* (last edited 10 hours ago)

Yes, the study is probably fine, it's the article that fails to clarify before using it, that they are not talking about bits the way bits are normally understood.

[-] credo@lemmy.world 3 points 14 hours ago* (last edited 14 hours ago)

I think we understand a computer can read this text far faster than any of us. That is not the same as conscious thought though- it’s simply following an algorithm of yes/no decisions.

I’m not arguing with anything here, just pointing out the difference in what CPUs do and what human brains do.

load more comments (3 replies)
load more comments (57 replies)
[-] Australis13@fedia.io 14 points 17 hours ago

Some parts of the paper are available here: https://www.sciencedirect.com/science/article/abs/pii/S0896627324008080?via%3Dihub

It doesn't look like these "bits" are binary, but "pieces of information" (which I find a bit misleading):

“Quick, think of a thing… Now I’ll guess that thing by asking you yes/no questions.” The game “Twenty Questions” has been popular for centuries as a thinking challenge. If the questions are properly designed, each will reveal 1 bit of information about the mystery thing. If the guesser wins routinely, this suggests that the thinker can access about million possible items in the few seconds allotted. Therefore, the speed of thinking—with no constraints imposed—corresponds to 20 bits of information over a few seconds: a rate of 10 bits/s or less.

The authors do draw a distinction between the sensory processing and cognition/decision-making, at least:

To reiterate: human behaviors, including motor function, perception, and cognition, operate at a speed limit of 10 bit/s. At the same time, single neurons can transmit information at that same rate or faster. Furthermore, some portions of our brain, such as the peripheral sensory regions, clearly process information dramatically faster.

[-] scarabic@lemmy.world 2 points 9 hours ago* (last edited 9 hours ago)

So ten concepts per second? Ten ideas per second? This sounds a little more reasonable. I guess you have to read the word “bit” like you’re British, and it just means “part.” Of course this is still miserably badly defined.

load more comments (17 replies)
[-] terminhell@lemmy.dbzer0.com 3 points 12 hours ago

Crazy how a biological analog lump is capable of even a fraction of what a brain can do.

[-] leaky_shower_thought@feddit.nl 6 points 17 hours ago

i can agree at some extent why it could be at 10bits/sec.

the brain is known to do some shortcuts when parsing/speed reading but slows down when we try to extract details from written works. it is also more tiring to scrutinize details than to just read articles.

i was surprised that they got the speed measured.

[-] GamingChairModel@lemmy.world 2 points 13 hours ago

The Caltech release says they derived it from "a vast amount of scientific literature" including studies of how people read and write. I think the key is going to be how they derived that number from existing studies.

load more comments
view more: next ›
this post was submitted on 26 Dec 2024
43 points (68.7% liked)

Technology

60112 readers
3726 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS