65
Human thought crawls at 10 bits per second, Caltech study finds
(www.techspot.com)
This is a most excellent place for technology news and articles.
Some parts of the paper are available here: https://www.sciencedirect.com/science/article/abs/pii/S0896627324008080?via%3Dihub
It doesn't look like these "bits" are binary, but "pieces of information" (which I find a bit misleading):
The authors do draw a distinction between the sensory processing and cognition/decision-making, at least:
So ten concepts per second? Ten ideas per second? This sounds a little more reasonable. I guess you have to read the word “bit” like you’re British, and it just means “part.” Of course this is still miserably badly defined.
But our brains are not digital, so they cannot be measured in binary bits.
There is no other definition of bit that is valid in a scientific context. Bit literally means "binary digit".
Information theory, using bits, is applied to the workings of the brain all the time.
How do you know there is no other definition of bit that is valid in a scientific context? Are you saying a word can't have a different meaning in a different field of science?
Because actual neuroscientists understand and use information theory.
Actual neuroscientists define their terms in their papers. Like the one you refuse to read because you've already decided it's wrong.
Actual neuroscientists do not create false definitions for well defined terms. And they absolutely do not need to define basic, unambiguous terminology to be able to use it.
Please define 'bit' in neuroscientific terms.
Binary digit, or the minimum additional information needed to distinguish between two different equally likely states/messages/etc.
It's same usage as information theory, because information theory applies to, and is directly used by, virtually every relevant field of science that touches information in any way.
Brains are not binary. I asked you to define it in neuroscientific terms.
Information is information. Everything can be described in binary terms.
Binary digit is how actual brain scientists understand bit, because that's what it means.
But "brains aren't binary" is also flawed. At any given point, a neuron is either firing or not firing. That's based on a buildup of potentials based on the input of other neurons, but it ultimately either fires or it doesn't, and that "fire/don't fire" dichotomy is critical to a bunch of processes. Information may be encoded other ways, eg fire rate, but if you dive down to the core levels, the threshold of whether a neuron hits the action potential is what defines the activity of the brain.
And yet you were already shown by someone else that the paper that you refuse to read is using its terms correctly.
I think what you really mean is brains are not numeric. It's the "digit" part that is objectionable, not the "binary" part, which as an adjective for "digit" just means a way of encoding a portion of a number.
But in the end it's a semantic argument that really doesn't have a lot to do with the thesis.
Indeed not. So using language specific to binary systems - e.g. bits per second - is not appropriate in this context.
All information can be stored in a digital form, and all information can be measured in base 2 units (of bits).
But it isn't stored that way and it isn't processed that way. The preprint appears to give an equation (beyond my ability to understand) which explains how they came up with it.
Your initial claim was that they couldn't be measured that way. You're right that they aren't stored as bits, but it's irrelevant to whether you can measure them using bits as the unit of information size.
Think of it like this: in the 1980s there were breathless articles about CD ROM technology, and how, in the future, "the entire encyclopedia Britannica could be stored on one disc". How was that possible to know? Encyclopedias were not digitally stored! You can't measure them in bits!
It's possible because you could define a hypothetical analog to digital encoder, and then quantify how many bits coming off that encoder would be needed to store the entire corpus.
This is the same thing. You can ADC anything, and the spec on your ADC defines the bitrate you need to store the stream coming off... in bits (per second)
As has been shown elsewhere in this thread by Aatube a couple of times, they are not defining 'bit' the way you are defining it, but still in a valid way.