66
submitted 1 day ago* (last edited 1 day ago) by FlyingSquid@lemmy.world to c/technology@lemmy.world

Thanks to @General_Effort@lemmy.world for the links!

Here’s a link to Caltech’s press release: https://www.caltech.edu/about/news/thinking-slowly-the-paradoxical-slowness-of-human-behavior

Here’s a link to the actual paper (paywall): https://www.cell.com/neuron/abstract/S0896-6273(24)00808-0

Here’s a link to a preprint: https://arxiv.org/abs/2408.10234

you are viewing a single comment's thread
view the rest of the comments
[-] Aatube@kbin.melroy.org 7 points 1 day ago

The external storage data and shannon are both called bits, exactly because they’re both base 2. That does not mean they’re the same. As the article explains it, a shannon is like a question from 20 questions.

[-] General_Effort@lemmy.world 1 points 1 day ago

Wrong. They are called the same because they are fundamentally the same. That's how you measure information.

In some contexts, one wants to make a difference between the theoretical information content and what is actually stored on a technical device. But that's a fairly subtle thing.

[-] typeswithpenis@lemmynsfw.com 5 points 22 hours ago

A bit in the data sense is just an element of the set of booleans. A bit in the entropy sense is the amount of information revealed by an observation with two equally probable outcomes. These are not the same thing because the amount of information contained in a bit is not always equal to one bit of entropy. For example, if a boolean is known to be 0, then the amount of information it contains is 0 bits. If it is known that the boolean is equally 0 or 1, then the information content is 1 bit. It depends on the prior probability distribution.

[-] General_Effort@lemmy.world 1 points 7 hours ago

In some contexts, a bit can refer to a boolean variable, a flag. In other contexts, it may refer to the voltage at a certain point, or any number of other things. But when you are talking about bits/s then it's a measure of information.

These are not the same thing because the amount of information contained in a bit is not always equal to one bit of entropy.

Yes, but as you know, this implies that the information is already available. You can use that knowledge to create a compression algorithm, or to define a less redundant file format. That's very practical.

We can also be a bit philosophical and ask: How much information does a backup contain? The answer could be: By definition, 0 bits. That's not a useful answer, which implies a problem with the application of the definition.

A more interesting question might be: How much information does a file contain, that stores the first 1 million digits of the number π?

[-] Aatube@kbin.melroy.org 3 points 1 day ago

I don't see how that can be a subtle difference. How is a bit of external storage data only subtly different from information content that tells the probability of the event occurring is ½?

[-] General_Effort@lemmy.world 0 points 1 day ago

It's a bit like asking what is the difference between the letter "A" and ink on a page in the shape of the letter "A". Of course, first one would have to explain how they are usually not different at all.

BTW, I don't know what you mean by "external storage data". The expression doesn't make sense.

this post was submitted on 26 Dec 2024
66 points (72.6% liked)

Technology

60112 readers
2568 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS