132

Music is just layered simple patterns and our brains LOVE IT.

Sound is pressure waves, musical notes are a specific pattern of pressure waves. Melodies are repeated musical notes. Songs are repeated melodies following standard structure.

Our brains love trying to decode and parse all these overlapping patterns.

Maybe not really a shower thought and more wild speculation.

you are viewing a single comment's thread
view the rest of the comments
[-] ascense@lemm.ee 5 points 5 days ago* (last edited 5 days ago)

I strongly believe that our brains are fundamentally just prediction machines. We strive for a specific level of controlled novelty, but for the most part 'understanding' (i.e. being able to predict) the world around us is the goal. We get boredom to push us beyond getting too comfortable and simply sitting in the already familiar, and one of the biggest pleasures in life is the 'aha' moment when understanding finally clicks in place and we feel we can predict something novel.

I feel this is also why LLMs (ChatGPT etc.) can be so effective working with language, and why they occasionally seem to behave so humanlike -- The fundamental mechanism is essentially the same as our brains, if massively more limited. Animal brains continuously adapt to predict sensory input (and to an extent their own output), while LLMs learn to predict a sequence of text tokens during a restricted training period.

It also seems to me the strongest example of this kind of prediction in animals is the noticing (and wariness) when something feels 'off' about the environment around us. We can easily sense specific kinds of small changes to our surroundings that signify potential danger, even in seemingly unpredictable natural environments. From an evolutionary perspective this also seems like the most immediately beneficial aspect of this kind of predictive capability. Interstingly, this kind of prediction seems to happen even on the level of individual neurons. As predictive capability improves, it also necessitates an increasingly deep ability to model the world around us, leading to deeper cognition.

[-] Yondoza@sh.itjust.works 1 points 4 days ago* (last edited 4 days ago)

I agree, LLMs have the amazingly human ability to bumble into the right answer even if they don't know why.

It seems to me that a good analogy of our experience is a whole bunch of LLMs optimized for different tasks that have some other LLM scheduler/administrator for the lower level models that is consciousness. Might be more layers deep, but that's my guess with no neurological or machine learning background.

this post was submitted on 01 Jan 2025
132 points (93.4% liked)

Showerthoughts

30120 readers
420 users here now

A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. A showerthought should offer a unique perspective on an ordinary part of life.

Rules

  1. All posts must be showerthoughts
  2. The entire showerthought must be in the title
  3. Avoid politics
    • 3.1) NEW RULE as of 5 Nov 2024, trying it out
    • 3.2) Political posts often end up being circle jerks (not offering unique perspective) or enflaming (too much work for mods).
    • 3.3) Try c/politicaldiscussion, volunteer as a mod here, or start your own community.
  4. Posts must be original/unique
  5. Adhere to Lemmy's Code of Conduct

founded 2 years ago
MODERATORS