146

Great headline, but ask fusion how long they have been 20 years away and how many more years they have...

top 36 comments
sorted by: hot top controversial new old
[-] kritzkrieg@lemm.ee 7 points 5 hours ago

Ngl, I kinda hate these articles because they feel so...click baity? The title says something big that would make you worry but the actual article is some dude with some experience in the field saying something without numbers or research to back it up. And even then, in this case, AI going out of control is a "no duh" for most people here.

We banked on Skynet nuking us. Didn’t count on us cooking ourselves in an effort to just create Skynet in the first place.

[-] YtA4QCam2A9j7EfTgHrH@infosec.pub 117 points 1 day ago

Yeah. Because we spent all of our carbon budget solving sudokus with idling car engines and making busty Garfields with genai instead of reducing our carbon emissions. All because these dipshits were conned by their own bullshitting machines.

[-] Fades@lemmy.world 16 points 20 hours ago* (last edited 20 hours ago)

Yeah… we spent it all…. Not the corps who we have no control over…. Somehow I don’t think the sudokus made much of an impact

[-] rottingleaf@lemmy.world 9 points 20 hours ago

Admittedly we've all been conned.

It's a simple sequence:

  1. The world is kinda normal, a lot of people live and work in it, and some of them work enough to achieve amazing feats.

  2. Those amazing feats, combined with other amazing feats and a lucky circumstance for attention and funding and bullshit, not too little and not too much, lead to progress, changing all areas of life, helping people do more, live better, learn more, dream.

  3. Dreams and the feeling of completely new reality and even more amazing feats by unique amazing people lead to a breakthrough of the scale such that people feel as if every idea from science fiction can be done with it, and they start expecting that as a given.

  4. Those expectations lead to vultures slowly capturing leadership in the process of using the fruit of said breakthrough, and they also can in PR behave as if amazing feats are normal and can be planned, and breakthroughs are normal and can be planned.

  5. Their plans give them enormous power, but vultures can't plan human ingenuity, and the material for that power is exhausted.

  6. Due to opportunities for real things being exhausted in that existing climate and balance of power, the vultures and the clueless crowd have a rare match of interests, the former want to think they are visionaries and the elite of civilization, the latter want to think they are not just fools who use microscopes instead of dildos, - they both want to pretend.

  7. Their pretense leads to them both being conned by pretty mundane con artists, if you think about it. Con artists build a story to match their target's weakness. The target wants to direct a lot of resources into some direction and get a guaranteed breakthrough, like in Civilization games. For the vultures it's about them being deserving of power. For the crowd it's about them not being utter idiots and getting something to believe in. Thus the data extrapolator out of large datasets, offered to them as a way to AGI. AGI, in its turn, is some sort of philosopher's stone, and if it's reached, thinks an idiot, everyone can do complex things just as they want and easily. So these people get conned.

As they've been conned, one might think - how did that happen? And why can't they admit it? And that's very simple, because it all started with fruit of a breakthrough done by amazing people being available to mundane people, and with mundane people being confused into believing that they can do that too just following in a direction shown, and that progress is some linear movement in one direction, one just has to find it.

Like in Civilization games. Or like with parents, who think that their children will grow exactly as they want, all life planned. Or like with Marx and his theory with "formations", which, by the way, was a response to similar breakthroughs in XIX century, except the ruling classes then, surprisingly, were a bit smarter than now. More engineers and scientists.

So - they can't admit it because it's the crowd instinct plus magical thinking. They don't believe into their own mind, so they want to build a machine that'll think instead of them, and they think there's only one right solution to everything, so building an AGI means predictable development and happiness for all apekind, and then they can safely exterminate all nerds.

I think this post is long and incomprehensible enough.

[-] RagnarokOnline@programming.dev 2 points 17 hours ago

This comment checks out; can confirm this if how conning works

[-] Wooki@lemmy.world 6 points 20 hours ago

Line mus go up

[-] sundrei@lemmy.sdf.org 47 points 1 day ago

"Extinction of humanity, eh? Hmm... how can I make money off that?" -- Some CEO, Probably

[-] TheFriar@lemm.ee 16 points 1 day ago

By staying the course. Literally. Their desire for profits is what’s causing the extinction of humanity.

[-] floofloof@lemmy.ca 15 points 1 day ago

Most of them, in fact.

[-] Chozo@fedia.io 10 points 1 day ago
[-] prex@aussie.zone 2 points 5 hours ago

Who knew - the real torment nexus was the corpses we made on the way.

[-] thefartographer@lemm.ee 3 points 22 hours ago

I TOLD YOU NOT TO BUILD IT, DAMNIT!

[-] nous@programming.dev 58 points 1 day ago

It has to compete with:

  • Climate change and the disasters it will cause.
  • Nuclear war
  • Some virus
[-] very_well_lost@lemmy.world 8 points 21 hours ago

It has to compete with: Climate change

That's the fun part, it doesn't! The data centers that make modern "AI" possible are so energy-hungry that we have to dump megatons of carbon into the atmosphere just to power them!

AI can destroy civilization and cook the planet simultaneously.

Synergy, baby!

[-] Hackworth@lemmy.world 1 points 2 hours ago

All of the data centers in the US combined use 4% of total electric load.

[-] nossaquesapao@lemmy.eco.br 35 points 1 day ago

It's not competing, but collaborating with climate change

[-] minnow@lemmy.world 9 points 1 day ago

Some virus

Iirc the increase in pandemics has been an expected result of global warming.

For my money, there are three existential threats to the human species. You've already listed two: global warming and nuclear war. IMO the third is microplastics (although PFAS could be combined with microplastics to make a category I think we could reasonably call "forever chemicals")

[-] ThePowerOfGeek@lemmy.world 10 points 1 day ago

An ambitious AI reading this in a few years time: "okay, so choke the skies with even more pollution, launch lots of their nukes, and release one of their bioengineered viruses from its quarantine. Got it!"

[-] nous@programming.dev 7 points 1 day ago

Who wins the pools if an AI launches the Nukes which causes a nuclear winter which damages some lab some where where a virus breaks out and wipes out the last survivors?

[-] derek@infosec.pub 2 points 1 day ago* (last edited 1 day ago)

Whichever species, if any, rise to sapience after the age of mammals comes to its close.

[-] ininewcrow@lemmy.ca 20 points 1 day ago

I don't think AI will wipe us out

I think we will wipe ourselves out first.

[-] 7rokhym@lemmy.ca 12 points 1 day ago* (last edited 1 day ago)

Growing up years ago, I found a book on my parents bookshelf. I wish I'd kept track of it, but it had a cartoon of 2 Martians standing on Mars watching the Earth explode and one commented to the other along the lines that intelligent life forms must have lived there to accomplish such a feat. I was probably 8 or 9 at the time, but it's stuck with me.

It only took a Facebook recommendation engine with some cell phones to excite people into murdering each other in the ongoing Rohingya genocide. https://www.nytimes.com/2018/10/15/technology/myanmar-facebook-genocide.html

We don't need AI, and at this point it uses so much electricity that it is probably the first thing that would get shut down in a shit hits the fan moment.

[-] transientpunk@sh.itjust.works 16 points 1 day ago

We are the "creators" of AI, so if it wipes us out, that would be us wiping ourselves out.

In the end, short of a natural disaster (not climate change), we will be our own doom.

[-] ininewcrow@lemmy.ca 4 points 1 day ago

My thinking is that we will probably wipe ourselves out ourselves through war / conflict / nuclear holocaust before AI ever gets to the point of having any kind of power or influence to affect the planet or humanity as a whole.

[-] geography082@lemm.ee 5 points 1 day ago

Thanks godfather

[-] merde@sh.itjust.works 2 points 1 day ago

good riddance!

[-] Rottcodd@lemmy.world 2 points 1 day ago

They're going to have to get in line.

this post was submitted on 27 Dec 2024
146 points (84.4% liked)

Technology

60130 readers
2751 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS