114
submitted 1 month ago by misk@sopuli.xyz to c/technology@lemmy.world
top 34 comments
sorted by: hot top controversial new old
[-] MaggiWuerze@feddit.org 79 points 1 month ago

In contrast to stuff like AI training or crypto, chips at least fulfill an actually useful function, so I don't see the issue with their manufacturing consuming a lot of energy. Or should we compare the same for cars or medicine?

[-] CosmoNova@lemmy.world 37 points 1 month ago

Right? I was just thinking that entire countries run on chips so it sort of sounds about right at least.

[-] skye@lemmy.world 4 points 1 month ago

AI Training, compared to crypto, has at least been used in medicine to:

Create novel proteins based on specific requirements (useful for developing medicine): https://www.cell.com/chem/fulltext/S2451-9294(23)00139-0?_returnURL=https%3A%2F%2Flinkinghub.elsevier.com%2Fretrieve%2Fpii%2FS2451929423001390%3Fshowall%3Dtrue

Detect possible cancer: https://hms.harvard.edu/news/new-artificial-intelligence-tool-cancer#:~:text=CHIEF%20achieved%20nearly%2094%20percent,datasets%20containing%2011%20cancer%20types.

And there's many more uses you can easily find if you look into it. Don't just assume every LLM slop is all what AI is. Even LLMs probably have their use in stuff actually relating to language, such as translation.

[-] shalafi@lemmy.world 60 points 1 month ago

entire countries

Not going to even read this horseshit. Which countries? Brazil or Vatican City?

Fuck these headlines. If they have valid points to make, I'll never see them. Grow the fuck up and be journalists or I don't have time.

[-] TimeSquirrel@kbin.melroy.org 40 points 1 month ago

Okay. What are we supposed to do, not use chips? They're kind of a main character of the 21st century.

This would be a great application of those nuke plants fuckin' Google and Amazon want to build.

[-] ech@lemm.ee 16 points 1 month ago

What are we supposed to do[...]?

All of these articles treat energy usage like a massive crime, but miss/ignore that the world's energy use needs to go up as we increasingly turn to electric alternatives. The problem truly lies in how we generate electricity, not how we use it.

So the actual answer to your question is intense and rapid investment in sustainable, non-carbon energy production. An infrastructure revamp to rival any other in history. It would've been far better to do so decades ago, but that's no longer an option. Anything else is just half measures we can't afford.

[-] leisesprecher@feddit.org 9 points 1 month ago

We could start by not requiring new chips every few years.

For 90% of the users, there hasn't been any actual gain within the last 5-10 years. Older computers work perfectly fine, but artificial slow downs and bad software cause laptops to feel sluggish for most users.

Phones haven't really advanced either. But apps and OSes are too bloated, hardware impossible to repair, so a new phone it is.

Every device nowadays needs wifi and AI for some reason, so of course a new dishwasher has more computing power than an early Cray, even though nothing of that is ever used.

[-] Bassman1805@lemmy.world 7 points 1 month ago

Tech companies are terrified of becoming commodities, even though a good chunk of them basically are at this point.

Intel would probably be in a better spot if they'd just leaned into that rather than try to regain the market dominance they once had.

Same for cars.

Why do we need a new model every year?

Automotive design has been functionally complete since decades ago.

[-] HubertManne@moist.catsweat.com 1 points 1 month ago

Yeah sure and we might as well sell fans that can go on for decades while we are at it /s

[-] sunzu2@thebrainbin.org 0 points 1 month ago
[-] HubertManne@moist.catsweat.com 1 points 1 month ago

the old metal fans that seemed like they would cut your finger off like many older apliances and such would last decades if you did not take care of them and a lifetime or more if they were maintained which mostly meant cleaned and lubricated.

[-] sugar_in_your_tea@sh.itjust.works 2 points 1 month ago

Simple, we should all become Mentats!

[-] lnxtx@feddit.nl 9 points 1 month ago

Do 7 nm chips are more energy intensive than older 100 nm?
Or it's just scale, more chips to manufacture, more energy needed.

[-] n3m37h@sh.itjust.works 15 points 1 month ago

Cutting edge chips consume more electricity to manufacture as there are a crapload more steps than older fabs. All chips are made on the same size silicon wafers regardless of the fabrication process.

Gamers Nexus has some good videos about chip manufacturing if you are interested

[-] Kidplayer_666@lemm.ee 6 points 1 month ago

“Thanks Steve”

[-] sugar_in_your_tea@sh.itjust.works 2 points 1 month ago

I'd be interested in a "payback period" for modern chips, as in, how long the power savings in a modern chip takes to pay for its manufacturing costs. Basically, calculate performance/watt with some benchmark, and compare that to manufacturing cost (perhaps excluding R&D to simplify things).

[-] n3m37h@sh.itjust.works 2 points 1 month ago

Honestly, if you go through all the node changes you could do the math and figure out. Like N3 to N2 is a 15-20% performance gain at the same power useage.

It wouldn't be exact. But I doubt any company will tell you how much power would be used in the creation of a single wafer

[-] Valmond@lemmy.world 7 points 1 month ago

Older chips definitely consume more watt per processor power, newer are usually better on top of that too.

Talking about usage, not construction.

[-] eleitl@lemm.ee 3 points 1 month ago

Megawatt is a unit of power, not energy.

[-] partial_accumen@lemmy.world 2 points 1 month ago

Are the largest power consumption steps of semi conductor product happening 24/7? Could we simply align manufacturing times with useful solar production times? So no need to store all the solar power, with the idea of consuming most of it immediately for manufacturing. Then pass a run that Semi conductor fabs have to build out their own solar arrays to cover most of their power consumption.

[-] Kidplayer_666@lemm.ee 3 points 1 month ago

Chances are yes. Simply because to build the machines, such an astronomical amount of money and energy is needed to build them, that even if electricity during dead times cost a bunch more (which for businesses probably does), it probably is still worth it, just to bring it to maximum capacity

this post was submitted on 30 Oct 2024
114 points (88.0% liked)

Technology

60112 readers
2568 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS