263

Hard to believe it's been 24 years since Y2K (2000) And it feels like we've come such a long way, but this decade started off very poorly with one of the worst pandemics the modern world has ever seen, and technology in general is looking very bleak in several ways

I'm a PC gamer, and it looks like things are stagnating massively in our space. So many gaming companies are incapable of putting out a successful AAA title because people are either too poor, don't want to play a live service AAA disaster like every single one that has been released lately, Call of Duty, battlefield, anything electronic arts or Ubisoft puts out is almost entirely a failure or undersales. So many gaming studios have been shuttered and are being shuttered, Microsoft is basically one member of an oligopoly with Sony and a couple other companies.

Hardware is stagnating. Nvidia is putting on the brakes for developing their next line of GPUs, we're not going to see huge gains in performance anymore because AMD isn't caught up yet and they have no reason to innovate. So they are just going to sell their next line of cards for $1,500 a pop for the top ones, with 10% increase in performance rather than 50 or 60% like we really need. We still don't have the capability to play games in full native 4K 144 Hertz. That's at least a decade away

Virtual reality is on the verge of collapse because meta is basically the only real player in that space, they have a monopoly with them and valve index, pico from China is on the verge of developing something incredible as well, and Apple just revealed a mixed reality headset but the price is so extraordinary that barely anyone has it so use isn't very widespread. We're again a decade away from seeing anything really substantial in terms of performance

Artificial intelligence is really, really fucking things up in general and the discussions about AI look almost as bad as the news about the latest election in the USA. It's so clowny and ridiculous and over-the-top hearing any news about AI. The latest news is that open AI is going to go from a non-profit to a for-profit company after they promised they were operating for the good of humanity and broke countless laws stealing copyrighted information, supposedly for the public good, but now they're just going to snap their fingers and morph into a for-profit company. So they can just basically steal anything they want that's copyrighted, but claim it's for the public good, and then randomly swap to a for-profit model. Doesn't make any sense and just looks like they're going to be a vessel for widespread economic poverty...

It just seems like there's a lot of bubbles that are about to burst all at the same time, like I don't see how things are going to possibly get better for a while now?

top 50 comments
sorted by: hot top controversial new old
[-] Telorand@reddthat.com 206 points 3 months ago

I'm a PC gamer, and it looks like things are stagnating massively in our space.

I would like to introduce you to the indie game scene. Where AAA is faltering, indie has never been in a better place.

Overall, I don't see things the way you see them. I recommend taking a break from social media, go for a walk, play games you like, and fuck the trajectory of tech companies.

Live your life, and take a break from the doomsaying.

[-] lvxferre@mander.xyz 55 points 3 months ago

I would like to introduce you to the indie game scene. Where AAA is faltering, indie has never been in a better place.

Amen.

Indie games might not be flashy, but they're often made with love and concern about giving you a fun experience. They also lack all those abusive DRM and intrusive anti-cheat systems that A³ games often have.

[-] rbos@lemmy.ca 25 points 3 months ago

They also tend to have linux support. Where the AAA companies want to eat the entire mammoth and scorn the scraps, small companies can thrive off of small prey and the offal. :)

[-] 9point6@lemmy.world 11 points 3 months ago

Equating Linux enthusiasts to offal is a bold move on this site

[-] lvxferre@mander.xyz 10 points 3 months ago

It's a great analogy though - Linux users aren't deemed profitable by the A³ companies, just like offal is unjustly* deemed yucky by your typical person.

*I do love offal though. And writing this comment made me crave for chicken livers with garlic and rosemary over sourdough bread. Damn.

load more comments (1 replies)
[-] Telorand@reddthat.com 16 points 3 months ago

And I'll add on to that, even if every GPU company stops innovating, we'll still have older cards and hardware to choose from, and the games industry isn't going to target hardware nobody is buying (effectively pricing themselves out of the market). Indie devs especially tend to have lower hardware requirements for their games, so it's not like anyone will run out of games to play.

[-] dinckelman@lemmy.world 40 points 3 months ago

Genuinely wish more people understood this. I've mostly only been playing indie games for the past few years. By far the best fun i've had in gaming. A ton of unbelievably creative, unique games out there. Not to mention that 99% of them are a single-purchase experience, instead of a cash treadmill

[-] Telorand@reddthat.com 9 points 3 months ago

cash treadmill

Borrowing this turn of phrase

[-] GBU_28@lemm.ee 20 points 3 months ago* (last edited 3 months ago)

Hello indie gamer, it's me, you, from the future.

I'd like to introduce you to PATIENT indie gaming.

The only games I play are small team, longer running, well documented, developers are passionate, mods exist, can play on a potato or a steam deck, etc

Because I'm patient, I don't ever get preorder, Kickstarter, prealpha disappointed.

I know exactly what I'm getting, I pay once, and boom, I own a great game for ever. (You can more often fully DL indie games)

[-] Telorand@reddthat.com 8 points 3 months ago

Bruh, what do you mean "future?" That's me right now!

[-] GBU_28@lemm.ee 7 points 3 months ago

Bro I'm from the future you can't ask me stuff like that, be patient, you'll figure it out

[-] EnderMB@lemmy.world 8 points 3 months ago* (last edited 3 months ago)

My only fear with the indie gaming industry is that many of them are starting to embrace the churn culture that has led AAA gaming down a dark path.

I would love an app like Blind that allows developers on a game to anonymously call out the grinding culture of game development, alongside practices like firing before launch and removing credits from workers. Review games solely on how the dev treated the workers, and we might see some cool corrections between good games and good culture.

[-] Telorand@reddthat.com 8 points 3 months ago

There's certainly room to grow with regard to workers' rights. I think you could probably solve at least a few of them if they were covered by a union, and publishers who hire them would have to bargain for good development contract terms.

load more comments (8 replies)
[-] frezik@midwest.social 54 points 3 months ago

. . . with 10% increase in performance rather than 50 or 60% like we really need

Why is this a need? The constant push for better and better has not been healthy for humanity or the planet. Exponential growth was always going to hit a ceiling. The limit on Moore's Law has been more to the economic side than actually packing transistors in.

We still don’t have the capability to play games in full native 4K 144 Hertz. That’s at least a decade away

Sure you can, today, and this is why:

So many gaming companies are incapable of putting out a successful AAA title because . . .

Regardless of the reasons, the AAA space is going to have to pull back. Which is perfectly fine by me, because their games are trash. Even the good ones are often filled with micro transaction nonsense. None of them have innovated anything in years; that's all been done at the indie level. Which is where the real party is at.

Would it be so bad if graphics were locked at the PS4 level? Comparable hardware can run some incredible games from 50 years of development. We're not even close to innovating new types of games that can run on that. Planet X2 is a recent RTS game that runs on a Commodore 64. The genre didn't really exist at the time, and the control scheme is a bit wonky, but it's playable. If you can essentially backport a genre to the C64, what could we do with PS4 level hardware that we just haven't thought of yet?

Yeah, there will be worse graphics because of this. Meh. You'll have native 4K/144Hz just by nature of pulling back on pushing GPUs. Even big games like Rocket League, LoL, and CS:GO have been doing this by not pushing graphics as far as they can go. Those games all look fine for what they're trying to do.

I want smaller games with worse graphics made by people who are paid more to work less, and I'm not kidding.

[-] sugar_in_your_tea@sh.itjust.works 15 points 3 months ago

None of them have innovated anything in years

Well, they've innovated news ways to take up disk space...

There's a reason I don't play new release AAA games, and it's because they're simply not worth the price. They're buggy at launch, take up tons of disk space (with lots of updates the first few months), and honestly aren't even that fun even when the bugs are fixed. Indie games, on the other hand, seem to release in a better state, tend to be fairly small, and usually add something innovative to the gameplay.

The only reason to go AAA IMO is for fancy graphics (I honestly don't care) and big media franchises (i.e. if you want Spiderman, you have to go to the license holder), and for me, those lose their novelty pretty quickly. The only games I buy near release time anymore are Nintendo titles and indie games from devs I like. AAA just isn't worth thinking about, except the one or two each year that are actually decent (i.e. Baldur's Gate 3).

[-] Dkarma@lemmy.world 14 points 3 months ago* (last edited 3 months ago)

This post really nails my take on the issue. Give me original cs level graphics or even aq2 graphics, a decent story, more levels, and a few new little gimmicks (rocket arena grappling hook, anyone?!?!) and you don't need 4k blah blah bullshit.

The #1 game for kids is literally Minecraft or Roblox...8 bit level gfx outselling your horse armor hi res bullshit.

The last game i bought was 2 days ago. Mohaa airborne for PC for $5 at a pawn shop Give me 100 of this quality of game instead of anything PS5 ever made.

[-] solomon42069@lemmy.world 13 points 3 months ago* (last edited 3 months ago)

Here are the number of hours I've spent on indie games VS AAA titles, according to my Steam library:

  • Indie - Valheim - 435 hours
  • Indie - Space Haven - 332 hours
  • Indie - Satisfactory - 215 hours
  • Indie - Dyson Sphere Program - 203 hours
  • AAA - Skyrim - 98 hours
  • AAA - Control - 47 hours
  • AAA - Far Cry 6 - 29 hours
  • AAA - Max Payne 3 - 43 minutes

If we're talking about value - the amount of playtime I've gotten out of games with simpler graphics and unique ideas blows the billions spent by the industry out of the water.

load more comments (1 replies)
load more comments (2 replies)
[-] BananaTrifleViolin@lemmy.world 53 points 3 months ago

As others have said, gaming is thriving - AAA and bloated incumbants are not doing well but the indie sector is thriving.

VR is not on the verge of collapse, but it is growing slowly as we still have not reached the right price point for a mobile high powered headset. Apple made a big play for the future of VR with its Apple Vision Pro but that was not a short term play; that was laying the ground works for trying to control or shape a market that is still probably at least 5 if not 10 years away from something that will provide high quality VR, untethefed from a. PC.

AI meanwhile is a bubble. We are not in an age of AI, we are in an age of algorithms - they will and are useful but will not meet the hype or hyperbole being banded about. Expect that market to pop and probably with spectacular damage to some companies.

Other computing hardware is not really stagnating - we are going through a generational transition period. AMD is pushing Zen 5 and Intel it's 14th gen, and all the chip makers are desperately trying to get on the AI band wagon. People are not upgrading because they don't see the need - there aren't compelling software reasons to upgrade yet (AI is certainly not compelling consumers to buy new systems). They will emerge eventually.

The lack of any landmark PC AAA games is likely holding back demand for consumer graphics cards, and we're seeing similar issues with consoles. The games industry has certainly been here many times before. There is no Cyberpunk 2077 coming up - instead we've had flops like Star Wars Outlaws, or underperformers like Starfield. But look at the biggest game of last year - Baldurs Gate 3 came from a small studio and was a megahit.

I don't see doom and gloom, just the usual ups and downs of the tech industry. We happen to be in a transition period, and also being distracted by the AI bubble and people realising it is a crock of shit. But technology continues to progress.

[-] sugar_in_your_tea@sh.itjust.works 13 points 3 months ago

VR

Yeah, I think it's ripe for an explosion, provided it gets more accessible. Right now, your options are:

  • pay out the nose for a great experience
  • buy into Meta's ecosystem for a mediocre experience

I'm unwilling to do either, so I'm sitting on the sidelines. If I can get a headset for <$500 that works well on my platform (Linux), I'll get VR. In fact, I might buy 4 so I can play with my SO and kids. However, I'm not going to spend $2k just for myself. I'm guessing a lot of other people are the same way. If Microsoft or Sony makes VR accessible for console, we'll probably see more interest on PC as well.

People are not upgrading because they don’t see the need

Exactly. I have a Ryzen 5600 and an RX 6650, and it basically plays anything I want to play. I also have a Steam Deck, and that's still doing a great job. Yeah, I could upgrade things and get a little better everything, but I can play basically everything I care about (hint: not many recent AAA games in there) on reasonable settings on my 1440p display. My SO has basically the same setup, but with an RX 6700 XT.

I'll upgrade when either the hardware fails or I want to play a game that needs better hardware. But I don't see that happening until the next round of consoles comes out.

[-] realitista@lemm.ee 5 points 3 months ago

Yeah Sony was my hope here but despite a few great experiences, they have dropped the ball overall. I'm bored of the cartooney Quest stuff, so I'll probably not buy another headset for a good 5-10 years until there's something with a good library and something equivalent to a high end PC experience today.

load more comments (2 replies)
[-] madjo@feddit.nl 39 points 3 months ago

We still don't have the capability to play games in full native 4K 144 Hertz.

And we really don't need that. Gameplay is still more important than game resolution. Most gamers don't even have hardware that would allow that type of resolution.

[-] 13esq@lemmy.world 25 points 3 months ago* (last edited 3 months ago)

I remember when running counter strike at 30fps on a 480p monitor meant you had a good computer.

Modern graphics are amazing, but they're simply not required to have a good gaming experience.

load more comments (3 replies)
[-] j4k3@lemmy.world 37 points 3 months ago

Mainstream is about to collapse. The exploitation nonsense is faltering. Open source is emerging as the only legitimate player.

Nvidia is just playing conservative because it was massively overvalued by the market. The GPU use for AI is a stopover hack until hardware can be developed from scratch. The real life cycle of hardware is 10 years from initial idea to first consumer availability. The issue with the CPU in AI is quite simple. It will be solved in a future iteration, and this means the GPU will get relegated back to graphics or it might even become redundant entirely. Once upon a time the CPU needed a math coprocessor to handle floating point precision. That experiment failed. It proved that a general monolithic solution is far more successful. No data center operator wants two types of processors for dedicated workloads when one type can accomplish nearly the same task. The CPU must be restructured for a wider bandwidth memory cache. This will likely require slower thread speeds overall, but it is the most likely solution in the long term. Solving this issue is likely to accompany more threading parallelism and therefore has the potential to render the GPU redundant in favor of a broader range of CPU scaling.

Human persistence of vision is not capable of matching higher speeds that are ultimately only marketing. The hardware will likely never support this stuff because no billionaire is putting up the funding to back up the marketing with tangible hardware investments. .. IMO.

Neo Feudalism is well worth abandoning. Most of us are entirely uninterested in this business model. I have zero faith in the present market. I have AAA capable hardware for AI. I play and mod open source games. I could easily be a customer in this space, but there are no game manufacturers. I do not make compromises in ownership. If I buy a product, my terms of purchase are full ownership with no strings attached whatsoever. I don't care about what everyone else does. I am not for sale and I will not sell myself for anyone's legalise nonsense or pay ownership costs to rent from some neo feudal overlord.

[-] Chocrates@lemmy.world 19 points 3 months ago

Mainstream is about to collapse. The exploitation nonsense is faltering. Open source is emerging as the only legitimate player.

I'm a die hard open source fan but that still feels like a stretch. I remember 10 years ago we were theorizing that windows would get out of the os business and just be a shell over a unix kernel, and that never made it anywhere.

load more comments (9 replies)
[-] tias@discuss.tchncs.de 6 points 3 months ago

AI still needs a lot of parallelism but has low latency requirements. That makes it ideal for a large expansion card instead of putting it directly on the CPU die.

[-] j4k3@lemmy.world 8 points 3 months ago

Multi threading is parallelism and is poised to scale to a similar factor, the primary issue is simply getting tensors in and out of the ALU. Good enough is the engineering game. Having massive chunks of silicon laying around without use are a mach more serious problem. At the present, the choke point is not the parallelism of the math but actually the L2 to L1 bus width and cycle timing. The ALU can handle the issue. The AVX instruction set is capable of loading 512 bit wide words in a single instruction, the problem is just getting these in and out in larger volume.

I speculate that the only reason this has not been done already is because pretty much because of the marketability of single thread speeds. Present thread speeds are insane and well into the radio realm of black magic bearded nude virgins wizardry. I don't think it is possible to make these bus widths wider and maintain the thread speeds because it has too many LCR consequences. I mean, at around 5 GHz the concept of wire connections and gaps as insulators is a fallacy when capacitive coupling can make connections across all small gaps.

Personally, I think this is a problem that will take on a whole new architectural solution. It is anyone's game unlike any other time since the late 1970's. It will likely be the beginning of the real RISC-V age and the death of x86. We are presently at the age of the 20+ thread CPU. If a redesign can make a 50-500 logical core CPU slower for single thread speeds but capable of all workloads, I think it will dominate easily. Choosing the appropriate CPU model will become much more relevant.

load more comments (2 replies)
[-] Toes@ani.social 28 points 3 months ago* (last edited 3 months ago)

Wait till the Y2K38 event occurs.

[-] Telorand@reddthat.com 20 points 3 months ago

If only we had some way of working with a bigger integer...maybe we'd call it something like BigInteger...

load more comments (3 replies)
[-] Blackmist@feddit.uk 27 points 3 months ago

COVID also inflated a lot of tech stock massively, as everybody suddenly had to rely a lot more on it to get anything done, and the only thing you could do for entertainment was gaming, streaming movies, or industrial quantities of drugs.

Then that ended, and they all wanted to hold onto that "value".

It is a bubble, but whether it pops massively like in 2000, or just evens off to the point where everything else catches up, remains to be seen.

"The markets can remain irrational longer than you can remain solvent" are wise words for anyone thinking of shorting this kind of thing.

load more comments (1 replies)
[-] schizo@forum.uncomfortable.business 26 points 3 months ago

Well, that's the doomer take.

The rumors are that the 80 series card is 10% faster than the 90 series card from last gen: that's not a '10%' improvement, assuming the prices are the same, that's more like a 40% improvement. I think a LOT of people don't realize how shitty the 4080 was compared to the 4090 and are vastly mis-valuing that rumor.

I'd also argue the 'GAMES MUST BE ULTRA AT 4K144 OR DONT BOTHER' take is wrong. My gaming has moved almost entirely to my Rog Ally and you know what? Shit is just as fun and way more convenient than the 7700x/3080 12gb desktop even if it's 1080p low and not 1440p120. If the only thing the game has going for it is 'ooh it's pretty' then it's unlikely to be one of those games people care about in six months.

And anyways, who gives a crap about AAAAAAAAAAAAA games? Indie games are rocking it in every genre you could care to mention, and the higher budget stuff like BG 3 is, well, probably the best RPG since FO:NV (fight me!).

And yes, VR is in a shitty place because nobody gives a crap about it. I've got a Rift, Rift S, Quest, and a Quest 2 and you know what? It's not interesting. It's a fun toy that, but it has zero sticking power and that's frankly due to two things:

  1. It's not a social experience at all.
  2. There's no budget for the kind of games that would drive adoption, because there's no adoption to justify spending money on a VR version.

If you could justify spending the kind of money that would lead to having a cool VR experience, then yeah, it might be more compelling but that's been tried and nobody bought anything. Will say that Beat Saber is great, but one stellar experience will not sell anyone on anything.

And AI is this year's crypto which was last year's whatever and it's bubbles and VC scams all the way down and pretty much always has been. Tech hops from thing to thing that they go all in on because they can hype it and cash out. Good for them, and be skeptical of shit, but if it sticks it sticks, and if it doesn't it doesn't.

load more comments (9 replies)
[-] sukotai@lemmy.world 20 points 3 months ago

it's time for you to play PACMAN, as i did when i was young 😂
no AI, no GPU, no shitcoin: you just have to eat ghost, which is very strange in fact when you think about it 🤪

[-] emax_gomax@lemmy.world 13 points 3 months ago

Correction the ghosts are AI and based on how many times they killed me clearly a step above anything mainstream today (º ロ º๑).

[-] LordCrom@lemmy.world 16 points 3 months ago

I would love to have a VR headset that didn't require a damn account with a 3rd party just to use it. I don't need an account for my monitor or my mouse. Plus when I bought the thing, it was just Oculus, then meta bought it and promised nothing would change, before requiring a meta account to use the fucking thing.

load more comments (3 replies)
[-] magic_lobster_party@fedia.io 16 points 3 months ago

What’s happening is that support from VC money is drying up. Tech companies have for a long time survived on the promise that they will eventually be much more profitable in the future. It doesn’t matter if it’s not profitable today. They will be in the future.

Now we’re in a period where there’s more pressure on tech companies to be profitable today. That’s why they’re going for such anti consumer behaviors. They want to make more with less.

I’m not sure if there’s a bubble bursting. It could just be a plateau.

[-] 13esq@lemmy.world 6 points 3 months ago

I agree. Smartphones, for example, have hardly changed at all over the last ten years, but you don't see Apple and Samsung going out of business.

load more comments (11 replies)
[-] solomon42069@lemmy.world 15 points 3 months ago* (last edited 3 months ago)

My biggest gripe with big tech is how governments of the world encourage their worst behaviours. Governments and businesses have failed to maintain their own level of expertise and understanding of technology.

Today everything relies on tech but all the solutions are outsourced and rely on "guidance" and free hand outs from vendors like Microsoft. This has caused situations where billions are poured into digital transformation efforts with fuck all to show for it but administrative headaches, ballooning costs and security breaches.

I'm so tired of silicon valley frat boys being the leaders of our industry. We need to go back to an engineer and ideas led industry. Focused on solving problems and making lives better. Not making bullshit unsustainable business monopolies with a huge pile of money. Right now big tech is the embodiment of all of capitalisms worst qualities.

P.s. apologies if my comment is a bit simplistic and vague. didn't want to write a 10 page rant but still wanted to say my 2c about the state of things.

[-] lvxferre@mander.xyz 12 points 3 months ago* (last edited 3 months ago)

It's interesting how interconnected those points are.

Generative A"I" drives GPU prices up. NVidia now cares more about it than about graphics. AMD feels no pressure to improve GPUs.

Stagnant hardware means that game studios, who used to rely on "our game currently runs like shit but future hardware will handle it" and similar assumptions get wrecked. And gen A"I" hits them directly due to FOMO + corporates buying trends without understanding how the underlying tech works, so wasting talent by firing people under the hopes that A"I" can replace it.

Large game companies are also suffering due to their investment on the mobile market. A good example of is Ishihara; sure, Nintendo simply ignored his views on phones replacing consoles, but how many game company CEOs thought the same and rolled with it?

I'm predicting that everything will go down once it becomes common knowledge that LLMs and diffusion models are 20% actual usage, 80% bubble.

[-] bad_news@lemmy.billiam.net 5 points 3 months ago

The backlash to this is going to be fun. Having lived through the .com boom/bust, which wasn't a scam, the web was actually the future and was undersold if anything, no one with the stink of computer on them outside of a tiny elite could get decent fulltime work for like 5 years. AI is a scam, full stop. It has virtually no non-fraud real world applications that don't reflect the underlying uselessness of the activity it can do. People are going to go full Butlerian Jihad from Dune when this blows up the economy, and it's going to suck so much more for everyone in tech, scammer or no...

[-] lvxferre@mander.xyz 6 points 3 months ago

The backlash to this is going to be fun.

In some cases it's already happening - since the bubble forces AI-invested corporations to shove it down everywhere. Cue to Microsoft Recall, and the outrage against it.

It has virtually no non-fraud real world applications that don’t reflect the underlying uselessness of the activity it can do.

It is not completely useless but it's oversold as fuck. Like selling you a bicycle with the claim that you can go to the Moon with it, plus a "trust me = be gullible, eventually bikes will reach Mars!" A bike is still useful, even if they're building a scam around it.

Here's three practical examples:

  1. I use ChatGPT as a translation aid. Mostly to list potential translations for a specific word, or as conjugation/declension table. Also as a second layer of spell-proofing. I can't use it to translate full texts without it shitting its own virtual pants - it inserts extraneous info, repeats sentences, removes key details from the text, butcher the tone, etc.
  2. I was looking for papers concerning a very specific topic, and got a huge pile (~150) of them. Too much text to read on my own. So I used the titles to pre-select a few of them into a "must check" pile, then asked Gemini to provide me three paragraphs summaries for the rest. A few of them were useful; without Gemini I'd probably have missed them.
  3. [Note: reported use.] I've seen programmers claiming that they do something similar to #1, with code instead. Basically asking Copilot how a function works, or to write extremely simple code (if you ask it to generate complex code it starts lying/assuming/making up non-existent libraries).

None of those activities is underlyingly useless; but they have some common grounds - they don't require you to trust the output of the bot at all. It's either things that you wouldn't use otherwise (#2) or things that you can reliably say "yup, that's bullshit" (#1, #3).

load more comments (2 replies)
load more comments (1 replies)
[-] tee9000@lemmy.world 12 points 3 months ago* (last edited 3 months ago)

I really truly suggest diversifying to newsfeeds without comment sections like techmeme for a bit.

Increasing complexity is overwhelming and theres plenty of bad shit going on but theres a lot overblown in your post.

Sorry for the long edit: i personally felt improvement for my mental health when i did this for 6 months or so. Because seriously, whatever disinformation is happening in american news is so exhausting. We need to think whatever we want and then engage with each other when our thoughts are more individualized. Dont be afraid to ask questions that might seem like you are questioning some holy established lemmy/reddit consensus. If you are being honest about your opinions and arent afraid to look dumb then you are doing the internet a HUGE service. We need more dumb questions and vulnerability to combat the obsession of appearing as an expert. So thank you for making a genuine post of concern.

[-] Dead_or_Alive@lemmy.world 12 points 3 months ago

The pace of technological change and innovation was always going to slow down this decade. But Covid, Ukraine and a decoupling from Russia/China has further slowed it.

You need three things in abundance to create tech. First an advanced economy, which narrows down most of the world. Second you need lots of capital to burn while you make said advances. Finally you need lots of 20 and thirty something’s who will invent and develop the tech.

For the last 20 years we’ve had all of those conditions in the Western world. Boomers were at the height of their earnings potential and their kids were leaving home in droves letting them pour money into investments. Low interest rates abound because capital was looking for places to be utilized. China was the workshop of the world building low to mid range stuff allowing the West to focus its excess Millennials age workforce on value added and tech work.

Now in the USA boomers are retiring and there aren’t enough GenX to make up the difference. Millennials and finally getting down to household creation or their oldest cohorts (Xennials) just now entering into their mid 40s and starting to move up in their careers but they probably still have kids to support. So it will be some time before capital becomes plentiful again. Gen Z is large but they aren’t enough to back fill the loss of Millennials.

Ohh I made a point to highlight that this was a US demographic phenomena. Europe and Japan do not have a large Millennial or GenZ populations to replace their aging boomers. We have no modern economic model to map out what will happen to them.

China is going through a demographic collapse worse than what you see in Europe or Japan. Only they aren’t rich to compensate add in the fact that they decided to antagonize their largest trading partners in the West causing the decoupling we are now seeing.

The loss of their labor means the West has to reshore or find alternative low wage markets for production and expend a lot of capital to build out the plant in those markets to do so.

Add on top geopolitical instability of the Ukraine and you have a recipe for slower tech growth.

[-] tibi@lemmy.world 10 points 3 months ago

Also, the movie industry is struggling because of many reasons. Movies are getting too expensive, the safe formulas big studios relied on aren't working anymore, customer habits are changing with people going less to movie theaters.

At the same time, just like with video games, the indie world is in a golden age. You can get amazing cameras and equipment for quite a small budget. What free software like Blender can achieve is amazing. And learning is easier than ever, there are so many free educational resources online.

load more comments (5 replies)
[-] Jolteon@lemmy.zip 9 points 3 months ago

I agree with you on the GPU hardware and AI bubbles, but I'm not sure I would consider VR/AR to be a bubble right now. The hype has mostly died down by now, and I think it's stabilized to the point where it will remain until we have new advances in hardware.

load more comments (2 replies)
[-] Wooki@lemmy.world 7 points 3 months ago* (last edited 3 months ago)

I had a similar feeling, the trapped part of Windows and the enshitifying erroding privacy and ownership tipped me. I stepped back to Linux and wow! Its so much better now, a friend nagged me into nix and I have been soo impressed at just how better Linux is now compared to Windows. It used to be the polar opposite with Windows being easy and shiny, now so many desktop environments are just so far ahead of Windows, its really impressive. Thats also let alone before I consider how novel nix is. As my friend would nag, I use nix btw ❤️

[-] Resol@lemmy.world 7 points 3 months ago

Guys, I'm actually getting nostalgic over the messy-but-still-kinda-fun 2010s. Everything was just so much more exciting back then, and if it was absolute garbage, it was still fun to make fun of it (cough cough 2013 Mac Pro, garbage quite literally).

Yeah, it was no "sunshine and lollipops" timeline, but still, over the literal boring hell of the 2020s, it was LEAGUES better.

[-] asdfasdfasdf@lemmy.world 6 points 3 months ago

I agree. But also add in the movie industry that's been complete trash for a while now. Not to mention books. I'm not sure if we'll ever see another Harry Potter level book again, at least in our lifetimes.

My take is we've already left the golden ages of movies, music, and books and probably won't get another for an extremely long time.

Video games are going through the same downfall which streaming services brought. Physical media left the movie scene as a standard while ago, but video games took longer. Now it's going to be all streaming and subscriptions where you can never own anything.

Once that happens, enshittification will peak, companies won't be incentivized to make the games good anymore, standards tank, and people will forget how good things once were.

[-] hedgehog@ttrpg.network 12 points 3 months ago

Not to mention books. I'm not sure if we'll ever see another Harry Potter level book again, at least in our lifetimes.

Are you talking quality or popularity? Because there are many, many books that are just as good or better than Harry Potter.

load more comments (1 replies)

movie industry that’s been complete trash for a while now.

This is not a callout of you in particular so don't get offended, but that's really only true if you look at the trash coming out of Hollywood.

There's some spectacularly good shit coming out of like France and South Korea (depending on what genres you're a fan of, anyways), as well as like, everywhere else.

Shitty movies that are just shitty sequels to something that wasn't very good (or yet another fucking Marvel movie) is a self-inflicted wound, and not really a sign that you can't possibly do better.

load more comments (4 replies)
load more comments (2 replies)
[-] werefreeatlast@lemmy.world 6 points 3 months ago

We are sorry. So sorry indeed man! We are sorry that because of a pandemic many people in the industry had to move to safe locations and realize how much better those places were so they're not going back. We're sorry to have inconvenienced your game play. But we're working hard to get you to pay another salary's worth on the next tumb raider! We promised so much many more transistors that the boob wobble will be endless! Thru AI, anything is possible!

[-] szczuroarturo@programming.dev 6 points 3 months ago

Ok . So first of all while NVIDIA is absoluetly a scumy company but the reason they are able to be this scummy is because they do generaly deliver unreasonable performance improvment ( at an unreasonable price tho ) and this time its unlikely to be any diffrent and 50xx series is expected to be monstrous as far as performance go. So far they didint do the same mistake intel did with cpus.

Second . You cant collapse something that hasnt risen. Virtual reality never gained enough traction for it to collapse. I personaly blame PlayStation for this. If there is anyone that could make a diffrence it would be them .

Third. If that's true thats actually fucked up. Alghtough to be fair openai is very strange company and very closed one for a supposedly openai. Also i dont think going from non profit to for profit comapny changes much since it requires a thing they dont have. Profit.

load more comments (1 replies)
load more comments
view more: next ›
this post was submitted on 15 Sep 2024
263 points (89.7% liked)

Technology

60148 readers
2081 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS