138
submitted 22 hours ago by ampersandrew@lemmy.world to c/games@lemmy.world
top 50 comments
sorted by: hot top controversial new old
[-] WereCat@lemmy.world 9 points 2 hours ago

Sound design > Graphics

[-] rimjob_rainer@discuss.tchncs.de 7 points 3 hours ago

My favourite games don't look nearly as good as in my memory. Graphics don't matter, they might even hurt, because there is less left to imagination.

[-] UnderpantsWeevil@lemmy.world 1 points 58 seconds ago

I'd say it's less about imagination than gameplay. I'm reminded of old action figures. Some of them were articulated at the knees, elbows, feet, wrists, and head. Very posable, but you could see all the joints. Then you had the bigger and more detailed figures, but they were barely more than statues. Looked great but you couldn't really do anything with them.

And then you had themed Lego sets. Only a vague passing resemblance to the IP, but your imagination is the limit on what you do with them.

[-] elucubra@sopuli.xyz 3 points 2 hours ago* (last edited 2 hours ago)

Gifted my kids, both of them already young adults, one of those retro gaming sticks. An absolute bang/for/buck wonder, full of retro emulators and ROMs. Christmas Day, at grandmas was a retro fest, with even grandma playing. Pac man, frogger, space invaders, galaga, donkey Kong, early console games…. Retro gaming has amazing games, where gameplay and concepts had to make do with the limited resources.

My son has a Steam deck, but he had a blast with the rest.

[-] echodot@feddit.uk 12 points 5 hours ago

The game of the year was a cutesy cartoon game about a robot. I don't think there's a problem here.

[-] Neon@lemmy.world -1 points 5 hours ago
[-] echodot@feddit.uk 5 points 4 hours ago

Yeah I did read the article. That's why I know what the article is about, and the fact that he's complaining about graphical fidelity in games and not getting the profit benefit. clearly AAA studios aren't actually having this issue because, like I said, the winner of the game awards this year was a cartoony game, so clearly they are well aware that graphics aren't everything.

[-] elucubra@sopuli.xyz 2 points 3 hours ago

Didn’t he tell you to read the article??

[-] echodot@feddit.uk 8 points 5 hours ago

Is there a way to actually read the article without having to be exposed to whatever the drug fueled hellscape that website is?

[-] brown567@sh.itjust.works 3 points 2 hours ago* (last edited 2 hours ago)

I use Firefox's "reader mode"

Edit: nyt managed to enshittify even that. will wonders never cease

[-] drasglaf@sh.itjust.works 11 points 5 hours ago

One way to understand the video game industry’s current crisis is by looking closely at Spider-Man’s spandex.

For decades, companies like Sony and Microsoft have bet that realistic graphics were the key to attracting bigger audiences. By investing in technology, they have elevated flat pixelated worlds into experiences that often feel like stepping into a movie.

Designers of last year’s Marvel’s Spider-Man 2 used the processing power of the PlayStation 5 so Peter Parker’s outfits would be rendered with realistic textures and skyscraper windows could reflect rays of sunlight.

That level of detail did not come cheap.

Insomniac Games, which is owned by Sony, spent about $300 million to develop Spider-Man 2, according to leaked documents, more than triple the budget of the first game in the series, which was released five years earlier. Chasing Hollywood realism requires Hollywood budgets, and even though Spider-Man 2 sold more than 11 million copies, several members of Insomniac lost their jobs when Sony announced 900 layoffs in February.

Cinematic games are getting so expensive and time-consuming to make that the video game industry has started to acknowledge that investing in graphics is providing diminished financial returns.

“It’s very clear that high-fidelity visuals are only moving the needle for a vocal class of gamers in their 40s and 50s,” said Jacob Navok, a former executive at Square Enix who left that studio, known for the Final Fantasy series, in 2016 to start his own media company. “But what does my 7-year-old son play? Minecraft. Roblox. Fortnite.”

Joost van Dreunen, a market analyst and professor at New York University, said it was clear what younger generations value in their video games: “Playing is an excuse for hanging out with other people.”

When millions are happy to play old games with outdated graphics — including Roblox (2006), Minecraft (2009) and Fortnite (2017) — it creates challenges for studios that make blockbuster single-player titles. The industry’s audience has slightly shrunk for the first time in decades. Studios are rapidly closing and sweeping layoffs have affected more than 20,000 employees in the past two years, including more than 2,500 Microsoft workers.

Many video game developers built their careers during an era that glorified graphical fidelity. They marveled at a scene from The Last of Us: Part II in which Ellie, the protagonist, removes a shirt over her head to reveal bruises and scrapes on her back without any technical glitches.

But a few years later, costly graphical upgrades are often barely noticeable.

When the studio Naughty Dog released a remastered version of The Last of Us: Part II this year, light bounced off lakes and puddles with a more realistic shimmer. In a November ad for the PlayStation 5 Pro, an enhanced version of the Sony console that retails for almost $700, the billboards in Spider-Man 2’s Manhattan featured crisper letters.

Optimizing cinematic games for a narrow group of consumers who have spent hundreds of dollars on a console or computer may no longer make financial sense. Studios are increasingly prioritizing games with basic graphics that can be played on the smartphones already in everyone’s pocket.

“They essentially run on toasters,” said Matthew Ball, an entrepreneur and video game analyst, talking about games like Roblox and League of Legends. “The developers aren’t chasing graphics but the social connections that players have built over time.” Going Hollywood

Developers had long taught players to equate realism with excellence, but this new toaster generation of gamers is upsetting industry orthodoxies. The developer behind Animal Well, which received extensive praise this year, said the game’s file size was smaller than many of the screenshots used to promote it.

A company like Nintendo was once the exception that proved the rule, telling its audiences over the past 40 years that graphics were not a priority.

That strategy had shown weaknesses through the 1990s and 2000s, when the Nintendo 64 and GameCube had weaker visuals and sold fewer copies than Sony consoles. But now the tables have turned. Industry figures joke about how a cartoony game like Luigi’s Mansion 3 on the Nintendo Switch considerably outsells gorgeous cinematic narratives on the PlayStation 5 like Final Fantasy VII Rebirth.

There are a number of theories why gamers have turned their backs on realism. One hypothesis is that players got tired of seeing the same artistic style in major releases. Others speculate that cinematic graphics require so much time and money to develop that gameplay suffers, leaving customers with a hollow experience.

Another theory is that major studios have spent recent years reshaping themselves in Hollywood’s image, pursuing crossover deals that have given audiences “The Super Mario Bros. Movie” and “The Last of Us” on HBO. Not only have companies like Ubisoft opened divisions to produce films, but their games include an astonishing amount of scenes where players watch the story unfold.

In 2007, the first Assassin’s Creed provided more than 2.5 hours of footage for a fan edit of the game’s narrative. As the series progressed, so did Ubisoft’s taste for cinema. Like many studios, it increasingly leaned on motion-capture animators who could create scenes using human actors on soundstages. A fan edit of Assassin’s Creed: Valhalla, which was released in 2020, lasted about 23 hours — longer than two seasons of “Game of Thrones.”

Gamers and journalists began talking about how the franchise’s entries had gotten too bloated and expensive. Ubisoft developers advertised last year’s Assassin’s Creed Mirage, which had about five hours of cut scenes, as “more intimate.”

The immersive graphics of virtual reality can also be prohibitive for gamers; the Meta Quest Pro sells for $1,000 and the Apple Vision Pro for $3,500. This year, the chief executive of Ubisoft, Yves Guillemot, told the company’s investors that because the virtual reality version of Assassin’s Creed did not meet sales expectations, the company was not increasing its investment in the technology. ImageA person plays a video game on a tablet. Live service games that are playable on mobile devices, like Genshin Impact, can generate large amounts of revenue. Credit...Ina Fassbender/Agence France-Presse — Getty Images

Many studios have instead turned to the live service model, where graphics are less important than a regular drip of new content that keeps players engaged. Genshin Impact, by the studio Hoyoverse, makes roughly $2 billion every year on mobile platforms alone, according to the data tracker Sensor Tower. Going Broke?

It was clear this year, however, that the live service strategy carries its own risks. Warner Bros. Discovery took a $200 million loss on Suicide Squad: Kill the Justice League, according to Bloomberg. Sony closed the studio behind Concord, its attempt to compete with team-based shooters like Overwatch and Apex Legends, one month after the game released to a minuscule player base.

“We have a market that has been in growth mode for decades,” Ball said. “Now we are in a mature market where instead of making bets on growth, companies need to try and steal shares from each other.”

Some industry professionals believe there is a path for superb-looking games to survive the cost crunch.

“I used to be a high-fidelity guy; I would log into games and if it didn’t look hyperrealistic, then it was not so interesting,” said David Reitman, a managing director at PricewaterhouseCoopers, where he leads the consulting firm’s games division. “There was a race to hyperrealism, and it’s tough to pivot away. You have set expectations.”

Reitman sees a future where most of the heavy costs associated with cutting-edge graphics are handled by artificial intelligence. He said that manufacturers were working on creating A.I. chips for consoles that would facilitate those changes, and that some game studios were already using smart algorithms to improve graphics further than anything previously seen.

He expects that sports games will be the first genre to see considerable improvements because developers have access to hundreds of hours of game footage. “They can take feeds from leagues and transpose them into graphical renderings,” Reitman said, “leveraging language models to generate the incremental movements and facial expressions of players.”

Some independent developers are less convinced. “The idea that there will be content from A.I. before we figure out how it works and where it will source data from is really hard,” said Rami Ismail, a game developer in the Netherlands.

Ismail is worried that major studios are in a tight spot where traditional games have become too expensive but live service games have become too risky. He pointed to recent games that had both jaw-dropping realism — Avatar: Frontiers of Pandora (individual pebbles of gravel cast shadows) and Senua’s Saga: Hellblade II (rays of sunlight flicker through the trees) — and lackluster sales.

He recalled a question that emerged early in the coronavirus pandemic and has become something of an unofficial motto in the video game industry.

“How can we as an industry make sho

[-] Nexy@lemmy.sdf.org 15 points 8 hours ago

Unpopular opinion but I preferer the graphics of a game were absolute trash but the ost be awesome. I can forget easyly how much individual hairs are in a 3d model, but good OST will live in my mind and heart forever.

And of course gameplay go first.

[-] elucubra@sopuli.xyz 4 points 3 hours ago

The Wii was a fantastic example of this. Less capable hardware used in very imaginative ways, and had the capacity to bring older people into the games

[-] Sanctus@lemmy.world 19 points 10 hours ago

This is my current addiction. No need graphix.

[-] noxypaws@pawb.social 4 points 8 hours ago
[-] SkyeStarfall@lemmy.blahaj.zone 4 points 8 hours ago

Your thirst is mine, my water is yours

[-] noxypaws@pawb.social 1 points 7 hours ago

HA! Now you have to come adventure with me if I can afford the rep loss!

I hope you like hauling bags of warm static~

[-] SkyeStarfall@lemmy.blahaj.zone 1 points 7 hours ago
[-] noxypaws@pawb.social 1 points 7 hours ago

seriously though AWESOME game, I must have 500+ hours in it at this point

[-] random_character_a@lemmy.world 1 points 4 hours ago* (last edited 4 hours ago)

Tried it about 10+ times, but I suck at it too much.

[-] Bad_oatmeal@lemmy.world 2 points 9 hours ago
[-] gregdaynes@lemmy.ca 5 points 8 hours ago

Caves of Qud

[-] p03locke@lemmy.dbzer0.com 22 points 11 hours ago* (last edited 11 hours ago)

This author has no fucking clue that the indie gaming industry exists.

Balatro screenshot

Like Balatro... you know, the fucking Indie Game of the Year, that was also nominated for Best Game of the Year at the Game Awards.

Localthunk was able to build this in Lua... WITH A BOX OF SCRAPS!

[-] xavier666@lemm.ee 1 points 3 hours ago

I'm sorry sir, but I'm not an indie dev. I need to show the investors that my game will earn $100 million otherwise it's a failure.

[-] ampersandrew@lemmy.world 13 points 11 hours ago

This article wasn't about indie games.

load more comments (2 replies)
[-] anakin78z@lemmy.world 12 points 13 hours ago

I just played Dragon Age Veilguard, and I'm now playing Dragon Age Origins, which was released 15 years ago. The difference in graphics and animation are startling. And it has a big effect on my enjoyment of the game. Origins is considered by many to be the best in the series, and I can see that they poured a ton into story options and such. But it doesn't feel nearly as good as playing Veilguard.

Amazing graphics might not make or break a game, but the minimum level of what's acceptable is always rising. Couple that with higher resolutions and other hardware advances, and art budgets are going to keep going up.

[-] squid_slime@lemm.ee 3 points 9 hours ago* (last edited 9 hours ago)

GSC in my opinion ruined stalker 2 in the chase for "next gen" graphics. And modern graphics are now so dependent on upscaling and frame gen, sad to see but trailers sell.

[-] DarkThoughts@fedia.io 19 points 15 hours ago

It's not that I don't like realistic graphics. But I'm not gonna pay 100 bucks per game + micro transactions and / or live service shenanigans to get it. Nowadays it's not even that hard to have good looking games, thanks to all the work that went into modern engines. Obviously cutting edge graphics still need talented artists who create all the textures and high poly models but at some point the graphical fidelity gained becomes minuscule, compared to the effort put into it (and the performance it eats, since this bleeds into the absurd GPU topic too).

There's also plenty of creative stylization options that can be explored that aren't your typical WoW cartoon look that everyone goes for nowadays. Hell, I still love pixel art games too and they're often considered to be on the bottom end of the graphical quality (which I'd heavily disagree with, but that's also another topic).

What gamers want are good games that don't feel like they get constantly milked or prioritize graphics over gameplay or story.

[-] makyo@lemmy.world 67 points 19 hours ago
load more comments (1 replies)
[-] caut_R@lemmy.world 22 points 17 hours ago

And I don‘t think games have to look that good either… I‘m currently playing MGSV and that game‘s 8 years old, runs at 60 fps on the Deck, and looks amazing. It feels like hundreds of millions are being burned on deminishing returns nowadays…

[-] SupraMario@lemmy.world 14 points 13 hours ago

It's bullshit accounting, they're not spending it on the devs or the games, they're spending it on advertising and the c levels Paydays. There are a ton of really good looking games, that had what would be considered shoestring budgets, but these companies bitching about it aren't actually in it for the games anymore, its just for the money.

load more comments (7 replies)
[-] SnotFlickerman@lemmy.blahaj.zone 77 points 21 hours ago* (last edited 20 hours ago)

There are a number of theories why gamers have turned their backs on realism. One hypothesis is that players got tired of seeing the same artistic style in major releases.

Whoosh.

We learned all the way back in the Team Fortress 2 and Psychonauts days that hyper-realistic graphics will always age poorly, whereas stylized art always ages well. (Psychonauts aged so well that its 16-year-later sequel kept and refined the style, which went from limitations of hardware to straight up muppets)

There's a reason Overwatch followed the stylized art path that TF2 had already tread, because the art style will age well as technology progresses.

Anyway, I thought this phenomena was well known. Working within the limitations of the technology you have available can be pushed towards brilliant design. It's like when Twitter first appeared, I had comedy-writing friends who used the limitation of 140 characters as a tool for writing tighter comedy, forcing them to work within a 140 character limitation for a joke.

Working within your limitations can actually make your art better, which just complements the fact that stylized art lasts longer before it looks ugly.

Others speculate that cinematic graphics require so much time and money to develop that gameplay suffers, leaving customers with a hollow experience.

Also, as others have pointed out, it's capitalism and the desire for endless shareholder value increase year after year.

Cyberpunk 2077 is a perfect example. A technical achievement that is stunningly beautiful where they had to cut tons of planned content (like wall-running) because they simply couldn't get it working before investors were demanding that the game be put out. As people saw with the Phantom Liberty, given enough time, Cyberpunk 2077 could have been a masterpiece on release, but the investors simply didn't give CD Project Red enough time before they cut the purse strings and said "we want our money back... now." It's a choice to release too early.

...but on the other hand it's also a choice to release too late after languishing in development hell a la Duke Nukem Forever.

[-] sp3tr4l@lemmy.zip 11 points 14 hours ago

Just wanna throw Windwaker into the examples of highly stylized art style games that aged great.

[-] Ashtear@lemm.ee 15 points 16 hours ago

Unfortunately, Cyberpunk is exactly the kind of product that is going to keep driving the realistic approach. It's four years later now and the game's visuals are still state-of-the-art in many areas. Even after earning as much backlash on release as any game in recent memory, it was a massively profitable project in the end.

This is why Sony, Microsoft, and the big third parties like Ubisoft keep taking shots in this realm.

load more comments (4 replies)
[-] scrubbles@poptalk.scrubbles.tech 41 points 20 hours ago

How hard is it for them to realize this? Graphics are a nice to have, they're great, but they do not hold up an entire game. Star wars outlaws looked great, but the story was boring. If they took just a fraction of the money they spent on realism to give to writers and then let the writers do their job freely without getting in their way they could make some truly great games.

[-] SnotFlickerman@lemmy.blahaj.zone 39 points 18 hours ago

Look, I'm gonna be real with you, the pool of writers who are exceptionally good at specifically writing for games is really damn small.

Everyone is trained on novels and movies, and so many games try to hamfist in a three-act arc because they haven't figured out that this is an entirely different medium and needs its own set of rules for how art plays out.

Traditional filmmaking ideas includes stuff like the direction a character is moving on the screen impacting what the scene "means." Stuff like that is basically impossible to cultivate in, say, a first or third-person game where you can't be sure what direction characters will be seen moving. Thus, games need their own narrative rules.

I think the first person to really crack those rules was Yoko Taro, that guy knows how to write for a game specifically.

load more comments (6 replies)
[-] Kolanaki@yiffit.net 11 points 17 hours ago* (last edited 17 hours ago)

You know the budget is spent almost entirely on the art when you actually pay attention to the credits and you see names for like 250 artists, but only 3-5 programmers.

load more comments
view more: next ›
this post was submitted on 26 Dec 2024
138 points (91.6% liked)

Games

32980 readers
3044 users here now

Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.

Weekly Threads:

What Are You Playing?

The Weekly Discussion Topic

Rules:

  1. Submissions have to be related to games

  2. No bigotry or harassment, be civil

  3. No excessive self-promotion

  4. Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts

  5. Mark Spoilers and NSFW

  6. No linking to piracy

More information about the community rules can be found here.

founded 2 years ago
MODERATORS