When I was a youngling, graphics improved quickly and that made older games look worse by every year. What initially looked amazing and “life-like” turned dated in just a couple of years, however it seems graphics has plateaued in the last ten-ish years.
I would say that we reached the point of diminishing returns around next-gen GTA V / MGS V. Not that you can’t see the difference between those and modern games, but it’s nowhere the leap from San Andreas to GTA IV.
I recently started playing MGS V again and I’m shocked at how good it still looks, compare it to RDD 2 and you would probably need a side by side comparison to see the extra detail.
We have finally reached a point where old games dont get less immersive because our baseline in visual fidelity keeps increasing, or if it does it’s atleast a lot slower.
I think it’s better when games don’t look realistic.
From a game design perspective, realistic graphics are not the right way to go, almost ever.
Players need big, bold, exaggerated, unambiguous visual cues.
Yeah, we need to see a big red glowing guy to know he’s the bad dude. We need gigantic oversized, conspicuously obvious doors and portal ways. We need obvious paths stomped out on the ground so we know where is a legal place to go and what is worthwhile. We need the special thing we’re supposed to pick up to have a giant glowing pink halo. The bad guys are all supposed to be in the same obvious purple uniform.
Realistic graphics are the crutch of the clueless game designer.
I think the most interesting example is Red Dead Redemption 2. I don’t like its gameplay, but…
It still looks better than most recent AAA games, mainly due to art direction and attention to detail.
But I disagree that graphical improvements made older games look worse. Most PS2 games looked bad from the beginning because they used mostly shades of grey and brown trying to look realistic.
Games that were designed with the graphical limitations in mind still look great today, for example The Wind Waker and Kingdom Hearts.
I agree, though I feel like we’re still pushing way too high fidelity on modern titles, and the performance kind of suffers.
The last breakthrough that happened recently and i still don’t see very often is the unreal 5 clothes and hair physics that prevent compenetrations. Not strictly graphics but it improves it a lot.
Yeah we pretty much reached the plateau since graphiccards companies now have to resort to frame generation as the next gimmick. There is just nothing more to gain in graphic fidelity.
That is completely untrue, we are in a plateue but it’s definitely not the final one. Frame Gen is gimmicky for sure but hardware has a long history of gimmicks so it’s not a sign of the end or anything - and the related technology of “AI” upscaling is already very good
And as much as I don’t like it, if graphics are rendered on a supercomputer and streamed to users, the quality can skyrocket - that is, assuming they can afford it.
And as much as I don’t like it, if graphics are rendered on a supercomputer and streamed to users, the quality can skyrocket - that is, assuming they can afford it.
Have you compared quality of streaming services against (physical) media you own? Yeah it will never skyrocket, too many moving parts.
The difference is massive right now but it’s not like internet technologies have stagnated. Speeds and latency are only getting better
Dont think this is an unpopular opinion. I can happily play most any game made in the last 15 years and be pretty happy, and if the OG game was pretty plain there’s always modders (like, morrowind/system shockn2 or freespace mods make a huge difference for that age of game) I was just watching GDQ today and surprised by well some games over a decade old just look… Perfectly fine. I find myself Turing rtx on and off a bit to figure out what difference it’s actually making - i believe rtx is a bigger win forndev workload than the quality increase for gamers. (Don’t get me wrong, full glass/water/puddke reflections are incredible, but it’s a lot of $$$ for the visual jump)
I think the unpopular opinion here is specifically that the graphics plateue is a good thing, though I do agree with you that OP’s justification for it isn’t the best
I think graphics are better and worse. Higher resolution and framerates, and higher resolution textures and enhamcements. But also grainy and flickery and fuzzy on the edges and certain details like hair, depending on the game and the tech it’s using. For example, Sniper Elite 5 has beautiful levels but I’ve never been able to get the AA looking decent. But games like Far Cry 6 and Starfield are much clearer.
I think it’s possible to push the boundaries a bit more but the cost to create a full world of detailed textures is just too much.
Our own universe isn’t locally real, meaning it renders in real time.
(I.e. on a universal scale: if a tree falls in the universe and no one is there to see it… Did it even fall? Quantum physics literally says No. Its just down when it’s observed. It never fell. It was up when it was last observed and it is down now. )
So yeah I mean we’ll get there but it will all just be local rendering. Only way in the known universe to do it 🤓