this post was submitted on 02 Jun 2025
73 points (91.0% liked)

Unpopular Opinion

7351 readers
9 users here now

Welcome to the Unpopular Opinion community!


How voting works:

Vote the opposite of the norm.


If you agree that the opinion is unpopular give it an arrow up. If it's something that's widely accepted, give it an arrow down.



Guidelines:

Tag your post, if possible (not required)


  • If your post is a "General" unpopular opinion, start the subject with [GENERAL].
  • If it is a Lemmy-specific unpopular opinion, start it with [LEMMY].


Rules:

1. NO POLITICS


Politics is everywhere. Let's make this about [general] and [lemmy] - specific topics, and keep politics out of it.


2. Be civil.


Disagreements happen, but that doesn’t provide the right to personally attack others. No racism/sexism/bigotry. Please also refrain from gatekeeping others' opinions.


3. No bots, spam or self-promotion.


Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Shitposts and memes are allowed but...


Only until they prove to be a problem. They can and will be removed at moderator discretion.


5. No trolling.


This shouldn't need an explanation. If your post or comment is made just to get a rise with no real value, it will be removed. You do this too often, you will get a vacation to touch grass, away from this community for 1 or more days. Repeat offenses will result in a perma-ban.


6. Defend your opinion


This is a bit of a mix of rules 4 and 5 to help foster higher quality posts. You are expected to defend your unpopular opinion in the post body. We don't expect a whole manifesto (please, no manifestos), but you should at least provide some details as to why you hold the position you do.



Instance-wide rules always apply. https://legal.lemmy.world/tos/

founded 2 years ago
MODERATORS
 

Upscaling and Frame Generation are disasters meant to conceal unfulfilled promises from GPU makers for 4k gaming, and as a coverup for the otherwise horrible performance some modern games have, even at 1080/1440p resolutions.

Upscaling will never, no matter how much AI and overhead you throw at it, create an image that is as good as the same scene rendered at native res.

Frame Generation is a joke, and I am absolutely gobsmacked that people even take it seriously. It is nothing but extra AI frames shoved into your gameplay, worsening latency, response times, and image quality, all so you can artificially inflate a number. 30FPS gaming is, and will always be, infinitely better as an experience, than AI frame doubling a 30fps experience to 60FPS.

and because both these technologies exist, game devs are pushing out less optimized to completely unoptomized games that run like absolute dogshit, requiring you to use upscaling and shit even at 1080p just to get reasonable frame rates on GPUs that should run it just fine if it was optimized better (and we know its optimization, because some of these games do end up getting that optimization pass long after launch, and wouldnt you know.. 9fps suddenly became 60fps)

top 30 comments
sorted by: hot top controversial new old
[–] real_squids@sopuli.xyz 22 points 4 days ago* (last edited 4 days ago) (2 children)

One important thing - upscaling does help with low spec/low power gaming (esp on smaller screens). Obviously it's a double edged sword (promotes pushing out games quicker), but it has some really cool uses. Now forced TAA on the other hand...

Both are tolerable, but only if they're not forced, and for some reason companies have a hard-on for forcing them. Kinda like how 103° FOV limit somehow became a standard even in fast-paced competitive games

[–] sp3ctr4l@lemmy.dbzer0.com 4 points 3 days ago* (last edited 3 days ago) (1 children)

Many... most?... games that can do DLSS/FSR... also have TAA forced on by default. You can find out by digging through their config files and/or console commands, its often the case that basically half of the game's graphical pipeline explodes or glitches out if you hack in and force it off.

Because the DLSS/FSR algos hook into the motion vectors that the game's TAA give them.

TAA is very often the foundation layer that DLSS/FSR is built on top of.

A few games actually let you turn TAA off, but a good deal of them don't, because they never bothered to set up a standalone motion vector layer/component/pipeline that is seperate from the AA part of TAA.

It continues to amaze me that people don't just generally know this.

DLSS/FSR are literally built off of TAA, and a lot of game devs are lazy, or are told by management to not bother, whichever.

[–] real_squids@sopuli.xyz 2 points 3 days ago* (last edited 3 days ago) (1 children)

Or i just check pcgw and get disappointed. Oh well, if I can at least change the frame weight it's usually bearable, otherwise I'll just not bother

edit: or if the game lets me hack it off, otherwise I'd rather skip

edit2: also you can use FSR on pretty much any game now, regardless of AA, at least that's what it looks like from my driver. So forced upscaling usually equals forced taa

[–] sp3ctr4l@lemmy.dbzer0.com 2 points 3 days ago* (last edited 3 days ago)

You can run FSR natively on many games now, but they don't all support it in the same way, nor the same version, and it also varies by your hardware.

You can also hack FSR of various versions into various games via ... well, the FrameGen addon on Decky, for the Steam deck, is really a collection of a whole bunch of other hacks/mods/implementations for other games, and you can run a lot of those independently on linux or windows ... but there are a lot of variables when you get this specific.

[–] A_Random_Idiot@lemmy.world 2 points 4 days ago* (last edited 4 days ago) (1 children)

I've had shit GPUs.

playing at lower res and settings gives a better image and performance than trying to upscale 540 to 1080p.

[–] real_squids@sopuli.xyz 5 points 3 days ago* (last edited 3 days ago) (1 children)

That's your preference tbh, same as AA for most people. Also you can upscale from higher res, it doesn't have to be 50%.
As long as it's not forced - more options are welcome in my opinion.

Personally I'm not a fan of anything other than NativeAA/DLAA, but it's a smaller issue for me than forced TAA or forced software RT.

[–] A_Random_Idiot@lemmy.world 2 points 3 days ago (2 children)

Except useless shit like this is increasing GPU prices.

So its not just a matter of preference or opinion. Its an economic matter, They are throwing useless bullshit "Features" into cards and jacking the prices up to cover for them.

[–] MotoAsh@lemmy.world 2 points 3 days ago

Nah, that's mostly the "AI" machine learning crap. That stuff is definitely at present garbage that does nothing but eat electricity and bloat nvidia's stock price.

[–] real_squids@sopuli.xyz 1 points 3 days ago* (last edited 3 days ago)

People keep buying their cards, why would they not raise prices? If you look at the backlash to the 50 series launch and it's "4090 performance" claims, upscaling and framegen can have the opposite effect on perception, yet people keep buying. Even in AMD's case the cards are at 50% over MSRP yet they sell.

The biggest reason still is rampant consumerism. If people voted with their wallets it wouldn't be this bad imo.

I think if you can't optimise your games enough or deliver on graphics card specs then faking it till you make it isn't a good band aid for that. feels like false advertising. is this really an unpopular opinion? i thought most people disliked this trend

[–] the_q@lemm.ee 10 points 4 days ago (1 children)

but how can nvidia sell more GPUs if they don't invent a solution to a non problem?

[–] 30p87@feddit.org 4 points 4 days ago (1 children)

AI

As always, the magical abbreviation

[–] the_q@lemm.ee 3 points 4 days ago
[–] esteemedtogami@lemmy.zip 6 points 4 days ago (1 children)

Absolutely true. I never bother to turn these options on if a game offers them because in the best case it doesn't do a whole lot and in the worst case it makes the game look awful. I'd rather just play with real frames even if it means playing at a lower frame rate.

[–] 30p87@feddit.org 4 points 4 days ago

I actively turn them off, if they're on. I actually notice when they're on.

[–] bizarroland@fedia.io 5 points 4 days ago (1 children)

Trust me bro, a few hundred more billion dollars worth of R&D spread out between 37 companies that keep taking turns to buy each other out in hopes of making a trillion dollars on some slop somebody vomited out of the dark recesses of their souls over the next 59 years and it will get better I promise trust me bro

[–] A_Random_Idiot@lemmy.world 6 points 4 days ago

I admire this level of jaded. <3

[–] ShadowRam@fedia.io 3 points 4 days ago (1 children)

Upscaling will never, no matter how much AI and overhead you throw at it, create an image that is as good as the same scene rendered at native res.

That's already been proven false back when DLSS 2.0 released.

[–] A_Random_Idiot@lemmy.world 2 points 4 days ago (2 children)

No it hasnt, You are just regurgitating nvidia's marketing.

You can't stretch a picture and have it look just as good as natively rendering it at that higher resolution.

You can not create something from nothing. No matter how much AI guesswork you put into filling in the gaps, it will never be as good as just rendering it at the larger res. It will never look as good at the original resolution pre-AI stretching either.

[–] moonlight@fedia.io 9 points 4 days ago (1 children)

It's using information from multiple frames, as well as motion vectors, so it's not just blind guesses.

And no, it's not as good as a 'ground truth' image, but that's not what it's competing against. FXAA and SMAA don't look great, and MSAA has a big performance penalty while still not eliminating aliasing. And I think DLSS quality looks pretty damn good. If you want something closer to perfect, there's DLAA, which is comparable to SSAA, without nuking your framerate. DLSS can match or exceed visual fidelity at every level, while offering much better performance.

Frame gen seems like much more of a mixed bag, but I think it's still good to have the option. I haven't tried it personally, but I could see it being nice in single player games to go from 60 -> 240 fps, even if there's some artifacting. I think latency would become an issue at lower framerates, but I don't really consider 30 fps to be playable anyway, at least for first person games.

And yes, it has been used to excuse poor optimization, but so have general hardware improvements. That's an entirely separate issue, and doesn't mean that upscaling is bad.

Also I think Nvidia is a pretty anti-consumer company, but that mostly has to do with business stuff like pricing. Their tech is quite good.

[–] sp3ctr4l@lemmy.dbzer0.com 3 points 3 days ago* (last edited 3 days ago)

Eh... The latest versions of DLSS and FSR are getting much better image quality in stills...

But they also still are not as good image quality as actually rendering the same thing, natively, at full resolution, as was the quote you are disputing.

Further, the cards that can run these latest upscsling techs, to reach 4k60fps, 4k90fps, in very demanding games, without (fake) frame gen?

Its not as as bad with AMD, but they also don't yet offer as high calibre a GPU as Nvidia's top end stuff (though apparently 9080 XT rumors are starting to float around)...

But like, the pure wattage draw of a 5080 or 5090 is fucking insane. A 5090 draws up to 575 watts, on its own.

You can make a pretty high powered 1440p system if you use the stupendously high cpu performance per watt, high powered 9745hx or 9745hx3d cpu + mobo combos that minisforum makes... and the entire PSU for the entire system shouldn't need to exceed 650 watts.

... A 5090 alone draws nearly as much power as basically the one resolution step down system.

This, to me, is completely absurd.

Whether or not you find the power draw difference between an 'ultra 1440p' build and an 'ultra 4k' build ridiculous... the price point difference between the those pcs and monitors is... somewhere between 2x and 3x as expensive, and hopefully we can agree that that in fact is ridiculous, and 4k, high fidelity gaming remains far out of the reach of the vast majority of pc gamers.

EDIT:

Also, the vast majority of your comment is comparing native + some AA algo to... rendering at 75% to 95% and then upscaling.

For starters, again the original comment was not talking about native + some AA, but just native.

Upscaling introduces artefacts and innacuracies, such as smudged textures, weird ghosting that resembles older, crappy motion blur techniques, loss of lod style detail for distsnt objects, sometimes gets confused between HUD elements and the 3d rendered scene and warps them together...

Just because intelligent temporal upscaling also produces what sort of look like, but isn't actually AA... doesn't mean it does not have these other costs of achieving this 'AA' in a relatively sloppy manner that also degrades other elements of the finished render.

Its a tradeoff between an end result at the same res that is worse, to some degree, but rendered faster, to some degree.

Again, the latest versions of intelligent upscalers are getting better at getting the quality closer to a native render while maintaining a higher fps...

But functionally what this is, is an overall 'quality' slider that is basically outside of or on top of all of a games other, actual quality settings.

It is a smudge factor bandaid that covers up poor optimization within games.

And that poor optimization is, in almost all cases... real time ray tracing/path tracing of some kind.

A huge chunk of what has driven and enabled the development of higher fidelity, high fame rate rendering in the last 10 or 15 years has been figuring out basically clever tricks and hacks in your game design, engine design, and rendering pipeline, that make it so realtime lighting is only used where it absolutely needs to be used, in a very optimized way.

Then, about 5 years ago, most AAA game devs/studios just stopped doing those optimizations and tricks, as a cost cutting measure in development... because 'now the hardware can optimize automagically!'

No, it cannot, not unless you think all PC gamers have a $5,000 dollar rig.

A lot of this is tied to UE 5 being an increasingly popular, but also increasingly shit optimized engine.

[–] ShadowRam@fedia.io -3 points 4 days ago (2 children)

You are just regurgitating nvidia's marketing.

No, this is general not only general consensus, but it's measurably better when comparing SNR.

You can personally hate it for any reason you want.

But it doesn't change the fact that AI up-scaling produces a more accurate result than native rendering.

[–] sp3ctr4l@lemmy.dbzer0.com 4 points 3 days ago

No upscaling algo produces a 'more accurate' image than native rendering, that's absolute nonsense.

It produces a (slightly to significantly) lower quality, but same res image (significantly to slightly) faster than native res, but never 'better quality'.

The SNR of a native image is... 1, 100%, no loss.

The SNR of any upscaler is... some smaller number, there is always lost quality.

If you mean to say that intelligent temporal upscaling produces only slightly lower quality images a good deal faster than native, enabling a higher fps... than yes, I don't think anyone disputes that...

But the cost and literal wattage powerdraw of cards capable of doing that, with modern high fidelity AAA games, at 4k... such GPUs are essentially as expensive on their own as a well cost-performance-optimized 1440p entire PC.

The whole point of this tech was originally marketed as being able to enable high quality, high speed gaming at 4k, by essentially branching the proverbial tech tree ... and it hasn't worked, the result still is that such GPUs are still massively expensive and only attainable for a tiny % of significantly wealthy people.

[–] gazter@aussie.zone 5 points 3 days ago (2 children)

I don't understand. This isn't really a subject I care much about, so forgive my ignorance.

Are you saying that an AI generated frame would be closer to the actual rendered image than if the image rendered natively? Isn't that an oxymoron? How can a guess at what the frame will be be more 'accurate' than what the frame would actually be?

[–] Krudler@lemmy.world 3 points 3 days ago

"more accurate" comment shows he's talking out his ass.

[–] sp3ctr4l@lemmy.dbzer0.com 2 points 3 days ago* (last edited 3 days ago)

They did in fact say that, and that is in fact nonsense, verifiable in many ways.

Perhaps they misspoke, perhaps they are misinformed but uh...

Yeah, it is fundamentally impossible to do what he actually described.

Intelligent temporal frame upscaling is getting better and better at producing a frame that is almost as high quality as a natively rendered frame, for less rendering time, ie, higher fps... but its never going to be 'better' quality than an actual native render.

[–] 30p87@feddit.org 1 points 4 days ago

To concretize:

  • Upscaling with ML may make the image acceptable if you don't look at anything the devs don't want you. You're supposed to look at yourself, your objective/enemy and maybe a partner. Everything else, especially foliage, hair etc., looks like shit. Flimmery, changing with distance and perspective. Lighting is weird. Thing is, if we aren't supposed to pay attention, we could just go back to HL-level graphics. Even HL 1. However, this would break the aspect of stuff looking good, of you being able to enjoy looking around and not only feeling immersed, but like you are seeing something that you will never actually see in real life - not because it looks unrealistically artificial, but too beautiful, too crazy and too dreamy to be real. However, when I get literally sick by actually looking around, because stuff changes abstractly, not like my brain expects it, that takes away the only actual visual advantage over GoldSrc.
  • Frame Gen does not make sense for literally any group of people:
  • You have 20FPS? Enjoy even worse input lag (because in order to generate frame B between A and C, the generation algorithm actually needs to know frame C, leading to another 1/20 second delay + time to actually generate it) and therefore a nice 80FPS gameplay for your eyes, while your brain throws up because the input feels like =< 10FPS.
    • You have 40FPS and want 80-160FPS? Well, that might be enjoyable to anyone watching, because they only see smoother gameplay, meanwhile you, again, have a worse experience than 40FPS. I can play story games at 40FPS, no problem, but halving the input lag literally more than doubled? Fuck no. I see so many YouTubers being like: "Next we're gonna play the latest and greatest test, Monkey Cock Dong Wu II. And of course it has 50% upscaling and MFG! Look at that smooth gameplay!" - THE ONLY THING THAT'S SMOOTH IS YOUR BRAIN YOU MONKEY, I CAN LITERALLY SEE THE DESYNC BETWEEN YOUR MOUSE MOVEMENTS AND INGAME. WHICH WAS NOT THERE IN OTHER BENCHMARKS, GAMES OR THE DESKTOP. I'M NOT FUCKING BLIND. And, of course, worse foliage than RDR2. Not because of the foliage itself, but the flimmering between.
    • You have 60FPS? And you know that you can actually see a difference between 60 and 144, 240Hz etc.? Well that's wrong, you only notice the difference in input lag. So it's going to be worse than smooth now. Because, again, the input lag get's even worse with FG, and much worse with MFG.
    • You have 80 FPS but are a high FPS gamer needing quick reaction times and stuff? Well yeah, it's only getting worse with FG. Obviously.

And about 4k gaming ... play old games, with new GPUs. RDR2 works quite good on my 7800XT. More than 60FPS, which is very enough if you aren't a speedrunner or similar. No FSR, of course.

[–] sunzu2@thebrainbin.org -2 points 4 days ago (1 children)

Ray tracing is still prolly fecade away for mainstream too

Tech has been fake promises at least since covid

Nothing really changed practically for gaming since peak 2015 period.

Just more MTX and scamming

[–] A_Random_Idiot@lemmy.world 2 points 4 days ago

Oh, believe me, I agree.. I agree so hard thats a worthy of a whole different post, lol.

Its capable of making pretty screenshots. But ultimately its a pointless tax that serves no real purpose besides artificially increasing the price of GPUs. . . because what better way to increase the price of a GPU than to start tacking other features onto it, Right nVidia?