Upscaling and Frame Generation are disasters meant to conceal unfulfilled promises from GPU makers for 4k gaming, and as a coverup for the otherwise horrible performance some modern games have, even at 1080/1440p resolutions.

Upscaling will never, no matter how much AI and overhead you throw at it, create an image that is as good as the same scene rendered at native res.

Frame Generation is a joke, and I am absolutely gobsmacked that people even take it seriously. It is nothing but extra AI frames shoved into your gameplay, worsening latency, response times, and image quality, all so you can artificially inflate a number. 30FPS gaming is, and will always be, infinitely better as an experience, than AI frame doubling a 30fps experience to 60FPS.

and because both these technologies exist, game devs are pushing out less optimized to completely unoptomized games that run like absolute dogshit, requiring you to use upscaling and shit even at 1080p just to get reasonable frame rates on GPUs that should run it just fine if it was optimized better (and we know its optimization, because some of these games do end up getting that optimization pass long after launch, and wouldnt you know… 9fps suddenly became 60fps)

  • sp3ctr4l@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    5 days ago

    Eh… The latest versions of DLSS and FSR are getting much better image quality in stills…

    But they also still are not as good image quality as actually rendering the same thing, natively, at full resolution, as was the quote you are disputing.

    Further, the cards that can run these latest upscsling techs, to reach 4k60fps, 4k90fps, in very demanding games, without (fake) frame gen?

    Its not as as bad with AMD, but they also don’t yet offer as high calibre a GPU as Nvidia’s top end stuff (though apparently 9080 XT rumors are starting to float around)…

    But like, the pure wattage draw of a 5080 or 5090 is fucking insane. A 5090 draws up to 575 watts, on its own.

    You can make a pretty high powered 1440p system if you use the stupendously high cpu performance per watt, high powered 9745hx or 9745hx3d cpu + mobo combos that minisforum makes… and the entire PSU for the entire system shouldn’t need to exceed 650 watts.

    … A 5090 alone draws nearly as much power as basically the one resolution step down system.

    This, to me, is completely absurd.

    Whether or not you find the power draw difference between an ‘ultra 1440p’ build and an ‘ultra 4k’ build ridiculous… the price point difference between the those pcs and monitors is… somewhere between 2x and 3x as expensive, and hopefully we can agree that that in fact is ridiculous, and 4k, high fidelity gaming remains far out of the reach of the vast majority of pc gamers.

    EDIT:

    Also, the vast majority of your comment is comparing native + some AA algo to… rendering at 75% to 95% and then upscaling.

    For starters, again the original comment was not talking about native + some AA, but just native.

    Upscaling introduces artefacts and innacuracies, such as smudged textures, weird ghosting that resembles older, crappy motion blur techniques, loss of lod style detail for distsnt objects, sometimes gets confused between HUD elements and the 3d rendered scene and warps them together…

    Just because intelligent temporal upscaling also produces what sort of look like, but isn’t actually AA… doesn’t mean it does not have these other costs of achieving this ‘AA’ in a relatively sloppy manner that also degrades other elements of the finished render.

    Its a tradeoff between an end result at the same res that is worse, to some degree, but rendered faster, to some degree.

    Again, the latest versions of intelligent upscalers are getting better at getting the quality closer to a native render while maintaining a higher fps…

    But functionally what this is, is an overall ‘quality’ slider that is basically outside of or on top of all of a games other, actual quality settings.

    It is a smudge factor bandaid that covers up poor optimization within games.

    And that poor optimization is, in almost all cases… real time ray tracing/path tracing of some kind.

    A huge chunk of what has driven and enabled the development of higher fidelity, high fame rate rendering in the last 10 or 15 years has been figuring out basically clever tricks and hacks in your game design, engine design, and rendering pipeline, that make it so realtime lighting is only used where it absolutely needs to be used, in a very optimized way.

    Then, about 5 years ago, most AAA game devs/studios just stopped doing those optimizations and tricks, as a cost cutting measure in development… because ‘now the hardware can optimize automagically!’

    No, it cannot, not unless you think all PC gamers have a $5,000 dollar rig.

    A lot of this is tied to UE 5 being an increasingly popular, but also increasingly shit optimized engine.