Edge of Eternity developer shares thoughts on FSR, DLSS

by Mark Tyson on 6 July 2021, 12:11

Tags: AMD (NYSE:AMD), NVIDIA (NASDAQ:NVDA)

Quick Link: HEXUS.net/qaeqta

Add to My Vault: x

We have talked in quite some depth about the progress of Nvidia DLSS and the emergence of AMD FidelityFX Super Resolution (FSR). The former caused such a splash that the editor tested it when it was box fresh, talking through the technology, the modes on offer, and observed performance implications on both AMD and Nvidia GPUs in various games / screen resolutions. You can read his conclusions of the tech as it worked on day 1, here.

Several other tech sites, tech forums, and YouTubers have spent considerable time weighing up the respective merits of DLSS and FSR. The problem, so far, is that no games support both DLSS and FSR, so it has not been possible to compare the technologies directly in a practical example – a shipping game. Usually reviewers just compare the various levels of DLSS/FSR quality with native. Sometimes it is compared to a game's built in resolution scaling settings too.

Two games on the list of upcoming AMD FSR titles, which also support Nvidia DLSS, are Edge of Eternity and Necromunda: Hired Gun. The developers of the former have just completed their FSR implementation and shared some thoughts on the upscaling tech comparison with WCCF Tech.

One of the first questions asked by WCCF Tech's Alessio Palumbo regarded the DLSS/FSR comparison in terms of its implementation in the Unity-engine game. The devs said in response that "implementing DLSS was quite complex to integrate into Unity for a small studio". In contrast, adding FSR was "very easy… it only took me a few hours". It was noted, however, that FSR worked much better with a change from the built-in Unity TAA with the open source AMD Cauldron TAA. Thus, while FSR is very easy to implement, it does require some thought to get the most out of it – something that is likely beyond user-tweaking like the recent GTA 5 with FSR mod.

The second question from WCCF Tech concerned the observed quality difference in using the DLSS/FSR implementations. Importantly, the dev said that "both technologies give amazing results, and I have a hard time seeing differences between them." In some further preference detail, the dev said that he would prefer to use FSR on 4K monitors, but with 1080p or even 720p gaming that DLSS "gives a better result". Interestingly, the developers have tested the GPD Win3 (with Intel APU) using FSR to upscale to 720p and claim that the result was very similar to native gameplay but at a smooth 60fps.

Going forward, the Edge of Eternity developers reckon that FSR is going to be more commonly integrated in upcoming games on PC and consoles, while DLSS might find more favour with big studios and AAA games titles.



HEXUS Forums :: 12 Comments

Login with Forum Account

Don't have an account? Register today!
This may come down to using the wrong engine for DLSS up to now. Unreal is ahead on support for this IRC, so maybe his experience is more about unity than DLSS being difficult. We will have to wait to see what unreal people say about supporting both first IMHO, because at this point we have too many questions. Unreal was first, and I'd like to hear their devs opinions about difficulty vs. FSR. There are more devs on unreal with experience with DLSS, and if there are things that get easy after learning curves, that would be important to know.

https://www.engadget.com/unity-nvidia-dlss-support-game-engine-164535941.html#:~:text=Unity%20will%20add%20native%20NVIDIA%20DLSS%20support%20to,in%20games%20with%20little%20work%20needed%20for%20developers.
Apr21, says adding it native with little work from devs, which kind of says this guy probably was using it half working, or more difficult to optimize still. Maybe his next project would be easier with better NATIVE support from day1. That article says available to DEVS by the END OF THE YEAR. So, I don't think you can take any info from a UNITY dev working with bad tools as legit DLSS info. The unity engine dev themselves says native wont' even be ready yet for months. Of course it is tough to work with crap tools that aren't native in the engine yet. The BF5 devs said it took a few weeks to optimize though they said they had much more they thought they could wring out over time (and they did get more with patches).

Unity devs shouldn't comment on DLSS until it's NATIVELY supported, which looks like Christmas or so. They say DLSS can be turned on with a few clicks at that point (unity said this). That sounds easy compared to what is described by the dev in this article today. I wasn't too impressed with 1.0. But I sure hope most of my future gaming has DLSS2.0 included. It's free perf that lets me hang on to my next card even longer.
This thing is not about FSR or DLSS… this is about setting the standards and going after that, as long as there is not standards we will keep on having these weird solutions for everything.

Enforce standards, so developers know what to go at… all these special features from whatever team is basically a waste of time.
A lot of developers are biased towards Team Red or Team Green - so it's nice hearing these comments from a smaller indie dev team that are likely free from the influence of the GPU vendors.

The comments about the development effort needed are telling - as the majority of studios now use third party engines, Unreal / Unity / ID Tech Engine / etc, rather than starting from scratch.

Looking forward to being able to run some comparisons on my own rig in the future - always good to see the results first-hand.
QuorTek
This thing is not about FSR or DLSS… this is about setting the standards and going after that, as long as there is not standards we will keep on having these weird solutions for everything.

Enforce standards, so developers know what to go at… all these special features from whatever team is basically a waste of time.

That's pretty much what AMD are angling at… offering something similar to NVidia's proprietary DLSS technology, but making it open enough to use on any vendor's GPU (so long as it has enough grunt). Similar to what they did with FreeSync vs NVidia's proprietary (and expensive) GSYNC technology.
KultiVator
A lot of developers are biased towards Team Red or Team Green - so it's nice hearing these comments from a smaller indie dev team that are likely free from the influence of the GPU vendors.

The comments about the development effort needed are telling - as the majority of studios now use third party engines, Unreal / Unity / ID Tech Engine / etc, rather than starting from scratch.

Looking forward to being able to run some comparisons on my own rig in the future - always good to see the results first-hand.

It was quite interesting seeing their comments on 1080p vs 4k about which tech they would rather be utilising at those resolutions.

Hopefully, this will lead to more objective comparisons between the two because although both are wildly different in execution, they are both attempting to serve the same purpose which is to get more out of your system without drastically degrading quality.