Codemasters F1 2020 Adds NVIDIA DLSS Support For Increased FPS

Published by

Click here to post a comment for Codemasters F1 2020 Adds NVIDIA DLSS Support For Increased FPS on our message forum
https://forums.guru3d.com/data/avatars/m/235/235224.jpg
More games the better, I hope DLSS makes it into every game.
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
Fox2232:

Well, they surely are confident to post graphs and numbers. But what about posting native screenshot vs. DLSS screenshot with settings that result in same frame rate?
Honesty dlss is amazing bit of tech. Unless you stop moving and zoom into things are really really look you aren't going to see the difference between 4k native and dlss. Specially when you're playing the game you won't notice it, you're more likely to notice the extra fps and smoothness than some detail lost. Digital foundry have done a lot of videos on it and every time it's very impressive, just wish it could be done on all games not just some
https://forums.guru3d.com/data/avatars/m/178/178348.jpg
Add VR support and i'll buy.
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
Fox2232:

If that 4K screen is 27'', then yes. But 4K is waste of pixels on anything under 32''. And you are wrong. I am yet to see 1080p game with DLSS that looks better than upscaled 720p. Same applies to 4K DLSS for now. It looks like upscaled 1440p. So reason why people may think it looks good on 4K is because they have screen size where 4K gives tiny visual improvement over 1440p anyway. Or they sit far enough from screen for pixels to be small enough. Anyway, people should enjoy their 8K DLSS. It is their race for fake pixel density. I race for pixel fidelity. It is funny how quickly people forgot they were downsampling to get additional details. And started upsampling to have worse IQ for sake of performance because of "inadequate HW for the job" situation which they accepted with RTX cards. I remember times where people were like: AA equivalent of 32x SGSSAA. Now people are: 2 pixel wide blur over everything, ... but few more fps on card that under-performs without it for price it came at.
Never mentioned screen size, not 1080p so odd to bring that up. But massively disagree no matter what video i look at, the only time I see a difference is when they 100x zoom into something, like i'm going to do that whilst playing. And the games in motion, you are going to notice even less. Native will and always will be better thats obvious, but it is insane impressive what they can do with DLSS. Now the games used already looked good and pretty and all that, what i am curious about and wondering if it will be rolled out, is using this AI DLSS to push older games in terms of resolution, many old games cap at 1200p or 1080p and some even lower, it will be interesting to see if these older games can be forced to run higher resolution than what was possible. Cause a lot of older games don't allow it or might not have mods in INI files to mess around in. But that can be a debate for another day. Personally for me either on my 55inch of 27inch, I cannot tell you whilst playing I see a difference unless you stop what you are doing an zoom into something, even then its normally very minor. I agree with you on pixel fidelity being a bigger thing to go for, but having these mid steps are nice and dont hurt anyone. Same way I don't have an issue with checkboarding, would be nice if everyone had 3080ti's and pumped out 240hz at 8k but that is not the case
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
BReal85:

You can get the same fps jump usually by turning settings to High from Ultra: not noticing differences in movement, while getting the same extra fps.
Mostly agreed, but depends what settings, and how difference in games is the high to ultra. Normally stuff like LOD and popin is more noticable between the two that I can think of. But personally I am all up for extra features, they might suit some games better than others, Nvidia control panel is lovely to have and use for older games where you can force AA or AF even though the game didn't have it natively
data/avatar/default/avatar14.webp
Fox2232:

Videos are very good at removing fine details. And most comparisons you are ever going to see are and will be between low quality and blurry TAA vs DLSS. In such scenario, neither is winner, both are losers. DLSS just has better performance while being blurry. If you want better IQ in older game on higher resolution, just inject CAS. DLSS is good upscaling method, but it comes at cost. Would they use it on native to prevent shimmering while replacing TAA, it would be good. It would not deliver any extra performance, but it could be best AA method out here. Enhanced with something like CAS, and IQ would touch about maximum possible for given game. (Matching high quality downsampling in all areas except high distance LoD.)
LOL! You sound like the biggest AMD fanboi that is just super butt-hurt over tech your card can NOT provide you. It's so super obvious how seathing you're coming across... Try harder to sound like you are actually being un-biased next time... It's really too obvious. And btw... Digital Foundry have un-compressed video of their analysis of DLSS 2.0. You're dead wrong on every front of your argument. I'm done here.
data/avatar/default/avatar01.webp
KingK76:

LOL! You sound like the biggest AMD fanboi that is just super butt-hurt over tech your card can NOT provide you. It's so super obvious how seathing you're coming across... Try harder to sound like you are actually being un-biased next time... It's really too obvious. And btw... Digital Foundry have un-compressed video of their analysis of DLSS 2.0. You're dead wrong on every front of your argument. I'm done here.
I'm happy you are done here, those attacking posts are nonsense and a waste of time.
data/avatar/default/avatar03.webp
Regarding DLSS i still think that either i can put on MSAA 4x or i prefer to have it off, no TXAA or similar for me.
https://forums.guru3d.com/data/avatars/m/108/108341.jpg
Fox2232:

Well, they surely are confident to post graphs and numbers. But what about posting native screenshot vs. DLSS screenshot with settings that result in same frame rate?
Not sure if I'm missing something (I read the article on Geforce.com and the article here) but I see no mention of DLSS 2.0, which leads me to believe that a 2020 game somehow ended up with a DLSS 1.x implementation. Could that be the reason they confidently roll out bars but lack comparison screenshots? I could just being reading WAY too much into a common press release, but leaving out the very important detail of what DLSS version they are using seems odd to me...
https://forums.guru3d.com/data/avatars/m/263/263710.jpg
Ricepudding:

Honesty dlss is amazing bit of tech. Unless you stop moving and zoom into things are really really look you aren't going to see the difference between 4k native and dlss. Specially when you're playing the game you won't notice it, you're more likely to notice the extra fps and smoothness than some detail lost. Digital foundry have done a lot of videos on it and every time it's very impressive, just wish it could be done on all games not just some
So, is it still 4K?? What users want to know is how many FPS they are sacrificing when using DLSS. The real question is, how is Nvidia going to deal with conflicting AI training results on the large scale? Typically AI training is an iterative process where you feed the AI data, it outputs results, and you adjust parameters according to the results. This is called a training step. You repeat this process until you get desirable results. The problem being that during the validation test, I don't see how Nvidia can test the adjusted parameters against all video games. It's also begs the question of how Nvidia will deal with conflicting training results. Heard that flickering is still an issue. Most likely, they could train the AI to fix that issue but it would likely come at the cost of something else. That's just across 2 games. Imagine for a second that the training model is running on a game and it causes all geometry to flicker. Entirely possible unless Nvidia are training each iteration of the AI against every game.
https://forums.guru3d.com/data/avatars/m/256/256969.jpg
In my dream DICE would still provide support for BF5 and implement the updated DLSS. At last 4k60 with ray tracing would be possible in this game without turning the game into a blurry mess.
data/avatar/default/avatar22.webp
Fox2232:

Hilbert, are you sure, you did not put DLSS vs TAA in reverse?
https://img.guru3d.com/compare/f1-2020/taa.png
https://img.guru3d.com/compare/f1-2020/dlss.png
Those are 1440p and even downscaled to my 1080p, it was clear that TAA is less blurry. Having them shown 1:1 (pixel to pixel), difference is rather big. DLSS should be winning against TAA, as TAA's nature is being blurry.
How do you know that DLSS should be winning? Because it did so in a few games? Doesn't mean it's universally better across the board. Not every TAA implementation in every game is the same. F1 games tend to have clean crisp look and it's obvious in such games DLSS is inferior for picture quality.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
ATAA when, nvidia has had an experimental form of it in the works for a while.
https://forums.guru3d.com/data/avatars/m/189/189438.jpg
If ACC and The Division2 get dlss at some point then i wont need to upgrade my 2080s, unless dlss 2.1 is limited to the 3xxx and is not backwards compatible with dlss2.0 enabled games. I will be getting f1 2020 in the next few weeks when i finish building my sim rig which has its own 32" 1440p hdr 75hz screen, the extra fps and higher detail will look great without having to shell out for a new card. https://scontent-lhr8-1.xx.fbcdn.net/v/t1.0-9/119097463_10217320251547011_8719685247839544384_o.jpg?_nc_cat=111&_nc_sid=730e14&_nc_ohc=FrGtCoVEF4wAX9H3k8W&_nc_ht=scontent-lhr8-1.xx&oh=a408b58690edaccf2de001f09e9b22a7&oe=5F7FEE18