NVIDIA: Rainbow Six Siege Players Test NVIDIA Reflex and Two new DLSS Titles

Published by

Click here to post a comment for NVIDIA: Rainbow Six Siege Players Test NVIDIA Reflex and Two new DLSS Titles on our message forum
data/avatar/default/avatar04.webp
I think Nvidia will regret pushing this DLSS forward due to how well AMD perform at lower screen resolutions. AA always worked so well on AMD hardware there wasn't any need for them to worry.
data/avatar/default/avatar21.webp
Ray tracing. This is where DLSS kicks ass. AMD has nothing to compete with the combo, and is why I purchased a 3080 over a 6800XT. Maybe RDNA3 will compete, but for now AMD is still a generation behind.
data/avatar/default/avatar05.webp
Martin5000:

I think Nvidia will regret pushing this DLSS forward due to how well AMD perform at lower screen resolutions. AA always worked so well on AMD hardware there wasn't any need for them to worry.
This makes no sense ... How exactly would Nvidia regret pushing an exclusive, hardware-accelerated feature forward when their competitor has nothing to compete with it yet and requires a new architecture to pull it off at the same magnitude? And how is that related to how well AMD handle traditional AA?
data/avatar/default/avatar39.webp
Regret using DLSS? What are you schmoking man? DLSS is the next best thing in gaming! When I see game that support DLSS I know I can count on playable framerates.
data/avatar/default/avatar34.webp
cpy2:

Regret using DLSS? What are you schmoking man?
he's definitely been schmoking shomting. I mean Jesus Christ, if you gonna drop bombs, drop one at a time, give us time to collect the dead at least.
Martin5000:

I think Nvidia will regret pushing this DLSS forward due to how well AMD perform at lower screen resolutions. AA always worked so well on AMD hardware there wasn't any need for them to worry.
1. https://abload.de/img/down32loadvgjqt.jpg 2. https://abload.de/img/down32loadvgjqt.jpg
https://forums.guru3d.com/data/avatars/m/85/85047.jpg
I like to read this : " Rainbow Six Siege players can check out NVIDIA Reflex bla, bla, bla, and a pair of new games are shipping with NVIDIA CARD" . Free Shipping! o_O
data/avatar/default/avatar37.webp
Fox2232:

You can count on playable framerates by having 1440p screen and rendering 720p, too. Or rendering 1080p resolution on 4K screen. If anything, they should have implemented DLSS into R6:Siege as option. In times I was playing it, game used toxic levels of of blur with TAA and even DLSS 1.x would have shot there to be perceived as improving image quality.
DLSS 2.0 looks way better then rendering 1080p on a 4K screen. I play at 4K and there really is no comparison between the 2 except in performance. Also, R6 isn't getting DLSS, it's getting Reflex.
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
cpy2:

Regret using DLSS? What are you schmoking man? DLSS is the next best thing in gaming! When I see game that support DLSS I know I can count on playable framerates.
This is why I laugh so effin hard at DLSS clowns.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Martin5000:

AA always worked so well on AMD hardware there wasn't any need for them to worry.
uh.... eh.. no.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
I like dlss , the way i see it is if you can run on target resolution natively the fps you target you keep it closed if not before you drop the resolution scaling you instead open dlss ... It is better than dropping the resolution scaling is it not ?
data/avatar/default/avatar13.webp
Astyanax:

uh.... eh.. no.
You see warriors! You see what happens when ppl engage in the middle of discussion w/o reading how it all started 🙂
https://forums.guru3d.com/data/avatars/m/282/282603.jpg
typical NaughtyVidia call center answers as always... just skip forward
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Martin5000:

I think Nvidia will regret pushing this DLSS forward due to how well AMD perform at lower screen resolutions. AA always worked so well on AMD hardware there wasn't any need for them to worry.
"NVIDIA will regret pushing an exclusive technology that gives them an almost unfair performance advantage, that AMD has zero answer for". BTW, I didn't believe the DLSS deniers too much before, but after getting a card that supports it, LOL. There is zero reason to play with it closed, and most titles actually have even a setting if you're that "sensitive". You will either get better IQ at better performance, or slightly lower IQ at much much better performance. There is zero reason for games not to have this.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Noisiv:

You see warriors! You see what happens when ppl engage in the middle of discussion w/o reading how it all started 🙂
Except i read from the beginning, and your argument is still just underdog shilling, out of the two vendors Nvidia is the only one that pushed the envelope on Antialiasing techniques whilst AMD settled for ....MLAA :| AMD you got MSAA up to 8x and when that couldn't be used anymore, MLAA :| Nvidia you got MSAA, Combined, SSAA, Colour Samples, FXAA (and FXAA derived into SMAA), under OpenGL you got up to 16x (since Geforce FX) via Combined 4x(2x2). Not even listing the super resolution capabilities yet.
data/avatar/default/avatar04.webp
Astyanax:

Except i read from the beginning, and your argument is still just underdog shilling, out of the two vendors Nvidia is the only one that pushed the envelope on Antialiasing techniques whilst AMD settled for ....MLAA :| AMD you got MSAA up to 8x and when that couldn't be used anymore, MLAA :| Nvidia you got MSAA, Combined, SSAA, Colour Samples, FXAA (and FXAA derived into SMAA), under OpenGL you got up to 16x (since Geforce FX) via Combined 4x(2x2). Not even listing the super resolution capabilities yet.
He's being sarcastic, it's so obvious ... sheesh.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Fox2232:

Sorry, but you have to get your statements corrected. DLSS does not in any version provide better image quality than native rendering while providing better performance. It is not designed that way. Then you misunderstand how temporal part of DLSS works.
I understand perfectly fine, you seem to miss how this sort of neural network works. For the fine details, you literally see what the network "guesses" is there, according to the previous frames and training. There are examples (in Wolfenstein and Control), where DLSS at 1440p is actually more detailed than the native 4k render. From the Eurogamer article:
Temporal ghosting is massively cut back, while break-up on sub-pixel detail and transparent textures are reduced to a minimum. Impressively, we were able to find examples of the new DLSS 2.0 at 1080p delivering improved visual quality over the older version running at 1440p. Just as we saw in Wolfenstein: Youngblood, the new DLSS is also capable of measuring up nicely to native resolution rendering, even if the core image is actually built up from just 25 per cent of the overall pixel count. There are a couple of interesting effects delivered by DLSS. On inner surfaces, textures appear to deliver more detail - sometimes even more than native resolution (and to be fair, sometimes less, though you need extreme magnification side-by-side shots to tell). It's not just a factor of contrast adjustment or sharpening either. On the rock wall ahead of Jesse in the very first playable scene in Control, single pixel detail on reflective elements of the rock wall shine with DLSS when they don't with native rendering. Remember that DLSS is a replacement for the temporal anti-aliasing found in many games - and TAA does tend to add some level of blur that DLSS does not.
From my own eyes: Control at 1440p DLSS looks better than Control at 4k. That's with a huge display (65") at 2160p120 native, with Full range RGB and 12bit pixel depth. Meaning that any reason that might create any weirdness or artifacts is removed. DLSS is not perfect, but 99,9% of people wouldn't even notice it from 1080p to 4k (I have tried with my wife and friends), and from 1440p they all told me it looks "better". It's the same in Cyberpunk between native 4k and DLSS Balanced/Quality. DLSS looks definitely better.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Fox2232:

So you do not know. There is huge difference between running 50fps and 200fps. Because one has temporal information that's 20ms old and other has temporal information that's 5ms old. Are you still having trouble comprehending? Move in 3d game forwards, backwards, strafe to sides, turn around. And now think how information changes on projection plane of viewport. And which situation causes particular type of distortion or missing information.
This is all nice in theory (if you think that the people making this are morons who didn't think of this), but it's also wrong from what me (and others with the actual thing in our hands) can see with our own eyes. You can also see it in screenshots. EDIT: The really negative thing about this is that both Microsoft and AMD were caught with their pants down and didn't expect it. All GPUs need something like this, and all APIs need standardized methods of using it. Perhaps the AMD/Intel neural network is not as good, for example, but it needs to be there. NVIDIA are already playing by themselves (there is literally zero reason to buy any other GPU at this point), this makes it even worse.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Fox2232:

Yeah, lets throw into garbage whole field of information theory. Entire temporal part of DLSS is subjected to same rules as is video compression when it gets to motion in space and density of information over time. Not that I expect you to understand why you need much higher bitrate in different scenarios described to achieve same resulting image quality in video stream.
Speaking about throwing entire sections of science into garbage, it's clear you don't understand how this type of NN works. The temporal information is only a part of the puzzle. In fact, the more frames it has, the better it works. Video streams are also lossy, this processing is not. You are not seeing the result of a video stream, in fact you are seeing what is closer to a shader than anything else. Bitrate is completely irrelevant in this scenario. I actually wonder how you can participate in this conversation at all, and we take you seriously when talking about "bitrate" in this situation (in any context). Also you are ignoring reports of people who have actually seen how DLSS does what it does. Bitrate would only be relevant in video comparisons. You are basically disputing every person who has seen this, and expert reviewers on top. I will post this video in case someone else following this thread wants to learn anything, as it is 100% certain you will not see it, yet you will keep talking as if you had. [youtube=YWIKzRhYZm4] Check around 7:18
https://forums.guru3d.com/data/avatars/m/280/280231.jpg
So neural networks will take our jobs. AI is the future worker unless you do work for corporations or government. Native image loses to calculated ones. That's the moral of today's lesson.