UL Adds Variable-Rate Shading benchmark To 3DMark Suite

Published by

Click here to post a comment for UL Adds Variable-Rate Shading benchmark To 3DMark Suite on our message forum
data/avatar/default/avatar04.webp
A short review would be nice - some screen shot and fps comparisons with it on/off so we can see what it does? Would be nice to know if it's something worth turning on when games start to support it.
https://forums.guru3d.com/data/avatars/m/220/220214.jpg
"NVIDIA Turing-based GPU"? So it only works on 20x0 series and not 10x0 series NVidia cards?? That's not much use then is it.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
geogan:

"NVIDIA Turing-based GPU"? So it only works on 20x0 series and not 10x0 series NVidia cards?? That's not much use then is it.
Why is it not much use? Because you can't use it? lol.. It's part of DX specification and every generation of GPUs going forward will have it - what exactly is the problem with that?
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Fox2232:

Not impressed by video and implementation at all. Their reduction of IQ was not content based, but distance based. And that's as bad as it can get because I want extra details on far away objects. Then they used 2x2 rendering resolution reduction instead of using 2x2 shading reduction while keeping texture at 1x1 to original. That's 2nd worst thing they can do, because it practically turns affected objects into 1/2 resolution. So applied on 4K screen, those trees were practically rendered as on 1080p. Who here fancies 32'' 1080p screen? No, nobody wants their large screen to render at 1080p? UL leading 3DMark to...
I thought so at first, but looking at the vid, you can still see the trees in the distance pretty well, but who is going to notice the fine details on the bark from a distance? This isnt your usual DOF setting in games. Still I would like to see this in actual use in a game before judging. So which game has it other than Wolfenstein Youngblood (which I am not interested in, lol).
https://forums.guru3d.com/data/avatars/m/220/220214.jpg
Denial:

Why is it not much use? Because you can't use it? lol.. It's part of DX specification and every generation of GPUs going forward will have it - what exactly is the problem with that?
Nothing apart from the fact that no games developer will ever use it for years, if it is restricted right now to only the top-end newest cards from one manufacturer.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
geogan:

Nothing apart from the fact that no games developer will ever use it for years, if it is restricted right now to only the top-end newest cards from one manufacturer.
Intel 11th gen iGPUs support it and presumably their upcoming discreet as well. It's also available on the entire Turing line up including the lower end models. It's also a fairly simple feature to add in the way UL is using it here.