3DMark feature test adds DLSS 3 support

Published by

Click here to post a comment for 3DMark feature test adds DLSS 3 support on our message forum
https://forums.guru3d.com/data/avatars/m/258/258688.jpg
Kind of interesting, I was doing some back and forth with the UL tech support (which is excellent, btw) about benchmark tessellation load errors invalidating the results, and asked them why no FSR support in the bench when they support nVidia and even XeSS from Intel, oddly enough--but nothing from AMD. I asked when they might include it. The reply I got was: "We cannot comment on FSR support." Any insight into that situation would be appreciated! If you are running the latest Adrenalins (from several versions back, on) and you keep getting 3dMark "Results invalid due to modified tessellation load" errors--but you keep setting the Adrenalins to "Application" thinking that is the "default" setting for the benchmark tessellation, like I had been doing for the longest, the solution is to make sure that tessellation is set for "AMD Optimized"--and all of my benchmark results finish up as valid! Darnedest thing, I had been uninstalling the drivers using DDU and using the driver internal reset function--all to no avail. I still got that error in every bench. Then I noticed that the default Tessellation setting immediately after a driver install was for "AMD Optimized." So...apparently for 3dMark, the Adrenalin driver optimization tells the 3dMark benchmark Application that "AMD Optimized" tessellation is actually the default "Application" setting!... I do remember, however, that the default tessellation setting for 3dMark benches to provide error-free results used to be "Application"--because that's how I set it. I can repeat it at will. Run the bench with the setting "Application" and I get the Invalid results because of modified tessellation error. Change the setting to "AMD Optimized" and all of the benches come out as Valid. Anyone else notice this? I don't think I'm quite ready to follow Biden into senile bliss... Quite yet, anyway!
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Yeah add a fake frames to a 3dmark benchmark. It should be solely based on rasterized performance. These people are idiots.
data/avatar/default/avatar20.webp
nvidia think their customers are idiots.
https://forums.guru3d.com/data/avatars/m/294/294824.jpg
Undying:

Yeah add a fake frames to a 3dmark benchmark. It should be solely based on rasterized performance. These people are idiots.
Haha exactly, why are they benchmarking this shit on Fake Frames... It should be false advertising. Injecting a guesstimate image that looks like shit and then going, yep we still added another frame. DLSS 3.0 also adds a lot of latency, you don't want to be using that for competitive games.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Undying:

Yeah add a fake frames to a 3dmark benchmark. It should be solely based on rasterized performance. These people are idiots.
Meathelix1:

Haha exactly, why are they benchmarking this crap on Fake Frames... It should be false advertising. Injecting a guesstimate image that looks like crap and then going, yep we still added another frame. DLSS 3.0 also adds a lot of latency, you don't want to be using that for competitive games.
uhh it's a feature test. It's a test specifically for testing DLSS. They have an XeSS version too. It's like asking why raytracing is enabled in the raytracing test.
Martin5000:

nvidia think their customers are idiots.
...lol
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Denial:

uhh it's a feature test. It's a test specifically for testing DLSS. They have an XeSS version too. It's like asking why raytracing is enabled in the raytracing test.
I don't see how being a feature validates this. DLSS and XeSS are just advanced ways of minimally lowering visual fidelity to significantly improve performance. If those can be enabled, why not just lower other detail settings while you're at it? There are other graphical settings that, depending on what's being rendered, have a minimal visual impact but a substantial performance impact. I've never used 3DMark but I assume you can't change any graphical settings other than resolution, so how does it make sense to allow this? Seems like a real double-standard to me. If it tracks the scores in separate tables then I guess that's okay (still seems rather dumb to me) but if these results can be mixed in against systems where DLSS/XeSS are not enabled then as far as I'm concerned, that's practically cheating. As for raytracing, that's improving visual fidelity, so it's totally fair to force enable it. Realistically, 3DMark should be a benchmark, a stress test, and effectively a compliancy test. If your GPU isn't compatible, then it shouldn't qualify to enter. If your GPU ranks at the bottom of the list because it wasn't optimized for DXR, sucks for you. Exceptions shouldn't be made, whether that be "boo hoo my GPU isn't good at this one particular thing" or "b-b-but if I lower just this one detail setting, look how much better it gets!" while ignoring everyone else who doesn't have to do those things and still gets good results.