NVIDIA releases some RTX 2080 performance numbers and some info on DLSS

Published by

Click here to post a comment for NVIDIA releases some RTX 2080 performance numbers and some info on DLSS on our message forum
data/avatar/default/avatar33.webp
Noisiv:

bla bla 🙂
Assuming they run DLSS on top of normal AA. Which very well might not be the case Because: ''We’ve seen it in motion and it’s really impressive. And the best part is that the AI hardware inside the Turing GPU actually means it can also boost performance, sometimes by over 50%. That’s a hell of a double win.'' https://forums.guru3d.com/threads/the-rtx-2080ti-thread.422587/page-10#post-5576818
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
meh I want one but not for the price, by time GPU can do 4k @ 60 comfortable they will move to 8k, which is aready poping up, and most card cant do 4k @60 not cost over 500$ and most streaming/tv service cant even do 1080p properly let alone 4k feeds
https://forums.guru3d.com/data/avatars/m/115/115710.jpg
Fox2232:

So, there is certain heavy level of TAA in this scenario to guarantee performance gain for 2080 cards which do TAA faster. And then they replaced TAA with DLSS for 2080 which delivers close to same results as not running AA at all. So 2 questions remain: Hows does run that 1080 w/o TAA? How does run that 1080 w/ DLSS? Because on one of those comparison images floating over web, DLSS runs quite faster on Pascal than TAA. So theoretically 1080 with DLSS does around same as 2080 with TAA.
Apples to oranges. DLSS is more like AI assisted upsampling which allows you to run higher resolution at much higher speed, because they use AI to generate higher resolution image from lower one.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Assuming this is legit, these tensor cores are proving to be a lot more valuable for consumer-grade software than I ever would've expected. However, seeing as these results came straight from Nvidia, I'll take them with a grain of salt. I never take any charts that come directly from the manufacturer seriously. I'm glad this series isn't as boring as everyone predicted. I know AMD is planning on some pretty hefty changes but I'm not quite sure how they're supposed to compete with this. I hope they do though, because the price point of these new GPUs is unattractive to me. I'm not upgrading until 4K-capable GPUs don't cost more than the rest of my PC combined (at MSRP), including the GPU that's already in there...
https://forums.guru3d.com/data/avatars/m/273/273822.jpg
ezodagrom:

His previous rumour had the RTX 2070 with 7GB of VRAM, and had nothing about a RTX 2080 Ti.
yeah, he likes to contradict himself.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
OrdinaryOregano:

Was it enabled outside of the game's own settings? I mean to ask, how have you determined that it's not game specific, it's AI based so I imagine it needs to be trained on ground truth?
It is an algorithm, the setting, in the end, will be available in the NV driver properties with a slider. Currently 2xDLSS equals (roughly) TAA. It's not a 100% perfect supersampling AA technology, but it's pretty good I must say. Considering you run them on the tensor cores, the shader engine is offloaded. So you're rendering a game with the perf of no AA, as DLSS runs on the tensor cores.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
SmootyPoody:

Every AI graphic solution I know of has artifacts
So does every AA methodology. Remember DLSS is AA at pretty much no additional cost in the rendering engine.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
asturur:

So tensor cores are not a quadro exclusive? we get them too?
That I can confirm, yes. You get the shader cores, RT cores and Tensor cores enabled.
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
Still waiting ton third Party, but by looks of it without the DLSS, its similar performance to a 1080ti, which isn't that impressive...Though with DLSS it seems to be a lot stronger, big question is though is DLSS worth it? and specially at 4k, where i believe AA isn't as important how much should it be valued. Still i look forward to seeing how a 2080ti performs, but for the price tag i sure hope it doubles the 1080ti performance in at least the majority of games... Be nice if HH (if you have the time) to run a test using different AA methods and their performance with the 1080 vs 2080 to see how much AA matters and what performance hit there is between the two. would be rather interesting to see
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
Hilbert Hagedoorn:

So does every AA methodology. Remember DLSS is AA at pretty much no additional cost in the rendering engine.
Assume this is because the Tensor cores which arent being used for rendoring are being used for the DLSS AA instead... Can you then have DLSS and ray tracing at the same time, or will a hit happen then?
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
metagamer:

the 2070 will be on par with the 1080ti. Give or take 5-10%. The 2080 will crush it and the 2080ti will run circles around it.
Well, going by these charts the 2080 will be around the 1080ti level of performance give or take some... guessing they go blow to blow in many times... DLSS gives it a slight edge mind you though we don't know how many games right now that it will work on. This could be quite a shame when in previous gens it was the XX70 card that was on par, but this time that doesn't look to be the case going on that charts by Nvidia Edit additional info. 4k results. Hitman. DX12 2080 - 73 fps 1080ti - 78 fps Battlefield 1 DX12 2080 - 84fps 1080ti - 73fps keep in mind DX11 would normally run a bit better on pascal, dont know if this would be the same with the 20XX series
data/avatar/default/avatar35.webp
cowie:

me too:D pretty sure the community is gunna need help from insiders that's why the asus brand have been getting my money the last few times. but if there is some baby voltage/bios mods on ref it would be nice being huge and a new fan might make it on the warm side if so that's good, cooling might help more then the cool running power anorexic 10xx cards
Being huge means it will be easier to cool Heat Dissipation ~ Total Area
https://forums.guru3d.com/data/avatars/m/273/273822.jpg
Battlefieldprin:

Of all the Nvidia series I have seen , this is strange one since the 2080 will be equal or slower even than 1080ti. In the past , each generation used to bring twice performance of the previous one !
sure it has captain.
Noisiv:

Being huge means it will be easier to cool Heat Dissipation ~ Total Area
Kinda, yeah.Larger core could be cooled well enough with a good heatsink/cooler due to a larger contact area between the heatsink and core.
https://forums.guru3d.com/data/avatars/m/246/246564.jpg
I'll believe it, when Hilbert's had a chance to run his own benchmarks. I've been around for way too long to fall for nvidia marketing. If his benchies confirm their claims, by all means, I'll probably get one (or two).
data/avatar/default/avatar13.webp
Fox2232:

Well, you have seen it, they did not make that information up. You just have to interpret it correctly. Way I see it: They brought in scenario where frame rendering takes sub 3ms+AA takes above 10ms. That's because they managed to drop cost of AA down a lot. So if you are running games with high levels of AA and have considerably big impact on performance from it, then you'll see quite high benefit. But if you are running low AA or post processing AA (With nearly no performance impact), then you are not going to see same improvement.
Well, apples to apples the improvement will be 30+ %, which is ok, but lower than we are used to. Usually we see an improvement of at least 50% over the same model from the previous gen. Still, what do we have as an alternative? Even if amd comes with a gpu that is 50% faster than their previous fastest gpu, it will still be slower than the 2080 ti :/
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
Cool, that's a surprise, so that's showing a 40-60% increase in performance going from GTX 1080 to RTX 2080, with supposedly like-for-like settings. The DLSS portion of the chart is I'm supposing when turning off AA in game and activating DLSS (which is AI accelerated anti aliasing, my understanding) - so that's showing how little fps cost DLSS AA is in comparison to whatever AA technique they had activated in the 'control' group. DLSS allows for up to around 2x the performance when compared to GTX 1080, although we don't know the AA technique used in the control so we don't know how fair a comparison it is. In my first sentence I wrote 40-60% increase in performance because that's supposedly comparing like-for-like game settings so is showing the raw gaming power increase of Turing vs Pascal - I really wasn't expecting such a big increase, I thought it was only gonna be 25% max! Turing looks like a better buy now, we need to know more about the context of that graph though in terms of what settings were used, it's a little vague.
https://forums.guru3d.com/data/avatars/m/105/105985.jpg
we see 50% in NVidia slides the past generations you mean lol it falls in line with the past besides prices are outta line use smoke and mirrors you can get it faster that's a given,but details next month on how much its better I hope they kiss cro teams culo and gave them a shit ton of money an volta workstation cards 100k enemies on screen at one time is gunna need a bigger boat
data/avatar/default/avatar33.webp
Paulo Narciso:

So with DLSS a get the same image quality as with Quantum Break?
What ?
https://forums.guru3d.com/data/avatars/m/202/202673.jpg
I wonder if a 2070 will be able to run both RTX ray-tracing and DLSS at the same time if it's possible on the 2080Ti/2080, or even possible/necessary to begin with. 78T/60T vs 45T...seems to be rapidly decreasing with prices going to sane levels.
data/avatar/default/avatar07.webp
SmootyPoody:

Every AI graphic solution I know of has artifacts.. Am I the only one thinking DLSS will introduce artifacts? I mean.. Compromises has to be made as I see it.
Unless it's just BS, and it's just regular post AA blur + sharpening filter... which it might very well be imo.