GeForce RTX 3090 Benchmarks Surface Online

Published by

Click here to post a comment for GeForce RTX 3090 Benchmarks Surface Online on our message forum
data/avatar/default/avatar08.webp
How did this leaker's 3080 numbers compared to actual tested numbers? We can probably take a truck load of salt on this until Hilbert or other validated sites post the real numbers. Maybe its real or maybe Nvidia released a gimped driver to keep their publishing dates and will release the real public driver a few days before the NDA lifts. I would wait before confirming the sky is falling. No matter how expensive Nvidia is, they do deliver on the performance.
data/avatar/default/avatar40.webp
If you cound Mhz and shader count, you get to a 17% extra power between 3080 and 3090 if you take in account that at 4k we see up to 11.5% perf improvement, i will say we are on point. Everyone knew from the start the 3080 was the big deal and even a 3080ti can offer you this 10% perf extra if you have the extra money and if you miss that 5fps to get 60fps. You should be happy and not disappointed that the reference card for gaming this year is either 500$ or 700$ instead of complaining that the 1300$ one is not fast enough.
https://forums.guru3d.com/data/avatars/m/101/101279.jpg
I'll wait to see Hilbert's benchmarks before I even consider making my mind up. These benches could be pure fantasy as far as we know. I'll stick with a bencher I trust.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
David3k:

Why the hostility? I'm just stating what is apparent.
No hostility, just reality. You're stating a company who gets to label their products and claim what it is, are wrong, about their own.. products...
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
With these prices and performance, how badly can AMD fuck up not to make a win here?
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
David3k:

Actually, the 3090 IS the Ti card, not a Titan replacement as nvidia claims. Not sure if they're still planning on releasing a Titan card, but if they do I suspect that it would be based on GA 100, and that it would be priced at 2k USD. Just an educated guess.
its not a ti
data/avatar/default/avatar04.webp
was not expect those numbers, though I still think professional reviews will tell the truth. Even more reason to wait to see If Big Navi is going to come close to all the hype.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Fediuld:

If there is not a big performance gap like 40% between 3080 to 3090 there won't be a 3080Ti while we already we know there are 2 3080 models with different VRAM. As for the physical die size, Nvidia cannot chop it as they see fit. Has to be done in certain way to work. Regarding the DX12/Vulkan Async Compute for mGPU wasn't some in here saying it was pointless when Nvidia couldn't scale over 50% while AMD could do 100% scaling? At least until people bothered to run such benchmarks 2 years ago with Vega 64 and 1080Ti. How many games have we seen with Async Compute all those years? Barely a handful and none the last 2 years. (Last one was Resident Evil remastered). Nvidia just drops the support, like it did with the Nvision kit. And what happened to PhysX/Gameworks? When was the last time we saw a big game supporting it and not some small studio making a C-rated game just to use the Nvidia grand?
Almost every DX12/Vulkan game uses Async Compute, there just isn't options to turn it on/off. PhysX is the default Physics engine in most engines and just recently got an update (https://news.developer.nvidia.com/announcing-nvidia-physx-sdk-5-0/). Most, if not all RTX games are using Gameworks libraries. Your posts continue to be rife with misinformation.
geogan:

It's funny that people don't believe it is only 10-20% faster, given benchmarks and hardware facts about numbers of shaders/ROPs. Don't understand how or why anyone thinks it could be any faster. And no way that an extra 14GB of VRAM costs 850 (1500-650) so don't anyone say the price is justified because of extra VRAM.
14GB doesn't cost that much but it makes the performance punch up next to expensive Quadro units in certain workloads. To avoid cannibalizing those sales they increase the price.
data/avatar/default/avatar15.webp
Probably be lot more once it's overclocked, reckon it's conservative due to temps and power. Kingpin 2080 ti was 30% faster than standard 2080 ti cause they removed power limits an ramped overclocks on it. Kingpin 3090 will be lot faster than this.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Huggi:

So I guess that means little chance of a 3080 Ti/Super variant since there's not enough performance difference to slot in another SKU between the two. My guess would be that the 20GB version of the 3080 is what will occupy the price gap between the standard 3080 and the 3090.
There still might be one, but only with more VRAM (and not necessarily with more memory bandwidth, but maybe) and maybe higher clock speeds. I've mentioned before that there's a large market out there demanding more VRAM (regardless of whether it is actually needed), so if Nvidia is smart, they'll tap into it.
lukas_1987_dion:

More VRAM is not about having more performance, just less stutters, smoother gameplay as game can use much more VRAM for cache which is much faster than system RAM
Yes and no - the stuttering happens when the game needs to unload a buffer/cache and swap in new assets. But, if a game actually demands more VRAM than what your GPU can supply (in other words, there's not even room for a buffer) then it will need to mooch off your system memory via the PCIe bus. This can have a dramatic loss in performance, depending how much more it needs.
geogan:

It's funny that people don't believe it is only 10-20% faster, given benchmarks and hardware facts about numbers of shaders/ROPs. Don't understand how or why anyone thinks it could be any faster.
I expected it to be a little faster (up to 5% faster) but it's obvious that this GPU was not built with 1080p in mind. It seems to fare much better in 4K due to the extra cores and memory bandwidth,
https://forums.guru3d.com/data/avatars/m/256/256969.jpg
Basically it would mean that if you're not overclocking the 3090, you could have exactly the same performance buying a custom factory overclocked 3080 costing 600$ less. Lower tier 3090, the worst deal ever ? Really wondering when the 1000-1200$ 3080-ti (naming has been confirmed by leaked Gigabyte afterall) are gonna be released/Announced. When is the next Nvidia event/conference?
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
XenthorX:

Basically it would mean that if you're not overclocking the 3090, you could have exactly exactly the same performance buying a custom factory overclocked 3080 costing 600$ less. Lower tier 3090, the worst deal ever ?
Kinda no different than the Titans honestly. It's more or less a branding problem. Idk, I'm convinced Nvidia pulled a last minute switcharoo with these cards. I think the 3080 was originally supposed to be a 3080Ti, the 3070 was supposed to be the 3080 and the 3090 was supposed to be a Titan priced at $2000-3000. The entire series makes more sense when you think about it this way. Why is a 3080 a GA102 and not GA104? Why does the '3090' even exist? I think Nvidia had these things in production, realized AMD was going to be more competitive than they first imagined, then shifted everything around.
https://forums.guru3d.com/data/avatars/m/165/165018.jpg
Disappointing if true
https://forums.guru3d.com/data/avatars/m/256/256969.jpg
The change in SKU naming to consumer product naming is indeed intriguing. With 102 usually referring to higher tier ships Ti, Titan and such, and 104 referring to -80 and below. Alongside the push in power consumption, and therefore limited overclocking headroom left for 2xPci-e // 380W cards. @Denial In Jen-Hsun Huang own words: "competition set the price" 33:28 [youtube=Xn1EsFe7snQ]
https://forums.guru3d.com/data/avatars/m/256/256969.jpg
I think AMD strategically set their reveal events after Nvidia: Next time Jen-Hsun Huang takes the stage will be 5 October for GTC (5-10 October), and then it won't be before mid/end November. Nvidia unexpected pre-emptive move toward AMD was to let this insanely wide product range 700-1500$ opened? October is gonna be all about AMD, if anyone ever had a doubt.
https://forums.guru3d.com/data/avatars/m/260/260828.jpg
Denial:

Kinda no different than the Titans honestly. It's more or less a branding problem. Idk, I'm convinced Nvidia pulled a last minute switcharoo with these cards. I think the 3080 was originally supposed to be a 3080Ti, the 3070 was supposed to be the 3080 and the 3090 was supposed to be a Titan priced at $2500-3000. The entire series makes more sense when you think about it this way. Why is a 3080 a GA102 and not 104? Why does the 3090 even exist? I think Nvidia had these things in production, realized AMD was going to be more competitive than they first imagined, then shifted everything around.
That woul also explain the low cost of this cards(if you compare them to 2xxx series release prices), let's hope AMD has something good, we need competition
https://forums.guru3d.com/data/avatars/m/126/126739.jpg
Lol.. I was holding out for the 3090, but a 10% increase in gaming over the 3080? Looks like I need to wait and see what AMD has to offer for sure. If the 3090 was 20-25% faster, then I would get it for sure. But a measly 10% is not worth double the price (from a gaming perspective). If AMD has something that is faster than the 3080, then the GPU battle for the next few months will be really interesting!
data/avatar/default/avatar17.webp
TheSissyOfFremont:

I think DLSS is fantastic - but it's certainly not as good as native. It can definitely have it's uses though - on my 2070 super, I've got Death Stranding running at 4k, DLSS quality, output to my 1440p monitor with NV's sharpening filter on. It looks better than 1440p native and is rock solid at 60fps.. I would have thought it would be a massive draw for the consoles - 3-4 years into their life-cycle, you start to use DLSS so that you can keep implementing the newer engine tech but offset the performance hit. I'm really interested to see if they can continue to improve on it - is 3.0 going to be as much of an improvement over 2.0 as 2.0 was over 1.0??
Ehm... How are you running 4k output to a 1440p monitor through dlss... seems like you are mixing up the terms. 4k output to a 1440p monitor = downsampling. And yes, downsampling looks better than native resolution. Dlss is used to upscale an image, not downsample. Aka take a 1440p output and upscale it to 4k. Not the other way around.
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
mattm4:

Lol.. I was holding out for the 3090, but a 10% increase in gaming over the 3080? Looks like I need to wait and see what AMD has to offer for sure. If the 3090 was 20-25% faster, then I would get it for sure. But a measly 10% is not worth double the price (from a gaming perspective). If AMD has something that is faster than the 3080, then the GPU battle for the next few months will be really interesting!
Double the price is simply due to the amount of VRAM. If it had 10GB it would probably cost 900-1000.
data/avatar/default/avatar12.webp
TheSissyOfFremont:

DSR. Desktop res set to 5K in NVCP --> game is set to 4K (it's max) --> DLSS quality mode. Looks great and runs at a solid FPS. Best combination for IQ and performance I've found.
Sigh... then you are downsampling with DXR, not upscaling with DLSS...