GeForce RTX 3090 Benchmarks Surface Online

Published by

Click here to post a comment for GeForce RTX 3090 Benchmarks Surface Online on our message forum
https://forums.guru3d.com/data/avatars/m/249/249481.jpg
So I guess that means little chance of a 3080 Ti/Super variant since there's not enough performance difference to slot in another SKU between the two. My guess would be that the 20GB version of the 3080 is what will occupy the price gap between the standard 3080 and the 3090.
https://forums.guru3d.com/data/avatars/m/87/87487.jpg
Wow, so if these benchmarks are true then that's a 10% improvement for the RTX 3090 over the RTX 3080 for almost double the cost? What a bargain! Seriously though, this is a card for creators not gamers due to the amount of VRAM but it does suggest that if there was a RTX 3080 Ti that its performance uplift over the RTX 3080 would be similarly disappointing? Paying, say, $300 more for a card (if the 3080 Ti was priced at $999) with just 10% more performance is not good value, although it would likely have more VRAM to offset the extra cost and support SLI for those that still care about it.
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
Hey everyone! Remember when nvidia decided to get rid of SLI ? this will not end well.
data/avatar/default/avatar37.webp
Huggi:

So I guess that means little chance of a 3080 Ti/Super variant since there's not enough performance difference to slot in another SKU between the two. My guess would be that the 20GB version of the 3080 is what will occupy the price gap between the standard 3080 and the 3090.
Actually, the 3090 IS the Ti card, not a Titan replacement as nvidia claims. Not sure if they're still planning on releasing a Titan card, but if they do I suspect that it would be based on GA 100, and that it would be priced at 2k USD. Just an educated guess.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
10% over 3080 would be ridiculous. It would be in the cherry-picked chip with overclocking territory. Not that it would be relevant for me personally, but in any case I'll wait for a Guru3D review before really believing it. If it's true, the GPU wasn't seriously meant for gamers, like AlexM said. It would basically be paying a huge amount of premium for the extra memory alone.
https://forums.guru3d.com/data/avatars/m/270/270743.jpg
in horizon zero dawn i do 60-75 fps with my 2080ti for 4k already , i'm not shelling out another 500-600 euro , and certainly 1500 euro for a 15-25 fps increase they can keep their ''new'' cards , i was expecting way more from the 3090 i mostly play on 1440p anyway , and my game looks about the same while running at 95-120 fps , mostly around 100 fps just tested it out yesterday to compare numbers on 4K for the newer cards not worth the upgrade , it's the same ''upgrade'' as every other one again , 20-25 % like always , F that !
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Man that extra 14GB of memory sure is helping a lot at 4K in demanding titles like everyone keeps insisting it would......3070's with 16GB and 3080s with 20GB is so totally needed and 10GB? Yeah....definitely not enough..... https://www.writerscookbook.com/wp-content/uploads/2017/03/how-to-write-sarcasm.png
David3k:

Actually, the 3090 IS the Ti card, not a Titan replacement as nvidia claims.
So...you disbelieve what a company states, when it's said companies job to create and maintain their product stack, and call what they want, what they want? Let me guess, the next Samsung phone won't be a Samsung phone, it'll be whatever you want to call it, cause you're the consumer, you must be right, right?
data/avatar/default/avatar34.webp
More VRAM is not about having more performance, just less stutters, smoother gameplay as game can use much more VRAM for cache which is much faster than system RAM, but yeah 24GB is a overkill even for 2021 gaming I'm sure, 12 to 16GB VRAM is just right for 4K and mods for future games. Btw if 3090 is really just 9% faster than 3080... no words lol
data/avatar/default/avatar02.webp
Aura89:

Man that extra 14GB of memory sure is helping a lot at 4K in demanding titles like everyone keeps insisting it would......3070's with 16GB and 3080s with 20GB is so totally needed and 10GB? Yeah....definitely not enough..... https://www.writerscookbook.com/wp-content/uploads/2017/03/how-to-write-sarcasm.png So...you disbelieve what a company states, when it's said companies job to create and maintain their product stack, and call what they want, what they want? Let me guess, the next Samsung phone won't be a Samsung phone, it'll be whatever you want to call it, cause you're the consumer, you must be right, right?
Why the hostility? I'm just stating what is apparent. And besides, they've a history of making a GPU designed for a certain tier of performance within their own product stack and selling above that, as an x80 series part. Also lying to consumers isn't exactly new to nvidia, let's be honest, as it is with every other company. They're completely within their rights to market and sell whatever they have as whatever part they want to name it as. I don't begrudge or hold it against them, I mean nothing negative about my claims, just pointing out that there isn't going to be a later Titanium part for the 3080, because we already have it in the 3090. It's a good marketing move. We WILL see a Titan, though, and that will be a monster of a card. EDIT: I understand why you don't know who I am, I rarely actually post. : )
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
Everyone is like: "Meh, too expensive for 10%". The price is on point guys, most of the extra you pay is for the 24GB of Ram. Don't you worry, those waiting for the RTX 3080 20GB, it will set you back 1100 - 1200 dollars.
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
jbscotchman:

I don't think it was so much Nvidia, but developers.
Fair comment - but I mention it because; nvidia seem to not only want all of the money in our wallets for a 10% gain in perf...they also want to begin locking out the custom nature of owning a PC. You can use nvlink, but...of course...ONLY if you are using the 3090...a bitter pill to swallow this has been, just because someone has some kind of inferiority complex and needs to buy more leather jackets and try oh...so...hard, to be regarded as Steve Jobs was. Remember that speech LJJ gave where it was taken literally word-for-word from a speech Steve Jobs gave once? Fun times.
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
Davud:

Nothing surprising really, 3090 has 10496 shader units, 3080 has 8704: That is exactly 20% more. 3090 has 112 ROPs, 3080 has 96: That is 16.6% more. The only meaningful difference, is the memory.
I was about to comment this. But I can imagine in certain loads the card will be that 20% faster. But most likely not games. This is the full chip so it's also the Titan.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Amd really have a chance here. If they catch up to 3080 seems like 3090 isnt far off. 60% faster than 5700xt is not impossible.
data/avatar/default/avatar18.webp
Undying:

Amd really have a chance here. If they catch up to 3080 seems like 3090 isnt far off. 60% faster than 5700xt is not impossible.
I'm choosing to not expect anything. that way if it actually turns out good I'll be pleasantly surprised. If not, I can buy a 2070 Super for pretty cheap soon. (Currently on a 1070)
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
@David3k just because the 3090 is using the full chip, doesnt automatically mean it eliminates a ti. especially since we dont know how many chip are "defective" (as in not full chip). @Loobyluggs you dont need the connectot for sli, as its now gonna be done in-game, so not sure why you state ppl have to buy a 3090 to do "sli".
https://forums.guru3d.com/data/avatars/m/220/220214.jpg
It's funny that people don't believe it is only 10-20% faster, given benchmarks and hardware facts about numbers of shaders/ROPs. Don't understand how or why anyone thinks it could be any faster. And no way that an extra 14GB of VRAM costs 850 (1500-650) so don't anyone say the price is justified because of extra VRAM.
data/avatar/default/avatar32.webp
angelgraves13:

Very few DLSS supported games thus far. We'll see what happens.
Dlss looks like shit anyways, nothing like the native image - very clearly upscaled with this oversharpened look to try and compensate for blurry image you get from upscaling... 1440p upscaled to 8k would look like... 1440p with a sharpening filter.
data/avatar/default/avatar21.webp
ITGuru:

This gives me more reason not to prostitute my 2080 TI but calmly wait for the 3080 TI variant to come out, and then carefully consider upgrade, forget the hype......use logic, facts and critical thinking. I advise existing owners of 2080 TI to do the same, remember how much you paid for your cards before you make a decision to sell them and upgrade. Might even snag another 2080 ti and SLI them, considering that you can't do that with 3080's.
There won't be a 3080Ti if 3080 to 3090 is less than 20% gap. Neither 3080S. The die we know already is the full fat one and chopped for the 3080 because of the yield issues. (too big physically). 2021 is the new MCM chiplet architecture from both companies.
data/avatar/default/avatar01.webp
fry178:

@David3k just because the 3090 is using the full chip, doesnt automatically mean it eliminates a ti. especially since we dont know how many chip are "defective" (as in not full chip). @Loobyluggs you dont need the connectot for sli, as its now gonna be done in-game, so not sure why you state ppl have to buy a 3090 to do "sli".
If there is not a big performance gap like 40% between 3080 to 3090 there won't be a 3080Ti while we already we know there are 2 3080 models with different VRAM. As for the physical die size, Nvidia cannot chop it as they see fit. Has to be done in certain way to work. Regarding the DX12/Vulkan Async Compute for mGPU wasn't some in here saying it was pointless when Nvidia couldn't scale over 50% while AMD could do 100% scaling? At least until people bothered to run such benchmarks 2 years ago with Vega 64 and 1080Ti. How many games have we seen with Async Compute all those years? Barely a handful and none the last 2 years. (Last one was Resident Evil remastered). Nvidia just drops the support, like it did with the Nvision kit. And what happened to PhysX/Gameworks? When was the last time we saw a big game supporting it and not some small studio making a C-rated game just to use the Nvidia grand?
data/avatar/default/avatar26.webp
How is in average 19.8% faster when the fastest percentage is 11.5% ? You meant 9.8?