NVIDIA GeForce RTX 5090 and RTX 5080 Specifications Revealed

Published by

Click here to post a comment for NVIDIA GeForce RTX 5090 and RTX 5080 Specifications Revealed on our message forum
https://forums.guru3d.com/data/avatars/m/246/246564.jpg
They might, no reason why they wouldn't considering how stripped down compared to a 5090 it's going to be. I still think it'll be priced at the original 4080 MSRP, but maybe even as low as the Super. It certainly has to be attractive enough to take the load off the 5090, because even if they price the 5090 as a workstation accelerator, it'll still fly off the shelves.
data/avatar/default/avatar03.webp
The 5080 is *not* more "power efficient", it just uses less power for less performance. My undervolted and power limited 4090 + 13900K run slightly above stock performance using a 750 W PSU btw, even while running Furmark + Prime95. That's not the norm, of course, but it's also not like everyone *has* to make full use of the 600 W power-limit.
https://forums.guru3d.com/data/avatars/m/169/169351.jpg
In before Nvidia buyers will cry "we need competition" just so that Nvidia lower their prices to match.... (Where they'll still buy Nvidia rather than buying the competition).
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
love how ppl think they determine what specs/tier products should have, or what price range they should be. ignoring for a moment that none here have a clue about the full 5xxx lineup yet, and i see the gap between 80 and 90 to allow for a Ti, do be the top gaming card, if anyone else gets close in perf. besides, there are ppl that have old hw and save money for years to by a high(er) end tier so they can use it for a couple of years, not everyone buys stuff just to be the latest and greatest, like i did with my 2080S WB that was almost 1k, but allowed me to use it much longer than expected/planned, as the LC allowed for full boost clocks more or less continously.
https://forums.guru3d.com/data/avatars/m/275/275921.jpg
Truder:

In before Nvidia buyers will cry "we need competition" just so that Nvidia lower their prices to match.... (Where they'll still buy Nvidia rather than buying the competition).
AMD hasnt been a better deal for a while now, I don't understand why theres anything wrong wanting Nvidia gear at lower prices.
https://forums.guru3d.com/data/avatars/m/301/301273.jpg
The RTX 5080 looks like a joke now. With 256bit and 16GB of Vram memory, it's literally HALF of a 5090 with 512bit and 32GB Vram.
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
fantaskarsef:

Yeah the 4090 also has a 450W TDP and I haven't seen this in maxed out gaming yet. But I did UV a little on my card, I have to admit. I doubt the PCB will be the same, they will deliberately change it, even if they wouldn't need more power cleaning stuff on the PCB as they do with 600W and probably a 2nd connector on there.
Ive had mine very close to 600w, i think well over 450 in CP2077, certainly over 500 in benchmarks.
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
GlassGR:

Could you please elaborate on why you think it is too much to ask for ?
I dont know the answer to this but, when was the last time a next gen PCB could use the water blocks from a previous gen, i cant think of any.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
pegasus1:

Ive had mine very close to 600w, i think well over 450 in CP2077, certainly over 500 in benchmarks.
Well I could see that happening in pushing for max frames, not undervolting, no DLSS / FG (which for me massively reduces power consumption when activated). Will see soon as I plan to revisit Night City soon, with my new 4K HDR display and all 😀
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
I think there will be some nice deals on the used market and the 4080. Maybe i could snag one at 600$ or so after the 5080 is released.
https://forums.guru3d.com/data/avatars/m/169/169351.jpg
AuerX:

AMD hasnt been a better deal for a while now, I don't understand why theres anything wrong wanting Nvidia gear at lower prices.
That may be so but my point is about the buying practices of a certain segment of shall we say, loyalists? Not the general sentiment of wanting lower prices (which we all want really).
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
Wasn't the 1060 6gb about -50% in specs to the 1080 ? Let that sink in . Edit: @wavetrex did the math and proved my point before me I see !
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
You guys remember the time when 3060ti was the same performance as 2080 Super? So thats x60 card matching the x80 of the previous gen. You think 5060ti will be fast as 4080 Super? 😀
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
Truder:

(Where they'll still buy Nvidia rather than buying the competition).
Name a GPU manufacturer competing with NVIDIA.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Undying:

You guys remember the time when 3060ti was the same performance as 2080 Super? So thats x60 card matching the x80 of the previous gen. You think 5060ti will be fast as 4080 Super? 😀
No, that's not going to happen anymore, any time soon. My 4070 can barely match 3080. That's the new norm. It would take some kind of real technological breakthrough for the consumers to be served such sweet development anymore. Even if AMD could actually compete, I don't believe it would happen just because of that. Maybe in a healthier market situation 5070 would match 4080 Super. But as it is, I have a feeling 5070 will have hard time reaching the regular 4080. 5060 Ti might be a little more powerful than 4070.
data/avatar/default/avatar07.webp
Honestly? kind of underwhelming considering the leaks stated of almost 25k cores, but at least the memory bandith will be almost 50% more and so will be the rt tensor cores. Looking forward to the price bomb, Iam expecting nothing less than 2 thousand euros....
data/avatar/default/avatar13.webp
pegasus1:

Ive had mine very close to 600w, i think well over 450 in CP2077, certainly over 500 in benchmarks.
At stock settings? Not ridiculously overclocked? Mine runs undervolted (+VRAM OC) stock performance in CP2077 using a 360 W power-limit and 1.0 V voltage cap. For 25-35% more power I would get 8-10% more stable (!) performance, that's still below 500 W, end of the line.
https://forums.guru3d.com/data/avatars/m/229/229509.jpg
valentyn0:

"RTX 5090 maintains a thermal design power (TDP) of 600 watts, indicating efficient power management relative to its computational power. This balance of high memory bandwidth and efficient power usage underscores NVIDIA's focus on delivering powerful yet manageable GPU solutions." that's the most underwhelming description of upcoming blackwell gpu when it comes to efficiency, 600W is not efficient by any way of imagination, people ! no matter how much memory they plastered there and how many cores, that is no way efficient to consume twice the normal high-end videocards did a couple of generations ago and should be unacceptable to even entertain the idea of having this product ! why intel or amd dont do this, well you know why, there is no point to have a freakin' component, ONLY one in the whole PC to consume more than 90% of all the other components. That's why nvidia has stopped being a REASONABLE company that produces gpu with this abonimation.
It was already getting silly by 2016, now we're at double this. On my mind, the max acceptable power output of a GPU should be 300 W for the highest-end ones. CPUs are the same (at least from intel's side. It's getting nuts. Power efficiency thrown out the window to eek out every last drop of performance. I may have a 1500 W power supply (have had for years), but I wouldn't dream of buying anything this power hungry! https://www.karlrupp.net/wp-content/uploads/2013/06/tdp.png
https://forums.guru3d.com/data/avatars/m/186/186805.jpg
wavetrex:

If it turns out to be true, 50"8"0 is the worst joke ever from nVidia. Never ever in their entire history the 2nd place part is cut down less than 50% than their flagship. With AMD exiting the high end, the next couple of years will be complete absolute milking from the monopoly company Omni Consumer Products... I'm sorry, from nVidia Corp.
For certain on paper this does look MASSIVELY cut down from the 5090. I believe this just gives them room to insert a 5080 Super and a 5080 Ti. I expect the 5090 to retail for the $2k mark with AIB's coming in around the $2100-$2500 ranges. With 5080 (if these specs are true and it performs how you think) to actually see a price cut from the previous generation from $1200 to $1000 for the 5080. Nvidia can tought this as a gen on gen decrease and talk the 5090 up to being so insanely overpowered that its actually worth the price increase. What this does is leaves a huge $1000 gap for the 5080S to be inserted into the $1200-$1500 range and a 5080Ti to be inserted into the $1600-$1900 ranges. it makes sense if these spec are true then it could turn out like this. 5090 >>>>>>>>>>> $2000 MSRP 21k shaders 512bit bus 32GB VRAM 5080Ti >>>>>>>>>>>>>> $1500 MSRP 15-16K shaders 384bit bus 24GB VRAM (with increases in TMU/ROP, and Shader count) 5080S >>>>>>>>>>>>>>> $1200 MSRP 12-13K shaders 320bit bus 20GB VRAM (with increases to TMU/ROP, and Shader count) 5080 >>>>>>>>>>>>>>>> $1000 MSRP 10K shaders 256bit bus 16GB VRAM This also allows them to more precisely segment the different SKU's in terms of performance by giving them more control over how they perform. Because no matter what you do with the 5080Ti you will never be able to overclock it to get anywhere near the 5090, and the same with the other SKU's they can be perfectly segmented. And now Nvidia has a total of 4 sku's ABOVE the $1000 price tag, even further increasing their brand as "premium". This still keeps the 5090 as the halo sku, while capitalising on people complaing about how Nvidia skimps on VRAM and allows Nvidia to completely dominate the high end market. Then an 8GB 5060 can be $399, a 12GB 5070 $699 leaving room for a 12GB 4060S/Ti, and leaving room between the 5070 and 5080 for a 16GB 5070S/Ti.
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
fantaskarsef:

Well I could see that happening in pushing for max frames, not undervolting, no DLSS / FG (which for me massively reduces power consumption when activated). Will see soon as I plan to revisit Night City soon, with my new 4K HDR display and all 😀
Go to Dogtown, its pretty impressive as far as the visuals go.