Rumored NVIDIA RTX 5090 Specs: Cores, GDDR7 Memory, and Bandwidth

Published by

Click here to post a comment for Rumored NVIDIA RTX 5090 Specs: Cores, GDDR7 Memory, and Bandwidth on our message forum
https://forums.guru3d.com/data/avatars/m/289/289956.jpg
pegasus1:

Graphics cards seem expensive, until you realise a Starbucks a day is over £1800 a year.
Multiply that by 2 if your spouse is a heavy caffeine drinker too. We decided to go with the Phllips latte go 4300 this black Friday although the 3200 model was on sale for $549 and does the same thing with just 1 profile vs 3. I wonder how much gamers spend on them poisonous Redbulls annually?
aufkrawall2:

If you pay 5 bucks per day for coffee, you either can afford several 4090s per year, or you got some serious money spending issue.
Starbucks often goes for more if you get the venti with flavors and extra shot it can easily add up to $8 before vat and then there is a tip if your generous lol. They do give you Starbucks points for $1 spent they give you 1 point.
Cave Waverider:

At least one can easily make their own high-end coffee with beans and such, but one can't do that with high-end video cards.
True but if you are eyeing the 5090 most likely you have a high end card like the 4090 which doesn't become paper weight and will sell on the second hand market offsetting impact of inflation from my experience.
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
i look foward to seeing the prices vs performance/gains. if 40xx is any indication, o_O
https://forums.guru3d.com/data/avatars/m/289/289956.jpg
tsunami231:

i look foward to seeing the prices vs performance/gains. if 40xx is any indication, o_O
With talks about next gen consoles on the rise and Intel's gpu succesion with Battlemage that might eat into Nvidia's mid to low end margins, I believe Nvidia will attempt to space themselves as far as it can from Intel and next gen consoles ( which are rumored to have ai capabilities) in terms of performance to justify the sky is the limit premiums. In order for Nvidia to have a 4090 like success it needs to have a similar delta gain imo for the 5090.
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
The card is little more than a speculative rumour and already people are crying over prices. Keyboard warriors getting upset because they want something they can't have for reasons they refuse to accept.
https://forums.guru3d.com/data/avatars/m/289/289956.jpg
pegasus1:

The card is little more than a speculative rumour and already people are crying over prices. Keyboard warriors getting upset because they want something they can't have for reasons they refuse to accept.
Hey there is always rdna5 rumored for 2025.
Krizby:

5090 + 4K 240hz OLED are gonna be awesome combo for 2024-2025 😎
Yeah I'm waiting for 2025 to upgrade my 3.5 year old CX oled 48 inch for dp 2.1 4k 240hz to flood the market in time for Blackwell. I wonder if Nvidia will upgrade Gsync module with dp 2.1 by then .
https://forums.guru3d.com/data/avatars/m/270/270091.jpg
TLD LARS:

RTX 4070 GDDR6X 192-bit bus 504 GB/s bandwidth 6900XT GDDR6 256-bit bus 512GB/s bandwidth Vega64 HBM 2048-bit bus 480GB/s bandwidth Bus width is not the only determining factor for how fast the memory is and memory speed is not even the main factor for render speed. Get a Radeon 7 if you want a wide bus and lots of memory bandwidth.
I know that, and the bigger L2 cache makes up for the deficit to some degree, but still we went from the 320-bit GTX 1080 and 3080 to the 256-bit 4080 and now possibly a 192-bit 5080. Nvidia seems to be knocking them down a tier again. Also another example - the 4070 Ti is generally as fast or very close to the 3090 Ti at 1080p and 1440p, but falls behind at 4K due to the lack of VRAM bandwidth and the 48MB L2 cache not being quite enough for 4K. Or it could also be due to being ROP/TMU-limited since it only has 80 ROPs and 240 TMUs versus 112 ROPs and 336 TMUs on the 3090 Ti.
https://forums.guru3d.com/data/avatars/m/270/270091.jpg
TheDigitalJedi:

After reading the projected specs of the 5090, I can't wait to test this card!!! Possibly over 26,000 cores and 32gb of GDDR7 memory at 1,536 GB/s??? They're possibly sticking with a 384 bit bus? I wonder why the hesitation with 512 bit if there are any? Nonetheless this card seems like it is going to deliver huge increases like the 4090 did. These cards delivered in spades and then some. Over a year later we are still seeing performance lifts. Although expensive, the 4090 will be one of my favorite cards ever owned. I know performance projections are always questionable, but the 4090 projections were on point before release. If the 5090 has 2X more performance than Lovelace that's going to be insane! Bring it on!
I think even the highest-end Zen 5 and 15th Gen Intel CPUs around that time will likely still hold back the 5090 in many games if you run below 4K. RT can be very taxing on the CPU as well plus a lot of games today don't seem to be coded that efficiently for the CPU side anymore.
https://forums.guru3d.com/data/avatars/m/181/181063.jpg
People who are non smokers always tell me" with the amount of money you throw away on cigarettes you could buy a luxury car in three years" - and I always tell them: so why are you driving your ass in a vw and not a bentley? If you don't spend money on coffee more than sure you will find other purposes from them and you will not putting away money for an expensive card.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
DonMigs85:

I think even the highest-end Zen 5 and 15th Gen Intel CPUs around that time will likely still hold back the 5090 in many games if you run below 4K. RT can be very taxing on the CPU as well plus a lot of games today don't seem to be coded that efficiently for the CPU side anymore.
Thats why you have frame generation so reduce the cpu bottleneck.
https://forums.guru3d.com/data/avatars/m/242/242443.jpg
pegasus1:

Graphics cards seem expensive, until you realise a Starbucks a day is over £1800 a year.
Ya but what if you can't afford starbucks ...lol 'ME' Chit I'm hopin' I win that 4070 in x-mas contest.
data/avatar/default/avatar04.webp
Damn, that is going to be a big boy. Thankfully I have a huge case 😀:D
data/avatar/default/avatar05.webp
For that price they should make an HBM version,512bit GDDR will probably drain 120W HBM would drop the TDP by a lot, but I guess that's enterprise/HPC only now
data/avatar/default/avatar12.webp
DonMigs85:

I know that, and the bigger L2 cache makes up for the deficit to some degree, but still we went from the 320-bit GTX 1080 and 3080 to the 256-bit 4080 and now possibly a 192-bit 5080. Nvidia seems to be knocking them down a tier again. Also another example - the 4070 Ti is generally as fast or very close to the 3090 Ti at 1080p and 1440p, but falls behind at 4K due to the lack of VRAM bandwidth and the 48MB L2 cache not being quite enough for 4K. Or it could also be due to being ROP/TMU-limited since it only has 80 ROPs and 240 TMUs versus 112 ROPs and 336 TMUs on the 3090 Ti.
If the memory bandwidth on the 4090 goes from 1000GB/s to 5090 1500GB/s with the same bus width, then a 192 bit bus like the 4070 at 500 GB/s will hit around 700GB/s with GDDR7 or the same as a 4080. The 5080 will then have the same memory bandwidth like the 4080, not the end of the world if the chip and cache can take over and give the performance increase needed between generations. Just like the 4070 and 4070ti has identical memory speeds, but different performance, a 4080 and 5080 can easily have the same. Same with 3090ti and 4090 having identical memory speeds, but a good difference in total performance. The 4070 falling behind at 4k can not be fixed with memory bandwidth, multiple games are at 10GB memory usage already at that point, so a 4080 is needed if further improvements are wanted.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
So becouse of the new memory standard the gddr7 nvidia has the right to reduce the bus width every generation? What happens when we get gddr8/9 will the 6080/7080 be a 128bit gpus? If they kept the 256bit on 5080 with gddr7 it would have same bandwidth as 4090 but no we cant have that.
https://forums.guru3d.com/data/avatars/m/268/268248.jpg
aufkrawall2:

If you pay 5 bucks per day for coffee, you either can afford several 4090s per year, or you got some serious money spending issue.
Ok fine I will pay you 1825 dollars and send me several 4090!
https://forums.guru3d.com/data/avatars/m/165/165326.jpg
The word behind the scenes is this will be release 2025 as that's what i heard from friends in the industry. nvidia is going to keep milking the RTX 4xxx until there is no more left on the tank lol ...
https://forums.guru3d.com/data/avatars/m/246/246088.jpg
Cave Waverider:

At least one can easily make their own high-end coffee with beans and such, but one can't do that with high-end video cards.
And one can look up 'analogy' in one's thesaurus.