NVIDIA cancels the GeForce RTX 4080 12GB

Published by

Click here to post a comment for NVIDIA cancels the GeForce RTX 4080 12GB on our message forum
data/avatar/default/avatar16.webp
The 4080/12 is very similar to a 3080/12 in rast. It should be a 4060ti, not even a 4070. The 4070 should MATCH the performance of a 3080ti. The 4080/16 should be $900.
https://forums.guru3d.com/data/avatars/m/282/282603.jpg
this reminds me again; NaughtyVidia Nobody does it better!
https://forums.guru3d.com/data/avatars/m/108/108389.jpg
CPC_RedDawn:

All this talk of RDNA3 being weak is seriously weird to me. The 6900XT had 5120 cores and the 3090 had 10496. Sure the architectures are totally different but AMD still managed to near match/beat the 3090 in general raster performance with this architecture. 7900XT is rumoured to be around the 12288 cores. This is OVER DOUBLE that of the 6900XT. Now factor in other architecture changes (mcm), clock speed increases. 6900XT could already hit 2.6GHz easily. So 3GHz+ isn't out of the question on the 7900XT. Now go to the 4090 with 16k cores which is nowhere near double that of the 3090. Factor in architecture changes, and more importantly the huge clockspeed increases and insane power draws its easy to see where the 40 series gets its performance from. Unless something majorly went wrong in the RDNA3 developement, given the current rumoured specs it would be extremely surprising to see them not at least match the 4090 in raster performance. RT performance is another story but with the MCM design maybe AMD will use more of the die space for RT enhancements. DLSS3 with its frame generation is I think where Nvidia could really pull ahead of AMD.
From a logical standpoint How do you expect a MCM design with total die space of 500mm2 to beat a 600mm2 monolithic when both AMD and Nvidia are on TSMC 5nm now? at best I could think is Navi31 perform around 4080 16GB. MCM is not magic, it is supposed to bypass the reticule limit by bunching up smaller dies together. It was already an accomplisment that Navi21 doubled Navi10 performance when more than doubling the die size 250mm2 to 530mm2. With Navi31 being the first MCM GPU I expect a manufacturing cost benefit, not a performance king. More than likely that Navi31 is around 4080 performance and costing 700-800usd, that's why Nvidia has to adjust "4070" name and pricing
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
SatsuiNoHado:

this reminds me again; NaughtyVidia Nobody does it better!
This is over a decade old, but still relevant as ever, the name just needs to updated to nGreedia:
Trollface - nVidia.png
https://forums.guru3d.com/data/avatars/m/223/223196.jpg
I think it's a good decision. They rightfully got flak for calling two cards that are completely differently specced 4080. Now they've corrected that, and that's a move that takes balls. Most companies would need a lot more pressure on them before even considering a course correction, or would stick to their error no matter what. I think as well that they'll reintroduce that die as 3070 or 3070ti, and that's where it belongs. And current pricing ? Strategic as is the current shortage of 4090s. They want the 3000 series stock gone, so they make the 4000 series as unattractive as they can for now. Still they had to at least paper launch it to stay ahead of AMD. Come November and AMD 7000 series, and I bet some things will change.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
hichamkh:

Usually the "80" card is 320bit
No. Nvidia puts the 80 where ever it makes sense for the generation. 280/285 - 512 480/580 - 384 680 - 256 780 - 384 9 - 256 10 - 256 20 - 256 30 - 320 40 - 256
data/avatar/default/avatar04.webp
Astyanax:

No. Nvidia puts the 80 where ever it makes sense for the generation. 280/285 - 512 480/580 - 384 680 - 256 780 - 384 9 - 256 10 - 256 20 - 256 30 - 320 40 - 256
If you continue further back, it becomes evident with the shenanigans nvidia started pulling with kepler, and has continued ever since...
data/avatar/default/avatar04.webp
people can afford it - that's a bad move people who can't - now we are talking
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
Undying:

Now we ended up with 1200$ 4080 and 1600$ 4090 so 40 series are out of reach for most people.
So they're not going to release a 4070, 4060 and 4050 next year?
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Dragam1337:

If you continue further back, it becomes evident with the shenanigans nvidia started pulling with kepler, and has continued ever since...
or just how competitive amd were.
https://forums.guru3d.com/data/avatars/m/230/230258.jpg
Great. Now give us a $699 rtx 0080 card. As RTX3080 was 699 $ MSRP
data/avatar/default/avatar16.webp
scoter man1:

Imo, it should've been illegal for them to have a product called "A 4080" that is entirely different. It would've been fun to buy that on amazon (or something) and return it for a misleading description lol.
They have done it with the GTX1060 6GB (1280 shaders) vs 3GB (1152 shaders) and the 3080 10GB (8704 shaders) vs 12GB (8960 shaders). Nothing stopped Nvidia to scam the consumers with their naming.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
cucaulay malkin:

or just how competitive amd were.
AMD was very competitive against kepler. The moment that nvidia started to pull away was with Maxwell. And Pascal was just a bloodbath. But the big take away is that we need competition to nvidia, be it from AMD or Intel. Otherwise, the ones that suffer are consumers.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Horus-Anhur:

AMD was very competitive against kepler. The moment that nvidia started to pull away was with Maxwell. And Pascal was just a bloodbath. But the big take away is that we need competition to nvidia, be it from AMD or Intel. Otherwise, the ones that suffer are consumers.
no solid leaks on navi 3x performance for months is worrying.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
cucaulay malkin:

no solid leaks on navi 3x for months is worrying.
Maybe. But it could be that AMD has better control over it's information. Also remember that the "leaks" with Ada Lovelace were mostly some guys throwing every possible spec at the wall, until something stuck.
https://forums.guru3d.com/data/avatars/m/87/87487.jpg
Wow. Just wow! Did NVIDIA not do ANY market research on the naming of the RTX 4080 12 GB model or did they honestly think they could get away with naming a XX70 product and selling at a much higher cost based on the name alone (a product that is already being sold at a much higher cost that the previous RTX 3080)? I don't know what it is worse: the fact that NVIDIA have become so incredibly greedy that they thought they could get away with it without anyone noticing or that NVIDIA have had to pull this launch, which, without a doubt, must be a huge, huge embarrassment to their shareholders.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Maybe it's just me but is nobody worried that this brings a worse SKU for a less attractive price instead of the 12GB 4080? Knowing Nvidia and all, and how they want to get you to flash out more money for the bigger card? I guess we shall see.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
GN says they spoke with some AIBs and they said nvidia will also be lowering the price. So that's a good news. I just hope it's a cut to a reasonable price. Like 500-600$.
https://forums.guru3d.com/data/avatars/m/258/258801.jpg
I dont know if anybody posted this but damn that just made a considerable drop in stock price. Shareholders should be very happy lol.
Screenshot_20221015-111434_Chrome.jpg
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Horus-Anhur:

Maybe. But it could be that AMD has better control over it's information. Also remember that the "leaks" with Ada Lovelace were mostly some guys throwing every possible spec at the wall, until something stuck.
not for ad102