Micron confirms GDDR6X for GeForce RTX 3090 with 12GB and over 1 TB/sec memory bandwidth

Published by

Click here to post a comment for Micron confirms GDDR6X for GeForce RTX 3090 with 12GB and over 1 TB/sec memory bandwidth on our message forum
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
So 12gb 3090 confirmed? That makes 3080 10gb card. Come on AMD 12/16gb Navi cards. :P
https://forums.guru3d.com/data/avatars/m/280/280231.jpg
Not enough for 8K gaming. This generation will be weak.
https://forums.guru3d.com/data/avatars/m/282/282392.jpg
Undying:

So 12gb 3090 confirmed? That makes 3080 10gb card. Come on AMD 12/16gb Navi cards. 😛
You'd be daft not to root for AMD, i partly thank them for my new cpu and without them nvidia may not have as been as quick to launch these cards.
itpro:

Not enough for 8K gaming. This generation will be weak.
Not too weak for CP2077 tho, enjoy your 8k gaming!
https://forums.guru3d.com/data/avatars/m/236/236506.jpg
itpro:

Not enough for 8K gaming. This generation will be weak.
I'd argue it's not enough going forward for 4K with ever-increasing texture demands and RTX features. Will Nvidia really be so stingy with VRAM... I might have to wait even longer for a new GPU purchase, which will drive me mad tbh.
https://forums.guru3d.com/data/avatars/m/115/115462.jpg
I'll have to see some performance numbers, but my gut feeling is telling me that nvidia is holding back. I wouldn't be surprised to see a "3090Ti" sometime next year with more bells and whistles including more vram.
https://forums.guru3d.com/data/avatars/m/260/260855.jpg
Undying:

That makes 3080 10gb card.
Is there some specific evidence the 3080 won't have 11 or 12 GB? I haven't been keeping up on all the rumors.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
rm082e:

Is there some specific evidence the 3080 won't have 11 or 12 GB? I haven't been keeping up on all the rumors.
Pretty sure the guy that wrote GDDR6x 12 for the 3090 - who was basically just proven right, also said 3080 @ 10GB.
https://forums.guru3d.com/data/avatars/m/272/272918.jpg
you know what happened last time we went with micron memory at 2080ti launch..
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
Supertribble:

I'd argue it's not enough going forward for 4K with ever-increasing texture demands and RTX features. Will Nvidia really be so stingy with VRAM... I might have to wait even longer for a new GPU purchase, which will drive me mad tbh.
Yeah I'm still holding onto my 2K monitor and GTX1070. That has been a great GPU for 2K and it still does pretty well today. I have had this card right at 4 years which is long for me. Nothing has really come out since that generation that made me think I needed a new GPU. Certainly not AMD's paltry mid-range offerings and Nvidia's overpriced overhyped RTX GPU's. I expect this to be the generation I upgrade since we should have some serious performance increases from both camps. I will be waiting to see what AMD has in store before I buy a GPU.
https://forums.guru3d.com/data/avatars/m/73/73680.jpg
With all the rumors floating around about 20GB & 24GB cards.. I wouldn't be surprised at all to see them launch with the lower memory variants and later release models using 2GB memory modules on the board to double up the memory. Granted I'd imagine the costs for cards with double memory would be sky high!
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Shadowdane:

With all the rumors floating around about 20GB & 24GB cards.. I wouldn't be surprised at all to see them launch with the lower memory variants and later release models using 2GB memory modules on the board to double up the memory. Granted I'd imagine the costs for cards with double memory would be sky high!
Quadro
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
I guess it will also be half the price of the model number.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Isn't that table just full of hypothetical examples? It has RX 5700 XT with 12GB of memory. Does such a thing even exist?
https://forums.guru3d.com/data/avatars/m/267/267641.jpg
SImple logic is saying that faster memory bandwidth means that less VRAM is needed, or not? And more VRAM means more expensive card with bigger TDP which could be otherwise used for main GPU? Its really need with such bandwidth speed keep all asset within VRAM? Is not good enough use some good asset streaming?
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
itpro:

Not enough for 8K gaming. This generation will be weak.
Your average mortal can't dream of 4k and you're upset about 8k? What monitor are you running right now?
data/avatar/default/avatar08.webp
One could argue that with the much higher ram speed and better optimization, you'd be able to do with less. I personally would like to see at least the 10 GB on the 3070 non-Ti series...or whatever they end up calling it. My old GTX1080Ti had 11 GB GDDR5...of which ACC made use of 7+ with big grids in VR. That's the most intense RAM usage I have of any game in my possession currently, so I would be good with the slight overhead.
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
@itpro Might wanna rename your account, as you clearly dont know much for a "pro". Unless the tv is 85in and bigger and sub 10ft away, or your sitting less than 2ft away from 49 or bigger moni, 8K is irrelevant.
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
itpro:

Not enough for 8K gaming. This generation will be weak.
The difference between 4k and 8k on a normally sized PC monitor is minimal enough that upscalers with sharpening will probably make it look almost identical. It's not DVD upscaled to Blu-ray where it's all blurry and awful. Honestly 1080p upscaled to 4k with a good upscaler and some sharpening is already closer to native 4k than anyone would like to admit. RTX and DLSS were honestly not ready and still are not 100%. But the principles behind them is good. The future is not native 8k for now. The future is ray tracing 4k upscaled with techniques like DLSS. Personally i feel like nVidia should have kept RTX to the 2080 Ti and kept the rest of the lineup as less expensive GTX cards. RTX was just too expensive for what was in the end a beta test (maybe even alpha in the case of DLSS as 1.0 did not perform better than standard sharpening filters). While i skipped RTX as it was too expensive in Canada i'm very interested in RTX 3k as long as nVidia can keep the price down a bit. DLSS has matured beautifully and games gain more with full fledged RTX than native 8k. Will be interesting to see if RTX 3k "affordable" mid range cards can at least provide full fledged ray tracing at 2k with decent fps (or 1080p with high fps). If so then i could live with DLSS personally. But as it stands right now the RTX 2060 S (only RTX card under 600$ in Canada) can't do that.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
it's amazing how close to hbm2 this gets
MonstroMart:

The difference between 4k and 8k on a normally sized PC monitor is minimal enough that upscalers with sharpening will probably make it look almost identical. It's not DVD upscaled to Blu-ray where it's all blurry and awful. Honestly 1080p upscaled to 4k with a good upscaler and some sharpening is already closer to native 4k than anyone would like to admit. RTX and DLSS was honestly not ready and still is not. But the principles behind them is good. The future is not native 8k for now. The future is ray tracing 4k upscaled with techniques like DLSS. Personally i feel like nVidia should have kept RTX to the 2080 Ti and kept the rest of the lineup as less expensive GTX cards. RTX was just too expensive for what was in the end a beta test (maybe even alpha). While i skipped RTX as it was too expensive in Canada i'm very interested in RTX 3k as long as nVidia can keep the price down a bit. DLSS as matured beautifully and games gain more with full fledged RTX than native 8k. Will be interesting to see if RTX 3k "affordable" mid range cards can at least provide full fledged ray tracing at 2k with decent fps. If so then i could like with DLSS personally.
agree wholehearteadly with 99.9% of this but having functioning dlss 2.0 in wolfenstein and control I played both with rtx on on a mere 2070S @1440p and got great experience.even more so considering those two games came as a free bundle.While it's a must for rtx performance to improve,you don't actually need a 2080Ti to enjoy it.Control ran at 55-65 fps which is fine for a third person on a adaptive sync monitor,wolfestein rarely droppped below 100,both with dlss quality.I feel like with 2080 Super and dlss 2.0 you'd have a great rtx experience overall,there'd be no reason to turn it off cause of broken game fluidity.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
The 3090 is going to have double the memory bandwidth of my 1080Ti, isn´t that good enough? And i´m still waiting for the prices of the damn thing...