Nvidia Ampere GA100-GPU would get 8192 cores and boost speed up to 2200 MHz

Published by

Click here to post a comment for Nvidia Ampere GA100-GPU would get 8192 cores and boost speed up to 2200 MHz on our message forum
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Vegeta, what does scouter say about the price? Its Over 9000, ghaa!
data/avatar/default/avatar12.webp
This with large VRAM size won't be the gaming card but yes GA100 w/ 22GB VRAM. First GA104 w/ 16GB will release between in the end of spring and the begin of summer and later GA100 w/ 22GB. But maybe no - GA100 will be the gaming card. 🙂
https://forums.guru3d.com/data/avatars/m/178/178348.jpg
The Die size for the current TU102 is already pretty massive as it's on 12nm. It would be hard to make it bigger without very low yields. The 3xxx move to 7nm is needed.
https://forums.guru3d.com/data/avatars/m/181/181063.jpg
LEEc337:

A doubling of core count, like early Ryzen did with intels count makes me think nvidia have always been able to release bigger better gpus but didn't need to because nobody else was pushing them to, just my humble opinion
By your opinion today we would still use 8800GTS...if I remember correctly that was the last time that anybody "pushed" them... No..it's not like that. To their credit they always pushed - don't make comparison between CPU's and GPU's - they are different beasts and they develop differently. Their price policy is questionable at times but even without a serious competitor for a long time they developed and innovated.
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
Looking forward to upgrade my tired overworked GTX 1080 with something that is double the the performance, without being double the price (preferably even cheaper than what I paid for the 1080, one can dream...) I don't care which company it will come from...
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
I need to stop looking at this stuff it might give me the itch to replace my 1070ti, which I still not over the 500$+ i payed of it Something for 250-300$ that capable of pushing 4k @ 60 fps might make me more willing
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
If they're trying to pull another $$$ stunt like with 2080Ti, they can keep this ngreedia crap :P
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
wavetrex:

Looking forward to upgrade my tired overworked GTX 1080 with something that is double the the performance, without being double the price (preferably even cheaper than what I paid for the 1080, one can dream...) I don't care which company it will come from...
Ditto with my 1080ti, I just worry about the cost this is looking at, but honestly even if this came out as $1200 like the 2080ti, considering the performance jump I would say it was worth it (of course i would love it cheaper) but i reckon the size of the chip must be huge for 8k cores, though something like a 3070/3080 might be more up your alley, looking at it they might double your performance for a more resonable price (considering what you see as reasonable)
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
LEEc337:

A doubling of core count, like early Ryzen did with intels count makes me think nvidia have always been able to release bigger better gpus but didn't need to because nobody else was pushing them to, just my humble opinion
That's just ignorant to be honest. Do you see the size of the rtx 2080 ti? Not the card, the actual gpu die? The thing is the largest consumer GPU die ever. It's massively costly to make and making a larger GPU with more shader cores "just because they could" isn't a feasible or smart business plan. Making a GPU as large as the 2080 ti was already pushing it. The only reason it's possible now is due to them going from 12(14?)nm to 7nm, something that wasnt possible when the 2080 ti was released.
https://forums.guru3d.com/data/avatars/m/256/256969.jpg
*Watch last kidney* "Your hours are counted"
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
Aura89:

That's just ignorant to be honest. Do you see the size of the rtx 2080 ti? Not the card, the actual gpu die? The thing is the largest consumer GPU die ever. It's massively costly to make and making a larger GPU with more shader cores "just because they could" isn't a feasible or smart business plan. Making a GPU as large as the 2080 ti was already pushing it. The only reason it's possible now is due to them going from 12(14?)nm to 7nm, something that wasnt possible when the 2080 ti was released.
Considering its meant to double RT cores and almost double the cores from last gen, I still see this being a massive die, wondering if it could be even bigger than the 2080ti even with the shrinking, would be nice if it is smaller, should mean it will be cheaper. But totally agree, if Nvidia really wanted to with the tech they could make an 8000 core die using turning, but it would be too stupidly expensive to even consider making
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
LEEc337:

Wow sorry I wasn't tryin to offend I ment it as a resources to competition way (why put more out than you need to stay ahead) and now more people wantin 4k powerhouse cards the core counts have doubled and suspect all 3 companies to bring cards with 4k at mid range and high end cards going for high fps 4k with better 8k support again just guesses It will be a credit to all for innovation I love watching the innovation, I've watched gpus develop since direct ☓ 6 and earlier was in the snes vs mega drive wars of old so I have nothing but respect for the innovative things in the pc world just like yourself I'm sure, this just a forum these are just my opinions you may be 100% right but calm down before you pop mate ✌
People are allowed to have opinions, ofcourse, but don't expect people to not comment on your opinion if you're going to post it on the internet. Especially if you post ignorant posts. Not trying to be mean here, it's the only word i can think of to describe your comment, as it has no basis in reality, and even the tiniest of research would make that comment cease to have ever existed.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Ricepudding:

Considering its meant to double RT cores
It isn't meant to double RT cores, nvidia went with scaling SM groups and increasing shader counts in the gpu instead. more sm groups, means automatic increase to RT cores, as there is one per SM group.
Ricepudding:

wondering if it could be even bigger than the 2080ti even with the shrinking, would be nice if it is smaller, should mean it will be cheaper
TU102 is only larger if you exclude hbm (and you don't) GA100 = 826 mm² GV100 = 815 mm² TU102 = 754 mm²
Ricepudding:

they could make an 8000 core die using turning, but it would be too stupidly expensive to even consider making
No, they really couldn't.
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
Astyanax:

It isn't meant to double RT cores, nvidia went with scaling SM groups and increasing shader counts in the gpu instead. more sm groups, means automatic increase to RT cores, as there is one per SM group. .
I mean OP says they double the tensor cores (rumour mind you), though by that standard 2080ti has 68SMs and 68RT cores, this would then have less than double as its 128SMs not 136SMs
Astyanax:

No, they really couldn't.
Reason for why they couldn't? RTX got to 4608 cores (I assume cause they can make a lot of these there could still be some imperfections on the die, if i'm wrong about that then my bad), maybe 8k might be pushing it a bit, but 6k in theory could be possible if they got a perfect platter right with no imperfections on it?
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Ricepudding:

Reason for why they couldn't?
far far to big a chip to deploy in nvmesh configs.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Will they be PCIe 4.0 cards or remain PCIe 3.0, I wonder.
data/avatar/default/avatar29.webp
Kaarme:

Will they be PCIe 4.0 cards or remain PCIe 3.0, I wonder.
Maybe no PCIe 4.0 support for top- or high-end GPU Ampere because the Intel doesn't announce PCIe 4.0 for desktop in this year. 😉
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
Even if it's "only" $1000 MSRP, it's absolutely hilarious that people here would think that's a reasonable or fair price, by any stretch of the imagination. In b4 someone comes in talking about how HBM2 costs more than an ocean of virgin blood, that their R&D costs more than 10 billion pure souls, then links to nVidia's ballsack's reddit page stating they pay everyone in their staff and homeless people outside $10K per minute therefore they need to charge insulting amounts. 🙄 I'll drop to 30 fps gaming or buy a console before I pay nVidia's mafia monopoly prices. Or better yet, bust out my backlog of old games. Too bad, I was looking forward to Cyberpunk on PC (no, I won't play it on a console). Guess I'll play that in 2026 or so. I can wait.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Neo Cyrus:

it's absolutely hilarious that people here would think that's a reasonable or fair price, by any stretch of the imagination.
Because you're market ignorant.
data/avatar/default/avatar28.webp
Expecting 50-70% more performance over a 1080ti and 40% more performance over a 2080ti? . Expecting to reach 2.4Ghz with a stable overclock on air/water? Expecting much better RTX performance aswell. Will it undervolt stable to decrease the temps? Find out more when Nvidia releases their new 3000 series graphics cards which will buttkick your wallet so far to the outer reach of the solar system.