Unusual high failure rates for GeForce RTX 2080 Ti?

Published by

Click here to post a comment for Unusual high failure rates for GeForce RTX 2080 Ti? on our message forum
data/avatar/default/avatar04.webp
Geryboy:

how well are they perfroming? And somebody always needs to apply the fix to the newest driver right? Very dependent.....
Well as always, sli is pcie bandwidth and cpu limited - quad sli even more so. But in frostbite engine games, performance is excellent with 40+ pcie lane cpu's. And yeah, it needs to be applied with each new driver. Quad sli died with 1080 ti's though, as you can't physically use more than 2 gpu's with nvlink on the consumer gpu's.
data/avatar/default/avatar06.webp
SamuelL421:

2080Ti FE owner here, no problems to report so far. I had been waiting on water blocks to get here before completing a new threadripper system with the 2080Ti. I opted to drop the card (temporarily) into my old X58 system to test in light of problems people mentioned on the nvidia forums. With the stock cooler, I applied a mild OC (+100/200 core/mem) and still haven't seen any problems. FYI, 4k 60fps is very doable on X58 despite whatever cpu bottleneck is likely happening - that is both awesome and sad (for how far we have come in about 10 years). Here's hoping the 2080 Ti doesn't start showing problems after I go to the trouble of mounting the block and adding it to my loop in a few days.
I wonder if i will be happy when i will finally switch on my 2700x vs i7 990x that i m replacing. I could have bought just the 1080ti to replace the old 2 x 770 gtx maybe
data/avatar/default/avatar06.webp
Lol 😀
https://forums.guru3d.com/data/avatars/m/264/264923.jpg
deusex:

I have my 2080TI FE edition for about 3 weeks, no issues so far.
03238xxx?
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
I'm not gunna drop nvidia employee's names into the ring here, but there are internal concerns about the initial and Replacement failure rates.
https://forums.guru3d.com/data/avatars/m/263/263710.jpg
.....the 60% increase in die size could be the issue??? More heat? and thus the GDDR6
data/avatar/default/avatar02.webp
Hi. My FE card has been working fine serial number starts 0323. One thing like to say to others where my setup is different. I use mine on 4K LG OLED tv, so the main purpose of getting this card was the magic 60fps that’s really needed for 4K tv gaming. This card is amazing and well worth the upgrade from 1080ti. That extra 20-30% makes all the difference. Touch wood this card will keep working. They have 3 year warranty so hopefully won’t need to use that.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Caesar:

.....the 60% increase in die size could be the issue??? More heat? and thus the GDDR6
its not the DIE thats failing, its generally the vram or power related components.
H1TMANza:

Maybe now nvidia will stop gimping the 10-Series drivers
nvidia hasn't been gimping anything.
data/avatar/default/avatar20.webp
Had my msi trio 2080ti since September 27th. Still working without issues.
https://forums.guru3d.com/data/avatars/m/274/274977.jpg
H1TMANza:

Maybe now nvidia will stop gimping the 10-Series drivers
This has already been debunked, stop spreading lies!
https://forums.guru3d.com/data/avatars/m/190/190157.jpg
emperorsfist:

This has already been debunked, stop spreading lies!
Don't feed the troll man, certainly not worth the bandwidth...;)
https://forums.guru3d.com/data/avatars/m/274/274577.jpg
it just doesnt work lol 😉
https://forums.guru3d.com/data/avatars/m/263/263487.jpg
Queue the age old "My thing works so everyone else is lying, the thing I have is BEST THING EVA!!" post 😀
https://forums.guru3d.com/data/avatars/m/272/272918.jpg
Mines fine, get hard lock in sniper elite 4 but all rest work fine, had a month or so now
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Man this must hurt, paying big bucks for something only to have serious problems with it. Hope everyone can get their GPUs fixed/replaced. Also this seems to indicate that maybe the GPUs were rushed?
https://forums.guru3d.com/data/avatars/m/201/201151.jpg
NVIDIA is confirming the issues with Turing based Cards....More info soon!!!
https://forums.guru3d.com/data/avatars/m/274/274577.jpg
i hope it doesnt affect the 2070 what i ordered
https://forums.guru3d.com/data/avatars/m/274/274577.jpg
maybe it only works for 10hrs i joke ofc lol
https://forums.guru3d.com/data/avatars/m/201/201151.jpg
Nvidia's response has been posted online i will just let Hilbert break the news..
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Fox2232:

Technically speaking, they are not gimping older series. All they need to do is reduce amount of game specific optimizations for newly introduced games. And that's what did happen before many times. Then you had wave of angry owners all over internet and "magical" driver patch suddenly delivers optimizations. So, no, no gimping of already released code. But as they invest less and less into each game optimizations for older series, they fall behind. And it can be seen today on cards like GTX 780Ti. AMD on other hand gains very little from game specific optimizations. They do general architecture improvements which deliver gains across the board and therefore losses from AMD ignoring older cards are much smaller. (read FineWine tm) Paradox is that it is not as much AMD who delivers great improvements to older cards, it is nVidia who stops doing new optimizations. AMD had few notable driver sets which brought very noticeable jumps in performance, but even then their cards were not really much better than nVidia's once price was considered. So, when you see that nVidia's card falls behind AMD's which it was previously outperforming in everything, there is really no reason to praise AMD. Blame nVidia.
Eh - I think the issue with Kepler was mostly due to a shift in game development towards GCN's strengths. AMD's presence in nearly every console required game devs to squeeze every ounce of performance out of that architecture. So if you're writing a game and you say "well GCN has a bunch of compute capability, let's leverage that" and Kepler's compute sucks (it does) then the result is eventually all games coming out will play to AMD's strengths (FineWine) and into Kepler's weaknesses (Downgrade Debacle). I don't really recall any magical driver uplifts either? There was definitely specific issues in games that Nvidia fixed after outcry but there was never any "downgrade debacle catch-all" driver. Either way we haven't seen similar results with Maxwell/Pascal, despite most saying we would, mostly because Nvidia's architecture optimizations started following suit with AMDs.