AMD Radeon RX 6900 XT review

Graphics cards 1054 Page 1 of 1 Published by

Click here to post a comment for AMD Radeon RX 6900 XT review on our message forum
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
AuerX:

So the RTX3080 is still easily the best bang for the buck.
Huh? No it's not. The 3060 Ti is.
https://forums.guru3d.com/data/avatars/m/169/169351.jpg
Fox2232:

Then you are out of loop for a year. But that's difference between official and background information. Samsung officially states that even now, they are sampling 16Gb GDDR6 with 16Gbps. And that they only have in mass production 16Gb GDDR6 with 14Gbps. https://www.guru3d.com/news-story/samsung-starts-gddr6-production-that-offers-bandwidth-of-18gbits.html (source) Problem as always is cost vs demand.
I can see a refresh of these cards with faster memory next year - it's so easy to tell that these cards are bandwidth starved...
https://forums.guru3d.com/data/avatars/m/254/254338.jpg
This card is lightning fast at lower resolutions, but like the 6800xt it doesn't scale so well at higher resolutions where the Infinity cache becomes less as effective. Even if it meant AMD had to make the card even more overpriced than it already is, they should have definitely upgraded the memory on the 6900xt to either GDDR6X or HBM2 to give it the bandwidth it needed to sweep the board at all resolutions.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Fox2232:

Then GDDR6X did not exist till nVidia released Ampere, right? Chips exist for years. No demand, no product. @Sovsefanden Gyrl, are you taking your testosterone pills? Like such posts? No? Then don't call people around bois. Keep gender slurs out.
how is there no demand for 18gbps ddr6 ? literally 3 out of 4 comments here say 6800xt and 6900xt lack bandwidth
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Herem:

Even if it meant AMD had to make the card even more overpriced than it already is, they should have definitely upgraded the memory on the 6900xt to either GDDR6X or HBM2 to give it the bandwidth it needed to sweep the board at all resolutions.
If they did either GDDR6X or HBM2, not only would the cost go up but availability would be even worse than it is now. Considering how much VRAM it has, I'm sure they could've got away with a wider bus.
https://forums.guru3d.com/data/avatars/m/275/275921.jpg
schmidtbag:

Huh? No it's not. The 3060 Ti is.
Sure, it you don't give a damn about RT and 4K. I should have specified that.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Undying:

6900XT is the easily fastest at 1080/1440p only in 4k being beaten by 3090. I think amd will have excellent lineup with 6700xt in january just damn it make more of them!
yeah on avg. I've seen it maybe like 2% faster than 3080 on some sites but others it's the other way around.Tbh if you focus on games they go about 50/50,you can make either card a few percent faster depending on your selection of games. at 1080p/1440p I dunno maybe you calculated averages and 6900xt pulls far ahead on g3d,I haven't,but frankly other sites have it going back and forth with 3080 at lower resolutions overall,across 20 games in 4 resolutions (fhd,1440p,3440x1440,4K) this is where current and last gen cards stand for traditional raster perfrormance. https://www.pcgameshardware.de/Radeon-RX-6900-XT-Grafikkarte-276950/Tests/RX-6900-XT-oder-RTX-3090-Grafikkarten-Vergleich-1362845/3/ https://i.imgur.com/0nPNcO8.jpg
https://forums.guru3d.com/data/avatars/m/150/150085.jpg
This is by no means worth $1,000. The 6900 XT only offered 8CUs more than the 6800 XT and that is it. There are no other features, benefits, nor vram you will get out of the 6900 XT other than 8CUs!!! At best this is a $700 card. AMD has lost their minds charging this much for it. There are no marketing points to reference to even suggest why someone should pay $400 + for an additional 8 cu's.
data/avatar/default/avatar29.webp
@Hilbert Hagedoorn small mistake in graphs, 2560x1440 is QHD, WQHD is 3440x1440.
https://forums.guru3d.com/data/avatars/m/280/280231.jpg
RTX 3060 12gb will kill RX 6700 series, RTX 3060 6gb will obliterate RX 6600 and RTX 3050 laughs hysterically at RX 6500. Hell, even 6800 series will lose both watt/performance/price ratio soon by then, already 3060 ti has the best overall gpu for 2020 entitlement. AMD fans should&must be wise, waiting for RDNA2 refresh/RDNA3. 6900XT is a worse offering than 3090. At least 3090 offers true 4k performance and real RT.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
AuerX:

Sure, it you don't give a damn about RT and 4K. I should have specified that.
The 3060 Ti without RT is sufficient for 4K (especially with DLSS). At 1080p, the 3060 Ti is sufficient with RT. If you want 4K and RT together, then yeah, the 3080 is the best overall choice, though even then, I'm not sure that's actually powerful enough.
https://forums.guru3d.com/data/avatars/m/283/283844.jpg
I lost all interest in buying 6800XT/6900XT they are not even listed on the biggest computer stores in Canada.Such a shame.
data/avatar/default/avatar20.webp
Well, good to see the performance of this card, AMD has come a long way now and finally can compete with Nvidia. But I still rock with my RTX 3080, it's just more efficient even at 8nm, DLSS is great and we also get good RT performance. So no, it was not a bad choice to choose the 3080 instead of waiting. 10GB is enough for 4K. What matters is fast SSD, good OCed CPU and fast RAM to go with it
data/avatar/default/avatar16.webp
Denial:

I have DLSS 2.0 and I'm claiming otherwise. In still pictures it's good, great even - in motion Death Stranding has a lot of weird issues going on with DLSS, moire effect in various places, smearing in wires/particles, etc - particularly with the water in the game. So yeah, still disagree.
No issues with my RTX 3080 using DLSS, the card is so quiet because of DLSS, and the temp lays around 55C on air at 1950Mhz. I play with LG OLED CX
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
The 4k drop is quite big considering. Only thing I can think of is that the bit bus at this point fails the hardware. But yeah the price of this card is not amazing. Better than what I paid for a 3090 though assuming as HH said you don't want DLSS or ray tracing. Still its good to have some nice competition and blows being traded not felt like this since the 7970 Vs 680 series
https://forums.guru3d.com/data/avatars/m/275/275921.jpg
Fox2232:

I do not believe words sufficient and DLSS should be used in same sentence. As use of DLSS is proof that card has insufficient performance for given resolution. Otherwise one could say that even 2060 is sufficient for 4K. In the end, there are super poor IQ modes like 9X where Actual 4K image output is faked from 1280x720 render. And I tend to believe 2060 is capable to deliver reasonable fps on 720p.
The only thing that matters is the final output. DLSS is proven already to be fine. Keep putting "Actual Pixels" ahead of real life image quality and performance, that's your loss.
https://forums.guru3d.com/data/avatars/m/165/165326.jpg
Great review @Hilbert Hagedoorn . I got mix feelings towards this card , it performs in between the rtx3080 and rtx3090 wish is great to see , but i feel it is held back by power limit and drivers. Also it seems it is bandwidth starve at 4k resolution. Overall not good / not bad , just mehh... go with the 6800xt as it offers better price to performance ratio. Availability 0 , zero , cero , nada , null 🙄 , nowhere to be found pink unicorn of a graphics card ....
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Fox2232:

I do not believe words sufficient and DLSS should be used in same sentence. As use of DLSS is proof that card has insufficient performance for given resolution. Otherwise one could say that even 2060 is sufficient for 4K. In the end, there are super poor IQ modes like 9X where Actual 4K image output is faked from 1280x720 render. And I tend to believe 2060 is capable to deliver reasonable fps on 720p.
The 3060 Ti can play in 4K, but DLSS can make up for wherever it may fall short. If your goal is to get 60FPS, you could probably upscale 1440p, where DLSS's loss in detail will be minimal. And that's an important point to make, because a 2060 upscaling to 4K will likely have to do so at 720p, where the loss of detail will not be minimal, and therefore insufficient. If you have to upscale more than double your resolution, you won't get desirable results. At that point, you might as well just lower detail level and play at 4K natively. Anyway, I would prefer a GPU that can render 4K natively, without making any sacrifices. I myself would be more interested in DLSS if it could work on anything, but it doesn't.
https://forums.guru3d.com/data/avatars/m/103/103841.jpg
Since AMD greenlighted AIBs for Custom 6900XTs, I feel pretty strongly that a well engineered custom 6900XT will exceed performance of 3090 in almost every case, even Nvidia sponsored/biased titles. Ray Tracing will still be behind this generation, but it's pretty gimmicky at this point in time. I find it very similar to the days when DX10 finally arrived and Microsoft and Nvidia marketed the heck out of it. Yet the visual comparison left people squinting and searching for differences and the performance hit was quite noticeable (not as bad as RT). Fast forward almost 15 years and DX10 never really brought much more to the table than DX9. I feel Ray Tracing may be headed down the same path.