AMD Radeon RX 6900 XT review

Graphics cards 1054 Page 1 of 1 Published by

Click here to post a comment for AMD Radeon RX 6900 XT review on our message forum
https://forums.guru3d.com/data/avatars/m/202/202673.jpg
Yay review time!:D
https://forums.guru3d.com/data/avatars/m/277/277169.jpg
6900XT is just slightly better than the 6800XT, it feels more like an overclocked 6800XT (especially when looking at the RT Performance). Those Benchmark results was kind of expected to be honest, because they both were introduced to have the same TDP, turns out the 6900XT drains (slightly) more than 6800XT đŸ™„ Since Nvidia basically confirmed that the availability issue is going to remain like nowadays for several months, perhaps even longer then half a year, I bet the situation for AMD must be even worse then. Especially considering the fact that 6800XT still isn't nowhere to be found at all, let alone what will be with the 6900XT.
data/avatar/default/avatar02.webp
I am disappointed by the limited gap in performance between the 6800, 6800xt and the 6900xt. Its almost justifiable not to have the 6900xt exist.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
A bit underwhelming for a flagship, but, a giant leap compared to what AMD offered in the past 6 years. Not sure how their Windows drivers are failing so badly with compute loads, or OpenGL. At least when it came to the 6800XT, the Linux compute performance was crazy fast, whereas in Windows it's downright slow.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
meh 999 matches or slightly beats 3080 at 4K has no dlss and poor RT performance for a card that costs this much it's a good card on its own but compared to 3080 at 699 no way it's worth $300 more. it's a weird strategy by amd to push this as a titan-esque card at 999.
data/avatar/default/avatar27.webp
The performance gap between 6800 and 6800XT is 15% on average.
Sure...now what the performance gap between the 6800 and 6900xt? I have went looked over some other sites results and the 6900xt faired better during their testing? On average, beating out the 3090 at 1440p, exceptionally different results
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
Well for the measly price of 600€ more in Finland then 6800xt this is a no go đŸ˜€
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Fox2232:

If there is AIB which is able to equip any 6800/6900 cards with faster memories, those will sell like hot cakes.
how ? never heard of gddr6 go beyond 16gbps unless we're talking oc this card should absolutely be 12g 384-bit unless amd like losing to nvidia's 699 card in every other title at 4K
data/avatar/default/avatar21.webp
cucaulay malkin:

meh 999 matches or slightly beats 3080 at 4K has no dlss and poor RT performance for a card that costs this much it's a good card on its own but compared to 3080 at 699 no way it's worth $300 more.
Ray Tracing may become more important as time carries on. I suspect more developers may spend less time programming in dynamic lighting and proper shadows if they can just use RT. At this point the value in the 3080 is paramount if you want RT...I am still impressed by the 6800 and 6800xt
data/avatar/default/avatar20.webp
I participated in the "swiss etailer lottery" you might have read about on videocardz (if not they were honest as usual and told everyone they would only get 35 cards so they made a lottery rather than frustrate buyers who think everyone else is getting a card except them) because of the non-availability of everything I leave my purchase decision to chance. That said strong chances I'll leave it to the second in line if I win because what I really want is a >10Gb 3080 I do use nvenc and that's a huge plus for me EDIT 1 : nvm I for sure will leave it to someone else it's 1293$ for a reference card ! (the ek asus watercooling model of 3080 10gb is 1058$ the asus tuf 921$) EDIT 2 : I guess I'll wait for the 3080ti as long as it has 11-12gb or more I'll be ok I've already seen games taking 9+Gbs with rivatuner on-screen stats 10gb is just no)
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
slimmy427:

Ray Tracing may become more important as time carries on. I suspect more developers may spend less time programming in dynamic lighting and proper shadows if they can just use RT. At this point the value in the 3080 is paramount if you want RT...I am still impressed by the 6800 and 6800xt
I don't think the question is wheter but when.there's no other way we can see movie-like games one day. sadly,sloppy devs are more interested in doing half assed job with rt and dlss and just slap "rtx on" on the box to sell more. that's why we're seeing rt develop slowly.not enough games like control,minecraft,cp2077
data/avatar/default/avatar15.webp
Great review and conclusion Hilbert. 6900XT and 6800XT are great raster GPUs but the price difference for so little more RT performance makes it a huge question mark...
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Sovsefanden:

Most AMD bois apparently does not care about RT and AI UPSCALING like DLSS (with 100% perf boost and even IQ boost with proper implementation). RT + DLSS used in combination allows for RT without perf hit and looks insane - Wait till people see Cyberpunk in full RT glory. The OFF images looks DULL in comparison. AMD should have priced 6800 at 449, 6800XT at 599 and this 6900XT at 749 tops. 3080 at 699 will easily outsell both 6800XT and 6900XT
DLSS doesn't give an IQ boost over native - it sometimes clarifies text but everything else is definitely still worse and in motion there are noticeable artifacts. It's a great technology and I hope it improves but it has issues. Also most of the earlier reviews say the RT implementation in Cyberpunk isn't great - the reflections are way too much and performance crashes in a bunch of areas even with DLSS.
From an aesthetic perspective, however, that abundance of reflective surfaces and light sources can be downright overwhelming with RT turned on, and walking through similar scenes with the feature disabled often reveals blended and diffuse lighting—still impressively lit, and arguably less gaudy. Other scenes, particularly some nighttime markets, look remarkably darker with RT disabled—enough so that I'm concerned console players will suffer from a lack of visibility in those zones. Weirdest of all is that CP77 doesn't include self reflection. You'll see allies and enemies reflected in nearby glass, while you'll always be a vampire-like phantom, unless you go into a bathroom and activate its mirror. Doing this with an RTX 3080 resulted in some insane slowdowns to single digit FPS levels, which seems to indicate some serious issues for CDPR in getting self reflections to work at all. Much of the "immersion" of ray tracing goes out the window when you, the star of the show, don't get to be a part of it, so that's a bummer.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Sovsefanden:

3080 at 699 will easily sell out
FTFY đŸ˜› but yeah,with turing nvidia knew that RT isn't even remotely possible without DLSS,and now even a 3080 will struggle not to drop a frame below 60 here and there if you don't use DLSS along with RT. I'm kinda like đŸ™„ when I hear ppl say amd is coming up with an answer to dlss soon to make rt playable as if they hadn't known that you need it for RT since Turing came out.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Sovsefanden:

Yes it does. Death Stranding for example, looks sharper with DLSS 2.0 enabled most of the time. https://www.pcgamesn.com/death-stranding/dlss-2 Why do people WITHOUT DLSS 2.0 experience always claim otherwise? LOL!
I have DLSS 2.0 and I'm claiming otherwise. In still pictures it's good, great even - in motion Death Stranding has a lot of weird issues going on with DLSS, moire effect in various places, smearing in wires/particles, etc - particularly with the water in the game. So yeah, still disagree.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
dlss has a lot less flickering in motion than native/taa in games that tend to produce it.a lot. I've seen it miss a texture here and there or produce some slight motion artifacting,but so does taa and to a bigger degree. watch it thoroughly.it's amazing how it gets rid of flickering where native/taa frankly look like an absolute mess [youtube=jS4_tUHv5zE]
Fox2232:

Then you are out of loop for a year. But that's difference between official and background information. Samsung officially states that even now, they are sampling 16Gb GDDR6 with 16Gbps. And that they only have in mass production 16Gb GDDR6 with 14Gbps. https://www.guru3d.com/news-story/samsung-starts-gddr6-production-that-offers-bandwidth-of-18gbits.html (source) Problem as always is cost vs demand.
I don't think I'm out of the loop since I've never seen a single card use them
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
slimmy427:

I am disappointed by the limited gap in performance between the 6800, 6800xt and the 6900xt. Its almost justifiable not to have the 6900xt exist.
On paper yes. But people would trash AMD if they did not have a 6900XT. 99% of the market doesn't buy halo products but still care about who is the best. It's not rare to see someone buying a worse mid range product simply because the company making it has the best high end product on the market. It's stupid but it has been this way for a while.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
6900XT is the easily fastest at 1080/1440p only in 4k being beaten by 3090. I think amd will have excellent lineup with 6700xt in january just damn it make more of them!
https://forums.guru3d.com/data/avatars/m/115/115462.jpg
And I thought the difference between 3080 and 3090 was small, with AMD it's even smaller in 6800XT to 6900XT. Still a great card of course, but yeah, I'm a little bit disappointed.
https://forums.guru3d.com/data/avatars/m/275/275921.jpg
So the RTX3080 is still easily the best bang for the buck.