Review: ASUS ROG Radeon RX 6800 XT STRIX OC Liquid Cooled

Published by

Click here to post a comment for Review: ASUS ROG Radeon RX 6800 XT STRIX OC Liquid Cooled on our message forum
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
Robbo9999:

Definite waste of money for $999 for this liquid-cooled version, 9% more performance (when overclocked) than a $650 reference card (not overclocked), so you overclock the $650 reference card and narrow the 9% lead of the $999 liquid-cooled card even more. Waste of money! You could buy the liquid cooled card for lower noise, but $999 vs $650....naaaa!
not only that, but who here wouldn't buy the full fat 6900xt at $999 and then put a block on it (or wait to do the same)?
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
tunejunky:

not only that, but who here wouldn't buy the full fat 6900xt at $999 and then put a block on it (or wait to do the same)?
Well the 6900XT stock air cooled card would be better value than the liquid cooled 6800XT in this review, but the 6900XT only has 11% more cores than the 6800XT and the same memory bandwidth, so really the 6900XT is poor value being 53% more expensive than the air cooled 6800XT but theoretically only having 11% more performance.....so the best bet is to buy an air cooled 6800XT for $650.
data/avatar/default/avatar17.webp
cucaulay malkin:

I would really like to see ampere and rdna2 compared at 1080 and 1440 on a 3600 or similar performance looks very good but they're all done on a 10900K. ain't nobody got money for that. I'm all for buying high end gpus,drives,cases and stuff,I do own them now and did own them before but I will never,never,pay $400 or more for a gaming processor.I'm already uncomfortable with 350.
i have 3300X with 6800 (non XT). Zero bottleneck in demanding games. In olders games around 60-70% GPU usage, still usually above 200 fps at 1080p. No neeed 10900k or 5600x 😀
data/avatar/default/avatar02.webp
Interested in how well this cooler holds up after 1 month. Both the Vega and 5700 Asus Rog models had bad coolers, with the 5700 literally falling off the board. My Vega 64 had bad VRM pads. Asus fan and RGB control that is not compatible with AMD wattman, so fan control is not working. GPU max clock set to 2400Mhz in bios, so fixing the VRM pads makes the card auto overclock to 1680Mhz, 90Mhz over stock boost, leading to 110c hotspot and instability. Downclocking in wattman or disabling the top speed step only seems to make the card unstable. Asus says it is working as intended, warranty repair denied. Only thing that seems to help is ripping the 3 fans off and replace them with 2x 140mm fans, held in place with cable ties and keeping the GPU load away from 100%.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
kapu:

i have 3300X with 6800 (non XT). Zero bottleneck in demanding games. In olders games around 60-70% GPU usage, still usually above 200 fps at 1080p. No neeed 10900k or 5600x 😀
there's something weird going on in dx11 games cpu intensive scenes with amd 6800 cards looks like amd driver overhead episode 374538975. I dumped my r9 290 in 2014 cause of this https://www.purepc.pl/test-ryzen-7-5800x-i-radeon-rx-6800-xt-w-miejscach-procesorowych
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Undying:

Good thing we all moved on from dx11 and 2013.
did we ? looking at g3d's game performance reviews,games are still mostly dx11.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
cucaulay malkin:

there's something weird going on in dx11 games cpu intensive scenes with amd 6800 cards looks like amd driver overhead episode 374538975. I dumped my r9 290 in 2014 cause of this https://www.purepc.pl/test-ryzen-7-5800x-i-radeon-rx-6800-xt-w-miejscach-procesorowych
theres nothing weird going on, this is classic Radeon driver performance for D3D11 titles [youtube=nIoZB-cnjc0] You either buy the highest IPC cpu you can to mitigate it, or use dxvk. [spoiler] [youtube=XoKu0_2ozAc] [youtube=S546TL2LWNY] [/spoiler]
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Astyanax:

theres nothing weird going on, this is classic Radeon driver performance for D3D11 titles You either buy the highest IPC cpu you can to mitigate it, or use dxvk. [spoiler] [youtube=XoKu0_2ozAc] [youtube=S546TL2LWNY] [/spoiler]
it doesn't get better than 5800x,isn't it the fastest single core now ? I swear I thought rdna1 fixed it.
Fox2232:

Seems like this is not only overpriced card from ASUS. Our shops are starting to list their standard TUF cards. But the price tags are really something. TUF 6800 (non-XT) 19.4% higher price than reference XT card. STRIX 6800 (non-XT) 28% higher price than reference XT card. TUF 6800 XT 38% higher price than reference XT card. STRIX-LC 6800 XT 55% higher price than reference XT card. There was rumor that there is not going to be 2nd/3rd batch of reference cards. And I think that ASUS behaves exactly as if there was nobody who could put their price into perspective. But even if there was no more reference 6800(XT), there is going to be full chip in form of 6900 XT. And ASUS is matching its price with STRIX-LC card. Reminds me of WoW loot system. "Greed or Need", except with a twist. We need those GPUs and ASUS is bit too much greedy.
that's not just asus asrock,gigabyte,all cost 3500-4500 here for 6800xt. I'd understand that if there were available,but no,they're on pre-order for comparison the 3080 I ordered was 3200.of course it hasn't arrived yet.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
cucaulay malkin:

did we ? looking at g3d's game performance reviews,games are still mostly dx11.
That should change here on Guru3d. Most other reviews already dropped them.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Undying:

Most other reviews already dropped them.
what are you talking about dude ? none dropped dx11 should we even drop games like flight simulator,mafia definitive,fallen order,anno 1800 and plenty more just cause they're dx11 ? what for ? to get another performance drop as is usually the case with dx12 ? https://www.purepc.pl/test-wydajnosci-resident-evil-3-remake-pojdzie-na-starym-trupie?page=0,10 until we're seeing dx12 become more than a botch job the idea of dropping dx11 for reviews is outrageous,and no one is doing it.really.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
cucaulay malkin:

what are you talking about dude ? none dropped dx11 should we even drop games like flight simulator,mafia definitive,fallen order,anno 1800 and plenty more just cause they're dx11 ? what for ? to get another performance drop as is usually the case with dx12 ? https://www.purepc.pl/test-wydajnosci-resident-evil-3-remake-pojdzie-na-starym-trupie?page=0,10 until we're seeing dx12 become more than a botch job the idea of dropping dx11 for reviews is outrageous,and no one is doing it.really.
It boggles my mind seeing a GTA tested even today or some 5 year old dx11 game and that should effect my decision in a future gpu upgrade, makes no sense.
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Undying:

It boggles my mind seeing a GTA tested even today or some 5 year old dx11 game and that should effect my decision in a future gpu upgrade, makes no sense.
yeah gta5 is way too old well,it's not just the age,witcher 3 is 5 years old but still holds up it's just gta5 looks slightly dated imo plus it runs on a potato to run witcher 3 with max draw distance in those swampy environments you need some real gpu power
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Fox2232:

Witcher 3 is about to get DX-R or RTX raytracing update. So it will either qualify itself for few more years of benchmarking or will disqualify itself.
nice might replay (though I have 400hrs) but my chances of getting a card anytime soon are slim
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
cucaulay malkin:

I swear I thought rdna1 fixed it.
RDNA made tesselation no longer choke the gpu, it didn't do anything for the serial input for d3d11.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
Undying:

That should change here on Guru3d. Most other reviews already dropped them.
The adoption rate for DX12 has been tremendously slow over the years, ergo most game developers to this date still prefer DX11. That said, our current reviews count eight DX12 games, four DX11 games, and one VULKAN. Out of the DX11 titles, two games stem from a release in 2020, one from last year and one is old (Witcher III) but people really like to keep that one included. Also, for a valid review, you need to test all popular APIs, and that includes DX11. So I don't see how your remark is valid really.
data/avatar/default/avatar29.webp
edit : to add to Hilbert comment above, only DX11 or older games really show the CPU impact because it cannot handle properly more than 4 cores thus cares a lot about cpu clockspeed unlike DX12, it's the main reason of the ryzen 5000 improvements, the fact that Intel was still king in limited cores games/apps for comparison purposes : 1080ti oc (280-320watts) 1x 360mm x 25mm rad only for the gpu phantek glacier waterblock idle 28°C max temp 45°C with some 46-47 peaks in benchmarks it barely sees 40° in light game it's around 35° also the instant the gpu load drops the card is back to 28°C an almost vertical temp drop, still true after 4-8hrs sessions of gpu heavy games, you stop playing gpu goes cold as if nothing happened (not the backplate tough) that LC is better than an air cooled card but nowhere near a custom loop, seems very expensive for what it does
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
kakiharaFRS:

edit : to add to Hilbert comment above, only DX11 or older games really show the CPU impact because it cannot handle properly more than 4 cores thus cares a lot about cpu clockspeed unlike DX12, it's the main reason of the ryzen 5000 improvements, the fact that Intel was still king in limited cores games/apps for comparison purposes : 1080ti oc (280-320watts) 1x 360mm x 25mm rad only for the gpu phantek glacier waterblock idle 28°C max temp 45°C with some 46-47 peaks in benchmarks it barely sees 40° in light game it's around 35° also the instant the gpu load drops the card is back to 28°C an almost vertical temp drop, still true after 4-8hrs sessions of gpu heavy games, you stop playing gpu goes cold as if nothing happened (not the backplate tough) that LC is better than an air cooled card but nowhere near a custom loop, seems very expensive for what it does
a lot of dx11 games handle 8 cores or more fine all anvil next based games (watch dogs 2,odyssey,origins,wildlands) all frostbite based games (bf,battlefront and what have you) witcher 3 too and more dx11 not using more than 4 cores is a myth. while I do think Vulkan is superior to both,and its adoption rate is way too slow,I'd rather see a good dx11 implementation than a hastily ported dx12 crap.
data/avatar/default/avatar34.webp
cucaulay malkin:

did we ? looking at g3d's game performance reviews,games are still mostly dx11.
I don't much care TBH . I still get 150fps+ in older games. Tested witched ( heavy moded) , i get drops to 100FPS in Novigrad, CPU is at 60-70% while GPU only 60%. All new games including Horizon Dawn , Valhalla, RDR2, are maxing GPU 97-100% with my 3300X ( why upgrade ? ) . I got lucky bin tho, boosting 4350ghz, 4250 all core stock. Got it to 4550mhz with 1.41v stable . I can do 4.6ghz some synthetics but games crash instantly 🙂
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
you may get away with a low-end cpu for a card like that for a while,but long-term you'll see how severly mismatched they are. for a 60hz monitor,sure,you may see very little bottleneck.that 3300x may even last a while.