Sapphire Radeon RX 6800 XT NITRO+ review

Graphics cards 1054 Page 1 of 1 Published by

Click here to post a comment for Sapphire Radeon RX 6800 XT NITRO+ review on our message forum
https://forums.guru3d.com/data/avatars/m/79/79189.jpg
Lordhawkwind:

Hey Hilbert I get this is your day job but these cards ARE NOT AVAILABLE TO BUY. You need to call this out and don't review vapour ware products otherwise both Nvidia and AMD will continue supplying cards none of us can buy, making you a shill. TBH I don't even read the reviews for these non existent cards anymore it's a waste of my time as I can't buy them. You make them a recommended buy when there is no stock availability. Stand up for us PC Gamers and say enough is enough.
How do we know what the availability would be if these cards were purchased by Gamers instead of scalpers. https://www.ebay.com/itm/Sapphire-Nitro-Radeon-RX-6800xt-se-OC/143862620959?hash=item217ee1631f:g:oIsAAOSw5XVfvrXD
https://forums.guru3d.com/data/avatars/m/266/266726.jpg
Webhiker:

If this is not a paper launch then I don't know what a paper launch is..... https://i.imgur.com/nCvSpz4.png
considering the ethereum total mining hashrate has increased by 90% this year from 147(january) to 280 th/s, I wouldn't be-surprised if they are being poached before they make it to retail. same goes for the nvidia 3000 series.
data/avatar/default/avatar13.webp
Obsidian_Oasis:

Honestly, I'm disappointed by AMD results overall. Man, I was really looking forward going all Red soon but unless the 6900xt (I'm doubtful) crushes the RTX 3090 in EVERYTHING, I will pass up on both these series from AMD and NVIDIA and wait for RDNA 3/Hopper. AMD had two-years to work on a DLSS-like alternative before launch. Failed out the gate. Two-years not just to match or come close but PASS NVIDIA by a lot when it comes to RT. Failed out the gate. Sadly, NVIDIA's 30 series are more future proof than AMD's offerings, especially when it comes to features like Broadcasting, NATIVE tensor cores for its ML/AI sampling, etc. The future is ray-tracing, not rasterization going forward over RT. Just look at the line up for future games. Why upgrade to a next-gen RT gpus just to play games that a 5700xt/1080ti can handle easily @ 1440p/120?
Which new game can 1080ti run at 120 fps at 1440p? Hell, let it be 1080 🙂. Stop smoking what ever you are smoking. Rt may be future just not right now, not current gen.
data/avatar/default/avatar29.webp
Great review, cannot buy ANY 6800 in my region so thanks AMD and scalpers for helping me save money until the next shipment at RRP.
data/avatar/default/avatar02.webp
Obsidian_Oasis:

I didn't say NEW games, smokey. [youtube=oh2u5o5uGY0]
And you buy new gpu to play old games. Cool stro. Go on with that bubbles 🙂
https://forums.guru3d.com/data/avatars/m/138/138312.jpg
i went on newegg U.S this morning EVERY 6800xt was priced over 1000 dollars (sold out). then a couple hours later they changed them back to normal prices...that is what i call bullshit
data/avatar/default/avatar26.webp
Obsidian_Oasis:

I think you need to slow down a minute. That was my point. Why would anyone purchase one of these next-gen gpu just to play rasterization games. There are a lot of RT haters here and they cheer for rasterization games but they are looking at these NEXT-GEN CARDS to play their old DX9 to DX11 (some DX12) games. I game at 4K before I sold my 2080ti NVLink cards.
Rather play high fps 1080p max graphics, than supose playable 4k with low fps . It's not about RT hate ,it's about what it is in current from , a gimmick , nothing else. Which huge performance impact. This generation won't have anything meaningfull regarding RT , thats why AMD bet on raster performance, because it matters , high fps gaming matters . Also keep in mind both currrent consoles use AMD GPU/CPU combo , you really think nvidia gpu will be that futureproof ?:) Especially 3070 with 8 gb vram , that one will sure be future proof as hell. I rather have 100 fps at 1080p then 50 fps at 4k. But i guess everyone is different, for some 50fps is gaming , for other it is not. https://www.guru3d.com/index.php?ct=articles&action=file&id=66424
data/avatar/default/avatar01.webp
Had option 3070 or 6800 , almost same price here. But it was no brainer for me , at 1080p it destroys it , has double the vram. It's matter of perference , RT/DLSS is not a thing for me yet, i think 2-3 years maybe , then i will switch to nvidia or AMD RDNA3 if its good. This generation we have true competition , you can't go wrong with any choice imo. But for me AMD pros : +1080p 6800 often matches 3080 wich is more expensive. +has much more vram ( not need to worry that 16gb will be obslete in forsenable future). +much better power efficiency Those pros beats in MY book : DLSS , RT.
data/avatar/default/avatar38.webp
Obsidian_Oasis:

Double does not mean better at higher resolution, though, especially the narrow bandwidth bus on AMDs 6800/XT. I'm still being optimistic for the perf from the 6900xt, but it's not looking good for it either since the only major difference between the 6800xt vs 6900xt is 72 CUs vs 80 CUs. Time will tell.
I'm not intrested anything higher than 1080/1440p ( maybe i will switch to 1440p 27' screen next year). At those resolutions RDNA2 is very good. For 4k 3070 with 8gb is a joke , which soon will be shown when next gen games will appear . Overall card is good but i couldnt accept 8gb vram.
data/avatar/default/avatar10.webp
@Hilbert Hagedoorn , can you please check if Nitro+ HDMI port is 2.1 or 2.0b? If it's 2.0b it would be a bit letdown, but it's what some shops are listing.
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
Should be a singular HDMI 2.1 and 3x Display Port 1.4 outputs. But yeah that's just what is listed not sure if anyone plugged it into a 2.1 TV (And the few computer displays on the market with this already.) to actually verify this.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
kapu:

has double the vram.
Having double of the ram is a negative to the product not a positive, as it'd be performing the same with half the vram, and would therefor then cost less. More vram which doesn't help performance and increases cost = bad no matter which way you look at it.
kapu:

For 4k 3070 with 8gb is a joke , which soon will be shown when next gen games will appear . Overall card is good but i couldnt accept 8gb vram.
In 3+ years? Maybe. But by that point, anyone buying one of these cards will likely be interested in a newer GPU that performs much better then a 6800/xt more vram doesn't magically make a GPU better, and this can be seen by how the 6800/xt fall off at high resolutions, even though aside from the 3090, they all have more vram the nvidias offerings. This should be a staple in tech oriented forums: Don't pay attention to the specs, pay attention to how it performs. If you only pay attention to specs, then the idea will be, a 128 core 256 thread system with 256GB of ram and a RTX A6000 with 48GB of memory will just be completely and totally worth it for gaming, should be 5-10 times faster then the fastest normal gaming system right? .....right....
data/avatar/default/avatar31.webp
Aura89:

Having double of the ram is a negative to the product not a positive, as it'd be performing the same with half the vram, and would therefor then cost less. More vram which doesn't help performance and increases cost = bad no matter which way you look at it.
8GB is not enough for 4k in some cases, Godfall can go over 12GB. 11GB is the absolute minimum I would accept for a current high-end card and I would not go below 8GB in lower midrange cards.
data/avatar/default/avatar07.webp
Aura89:

Having double of the ram is a negative to the product not a positive, as it'd be performing the same with half the vram, and would therefor then cost less. More vram which doesn't help performance and increases cost = bad no matter which way you look at it. In 3+ years? Maybe. But by that point, anyone buying one of these cards will likely be interested in a newer GPU that performs much better then a 6800/xt more vram doesn't magically make a GPU better, and this can be seen by how the 6800/xt fall off at high resolutions, even though aside from the 3090, they all have more vram the nvidias offerings.
I have a 2080ti and WQHD monitor and in past, I did run out of video memory occasionally causing games to stutter, freeze or even crash. If that happened with 11GB on WQHD it would be much worse on 4K with 8GB or even 10GB. It just a few games today, but RDNA3 is coming around the end of 2022 which is too long to risk to not have enough video memory especially on cards with a price attacking $1k, be it $300 then maybe but I would still rather pay $50 for few gigabytes more.
data/avatar/default/avatar27.webp
kapu:

Had option 3070 or 6800 , almost same price here. But it was no brainer for me , at 1080p it destroys it , has double the vram. It's matter of perference , RT/DLSS is not a thing for me yet, i think 2-3 years maybe , then i will switch to nvidia or AMD RDNA3 if its good. This generation we have true competition , you can't go wrong with any choice imo. But for me AMD pros : +1080p 6800 often matches 3080 wich is more expensive. +has much more vram ( not need to worry that 16gb will be obslete in forsenable future). +much better power efficiency Those pros beats in MY book : DLSS , RT.
First of all congrats on your new card. You got yourself a great GPU and I hope it serves you well. But how about putting efficiency numbers into grand perspective? According to 4090 individual benchmarks, including guru3d, 3070 is 3.2% more efficient than 6800 at 1080p. 6800 XT is 1.4% more power efficient than 3080 at 4k (6.8% at 25x16). Anyone desiring efficiency with top performance, but unwilling to get 3090, might consider getting 3080, dial it down to 6800XT perf level, ending up with eye tearing perf/W. Classic rendering. nvm DLSS. Let alone RT. [spoiler] https://abload.de/img/nnh8gj5u.png [/spoiler]
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
Was expecting closer to 1000 better than the numbers I was looking at with a ETA of 2021 for when there's anything stocked too so yeah pay over the expected even with VAT on top of the MSRP and then wait four or six months. It's about the same with the stock situation for the 3080's though hopefully the pricing actually comes down when availability improves. EDIT: Yeah 9500 SEK plus shipping and such for the one store I've found that lists these. So 930 something EUR. https://www.inet.se/produkt/5411936/sapphire-radeon-rx-6800-xt-16gb-nitro Actually free shipping though (But not expected until early 2021) but at 3000 or 4000 SEK above the standard versions yeah. (Granted those are pretty much gone and little to no further stock is expected so it's going to be 7000 - 8000 as the baseline pricing then another 1000 - 2000 for the higher-tier variants.)
data/avatar/default/avatar03.webp
Noisiv:

First of all congrats on your new card. You got yourself a great GPU and I hope it serves you well. But how about putting efficiency numbers into grand perspective? According to 4090 individual benchmarks, including guru3d, 3070 is 3.2% more efficient than 6800 at 1080p. 6800 XT is 1.4% more power efficient than 3080 at 4k (6.8% at 25x16). Anyone desiring efficiency with top performance, but unwilling to get 3090, might consider getting 3080, dial it down to 6800XT perf level, ending up with eye tearing perf/W. Classic rendering. nvm DLSS. Let alone RT. [spoiler] https://abload.de/img/nnh8gj5u.png [/spoiler]
Bad perspective. 6800 is quite faraway in new games. Don't care older games with cpu bottlenecks. Also performance per watt is much better. If you like i can post some graphs. Only thing thay 3070 beats 6800 is Dlss/rt im some games. Other than that it gets destroyed.
data/avatar/default/avatar13.webp
Fox2232:

As for 3090, it is not efficient in any way. Card needs extra power above official TBP to properly pull from 3080. And as for that same image pushed around over and over again.
Complete nonsense. 3090 is equally efficient as 6800XT at 4k. (-0.7% difference according to collection of 4090 benchmarks)
kapu:

Bad perspective. 6800 is quite faraway in new games. Don't care older games with cpu bottlenecks. Also performance per watt is much better. If you like i can post some graphs. Only thing thay 3070 beats 6800 is Dlss/rt im some games. Other than that it gets destroyed.
I know you can post some benchmarks. I posted them ALL. You did say RT is not for you. Somehow I didn't conclude from that that you only care about new games 🙂