Sapphire Radeon RX 6800 XT NITRO+ review

Graphics cards 1054 Page 1 of 1 Published by

Click here to post a comment for Sapphire Radeon RX 6800 XT NITRO+ review on our message forum
data/avatar/default/avatar21.webp
Fox2232:

I did see and read what actual owners of those cards post. That's enough for me. https://www.techpowerup.com/forums/threads/rtx-3080-undervolting-adventures.272936/post-4363787 https://bjorn3d.com/2020/10/undervolting-the-rtx-3080-and-the-rtx3090/2/#split_content
here we go again with famous tactics of: Scr3w multitude of (4090) benchmarks. Let me throw at you some endless garbage from random users, that I myself didn't bother to read. Because if you read it, you'd see it said: "Even at 850 mV/1860 MHz we see a big gain in all fronts."
Fox2232:

https://www.reddit.com/r/nvidia/comments/iwt953/rtx_3080_to_undervolt_or_not_to_undervolt_that_is/
Point if any? This is overclocking. With undervolting. Plx stay focused: your claim was "3090, it is not efficient in any way." Turns out that according to collection of 4090 benchmarks, 3090 is 0.7% less eff. than 6800XT. While being 20% faster! Meaning 3090 smashes 6800XT at same perf level. Time to say I was wrong..... maaaybe... try it for once?
data/avatar/default/avatar29.webp
Noisiv:

Complete nonsense. 3090 is equally efficient as 6800XT at 4k. (-0.7% difference according to collection of 4090 benchmarks) I know you can post some benchmarks. I posted them ALL. You did say RT is not for you. Somehow I didn't conclude from that that you only care about new games 🙂
3090 or 3080 is not as efficient as 6800 or 6800 XT. https://tpucdn.com/review/amd-radeon-rx-6800/images/performance-per-watt_3840-2160.png Actualy it's quite far away . Get Your information straight . Games that are heavy on GPU at even low res : SOTTR DOOM HORIZON DAWN VALHALLA Red Dead 2 Doesn't matter which , there isnnt many games that 3070 can beat 6800, and in couple it gets DESTROYED ( there are few that it beats 3080/90 ...) [youtube=braSQtZ3wR8]. I dont need to post more proof . 3070 gets destroyed 90%+ of timer , by rather large margin. And like i said i dont care for RT or DLSSS this generation.
data/avatar/default/avatar24.webp
Fox2232:

Well, what you call garbage are owners of cards, your fellow enthusiasts.
I called it garbage because it deserves no better. Not when in completely expected attempt at "but what about this", a single point is added to divert from 4090 of them. But that was not enough for you, so you posted no less than three links without anything that proves your claims, or disproves mine. Nor any of this was so compelling that it needs to be discussed right now. And you did so for no other reason than to divert from the discussion at hand... to whatever comes across the random area of your brain.
Fox2232:

And what you call gospel is worst site there is in PC HW space.
What the... where is this coming from? Stay focused, please - for the sake of conversation. Don't make claims like this that you can't substantiate.
Fox2232:

If one site should be taken as gospel here, it would be G3D. But I do respect actual owners of cards and their data. Because they have to live with cards. And they'll notice unstable undervolt and increase voltage till their card is stable. That's in contrast to those who just run 3DMark and if there is no crash, they declare UV successful. And with all those UV stories being pushed around, I have feeling that Hilbert may have article in works. Maybe even some comparison between different manufacturing processes.
Guru3d is prominently featured in EVERY 3dcenter's Launch Analysis aggregate. Including this one. I do get the point of UV/OC possible issues. I am sure everybody here does. Why are you lashing at this particular case of UV in the wild. Never saw you do that before... hmm weird.
Fox2232:

Now to the bold part: Yeah. Except you again ignored reality of: "Performance per watt per $".
I didn't ignore anything. Remember, this was your claim, not mine:
Fox2232:

As for 3090, it is not efficient in any way.
*******
kapu:

3090 or 3080 is not as efficient as 6800 or 6800 XT.
Aggregate of 4090 benchmarks says otherwise. TPU included. Thanks for posting a single point to add to these benchmarks.
data/avatar/default/avatar14.webp
Noisiv:

I called it garbage because it deserves no better. Not when in completely expected attempt at "but what about this", a single point is added to divert from 4090 of them. But that was not enough for you, so you posted no less than three links without anything that proves your claims, or disproves mine. Nor any of this was so compelling that it needs to be discussed right now. And you did so for no other reason than to divert from the discussion at hand... to whatever comes across the random area of your brain. What the... where is this coming from? Stay focused, please - for the sake of conversation. Don't make claims like this that you can't substantiate. Guru3d is prominently featured in EVERY 3dcenter's Launch Analysis aggregate. Including this one. I do get the point of UV/OC possible issues. I am sure everybody here does. Why are you lashing at this particular case of UV in the wild. Never saw you do that before... hmm weird. I didn't ignore anything. Remember, this was your claim, not mine: ******* Aggregate of 4090 benchmarks says otherwise. TPU included. Thanks for posting a single point to add to these benchmarks.
Aggregate of 4000 usless tests sure. All respectable sources claim same , i trust TPU , Guru3d ,gamers nexus . Those say efficiency is better . Even more , i actually own one of those cards , are MY results are 1:1 same with guru3d, same card same results . You can say white is black, but it wont become white because you say so . We can even say 3070 has SAME ( which it hasnt) power efficiency ,it does LOSE in almost ANY game at 1080/1440p (also at 4k but whatever ) .
data/avatar/default/avatar14.webp
kapu:

Aggregate of 4000 usless tests sure. All respectable sources claim same , i trust TPU , Guru3d ,gamers nexus . Those say efficiency is better . Even more , i actually own one of those cards , are MY results are 1:1 same with guru3d, same card same results . You can say white is black, but it wont become white because you say so . We can even say 3070 has SAME ( which it hasnt) power efficiency ,it does LOSE in almost ANY game at 1080/1440p (also at 4k but whatever ) .
Without anything to indicate shadiness I trust them all. You said you were interested into efficiency and 1080p. With 3070 being only 3.3% behind 6800 is why I thought this was relevant, so I posted it
Noisiv:

According to 4090 individual benchmarks, including guru3d, 3070 is 3.2% more efficient than 6800 at 1080p. 6800 XT is 1.4% more power efficient than 3080 at 4k (6.8% at 25x16).
I didn't present some wild theories. Instead I simply posted numbers. UN-CHERRY-PICKED numbers. Funny that would cause ruckus.
data/avatar/default/avatar12.webp
Noisiv:

Without anything to indicate shadiness I trust them all. You said you were interested into efficiency and 1080p. With 3070 being only 3.3% behind 6800 is why I thought this was relevant, so I posted it .
It's quite more than 3% , more like 10%. It's also behind in performance . Also depends what you play for my games right now its way more than 10%. Will see how it will change in future , since consoles are RDNA2 , i look for future improvements 🙂
data/avatar/default/avatar18.webp
Fox2232:

Once more. You put WTFTech above entire enthusiast community. That's pretty big statement of yours.
Any proof about A N Y of this? 1) Which BIG statement did i make? 2) How did I put wcfwfwf above "entire enthusiast community". What drugs are you on? 3. Why are you shifting from 3090's supposed crap efficiency before even acknowledging you were wrong.
Fox2232:

Because that magical image with 330W going down to 200W with undervolt is from them.
Is there anything magical about Overvolting+Overclocking giving exponential rise to power? No? Good. By the token of symmetry, Undervlot+Underclock gives you the same dramatically different power figure, only in different direction. No magic, not even physics or math, just common sense.
Fox2232:

And asking why do I have problem with their data is not even rhetorical question. You know it. It is simple. Everyone around shows that real world UV which people actually use can't replicate it.
Someone dared to UV+undrclock 3080 and you went berserk. I never made any statements regarding accuracy, or reproducibility or how typical it is, in one direction or another. Nor did I use this single data point to draw any conclusions. Perf/W figures that I quoted are from BIG 4090 results agregate. And have NOTHING to do with wfwcwfwqfg. If so, then why you're still harping about this particular case of UV like your life depends on it?
Fox2232:

And yes, you do ignore a lot of things. Because only thing I always need to rip your arguments is to: "Step back and look at bigger picture."
No idea what's this about. Your big picture seems to be: forget about 4090 benchmarks and look at this guy's forum post. Better yet let me throw 3 different ones. Pigs. Pink Floyd LOL
https://forums.guru3d.com/data/avatars/m/278/278366.jpg
Where did the section with the parsing of video cards disappear?:( Product Teardown
data/avatar/default/avatar05.webp
Fox2232:

https://wccftech.com/undervolting-ampere-geforce-rtx-3080-hidden-efficiency-potential/ And you did post serious question. Only one. I bolded it for you. But hey, are you really so triggered in point 3) by my original statement? Which happens to be true. As far as rest of your outburst. I do not do drugs. I do not need any prescription to modify my mood in those unpleasant times. But I see pattern of self harm where you enter AMD's threads and can't keep away from throwing poo at product/technologies or claiming nVidia's better. Expecting that everyone will either nod their head or look away. You can't even congratulate lucky @kapu that he managed to get one of those cards without immediately attacking his choice.
This is not congratulating? This is not respectful? You must be HIGH.
Noisiv:

First of all congrats on your new card. You got yourself a great GPU and I hope it serves you well. But how about putting efficiency numbers into grand perspective?
From your random post link:
1905 Mhz @ 0.90V 9180 (+3% Stock) 1877 (hitting few power limits) Average Power reduced by around 30-40W
Your guy OVERCLOCKED his 3080, INCREASING perf., while shaving 40 Watts with the undervolt(0.9V). The wtfcfrwfc guy BOTH undervolted and UNDERCLOCKED his card (0.8V), LOSING 7% performance to shave 109W So either you're incredibly dense for thinking that two UV cases are somehow contradicting each other and can't exist in the same Universe. Or you didn't bother to read your own link. Which is it? PS feel free to continue NOT staying on point and writing about cats and dogz and peanuts and shifting goalposts. Know that I am not EVEN READING that
data/avatar/default/avatar31.webp
Noisiv:

Your guy OVERCLOCKED his 3080, INCREASING perf., while shaving 40 Watts with the undervolt(0.9V). The wtfcfrwfc guy BOTH undervolted and UNDERCLOCKED his card (0.8V), LOSING 7% performance to shave 109W So either you're incredibly dense for thinking that two UV cases are somehow contradicting each other and can't exist in the same Universe. Or you didn't bother to read your own link. Which is it?
That was not a rhetorical question. Of course you are free to ignore it like you usually do with anything inconveniently on-topic, and continue with anything that pops up in your mind no matter how unrelated. But you will be remembered as the guy who rallied against one set of results, imploring my christian soul not to go against "your fellow enthusiasts" (LOL), yet you yourself linked a guy with in-all-likeliness even better 3080. According to your link wccfffg guy is practically bad-mouthing 3080. Cos yours shaves 40W while increasing perf. Hows that for practically sitcom 😀
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Noisiv, you've been really combative lately, you ok?
data/avatar/default/avatar04.webp
Astyanax:

Noisiv, you've been really combative lately, you ok?
link me some of your anime in pm and ill tell you
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Noisiv:

link me some of your anime in pm and ill tell you
o_O
data/avatar/default/avatar17.webp
Astyanax:

o_O
plx some of your waifu anime, before running off to any of dozen threads you are combating good fight atm
data/avatar/default/avatar27.webp
Fox2232:

Your grand perspective was not respecting his point of view and preferences at all. (If you even read it.) Or maybe it is best approach when someone asks for advice and states list of things he can't care less about. Simply find product that has all those low priority/needless features. And tell them that they should buy that product and use things they do not want to use. Then write about different product to support claim that you know better what someone needs and does not need. But It remains true that he made right choice for his use case. Like it or not 🙂 But for real: A: "DLSS/RT is not for me." B: "Buy this, because it has DLSS and RT." It's practically sitcom.
Right now i have other problem. Already had two or three crashes (pc restarts) in Rd2 and vallhala. I think 1 rail from psu might not be enough. 22amps per rail. Hopefully tomorrow i can find second pcie cable to rule that out.
data/avatar/default/avatar28.webp
Fox2232:

Pull down power limit to -20~25%. In case of power issue, that may "fix" it. With driver crash, I would expect GSOD. 20.11.2 driver has quite some issues with DX-R crashes and black screen. But as long as GSOD is not shown prior restart, it can be power issue.
There is nothing just pure classic restart . PSU is Bequiet 680W Gold , 2x 12v rails 22amps. ( for pcie) . Each on separate cable . Currently card is connected to 1 rail.
data/avatar/default/avatar09.webp
Fox2232:

Can't recommend you to put such load on that PSU via one rail. If it burns, it can take 1/2 of PC with it. If you can't get 2nd cable, get new PSU.
I can buy cable. Costs 15 euro hahah. I think it will be fine with two cables. Im 99% that's the problem here.
data/avatar/default/avatar26.webp
Fox2232:

PSU is rated 55A on +12V: V1~V4 together. V1/V2 are for MB/CPU. So, do not get over 660W in sum of MB, CPU, GPU And 2nd cable should be fine. There is only little problem. GPUs eat certain amount of energy on average (300W default power limit). But within each frame, different parts of GPU work in each time interval. That means GPU can be pulling 15A as minimum and maybe 35A as maximum in certain milliseconds, but will average to 25A over time. So, I would not OC in any way. If anything, undervolt a bit.
With 2 cables i should be above that , but 1 cable is tight , very tight. Problem is , it runs RD2/Valhalla for 30-40 min , then it crashes, its not instant .