Preview: Starting at $579 - AMD Radeon RX 6800 (XT) and 6900 XT graphics cards

Published by

Click here to post a comment for Preview: Starting at $579 - AMD Radeon RX 6800 (XT) and 6900 XT graphics cards on our message forum
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
kapu:

RTX 3070 doesnt look soo good anymore with that shy 8 gigs of ram . It was worth ot wait .
Even the 6700xt will have 12gb. I wonder when they will announce it.
https://forums.guru3d.com/data/avatars/m/103/103120.jpg
kapu:

RTX 3070 doesnt look soo good anymore with that shy 8 gigs of ram . It was worth ot wait .
And again, and again. 8 GB is enough for 4K. There is no a single game that needs more. Even 10 GB is enough in many games in 8K. Only few very recent games can push for more in 8K.
data/avatar/default/avatar08.webp
Can't wait for the reviews. If AMD's RT performance isn't up to par then that makes this a less compelling offering, especially without a DLSS alternative.
https://forums.guru3d.com/data/avatars/m/202/202673.jpg
Bit of a blow to all the dicks who used bots to get a hold of all those nVidia cards...must be fun to behold that AMD launch when you're sitting on dozens of 3090s you've got listed on the bay at near $2.5k...or even pushing $4k for a 3090 Strix...with 3080's at over $1.5k...not much respect for the 'entrepreneurial initiative' displayed there... Of course there's no indication that won't happen to AMD as well either... 🙄
https://forums.guru3d.com/data/avatars/m/103/103120.jpg
pharma:

DLSS alternative
They announced Denoiser. But that's it. They didn't say a word on it.
https://forums.guru3d.com/data/avatars/m/280/280231.jpg
tty8k:

For me it's another rip off. They followed the jackals (Intel & Nvidia) with same practice and prices. I don't care if they're at the same level or slightly better.
Have you ever thought that AMD has to survive? Do they need to sell you at a loss &stop their evolution?
https://forums.guru3d.com/data/avatars/m/220/220214.jpg
Looks good. So is it that instead of using expensive GDDR6X VRAM, they used the cheaper & slower GDDR6 VRAM with an extra VRAM caching system? Similar to CPU structure. There is a difference though in that most CPU based code blocks will fit in the small cache... not sure how caching the VRAM helps if a large framebuffer needs to be continually read for every frame. Must help a small bit, but there would be still much less cache hits and much more misses than similar CPU situation. Anyway, 6900X looks like good kick to NVidia with their 1500 suckers card as I have said previously. Most likely RT performance will be worse... but as we know... future RT console games will be based on AMD hardware, so won't push much more than that, so any PC based AMD GPU will most likely be sufficient. Looking forward to reviews.
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
coth:

And again and again. 8 GB is enough for 4K. There is no a single game that needs more. Even 10 GB is know in many games in 8K. Only few very recent games can push for more in 8K.
This, and a 3070 isn't even aimed for native 4K.
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
Interesting having a look at the 6900XT shows AMD with battlefield 5 with 122, being above the RTX 3090, but on the view here it shows the 3090 with 124fps they show it around 110fps so thats a little interesting why its much lower. Again Borderlands 3 3090 shows 78, 6900XT shows 73. There results show otherwise. This is why we should not jump up on hype from what they show. This isn't a dig at AMD or a fanboi for Nvidia btw, just showing that results are not always as they seem and need to be taken with a grain of sand. Third party results should hopefully show a bigger picture
https://forums.guru3d.com/data/avatars/m/256/256969.jpg
Just take a second to appreciate this quote from the presentation: sip coffee
We purposedly put the power connector at the end of the card and facing the case side panel to provide for simple connector accessibility and cable management. Also there's no need to worry about special adapters to connect to the power supply especially ones that are placed in any odd location that can interfere with the aesthetic of your build.
:D
data/avatar/default/avatar32.webp
Wake me up when someone finally presents a sub 300 u$s card..... And no, "go buy a 3yo model" is not an answer, tech industry (used to) work like this: "If u want my money, offer me something better for the same money. Otherwise I'll keep my current hard." And again no, "old cards are enough for 1080 gaming" is not an answer either. Resolution is only half of the equation, the other half is graphic detail/amount. The damn consoles have kept that variable static for too long but next year the bar will rise, a lot (and then be static for another 3 years....).
data/avatar/default/avatar14.webp
coth:

And again, and again. 8 GB is enough for 4K. There is no a single game that needs more. Even 10 GB is enough in many games in 8K. Only few very recent games can push for more in 8K.
Yea AMD soo stupid giving that 16 gigs for nothing haha , obviously their enginers suck at what they are doing and nvidia is smart 😀
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
coth:

They announced Denoiser. But that's it. They didn't say a word on it.
"We're already working with developers to work on a super resolution feature to give gamers the option of more performance when using raytracing" Could be DLSS alternative, could be something completely different. Keep in mind AMD still doesn't have tensor hardware on it's chips.
kapu:

Yea AMD soo stupid giving that 16 gigs for nothing haha , obviously their enginers suck at what they are doing and nvidia is smart 😀
To be fair AMD is the company that said "4GB of HBM = 10-12GB of GDDR5" and that they could just assign "a couple of engineers" to handle the VRAM problem. Didn't go well.
data/avatar/default/avatar06.webp
I think we need a third GPU maker. The pricing is just out of control compared to what things used to be 5-15 years ago.
https://forums.guru3d.com/data/avatars/m/256/256969.jpg
something else i'm curious about, 6900XT and 6800XT boh rated 300W despite 11% more CUs.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
toyo:

I think we need a third GPU maker. The pricing is just out of control compared to what things used to be 5-15 years ago.
The cost of building a modern architecture is also 2-3x higher. It's not like these companies margins are increasing by 3x.
data/avatar/default/avatar15.webp
I have no doubt Nvidia and AMD planned all these things to raise the prices. 1st Paper launch by Nvidia and now this. Enjoy your CONSOLE price low end GPUs soon... "AMD will save us all..." Hell yeah.
https://forums.guru3d.com/data/avatars/m/282/282392.jpg
I can hardly believe they achieved 3090 performance, this is awesome news!
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
DannyD:

I can hardly believe they achieved 3090 performance, this is awesome news!
And people thought 3070 is the best they can do.:D