SAPPHIRE Radeon RX 6800 XT NITRO gets listed at a price of 732 and 839 euros

Published by

Click here to post a comment for SAPPHIRE Radeon RX 6800 XT NITRO gets listed at a price of 732 and 839 euros on our message forum
data/avatar/default/avatar30.webp
I was planning to buy 6800xt but That much for a Radeon? No thx, I rather wait few months and buy nvidia 3080
https://forums.guru3d.com/data/avatars/m/180/180081.jpg
half_empty_soul:

I was planning to buy 6800xt but That much for a Radeon? No thx, I rather wait few months and buy nvidia 3080
Hi - why is it "that much for a radeon" - is it the same "for a ryzen?" I mean, that much for a geforce is also absurd.
https://forums.guru3d.com/data/avatars/m/280/280231.jpg
half_empty_soul:

I was planning to buy 6800xt but That much for a Radeon? No thx, I rather wait few months and buy nvidia 3080
3080 will be slower and weaker than 6800xt. Get a life. Only 3080ti worths more than 6800xt. And even that is under discussion if you consider 6900xt. 🙄
https://forums.guru3d.com/data/avatars/m/273/273838.jpg
Fox2232:

Then you can as well tell people about margins shops have on Apple products. Because those are laughably low. And Computer parts shops are basically forced to sell them unless they want bad image. Some simply keep prices, some do not. Shop's choice.
There are other things at play, too. For example, a colleague from when I was working on imports, used to work for a Sony distributor. They told retailers they have to buy Sony car stereos if they wanted to have Playstations allocated to them. Sony car stereos wouldn't sell well, so the retailers would drop the car stereo well bellow MSRP just to get rid of the stock. Apple is another story, they're practically a cult. The products themselves don't have good margins for retailers, but they profit from accessories (this is also true for TVs). By the way, Apple charges manufacturers a substantial amount to allow them to brand their product as "Made for iPhone".
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
AlmondMan:

Hi - why is it "that much for a radeon" - is it the same "for a ryzen?" I mean, that much for a geforce is also absurd.
Radeon has effectively less capable drivers, this isn't an opinion its a fact. They could have better hardware, but it doesn't matter for shit if you can't even use it in the majority of D3D games (11) available because core 1 is pegged at 100% and can't feed the api. The saving grace is that you can use dxvk now to avoid their terrible D3D9 and 11 capabilitles.
https://forums.guru3d.com/data/avatars/m/180/180081.jpg
Astyanax:

Radeon has effectively less capable drivers, this isn't an opinion its a fact. They could have better hardware, but it doesn't matter for crap if you can't even use it in the majority of D3D games (11) available because core 1 is pegged at 100% and can't feed the api. The saving grace is that you can use dxvk now to avoid their terrible D3D9 and 11 capabilitles.
Can you explain? What is less capable in the drivers? You're saying that in the majority of Direct3D games you can't use Radeon cards? How is it that I've never heard of this before? Is it because ... you're one of the 2 people using Linux to play Windows titles?
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
AlmondMan:

Can you explain? What is less capable in the drivers? You're saying that in the majority of Direct3D games you can't use Radeon cards? How is it that I've never heard of this before? Is it because ... you're one of the 2 people using Linux to play Windows titles?
It would seem you're completely uninformed about how poorly Radeons run in Dx11 compared to specwise equivalent Geforce parts, where less capable CPU's are used.
https://forums.guru3d.com/data/avatars/m/180/180081.jpg
Astyanax:

It would seem you're completely uninformed about how poorly Radeons run in Dx11 compared to specwise equivalent Geforce parts, where less capable CPU's are used.
Obviously. I've never experienced any problems over the last 3ish years of being back to using an AMD GPU. I just see that benchmarks put the cards in the same performance bracket and I'm running at satisfying performance in everything. Can you give anything but a reddit thread? Like a G4mersN3xus (seriously Guru3d, this "we don't allow other peers to be mentioned is ridiculous) article or something like it where the issue is tested and benchmarked? Or perhaps titles where it's a problem in? I don't own any Assassin's Creed games, but I do have a lower spec CPU. If this is so well known and common, Iwould imagine that it would be well documented? Other than people talking about it in forums... but I don't experience any issues? I see people are mostly talking about Assassin's Creed games when googling it, can't really find a lot of titles where it seems to be an issue. But obviously is an issue in some games, I see that. But just not a big issue, it would seem. Also seems to be fixed by driver updates in some cases of other Assassin's Creed games. But I don't think I own any of them.. haven't played any at least.
data/avatar/default/avatar10.webp
AlmondMan:

Hi - why is it "that much for a radeon" - is it the same "for a ryzen?" I mean, that much for a geforce is also absurd.
history tells a radeon = pathetic drivers, blackscreens, crashesh, not working freesync RT weaker on radeon, no dlls, no extra benefits for streamers that nvidia offers so yeah THAT much for a radeon
https://forums.guru3d.com/data/avatars/m/180/180081.jpg
half_empty_soul:

history tells a radeon = pathetic drivers, blackscreens, crashesh, not working freesync RT weaker on radeon, no dlls, no extra benefits for streamers that nvidia offers so yeah THAT much for a radeon
OK. Not really something I experienced, but sure, I guess some people do. Streamers are the worst, though. Should not be supported or encouraged. None of the other stuff matters though, too expensive to be practical.
data/avatar/default/avatar05.webp
AlmondMan:

OK. Not really something I experienced, but sure, I guess some people do. Streamers are the worst, though. Should not be supported or encouraged. None of the other stuff matters though, too expensive to be practical.
RT and DLSS not practical?
https://forums.guru3d.com/data/avatars/m/180/180081.jpg
half_empty_soul:

RT and DLSS not practical?
RT obviously is very practical. However the expense makes it impractical at the moment. A 1000€+ GPU required to get decent performance at 1440p makes it a no-go. DLSS is fine, but not very useful if you didn't buy a super expensive 4k monitor.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
I don't particularly fancy these high prices as much as the next guy but... what did people expect? Scenario A: Nvidia keeps it's advantage, dictating higher and higher prices. Basically what we had above the 5700XT, or above €400. Prices increase. Scenario B: AMD takes the pole position or competing, meaning AMD prices increase because they will orient at the competition's prices (Nvidia's "established" range, or at least in some relation to it). Prices increase. Scenario C: Prices drop with both lineups. Companies earn less money than the generation before, lowering their stock value. Can't do that because both Nvidia and AMD have acquired companies lately and they need the money of stock value to keep the balance. But ultimately they make less money, thus failing the business target and will have to earn back that money in some way. Like increasing prices next chance they get. Prices increase. Now we have competition in terms of performance, and the price we get is what the companies think we will buy it. Unless we don't buy cards, the prices won't lower. And even if we do not buy cards, how do you guys think it will make the prices work? Do people expect a XX80TI of a future generation will magically cost 750€ again? Or a X900XTX of any future generation will come for 600€? If so, you're very, very, very, very optimistic or hands down unrealistic, to stay politely.
https://forums.guru3d.com/data/avatars/m/180/180081.jpg
It would naturally not make sense to deliver a product that performs the same as the competitor and price it at 60% of the competitor. You lose credibility when the competition has priced their solution so high.
data/avatar/default/avatar26.webp
AlmondMan:

RT obviously is very practical. However the expense makes it impractical at the moment. A 1000€+ GPU required to get decent performance at 1440p makes it a no-go. DLSS is fine, but not very useful if you didn't buy a super expensive 4k monitor.
i have uqhd monitor and both RT and DLSS are in the practical basket
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
Astyanax:

Radeon has effectively less capable drivers, this isn't an opinion its a fact. They could have better hardware, but it doesn't matter for crap if you can't even use it in the majority of D3D games (11) available because core 1 is pegged at 100% and can't feed the api. The saving grace is that you can use dxvk now to avoid their terrible D3D9 and 11 capabilitles.
Assuming the info is correct about a hardware blocker and insufficient cache memory that only was resolved with the Navi10 GPU and RDNA changes that would explain why AMD's dragged on 11.1 driver command list functionality and ... unfortunately the performance gains NVIDIA advertised with their D3D11 optimization driver for I think it was 40% "up-to" fits in really well with the gains you get through DXVK or at times even more which would make for a solid case on AMD now finally implementing this. (Would be nice to have had it already but the driver state going by my own experiences at least as a early 5700XT GPU user yeah that first year was resolving and stabilizing a lot of problems there was little else that could be prioritized and there's still issues that would ideally need to be resolved ASAP.) EDIT: Mostly used DXVK for the stability and GPU smoothness it provides in how it operates under Vulkan (And D3D12) but the performance benefits have been nice and sometimes even incredibly impressive for what should be a similar performance result unless constrained like with the D3D9 API but a higher CPU workload from the wrapper. Which would have been acceptable too with how the GPU operates better and was more stable through some of the earlier drivers and issues though much of that has since been resolved even if's taken time and even using the AMD GPU users and community to improve and fix some of it. EDIT: Impressive in turn how the GPU's competed and kept up even with GCN and it's utilization problems. https://www.sapphirenation.net/the-last-gcn-and-the-age-of-rdna-and-cdna
Just like GCN 1 before, RDNA represents a radical shift in design. With RDNA, AMD decided to address key weaknesses in their GCN-based designs – namely underutilization in most games. To achieve this, the entire GPU was more or less rebuilt from the ground up to work in a different manner. The biggest change here relates to the “wavefront” (basically a group of work), from 64 in Vega, to 32. Now, what was happening on GCN parts was that they rarely managed to fill in their work group fully which meant that while the GPU was working hard, it was not actually doing work efficiently. Filling up the smaller que is obviously easier than the larger one. Even extremely optimized, compute-heavy titles like Wolfenstein 2 were barely able to reach 60% saturation on a Vega 64. This happens much less on Navi parts with this one, really big change. This also means that RDNA GPUs are faster in less optimized APIs like OpenGL and DX9 than previous AMD designs.
Kinda explains the "gains" AMD has for Vulkan and D3D12 the driver and GPU hardware is operating as it should or at least a lot closer and better utilization and even against this penalty and problems AMD managed to compete with some of the mid to high end NVIDIA cards although not enough for the enthusiast model that's faster overall regardless. Somewhat impressive yet the performance gap that could be from supporting the driver and code better yeah that's less impressive put mildly. (Still has improved though, caught up to the 2070Ti within a 5% margin or so at times so that's a thing.) EDIT: All that out of the way will be interesting to see what RDNA2 can deliver these custom editions or just the stock versions, tomorrow will be fun. (Probably going to be selling out fast and then these customs are planned for launch afterwards likely going fast too but eh seeing the reviews will be very interesting.)
https://forums.guru3d.com/data/avatars/m/180/180081.jpg
half_empty_soul:

i have uqhd monitor and both RT and DLSS are in the practical basket
You get to pay, then.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
fantaskarsef:

I don't particularly fancy these high prices as much as the next guy but... what did people expect? Scenario A: Nvidia keeps it's advantage, dictating higher and higher prices. Basically what we had above the 5700XT, or above €400. Prices increase. Scenario B: AMD takes the pole position or competing, meaning AMD prices increase because they will orient at the competition's prices (Nvidia's "established" range, or at least in some relation to it). Prices increase. Scenario C: Prices drop with both lineups. Companies earn less money than the generation before, lowering their stock value. Can't do that because both Nvidia and AMD have acquired companies lately and they need the money of stock value to keep the balance. But ultimately they make less money, thus failing the business target and will have to earn back that money in some way. Like increasing prices next chance they get. Prices increase. Now we have competition in terms of performance, and the price we get is what the companies think we will buy it. Unless we don't buy cards, the prices won't lower. And even if we do not buy cards, how do you guys think it will make the prices work? Do people expect a XX80TI of a future generation will magically cost 750€ again? Or a X900XTX of any future generation will come for 600€? If so, you're very, very, very, very optimistic or hands down unrealistic, to stay politely.
If people refuse to buy GPUs with the current prices, Nvidia and AMD will have no alternative but to drop prices. But we both know that is not going to happen, so they will set their prices as ahigh as possible in order to make more money. Personally, i think current GPUs are expensive as hell but i have no illusions and i don´t expect them to drop in the near future, specially with these shortages...