Radeon Series RX 6000 Raytracing numbers posted by AMD, match RTX 3070
Click here to post a comment for Radeon Series RX 6000 Raytracing numbers posted by AMD, match RTX 3070 on our message forum
pharma
Yxskaft
Sony developed their own APIs called GNM and GNMX for the PS4, likewise PS3 used their own APIs, presumably they'll do the same for the PS5.
That one uses DX12 instead of Vulkan or another proprietary API matters little in this case.
Long term, I wouldn't be surprised to see RDNA2 GPUs age better than Turing and Ampere, like what happened for GCN vs Kepler.
Mark Cerny said PS5 is RDNA2, Microsoft now claims theirs is the only full RDNA2.
Putting their words against eachother is as reliable as AMD and Nvidia and Intel claiming whose GPUs are better.
fantaskarsef
At this point I'm intrigued, but will still wait for the official release of the cards, more reviews, and benchmarks in real games (CP2077) comparing all of the RDNA2 cards to Ampere. And then I will have facts to decide on.
But, if they can match 3080 performance in raytracing with the 6900XT or what it will be called, I'd be pretty impressed. Still slower, but not bad for a start. But the pricing will be a thing then, being slower in RT, since they should be cheaper than Nvidia cards, I feel.
H83
As expected, if the RT performance was great they would be the first ones to shout about it. Personally, i´m not concerned about RT performance too much, so no problem.
What bothers me the most are prices, higher than they should be, both for Nvidia and AMD. If the 6800xt was around 500€, i would probably get one, but at around 700€, i think i´ll pass...
Horus-Anhur
Remember when the CEO of AMD, stated that the PS5 uses RDNA2.
https://twitter.com/lisasu/status/1260602084669390850?lang=en
Richard Nutman
pharma
Horus-Anhur
Sony will never use DirectX, because it's a proprietary API, that belongs to MS.
Sony has to create it's own API, that calls to RDNA2 feature set.
Kool64
It's actually better than I guessed if true. However at $650 I think it's not quite as good unless the 6800XT can brute force it's way past or the RT deficiency.
schmidtbag
To me, the best (and really, only) practical use of RT is for secondary lighting effects, which to my understanding (and may be wrong) is also the least taxing on GPUs. Tertiary reflections are also really nice, but seldom ever come up. Developers have pretty much perfected shiny and semi-glossy surfaces (with or without texture) for years now, where RT really doesn't yield any noteworthy benefit at all, but is hugely computationally expensive.
It's kind of weird to me, how people are willing to sacrifice tens of FPS for something like puddles that don't really look any better than what technology has offered before, but balk at the idea of buying a piece of hardware (whether that be CPU, GPU, RAM, storage, etc) that, despite any other advantages, has an unnoticeable performance drop under specific workloads.
Unlike a lot of people, I do legitimately think RT is absolutely critical to the future of gaming graphics, but it's pretty sad when Minecraft is the only really good example of how it should be used.
Of screenshots I saw, there are a small handful of situations in Metro Exodus that really stood out as "this is a great example of RT" but in most cases, it yielded no significant advantage at all.
Richard Nutman
Ricepudding
Was a bit worried when RT was not mentioned. Although right now is not a deal breaker, with next gen supporting it we may see more and more games include it maybe making it a deal breaker for some.
If this is also the 6900XT performance on it, then thats worrying. Don't get me wrong great for first gen of it, but I bet running RT will see performance drop so much.
The other thing that wasn;t mentioned was DLSS or something like that, which also begs the question is that also sub-par?
By pure rasta, these cards are beast and competitive with Nvidia, but think Nvidia have the edge with the extra RT and AI cores that can bring about extra performance in games, which at the moment is just current titles but I did see DLSS start to get impliments in far more games which the jumps up on FPS using DLSS has been amazing
gmavignier
Fender178
AMD is really impressive on paper however I am going to wait for actual reviewer benchmark results. After what AMD has pulled in the past with their benchmark results it is very difficult to trust them.
H83
Denial
schmidtbag
pharma
GodFall and Dirt 5 have raytraced shadows on AMD series 6000.
https://wccftech.com/amd-helped-god...-traced-shadows-though-its-barely-noticeable/
rl66
DXR is a step forward...
But as for now i haven't seen anything that make me think "wow i need one of those GPU", NVidia was the 1st to launch some... it was really so/so in RT (not that the card were bad but in RT it wasn't good).
The new NV seem more strong in RT, but again it is not enough on that point looking at the demo of it (as nowhere in stock).
If the new AMD reach the 3070 level in RT and around 3080 without RT it is good enough, as less expensive
(the 6900XT is anounced at 999 Euro wich is less than the cheapest 3080 at 1149,95 Euro both unaviable anyway lol)
But for real nice RT, it is not yet this gen... 🙁
little by little...
rl66