Radeon Series RX 6000 Raytracing numbers posted by AMD, match RTX 3070

Published by

Click here to post a comment for Radeon Series RX 6000 Raytracing numbers posted by AMD, match RTX 3070 on our message forum
data/avatar/default/avatar18.webp
Turanis:

How's that?RDNA 2 (PS5/XBOX)do not support DX12 Ultimate?I guess does,including DXR 1.1.
PS5 is not RDNA2 ... no VRS, Sampler Feedback, etc, RT is believed to be different (not sure) ... I believe they have something schedule shortly for more detail.
data/avatar/default/avatar03.webp
Sony developed their own APIs called GNM and GNMX for the PS4, likewise PS3 used their own APIs, presumably they'll do the same for the PS5. That one uses DX12 instead of Vulkan or another proprietary API matters little in this case. Long term, I wouldn't be surprised to see RDNA2 GPUs age better than Turing and Ampere, like what happened for GCN vs Kepler. Mark Cerny said PS5 is RDNA2, Microsoft now claims theirs is the only full RDNA2. Putting their words against eachother is as reliable as AMD and Nvidia and Intel claiming whose GPUs are better.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
At this point I'm intrigued, but will still wait for the official release of the cards, more reviews, and benchmarks in real games (CP2077) comparing all of the RDNA2 cards to Ampere. And then I will have facts to decide on. But, if they can match 3080 performance in raytracing with the 6900XT or what it will be called, I'd be pretty impressed. Still slower, but not bad for a start. But the pricing will be a thing then, being slower in RT, since they should be cheaper than Nvidia cards, I feel.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
As expected, if the RT performance was great they would be the first ones to shout about it. Personally, i´m not concerned about RT performance too much, so no problem. What bothers me the most are prices, higher than they should be, both for Nvidia and AMD. If the 6800xt was around 500€, i would probably get one, but at around 700€, i think i´ll pass...
data/avatar/default/avatar13.webp
Maddness:

Yep, Metro Exodus springs straight to mind. That game was totally worth the performance hit enabling Ray Tracing. That's my opinion and you're welcome to disagree, but it's how I feel.
Really? Can you point out where in this video it's totally worth it? Just read the comments below it. No-one seems to agree. [youtube=yPySV5C1i24]
data/avatar/default/avatar33.webp
Horus-Anhur:

Remember when the CEO of AMD, stated that the PS5 uses RDNA2. https://twitter.com/lisasu/status/1260602084669390850?lang=en
It is possible I could be misinterpreting the scenario. However the tech/dev sites I frequent are patiently waiting for Sony's announcement regarding DirectX 12 Ultimate compliance which should be very shortly.
https://forums.guru3d.com/data/avatars/m/248/248291.jpg
Sony will never use DirectX, because it's a proprietary API, that belongs to MS. Sony has to create it's own API, that calls to RDNA2 feature set.
https://forums.guru3d.com/data/avatars/m/165/165018.jpg
It's actually better than I guessed if true. However at $650 I think it's not quite as good unless the 6800XT can brute force it's way past or the RT deficiency.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
To me, the best (and really, only) practical use of RT is for secondary lighting effects, which to my understanding (and may be wrong) is also the least taxing on GPUs. Tertiary reflections are also really nice, but seldom ever come up. Developers have pretty much perfected shiny and semi-glossy surfaces (with or without texture) for years now, where RT really doesn't yield any noteworthy benefit at all, but is hugely computationally expensive. It's kind of weird to me, how people are willing to sacrifice tens of FPS for something like puddles that don't really look any better than what technology has offered before, but balk at the idea of buying a piece of hardware (whether that be CPU, GPU, RAM, storage, etc) that, despite any other advantages, has an unnoticeable performance drop under specific workloads. Unlike a lot of people, I do legitimately think RT is absolutely critical to the future of gaming graphics, but it's pretty sad when Minecraft is the only really good example of how it should be used.
Maddness:

Yep, Metro Exodus springs straight to mind. That game was totally worth the performance hit enabling Ray Tracing. That's my opinion and you're welcome to disagree, but it's how I feel.
Of screenshots I saw, there are a small handful of situations in Metro Exodus that really stood out as "this is a great example of RT" but in most cases, it yielded no significant advantage at all.
data/avatar/default/avatar10.webp
Tyrchlis:

I call bullshit on your claim. I play Metro Exodus, Tomb Raider, and Wolfenstein Youngblood ALL with RT on, 1440p ULTRA settings. Quit acting like you know everybody, you don't. Not even close, clearly. As of today, my RTX 3090 will be playing Watch Dogs Legion sometime a little later, again, with FULL RTX on. If the game supports it, I turn it on. RT, DLSS, all of it. A little later I will be also be playing Mechwarrior 5, though I hear it's RT implementation doesn't look as good as others, I will be the judge of that, and it won't be a performance issue causing me to turn it off if it looks bad.
That's great and all, but all I'm asking is for someone to point out where FULL RTX makes the game look significantly better. I'd even take slightly better. There's none in that Metro video. In the AMD presentation the World of Warcraft Shadowlands supposedly showing off ray-tracing effects only resulted in some wall's with some ambient glow on it. It looked abysmal, and nothing you couldn't do without raytracing.
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
Was a bit worried when RT was not mentioned. Although right now is not a deal breaker, with next gen supporting it we may see more and more games include it maybe making it a deal breaker for some. If this is also the 6900XT performance on it, then thats worrying. Don't get me wrong great for first gen of it, but I bet running RT will see performance drop so much. The other thing that wasn;t mentioned was DLSS or something like that, which also begs the question is that also sub-par? By pure rasta, these cards are beast and competitive with Nvidia, but think Nvidia have the edge with the extra RT and AI cores that can bring about extra performance in games, which at the moment is just current titles but I did see DLSS start to get impliments in far more games which the jumps up on FPS using DLSS has been amazing
https://forums.guru3d.com/data/avatars/m/87/87316.jpg
Richard Nutman:

That's great and all, but all I'm asking is for someone to point out where FULL RTX makes the game look significantly better. I'd even take slightly better. There's none in that Metro video. In the AMD presentation the World of Warcraft Shadowlands supposedly showing off ray-tracing effects only resulted in some wall's with some ambient glow on it. It looked abysmal, and nothing you couldn't do without raytracing.
You touched on a very controversial point. IMO RT is great, but a future technology. Right now it's implementation is just going to be a way to show power and, maybe, gain some performance. True RT is only going to come into play when all lighting are real time. Meanwhile, it's only for some happy few that can afford a top of the line GPU.
data/avatar/default/avatar02.webp
AMD is really impressive on paper however I am going to wait for actual reviewer benchmark results. After what AMD has pulled in the past with their benchmark results it is very difficult to trust them.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
schmidtbag:

To me, the best (and really, only) practical use of RT is for secondary lighting effects, which to my understanding (and may be wrong) is also the least taxing on GPUs. Tertiary reflections are also really nice, but seldom ever come up. Developers have pretty much perfected shiny and semi-glossy surfaces (with or without texture) for years now, where RT really doesn't yield any noteworthy benefit at all, but is hugely computationally expensive. It's kind of weird to me, how people are willing to sacrifice tens of FPS for something like puddles that don't really look any better than what technology has offered before, but balk at the idea of buying a piece of hardware (whether that be CPU, GPU, RAM, storage, etc) that, despite any other advantages, has an unnoticeable performance drop under specific workloads. Unlike a lot of people, I do legitimately think RT is absolutely critical to the future of gaming graphics, but it's pretty sad when Minecraft is the only really good example of how it should be used.
I have the same opinion. RT may be amazing and the future of graphics but i think it´s still too early to be really concerned about it. But i can understand those who like RT early implementations and consider an important aspect of graphics.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Richard Nutman:

It looked abysmal, and nothing you couldn't do without raytracing.
Look I don't know about the specific example you're talking about but the idea that "well devs have done x, y, z before, so you can mimic that same effect without RT" is not really accurate. The main goal of RT is to try to simulate how light behaves and reacts physically with the environment. Artists can fake this with raster effects and those effects can look great - in fact it could look better than RT because artists are essentially painting a scene to the way they want it to look.. but just because a scene looks good doesn't mean it's accurate and further it takes an artist a ton of time to really dial in the appearance of a scene like this. In the other thread, one of the posters talking about RT pointed out the Unreal 5 demo and how it doesn't use RT but it looks fantastic and we've yet to see games with that level of fidelity - but those demos usually look that good because A. they are on rails and B. Dozens of artists spend months working on literally 1 scene, getting the lighting perfect, removing all the oddities, making sure every shadow is casted and that the reflection maps are working correctly and modifying textures so they don't look a little off from an angle, etc. A good example of this was the Crytek RT demo - which looked fantastic, then they took the technology from that demo and shoved into Crysis Remastered and that game looks like an abomination. My point is that, yeah without RT you could cast some color on a wall but you're not going to cast it to the same accuracy RT would. That may not matter for most people, just a light bleeding color into a wall, but if you want to close the gap on photo realistic graphics, that accuracy does matter. People also need to keep in mind that RT is relatively new. It's going to take time for devs to play with it, learn how to master scenes with it, get the looks they want, etc. So while the starting point is lower than where the performance/quality of raster is now.. if you graphed it's development it would look like a diminishing return curve but the plateau would be significantly higher than raster would ever achieve. Regardless to whether RT came out with Turing or 3 generations from now - devs would still have to learn how to use it and make things look better with it. It's a tool and like all other tools it takes time for people to master it. That also applies to performance - things like DXR 1.1 increase performance by just optimizing the way rays are handled/sampled. Improved denoisers will probably also lead to less rays needed to be cast, etc. That's also not to mention RT can be used in a variety of other ways. Like Nvidia is experimenting with using the RT cores to do a faster motion blur. Microsoft is experimenting in using RT to accelerate and improve sound.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
H83:

But i can understand those who like RT early implementations and consider an important aspect of graphics.
I'm one of those people who considers it an important aspect of graphics. When used appropriately, RT brings graphics from "looks nice but obviously a computer rendering" to questioning reality. The thing is, it isn't being used appropriately. As Denial pointed out, devs are still figuring out how to best use it, and for now, most of them are wasting its potential on things we've already done well (reflections, glossiness, shadows, etc) and butchering performance in the process. As a result, people are rightfully confused over what's so special about it, or worse: undermining its value.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
DXR is a step forward... But as for now i haven't seen anything that make me think "wow i need one of those GPU", NVidia was the 1st to launch some... it was really so/so in RT (not that the card were bad but in RT it wasn't good). The new NV seem more strong in RT, but again it is not enough on that point looking at the demo of it (as nowhere in stock). If the new AMD reach the 3070 level in RT and around 3080 without RT it is good enough, as less expensive (the 6900XT is anounced at 999 Euro wich is less than the cheapest 3080 at 1149,95 Euro both unaviable anyway lol) But for real nice RT, it is not yet this gen... 🙁 little by little...
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
Tyrchlis:

What you are saying here appears to me like you want a Google slave to go find info that is EASILY found with a few taps of the keyboard onto your favorite search engine. I am not your Google bitch. The reason I see great difference is BROADLY talked about in dozens of game reviews out there. Google is your friend, use it, or learn proper search terms. The data is there, I am not your Google slave.
Google is evil... don't use it 🙂