Radeon Series RX 6000 Raytracing numbers posted by AMD, match RTX 3070

Published by

Click here to post a comment for Radeon Series RX 6000 Raytracing numbers posted by AMD, match RTX 3070 on our message forum
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
Anyone with more than two brain cells knows the potential of RT, but its still far from being achieved. I think it's great AMD is back competing on high end and implementing new features. Personally I prefer the AMD approach to make a smaller GPU without dedicated cores, that way its more affordable. Some game studios have made RT demos that run without RT hardware, I see no reason to have Ngreedia dominate the thing with it's proprietary hardware. Let's give a chance to AMD, it's the first RT generation hardware after all. Personally I'm waiting for the 6700/6600, couldn't care less for RT in the next 3 years.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
schmidtbag:

I'm one of those people who considers it an important aspect of graphics. When used appropriately, RT brings graphics from "looks nice but obviously a computer rendering" to questioning reality. The thing is, it isn't being used appropriately. As Denial pointed out, devs are still figuring out how to best use it, and for now, most of them are wasting its potential on things we've already done well (reflections, glossiness, shadows, etc) and butchering performance in the process. As a result, people are rightfully confused over what's so special about it, or worse: undermining its value.
True but that´s because of the reason why RT is being deployed in games. Right now, RT is not being used because it´s better or cheaper than the classical way, it´s being used because Nvidia has been aggressively pushing since the release of the 2000 series in order to sell GPUs. And in many games, the sole reason why they are using RT is because Nvidia payed them or helped them implementing it, even if it makes little sense. Basically RT implementations right now are due to marketing reasons and that marketing is so good that now RT is a buzzword and almost a must have feature, so consoles are hoping in the hype train surrounding it and are also implementing it even if they don´t need or it tanks the performance severely, so severely some games are going to run at 30fps... AMD had to follow suit, because otherwise everybody would say their cards are old crappy tech, and also added RT to their line up in a more sensilble way, i think. Bottom line, as long as the push for RT is done for marketing reasons instead of technical ones, RT is going to be a gimmick until it finally becomes essential. But i guess we have to start somehow, so maybe this situation isn´t that bad.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
H83:

True but that´s because of the reason why RT is being deployed in games. Right now, RT is not being used because it´s better or cheaper than the classical way, it´s being used because Nvidia has been aggressively pushing since the release of the 2000 series in order to sell GPUs. And in many games, the sole reason why they are using RT is because Nvidia payed them or helped them implementing it, even if it makes little sense.
I totally agree, and I feel it has really backfired (especially since the whole 2000 series was stupidly overpriced). Nvidia did a poor job at emphasizing where it should be used.
Basically RT implementations right now are due to marketing reasons and that marketing is so good that now RT is a buzzword and almost a must have feature, so consoles are hoping in the hype train surrounding it and are also implementing it even if they don´t need or it tanks the performance severely, so severely some games are going to run at 30fps...
I agree with all of this too - I'm glad it's finally no longer a gimmick but I've waited years for 4K gaming to be reasonably priced and I'm not about to wait another few years for a feature I can currently live without haha.
AMD had to follow suit, because otherwise everybody would say their craps are old crappy tech, and also added RT to their line up in a more sensilble way, i think. Bottom line, as long as the push for RT is done for marketing reasons instead of technical ones, RT is going to be a gimmick until if finally becomes essential. But i guess we have to start somehow so maybe this situation isn´t that bad.
It's hard to say whether or not AMD's approach is better or not. Nvidia's approach is most likely more efficient, but to my understanding, AMD's approach is more dynamic/modular. I'm torn because efficiency and modularity are the 2 things I appreciate most in engineering. But yeah, currently RT is more gimmicky than essential. Aside from preferring to wait a couple years for devs to utilize it best and for the interest to die down, I primarily play games on Linux so I'm probably not going to get access to it for a couple years anyway. So, I'm waiting whether I prefer to or not haha.
https://forums.guru3d.com/data/avatars/m/165/165018.jpg
The problem we run into now with (or at least if this rumor is true) is that Who would rather spend an extra $50(barring scalping and what not) on extra RT power VS 6gb of extra ram and lower RT performance or turn it off to get "the same". Though I might still be in the market for a 6800 next year because I have a freesync monitor I got for cheap and it only "sort of " works with my 2070s.
https://forums.guru3d.com/data/avatars/m/277/277212.jpg
This article and the post it is about, while somewhat interesting, are complete nonsense. To begin, that is not a benchmark. The screen shot shown is from a DirectX sample application called Procedural Geometry. You can see this in the title banner of the window in the screenshot. If you want, you can download the DX sample code pack and build this yourself with Visual Studio which is free. I did and ran it for myself. My laptop, with a 2070, gets over 150 FPS in debug mode which is pretty good I think. To repeat, this is NOT a benchmark and there are really no conclusions one can draw from the results other than the performance of the cards is of the same order of magnitude.
https://forums.guru3d.com/data/avatars/m/165/165018.jpg
I did a bit of fiddling with Wolfenstein Youngblood(thanks Game Pass) and found that my 2070 S averages 74 FPS with just RT on. By enabling DLSS I average 105. I think DLSS is going to be the one thing that AMD has to combat the most. Perhaps they can figure out a non HW solution to allow the 6k cards to be right where we want them.
data/avatar/default/avatar39.webp
Gomez Addams:

This article and the post it is about, while somewhat interesting, are complete nonsense. To begin, that is not a benchmark. The screen shot shown is from a DirectX sample application called Procedural Geometry. You can see this in the title banner of the window in the screenshot. If you want, you can download the DX sample code pack and build this yourself with Visual Studio which is free. I did and ran it for myself. My laptop, with a 2070, gets over 150 FPS in debug mode which is pretty good I think. To repeat, this is NOT a benchmark and there are really no conclusions one can draw from the results other than the performance of the cards is of the same order of magnitude.
Did you read AMD's footnote? If you want to second guess them fine, but they specifically mention HW based ray tracing as well as the associated FPS benchmark result versus results using the fallback method. AMD's footnote on the comparative performance below:
"Measured by AMD engineering labs 8/17/2020 on an AMD RDNA 2 based graphics card, using the Procedural Geometry sample application from Microsoft’s DXR SDK, the AMD RDNA 2 based graphics card gets up to 13.8x speedup (471 FPS) using HW based raytracing vs using the Software DXR fallback layer (34 FPS) at the same clocks. Performance may vary. RX-571"
https://www.amd.com/en/technologies/rdna-2
https://forums.guru3d.com/data/avatars/m/87/87316.jpg
Kool64:

I did a bit of fiddling with Wolfenstein Youngblood(thanks Game Pass) and found that my 2070 S averages 74 FPS with just RT on. By enabling DLSS I average 105. I think DLSS is going to be the one thing that AMD has to combat the most. Perhaps they can figure out a non HW solution to allow the 6k cards to be right where we want them.
That is true. AMD sees to have some equivalent tech to DLSS but hasn't come out with more information on it. We'll have to wait and see.
https://forums.guru3d.com/data/avatars/m/238/238795.jpg
A few things 1.) These benchmarks with RT, somewhat disappointing. Especially if this is apples to apples which I assume and DLSS is off in them. Having developed for consoles I figured AMD would have better performance in this area. 2.) As a 2080 owner I can say personally that RT is a gimmick and a waste of frames right now. Yes, it can be great in the future. Right now no cards.... not even the 3090 can handle a game that uses it completely, in 4k, that makes the difference we are looking for. Right now and for the foreseeable future it's negligible differences in shadows, lighting, etc. You know it's not a game changer yet when comparison videos and pics are up and you don't know which is which if you don't read icon the image (I'm guilty of this, hell, sometimes the pic I assume is RT isn't!) under it saying RT on, RT off. I'd rather some games focused on fun factor and gameplay. Graphics for me are getting silly. The end all be all. Like a big budget Hollywood action movie that has no story. I love great visuals, but there's been so many games as of late that look amazing and I can't get through half the game because it's just so boring or downright terrible. FFXV was the start of this for me. Great looking. Worst game of the series for me. Division 2, looked great, was terrible compared to the original for me. And so on. I'm still comfortable buying AMD this round. I don't find RT necessary. I find HDR much more essential than RT. I hate to sound negative but I really think these cards will go as fast as Nvidia unless some precautions are setup. This bot fiasco is becoming the norm for newly released hardware. It will happen again with AMD cpu's coming, and when Intel drops their own next year and so on. Needs to be stopped. But I can't see companies really working too hard on prevention because at the end of the day, the products are selling. Their bottom line is being met. What do they care if it goes to a real user or a real loser (scalper)?
data/avatar/default/avatar07.webp
Fender178:

AMD is really impressive on paper however I am going to wait for actual reviewer benchmark results. After what AMD has pulled in the past with their benchmark results it is very difficult to trust them.
Yeah... I have some reservations with IHV's benchmarks by default (member Vega, Fury), and then there were some questionable benching choices in AMD's presentation (Smart Access Memory(tm) and auto-undervolting+overclocking 6900XT with Rage), but even so... independent benchmarks should not change anything too dramatically. All in all - RDNA2 looks to be one of the biggest, if not the biggest GPU performance leap in the last decade. Kudos to AMD. On the not-so-good side, we now have ray-tracing fragmentation, with devs having to work with two different RT arch and with AMD more than likely trailing if both paths are optimized. Also I don't see MS/AMD being able to match DLSS perf*quality without dedicated hw. But then again maybe they pull off a miracle - something like Freesync vs Gsync would be phenomenal.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
pharma:

PS5 is not RDNA2 ... no VRS, Sampler Feedback, etc, RT is believed to be different (not sure) ... I believe they have something schedule shortly for more detail.
XBox serie X is based on Windows, PS5 isn't based on windows and so doesn't care about DX12 or DXR... , and so it is logical that it doesn't use the RT of AMD as is, because based on those API for convenience. But does it mean that the PS5 will not use RT? Exept if there is a " no we can't " from Sony, i would not have taken easy shortcut as in this article. Other OSes exist, even on PC and you can't imagine how much hardware made for windows fully work on them. As anything, we will see when in shop and reviewed.
data/avatar/default/avatar04.webp
Noisiv:

Yeah... I have some reservations with IHV's benchmarks by default (member Vega, Fury), and then there were some questionable benching choices in AMD's presentation (Smart Access Memory(tm) and auto-undervolting+overclocking 6900XT with Rage), but even so... independent benchmarks should not change anything too dramatically. All in all - RDNA2 looks to be one of the biggest, if not the biggest GPU performance leap in the last decade. Kudos to AMD. On the not-so-good side, we now have ray-tracing fragmentation, with devs having to work with two different RT arch and with AMD more than likely trailing if both paths are optimized. Also I don't see MS/AMD being able to match DLSS perf*quality without dedicated hw. But then again maybe they pull off a miracle - something like Freesync vs Gsync would be phenomenal.
Yeah the same goes with their 8core Laptop CPUs as well they did some questionable benching with those as well
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
Turanis:

Future console games for PS5/XBOX will support everything because RDNA2 and Ryzen 3.
Having same CPU and GPU doesn't made thing compatible and easy to port... it's just less hard. Have you seen how the memory is handled on those console? and the file system? and the OS? An exemple: A Chrome book is hard to work on Windows despite sharing nearly anything with a regular PC... the "nearly" is important there, this is what causing headhache to the one who try to do it.
data/avatar/default/avatar28.webp
im gonna wait for official reviews, but i think ill get the 6900xt, ray tracing performance shouldnt be bad considering, that all title will be optimized for this gpu architecture because of the new consoles, i just bought a 1440p 240 hertz, i care more about high fps than raytracing, i mostly play fps game as well so i would turn off the ray tracings anyway, if 6800xt and 6900xt turn out to be like amd benchmark its a big win for me, the monitor and the 6900xt will cost me what the 3090 would have
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
rl66:

Having same CPU and GPU doesn't made thing compatible and easy to port... it's just less hard. Have you seen how the memory is handled on those console? and the file system? and the OS? An exemple: A Chrome book is hard to work on Windows despite sharing nearly anything with a regular PC... the "nearly" is important there, this is what causing headhache to the one who try to do it.
Yeah. Someone else said it already in another thread I've forgotten, but PS4 and XBox One already have been sporting AMD CPU+GPU, yet on PC games nearly always run better on Nvidia cards. It seems to be some kind of mantra to claim AMD video cards will benefit from the console ecosystem, but somehow that's never the case in practice. Nvidia has a huge machinery running to help game studios to optimise the games for Nvidia video cards, and it shows. If a game runs (relatively) better on an AMD video card, you usually find AMD's direct involvement in the background, not merely the status of being a console port.
data/avatar/default/avatar05.webp
Kaarme:

Yeah. Someone else said it already in another thread I've forgotten, but PS4 and XBox One already have been sporting AMD CPU+GPU, yet on PC games nearly always run better on Nvidia cards. It seems to be some kind of mantra to claim AMD video cards will benefit from the console ecosystem, but somehow that's never the case in practice. Nvidia has a huge machinery running to help game studios to optimise the games for Nvidia video cards, and it shows. If a game runs (relatively) better on an AMD video card, you usually find AMD's direct involvement in the background, not merely the status of being a console port.
but this console gen are basicly PCs, if im not mistaken game are even developped on pc and ported to these consoles, it might be different this time around. time will tell us, if i go with an amd gpu it ll be a first for me since the all in wonder 9700pro
data/avatar/default/avatar01.webp
Fox2232:

I have unpleasant feeling that while AMD claims huge effective bandwidth thanks to IC, actual story with high resolution textures will be different. Maximum theoretical bandwidth of those 16GB memories is 512GB/s w/o OC. That's together Read+Write operations as GDDR6 can't do both at same time through same line. That leaves us with usable ~2GB_data/frame if we want to reach 240fps. And that's not much. We will see effect of IC, but I think it will not do much for high fps targets while it will do a lot for bottom line.
yep anyway, there is nothing for us to do but wait for reviews, its not like we can buy a 3080 or 3090 anyway 🙂 these are practicly non existent in canada
https://forums.guru3d.com/data/avatars/m/266/266726.jpg
Fox2232:

I have unpleasant feeling that while AMD claims huge effective bandwidth thanks to IC, actual story with high resolution textures will be different. Maximum theoretical bandwidth of those 16GB memories is 512GB/s w/o OC. That's together Read+Write operations as GDDR6 can't do both at same time through same line. That leaves us with usable ~2GB_data/frame if we want to reach 240fps. And that's not much. We will see effect of IC, but I think it will not do much for high fps targets while it will do a lot for bottom line.
I wouldn't worry too much, if you look at the rtx 3070 and its paltry 448gb/s vs the 2080ti's 616gb/s or the 3080 , the extra bandwidth doesn't seem to matter as much when it comes to typical gfx workloads (compute is a different story however), I suspect that 512gb/s more than enough, and that the cache mainly decreases the access latency, which keeps the cus better fed, instead of stalling like what older cards suffered from,
data/avatar/default/avatar37.webp
Richard Nutman:

That's great and all, but all I'm asking is for someone to point out where FULL RTX makes the game look significantly better.
Minecraft and Quake 2 via RTX ray-tracing. Anything else?
https://forums.guru3d.com/data/avatars/m/277/277158.jpg
Definitely not wanting to poo-poo ray tracing - but looking at videos on YT, comparing RTX on and off - it's a real credit to the designers and artists just how good looking they've made a lot of these games (without it).