Red Dead Redemption 2: PC graphics benchmark review (revisited)

Game reviews 127 Page 1 of 1 Published by

Click here to post a comment for Red Dead Redemption 2: PC graphics benchmark review (revisited) on our message forum
data/avatar/default/avatar15.webp
Nice work , thank you. Seems like previus gen is a suprise, Vega 64 just a few fps lower then 1080ti and the gtx 1080 is a bit lower then expected.
https://forums.guru3d.com/data/avatars/m/282/282392.jpg
Looking forwards to this, thanks!
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
The_Amazing_X:

Nice work , thank you. Seems like previus gen is a suprise, Vega 64 just a few fps lower then 1080ti and the gtx 1080 is a bit lower then expected.
Game favors AMD cards. 5700XT is doing mighty fine for a 400$ gpu.
https://forums.guru3d.com/data/avatars/m/249/249528.jpg
haste:

Maximum quality tests are pretty much pointless in RDR2. Most of ultra settings are the same shaders with unjustified oversampling, huge performance impact and minimal visual improvements. The only ultra quality setting (except textures of course) I'd definitely keep is the lighting quality. It pushes lights range limits much further and also allows multiple shadowing and doesn't limit moon/sun light in larger towns etc.. it's very demanding at night but it's worth it.
So what exactly are you saying? That Hilbert's tests were straight up pointless because he followed the most common benchmark review format(just max it out)? And that he should make custom settings so that people get even more confused because they think he maxed it out but he actually didn't because he followed his visuals-to-performance instincts?
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
The reason for max quality tests being 'standard' can be discussed sure. Yet, I've explained in the article that we'll be using this benchmark with future generation cards in mind, ergo we opt the best quality settings opposed to a mixed setting in the previous revision of the article. It's a terrific Vulkan benchmark for future products. Whether or not you choose max quality is up to you. But that's why I also included some tests with different settings as the game certainly can be tweaked in many many ways. And it just that, the popularity for this game is huge as it's just such a terrific game, and I wanted it included again by adding a Vulkan compatible title into the test suite for graphics cards. Hopefully, future drivers and patches will no longer affect performance.
https://forums.guru3d.com/data/avatars/m/252/252732.jpg
Thanks for the update Hilbert. Vulkan MultiGPU also works nicely in this game πŸ™‚
https://forums.guru3d.com/data/avatars/m/267/267153.jpg
I really miss the comparison with old results, so I can see how the game has developed. Without it, the article somehow misses its point. Still, Im very glad to see such a test like this one, comparing the game in time. Thank you!
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
Man, Pascal is really showing its limitations in both DX12 and Vulkan in modern titles. Wild to see a card like the 1080 once able to keep up with cards like the RTX series now fall so short, and then turn around to see cards like the Vega 56 and 64 both hold their own in these APIs.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
HybOj:

I really miss the comparison with old results, so I can see how the game has developed.
Though the results between the old test and new test mostly show up in 1920x1080, I could not objectively compare and insert them. The first revision of this benchmark overview was done in-game, for the new one, we moved to the internal benchmark. Secondly, the first article focussed on DX12, this one VULKAN. But as jbscotchman mentions, the differences are to be found in that resolution jumping up 25%~30% Update: I added a comparison of old vs new results on the 1080p page. But again, the results remain subjective from how I look at it, so that is the big disclaimer I am adding here.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
vbetts:

Man, Pascal is really showing its limitations in both DX12 and Vulkan in modern titles. Wild to see a card like the 1080 once able to keep up with cards like the RTX series now fall so short, and then turn around to see cards like the Vega 56 and 64 both hold their own in these APIs.
is it limitations, or is it people expecting too much? theres always going to be something new a card can do that will be taken advantage of in a new games settings.
https://forums.guru3d.com/data/avatars/m/282/282392.jpg
jbscotchman:

Agreed, but I will give you my personal experience as I have many hundreds of hours in this game since release. Using the same settings @1080p my original average fps was 46, but now its in the low 60's. Like I mentioned I did do a cpu upgrade, AMD FX 8350 to a Zen+ @ 4ghz but only my minimum fps went up, buy a lot actually. However your average fps is entirely based on your gpu which is why this game should not even be considered in CPU benchmarks.
I'm planning on a 1660S to tide me by till the 3k series so thats very positive news, i'm sticking with my 1080 tele till then too and then switch back to 1440 monitor. This game i've yet to play, probably the one i'm most hyped for, i'll need to get the game soon then.
https://forums.guru3d.com/data/avatars/m/224/224796.jpg
anxious_f0x:

Thanks for the update Hilbert. Vulkan MultiGPU also works nicely in this game πŸ™‚
MultiGPU is only working with Nvidia gpu(s), or is it working on AMD now as well?
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
jbscotchman:

I am jbscotchman, not scortman lol. It's a language barrier thing, I get it. πŸ˜› But just to add further input on the topic when I got a 1440p a a couple months ago there were two games in my mind that I knew I would have to run at 1080p if I wanted acceptable framrates. And those were Control and Red Dead Redemption 2. No matter what cpu, ram amount/speed, etc your GPU will determine your framerate.
LOL no clue as to why or how I typed it like that. Corrected, sorry bro πŸ™‚
https://forums.guru3d.com/data/avatars/m/267/267153.jpg
Hilbert Hagedoorn:

Though the results between the old test and new test mostly show up in 1920x1080, I could not objectively compare and insert them. The first revision of this benchmark overview was done in-game, for the new one, we moved to the internal benchmark. Secondly, the first article focussed on DX12, this one VULKAN. But as jbscotchman mentions, the differences are to be found in that resolution jumping up 25%~30% Update: I added a comparison of old vs new results on the 1080p page. But again, the results remain subjective from how I look at it, so that is the big disclaimer I am adding here.
Thanks a lot, I REALLY appreciate that you did that and the way you interact with the community, this is very special, respect! PS: even if its just indicative, it can be seen that the performance got more in check, for example the 1080ti which behaved sub-par is now more at where it should be etc, I think the comparison is valuable and shows the progress made on the game and drivers. Good to see!
data/avatar/default/avatar36.webp
Yeah seems like the 1000 series cards are starting to show their age. Even the 1080 Ti which was a great card at the beginning of the RTX's life cycle started to struggle. Man I need that upgrade. 3060/3070 should do nicely. Yeah it is understandable that the 1000 series cards would struggle considering DX12 and Vulkan are not their strong suits.
https://forums.guru3d.com/data/avatars/m/252/252732.jpg
Elder III:

MultiGPU is only working with Nvidia gpu(s), or is it working on AMD now as well?
Doesn’t look like it from reading around the internet πŸ™
data/avatar/default/avatar10.webp
How about benchmarking the settings from Digital Foundry instead of Ultra. That would give a much better idea how that game performs. The settings from DF have the best quality and performance ratio and looks very close to Ultra, with much better performance.
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
Ultra probably works well for later GPU's as well even if some of it has the usual minor difference in visual quality not so minor difference in performance due to all sorts of scaling and changes and what the GPU is good with or has problems with especially when some things like shaders might scale to higher resolutions and show a notably higher performance cost at 2560 compared to 1920 and such. Would be neat if NVIDIA could take their DLSS solution one step further for the next version since it already uses minimal data and no per-game training model, their tech of course but it would get a lot of use for the upcoming games when scaling the resolution can make for some massive performance changes, maybe less so for benchmark purposes but for playing the game it has some really interesting utilization purposes and potential gains coupled with the upscaling to minimize image quality loss. Comparably though some shaders and stuff like volumetric effects and the upcoming dabbling into ray tracing will be costly, Assassin's Creed Odyssey and it's cloud quality setting hitting a near 60% framerate decrease at ultra might just be a bit of a starter though new GPU hardware might help a bit or just do as usual and brute force the performance really. πŸ˜› (Think it's still something like 40% on V.High and then it gets a bit more reasonable although still costly from High or Medium plus it has some pretty extreme image quality / performance ratios where it seems to barely change a thing above medium or high quality ha ha.) Red here I suppose is down to the massive view distance and additional effects even if Vulkan and D3D12 prevents another GTA IV bottleneck situation. Console max view distance was something at like 10 - 20% of that slider already showing again that sometimes the PC version gets a hefty increase in settings and scalability although the newer console generation and the upcoming one might change it around a bit again. (Well sorta, been some years since games like Crysis or Half-Life 2 where the low settings actually look like a generation back and the higher settings are very future proofed much as it gets called out as unoptimized when that's attempted.) EDIT: But yeah this game might work pretty well as a benchmark suite until I don't know, Horizon Zero Dawn has some benchmark mode coming out but might not be using anything too fancy. GTA VI after this I suppose far as looking at what is next for Rocksteady in a year and then I suppose they are still doing that whole a year on console first thing still heh. Character models and texture detailing aside as a bit of a lower point though still good looking it'll be interesting to see what the next-gen version of this RAGE engine can pull when no longer limited by the PS4 or XBox One hardware base.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
JonasBeckman:

Would be neat if NVIDIA could take their DLSS solution one step further for the next version since it already uses minimal data and no per-game training model
This is incorrect, and appears to be based on a mis-interpretation of the following One Network For All Games - The original DLSS required training the AI network for each new game. DLSS 2.0 trains using non-game-specific content, delivering a generalized network that works across games. This means faster game integrations, and ultimately more DLSS games. But further on states "Using our Neural Graphics Framework, NGX, the DLSS deep neural network is trained on a NVIDIA DGX-powered supercomputer. DLSS 2.0 has two primary inputs into the AI network: Low resolution, aliased images rendered by the game engine Low resolution, motion vectors from the same images -- also generated by the game engine Motion vectors tell us which direction objects in the scene are moving from frame to frame. We can apply these vectors to the previous high resolution output to estimate what the next frame will look like. We refer to this process as β€˜temporal feedback,’ as it uses history to inform the future." https://www.nvidia.com/en-us/geforce/news/nvidia-dlss-2-0-a-big-leap-in-ai-rendering/ I realise the snippet from the dlss page also says "While the original DLSS required per-game training, DLSS 2.0 offers a generalized AI network that removes the need to train for each specific game." But this isn't the case if you want the content to look correct.
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
Ah that clears it up then, if NVIDIA had cleared the hurdle of per-game training down to requiring only small amount of data they would have been close to a almost generic implementation or setting in the control panel that maybe wouldn't be as detailed but could still be leveraging the DLSS 2.0 and newer improvements without game specific implementations being needed so it could just work on everything pretty much. Upscaling from a resolution that could be just a quarter of what it's scaling up to for the final output with good enough results to be well worth the performance gains, assuming that had been the case it's a bit like a immediate win against the competition as outside of direct comparisons and settings users would just toggle it and that'd be it, big performance gains with few drawbacks and it'd probably improve further over time. Scaling of geometry data and of course the pixels for the resolution targeted and what it's scaling up from plus shader performance scaling at different resolutions along with finer detail preservation and and also with TAA and maybe some sharpening and you'd have a pretty strong advantage there and not one I think AMD could easily match without having to make their own solution from start which would take time and resources and manpower to try and match this. Game wise assuming it would have worked then yeah that's overall a pretty hefty boost to performance and from what's seen of it so far it would even be usable for downsampling resolutions at a lower performance cost or just staying at say 1920x1080 or 2560x1440 but with much less demand on the GPU now. Well good to hear that cleared up though, bit confusing in the wording there but it makes sense after reading your explanation and getting it cleared up as I don't think it'd be possible to get away with even a game-engine generic version of this implementation (yet?) as not everything on Unity or Unreal to use these popular ones for this example is going to be similar after all though if that could be done that'd be a pretty huge thing at least from my view on how this works and how performance would change to where there would be no real competition as one GPU vendor would just have a way to make things significantly faster and with continual improvements that could be implemented over time as well. Upcoming generation of games and demand on shader and geometry performance would be a big thing although eventually newer hardware capable of D3D12 Ultimate and the Vulkan utilization of these would be required for anyone not on a Turing type card. Or already for that matter what with pushing shader effects that can halve framerate or more plus it's scaling with the users preferred resolution even if it might not be 1:1 it will still incur a noticeable higher performance hit above 1920x1080 πŸ™‚ EDIT: Optimistically that's a 30 - 50% extra performance just like that, kinda hard to match. Realistically yeah it can't be quite that easy though it sounded like NVIDIA had cleared one of the major obstacles towards this. Well that probably is enough on that and back to the actual benchmark. Who knows maybe if NVIDIA has Ampere launching early maybe it won't be too long until the results will be updated with what these cards can do. πŸ˜€