Assassins Creed: Valhalla graphics perf benchmark review

Game reviews 127 Page 1 of 1 Published by

Click here to post a comment for Assassins Creed: Valhalla graphics perf benchmark review on our message forum
https://forums.guru3d.com/data/avatars/m/165/165018.jpg
great review HH. It's interesting to see what "financially backing" a game engine can do for graphics performance.
https://forums.guru3d.com/data/avatars/m/147/147322.jpg
I got confused by page title > Assassins Creed: Valhalla graphics perf benchmark review - RTX - DLSS 2.0 Perf - Quality :D does game support DLSS? :P
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
lol that 5700XT is faster than it had rights to be. ๐Ÿ˜€
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
Curious too as this shouldn't be utilizing the full range of RDNA stuff like Horizon was doing but the implemented AMD extensions could still be for more than FreeSync2 HDR support.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
SpajdrEX:

I got confused by page title > Assassins Creed: Valhalla graphics perf benchmark review - RTX - DLSS 2.0 Perf - Quality :D does game support DLSS? ๐Ÿ˜›
Ah template residual, fixed. It could certainly use DLSS support though ๐Ÿ™‚
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
A solid upscaling option wouldn't hurt I think the game has two modes but I am not 100% on them keeping the same as the two prior Assassin's Creed games did with this engine here. Adaptive resolution scaling now as it's own setting and then below "High" anti-aliasing quality rendering in sub-native thus it looking like this setting has a higher than average performance impact when it's two separate components the scaling and then the TAA itself. Because it changes up I need to see if there's an actual confirmation on how this game implements it though. EDIT: Though of course as it's TAA then just checking the image whether it's soft or not doesn't really work it's TAA the image is softened and it can't really be disabled. ๐Ÿ˜€ (Well it should still stand out if it's really soft or not from upscaling the final image I'd imagine.) EDIT: Actually with it as a separate option and scaling both above and below 100% render resolution also being a option I would think that TAA on low or medium would now be entirely separated from modifying the back buffer resolution or how it scales it back.
data/avatar/default/avatar33.webp
2560x1440 all ultra settings and 100% res scale, I have 75 average fps on 2080ti with drops to 60 fps in cities. Surprising to see how well 5700 XT is doing here, I wonder how well will 6800 XT perform then..
data/avatar/default/avatar01.webp
SpajdrEX:

I got confused by page title > Assassins Creed: Valhalla graphics perf benchmark review - RTX - DLSS 2.0 Perf - Quality :D does game support DLSS? ๐Ÿ˜›
No, and good riddance.
https://forums.guru3d.com/data/avatars/m/278/278016.jpg
Undying:

lol that 5700XT is faster than it had rights to be. ๐Ÿ˜€
We should state that Assasin's Creed: Valhalla is an AMD sponsored title Overall we say that AMD benefits from the game the most, as it should as AMD financially backs the game. With this title, we'll also move towards a new test platform, based on AMD Ryzen 9 5950X, currently the fastest gaming processors your money can get you.;) It starts:) gg GURU3D
data/avatar/default/avatar19.webp
lukas_1987_dion:

2560x1440 all ultra settings and 100% res scale, I have 75 average fps on 2080ti with drops to 60 fps in cities. Surprising to see how well 5700 XT is doing here, I wonder how well will 6800 XT perform then..
6800 xt will perform worse the higher the res, due to the 256 bit bus and "only" having regular gddr6. So it will hit it out of the ballpark at 1080p, and likely be somewhat slower than the 3090 and possibly 3080 at 4k.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
Dragam1337:

6800 xt will perform worse the higher the res, due to the 256 bit bus and "only" having regular gddr6. So it will hit it out of the ballpark at 1080p, and likely be somewhat slower than the 3090 and possibly 3080 at 4k.
:) Not saying a thing here, but sometimes I am so proud of the Guru3D community.
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
Dragam1337:

6800 xt will perform worse the higher the res, due to the 256 bit bus and "only" having regular gddr6. So it will hit it out of the ballpark at 1080p, and likely be somewhat slower than the 3090 and possibly 3080 at 4k.
Wonder what the infinity fabric and a 300 or higher bus width could do, sure there's full 512-bit but then it's all complicated due to pricing and what not in turn although combining the full thing of these together would have been interesting to see although a 384 or what's it called 448 bit bus might have been the top if AMD was going that far for the enthusiast model as I don't believe AMD has gone 512-bit since the attempts with this ring-bus and the 290 GPU model. Add a HBM2E type memory while thinking about something that won't happen best case is that AMD maybe uses this for the professional GPU lineup which I think is the rumored CDNA architecture replacing GCN, eventually. A bit under a week for reviews for these though, some good more demanding titles for testing them too. ๐Ÿ˜€ EDIT: Suppose GDDR6 and even GDDR6X has a range of speeds available plus improvements since the initial use on GPU's but I don't think AMD would be using the top-end chips same as NVIDIA likely isn't using the fastest GDDR6X modules at least for now. Just goes back to what benchmark results will be and then overclocking results if there's any headroom for long-term general stability when pushing the stock speeds higher. Was a bit iffy for the 5000 series due to the VRAM chips and then a mix between Samsung and Hynix I think it was here plus the memory controller and how the GPU handled that all. (Not well at all until 19.8.1 from what I recall and then various issues since with how sensitive these are.)
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Hilbert Hagedoorn:

:) Not saying a thing here, but sometimes I am so proud of the Guru3D community.
This almost seems like an hint for the upcoming review of the 6000 series... ๐Ÿ˜ฑ
https://forums.guru3d.com/data/avatars/m/236/236506.jpg
AMD have sponsored Ubisoft titles in the past with no obvious indication that their cards performed abnormally better than Nvidia but the sponsorship seems to have, belatedly, paid off. Nvidia cards are struggling here, and the performance the 5700XT is putting out bodes will for the 6000 series.
data/avatar/default/avatar05.webp
JonasBeckman:

Wonder what the infinity fabric and a 300 or higher bus width could do, sure there's full 512-bit but then it's all complicated due to pricing and what not in turn although combining the full thing of these together would have been interesting to see although a 384 or what's it called 448 bit bus might have been the top if AMD was going that far for the enthusiast model as I don't believe AMD has gone 512-bit since the attempts with this ring-bus and the 290 GPU model. Add a HBM2E type memory while thinking about something that won't happen best case is that AMD maybe uses this for the professional GPU lineup which I think is the rumored CDNA architecture replacing GCN, eventually. A bit under a week for reviews for these though, some good more demanding titles for testing them too. ๐Ÿ˜€
Well the only option if they want to maintain the 16gb vram configuration, is 512 bit. It would be more expensive, sure... but not THAT much more expensive. So i would personally have expected at least the 6900xt to get a 512 bit bus, given it's 1k USD price point, and use of much cheaper gddr6 memory. The 6900xt with a 512 bit bus would likely have destroyed the 3090 at 4k, rather than it being possibly a bit behind the 3090 at 4k. Amd haven't done a gpu with a 512 bit bus since the 390, but i don't see any reason as to why they couldn't. HBM2 usually fares worse than GDDR6 in games due to higher latency - latency is king in games, which is also seen with intel vs amd, where intel has traditionally had substantially lower memory latency.
data/avatar/default/avatar19.webp
Hilbert, could You check if tuning down Volumetric Clouds setting a notch or two gives massive perfomance improvement? In previous AC games (Odyssey and Origins) it can work miracles - going from 35 to 45 (one step down) to 60 fps (two or three steps down) on my old 290X/FHD - it is most visible on weaker cards.
https://forums.guru3d.com/data/avatars/m/218/218363.jpg
When I bought the 2080Ti I thought it would be 4K capable which it really wasn't but I had high hopes with the 3080 being a true 4K card. Apparently it's not, at least not with Ubi games. W_D Legion would run at 4K but with reduced settings and I could never get it locked at 60fps. I was sure that Valhalla would run like Odyssey which after a few patches ran beautifully at 4K/60 with close to max settings on a 2080Ti.
Sylwester Zarฤ™bski:

Hilbert, could You check if tuning down Volumetric Clouds setting a notch or two gives massive perfomance improvement? In previous AC games (Odyssey and Origins) it can work miracles - going from 35 to 45 (one step down) to 60 fps (two or three steps down) on my old 290X/FHD - it is most visible on weaker cards.
I checked out another review where they said that clouds have very little impact in this iteration. There is one or two settings which are much heavier on the GPU.
https://forums.guru3d.com/data/avatars/m/180/180081.jpg
willgart:

I love the guys claiming that GDDR6X is required or a huge bandwidth. the drop in performance for the 3090, 3080 and 3070 is the same between 1440P and 4K specialy the drop in performance of the 3080 is 31% and the 3070 its 33% same quantity of ram, not the same amount of CUs, gddr6X vs gddr6... 2% of difference... for sure the speed of the ram has no impact. which is expected. AMD did the right move to go GDDR6 and not GDDR6X so we can expect to see the 6900XT at the 3090 performance level as expected.
Memory speed and bandwidth does make a difference at high resolution. But if you're at below 4k, like most people, it doesn't matter. Might try this game some day, see the sights and feel one's GPU get a good workout. Most likely not. Who has time to play games anymore.
https://forums.guru3d.com/data/avatars/m/236/236506.jpg
The 3070 seems to perform as well as the 2080ti at 4K despite having lower memory bandwidth. There is a bit more to GPU architecture than bandwidth tbh.
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
Did nVidia pulled a Pascal again? I had a 1070 and when RTX 2k was released performance "went down the toilet". Before it was on par with a Vega 56 and after 2k it was often like 10% behind in newer titles. I mean the 5700XT almost on par with 2080 Super at 2k ...