VRAM usage and Concluding
Concluding
Genuinely, I had prepared myself for a possible bitch-fest on this Ubisoft title, historically speaking, we always ran into weird stuff or DRM limitations. This round NIL, nothing, Nada. Neither did I anticipate the game to be this good, as guys genuinely, it really is. The gameplay is admirable; the scenery has been done elegantly. You can perhaps argue that the game does not have any Raytracing support, as reflections would have made this title really nice. But even without it, this game shows what the good old rasterizer engine can do, and that's a whole lot! Maybe the one grasp would be the cut-scene character quality of rendering, that's a little 'meh,' but other than that this game looks great.
Performance-wise the game is HARD on the GPU. And guess we like that. Why? It often indicates one of two things, it's a poor console port with sloppy coding, or the GPU is extremely hard at work creating beautiful scenery, and it is that last option for sure. I know that is subjective, but for me, PC gaming is all about gaming at best possible render quality. So I am fine with included graphics options that are taxing as hey,m you can always turn down image quality and gain back performance to your liking.
In-game testing or the benchmark?
Preferably we always use an in-game benchmark as that controls any random variables that can not influence your framerates otherwise. The ground rule however is that the internal benchmark must be representative of gameplay FPS overall. This game has an in-game benchmark that is pretty decent in equalling your framerates. Again, framerates vary per scene big time; it's all about that average in a somewhat GPU stringent situation. However, we stepped away from the internal benchmark as in high resolutions; it started to stutter quite a bit. So the alternative was to find a spot in the game that is harsh on the GPU and run a 30-second average. In the end, the in-game-results still are very close to the internal benchmark. So I'd say you can still compare them at home. About the stutters, these can be found in the game as well shortly after a scene start. But once textures etc. are cached, the game pretty much is stutter-free.
Both AMD and NVIDIA have Valhalla optimized drivers. Overall we say that AMD benefits from the game the most, as it should as AMD financially backs the game. Especially Vega 56/64 and Radeon RX 5600/5700 (XT) RDNA architecture does perform extremely well in Full HD, while the GeForce graphics cards are catching up starting at WQHD. BTW it's not that AMD is hiding their sponsorship, when the game starts you'll notice some much AMD lovin' going on in the form of Ryzen adverts etc. Hey's it's all in the game, in the end, NVIDIA does the same.
That Ultra quality mode, as stated, is harsh in ultra HD. Only the new RTX card can deal with it; the 6800 series from AMD will be roughly at that same spot as well; we just cannot show you the results just yet. That said, it is an original and, above all, intriguing game. In the end, AC - Valhalla is a game that will be appreciated by many; there are some very cool graphics quality settings to fiddle and play around with. Just don't expect 60 FPS Ultra HD performance with all eye-candy on your sub 500 USD graphics card. However, if you have that expensive card, you're in for a treat. If you are into the Assassins Creed genre and type of gameplay, this by far is one of the better releases to date.
And remember, you're playing a Viking, raid!
- Download the latest NVIDIA GeForce graphics game ready driver (download).
- Download the latest AMD Radeon graphics Adrenalin driver (download).
- Sign up to receive a notification when we publish a new article
- Or go back to Guru3D's front page.