Review: Total War WARHAMMER DirectX 12 PC GFX & CPU scaling performance

Published by

Click here to post a comment for Review: Total War WARHAMMER DirectX 12 PC GFX & CPU scaling performance on our message forum
data/avatar/default/avatar15.webp
It feels a bit unfair because MSAA purposely was disabled/removed, which would have been the preferred applied anti-aliasing option. Hence AMD knows it is performing better, and probably worse with MSAA. I could be wrong here, but it certainly raised my eye-browses for me. Hey like I stated in the conclusion, this is how things go in 2016, it is what it is. AMD has the upper hand with clever optimizations in this game, good for them and everybody with Radeon cards.
So how come you've used FXAA in several of your reviews, but haven't mentioned anything about it being "unfair advantage for NVIDIA", even though you think "MLAA for AMD is what if FXAA for Nvidia." and you see MLAA here as a bit unfair option? Didn't do thorough sweep of all reviews obviously, but at least in one case there was even MSAA option available (Crysis 3), yet you still went FXAA
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
So how come you've used FXAA in several of your reviews, but haven't mentioned anything about it being "unfair advantage for NVIDIA", even though you think "MLAA for AMD is what if FXAA for Nvidia." and you see MLAA here as a bit unfair option? Didn't do thorough sweep of all reviews obviously, but at least in one case there was even MSAA option available (Crysis 3), yet you still went FXAA
FXAA's impact on performance is equal on both AMD and Nvidia. The article makes it sound like MLAA negatively impacts performance on Nvidia more than AMD and since there is no other option for AA he wanted to put a note in there regarding it. I don't really see the issue. He disables Hairworks and stuff for the same reason in other benchmarks specifically because it favors Nvidia. Also, if he had called out FXAA in Crysis 3 the same way he did MLAA here, would you have posted the same thing about not calling out MLAA in previous benchmarks?
data/avatar/default/avatar38.webp
FXAA's impact on performance is equal on both AMD and Nvidia. The article makes it sound like MLAA negatively impacts performance on Nvidia more than AMD and since there is no other option for AA he wanted to put a note in there regarding it.
Last time I saw a game with MLAA (Deus Ex) NVIDIA actually took less hit from it than AMD.
I don't really see the issue. He disables Hairworks and stuff for the same reason in other benchmarks specifically because it favors Nvidia.
Yet he doesn't disable FXAA even when he thinks it's the same for NVIDIA as MLAA is for AMD, and calls MLAA unfair.
Also, if he had called out FXAA in Crysis 3 the same way he did MLAA here, would you have posted the same thing about not calling out MLAA in previous benchmarks?
If he had called FXAA unfair advantage in similar way in his previous reviews as he now did MLAA, no, I wouldn't have said anything about it. The point being: He thinks MLAA is for AMD as FXAA is for NVIDIA, yet only MLAA is being called unfair advantage for AMD, while FXAA isn't called the same for NVIDIA, that just doesn't add up.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
Last time I saw a game with MLAA (Deus Ex) NVIDIA actually took less hit from it than AMD. Yet he doesn't disable FXAA even when he thinks it's the same for NVIDIA as MLAA is for AMD, and calls MLAA unfair. If he had called FXAA unfair advantage in similar way in his previous reviews as he now did MLAA, no, I wouldn't have said anything about it. The point being: He thinks MLAA is for AMD as FXAA is for NVIDIA, yet only MLAA is being called unfair advantage for AMD, while FXAA isn't called the same for NVIDIA, that just doesn't add up.
No, you are nitpicking and zooming in on an argument just to win that argument. Feel free to think and express what you want. FXAA is optimized for both sides and both sides have an equal perf win these days. However MLAA, ... not so much. Just google MLAA and AMD a bit. The fact that MSAA was disabled and this enforcing merely MLAA is imho a little suspect and as such I mention it in the article. MLAA however works great, so kudos, top notch, thumbs up and hail mary there. But the fact remains that MSAA is disabled to enforce MLAA, and that is suspect period. Anyway, I've said what I wanted to say on the topic. Everybody can think whatever they want about MLAA, I honestly do not care. I'm very much done getting sucked into these weird discussion on the forums where somebody agitated has to proof his/her point.
https://forums.guru3d.com/data/avatars/m/260/260828.jpg
Any idea how this game runs in APU? I havent seen many test of APUs in DX12
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Any idea how this game runs in APU? I havent seen many test of APUs in DX12
Yeah, it's weird -- Microsoft made a huge deal about how great DX12 was going to be for integrated graphics, in terms of both power savings and performance. Yet, after the initial starwarm benchmark, we haven't heard a peep about it. I guess the issue is that APU's aren't exactly the target demographic for Guru3D. I was hoping someone like Anandtech would do a DX12 review of them, but they can't seem to get anything out anymore.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Meh.. Im still waiting for all those magic next gen u4e games and what not, so far only RoTR and Doom came a bit close to the word "nextgen", at least visually.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Meh.. Im still waiting for all those magic next gen u4e games and what not, so far only RoTR and Doom came a bit close to the word "nextgen", at least visually.
And Uncharted 4, Until Dawn, and The Order 1866, but they are console games and we shall not speak their names in here.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
No we can't, it's AMD fanboy's wet dream. To get back to topic, an overclocked 980 (which I have, and no there's no SLI support), running about 1500MHz gets you below 60fps on the campaign map, with dips to below 10 during the turn end (when the AI gets their turn). Average it shows 59 when I have the ShadowPlay overlay on. But the game runs fine, I have a Gsync monitor. What I see is that I would actually downgrade my fps going for dx12. So.... it's not a thing to do, as simple as that. dx12 offers nothing in better graphics or performance... no gain, so it's a fail for a well built machine that happens to have a nvidia graphics card. Getting 59 or 60 fps with an overclocked card between dx11 and dx12 doesn't make any difference. By the way, I'm not surprised this game runs better on AMD, it's in AMD's partner program... 🙄
As Aelders says correctly all the time, AMD doesn't have an advantage under DX12, they have a disadvantage under DX11. The thing is that cards are sold with the DX11 performance in mind really. So your 980 is overpriced compared to the hardware it has vs a 390x/Nano, and when that hardware is fully utilized under DX12, the AMD cards pull ahead (which they SHOULD have done anyways under DX11, but they don't). It's a matter of pricing and expectations, more than the actual performance.
https://forums.guru3d.com/data/avatars/m/225/225084.jpg
As Aelders says correctly all the time, AMD doesn't have an advantage under DX12, they have a disadvantage under DX11. The thing is that cards are sold with the DX11 performance in mind really. So your 980 is overpriced compared to the hardware it has vs a 390x/Nano, and when that hardware is fully utilized under DX12, the AMD cards pull ahead (which they SHOULD have done anyways under DX11, but they don't). It's a matter of pricing and expectations, more than the actual performance.
My personal opinion is that all this is hot air. DX12 is too new and un-optimized to even begin a discussion like this. I agree AMD was behind in DX11 performance and yet they still managed to compete. You only need to look back @ DX9 - 9c and you'll understand that DX is an evolving software. At the moment we are at the very start of a new API and in no way can we make a judgement call on how it will end up. I have a feeling that DX12 will go through many a stage in it's life and for all we know Nvidia and AMD will both get wins and losses along the way. I highly doubt Nvidia will sit on it's hands while AMD takes an API lead, never going to happen, which is a good thing for us i suppose. Right now the discussion is about DX12 but who knows what percentage of games will even use this API. For all we know maybe only 10% of future games will run on DX. I have this feeling that most Devs will not want to work on an MS API and will instead go with an underdog like Vulkan.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Nvidia dont want to promote DirectX 12 because they are not prepared for this API. Untill Volta will come these single thread old farts (Maxwell & Maxwell 2-Pascal) will push only in Dx11 games. Sadly,but Nvidia is not a pioneer in gaming industry. Well I want to see how they perform in Battlefield 1 (Dx12). 🙂
Everything about this post is wrong.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
I agree that NVIDIA doesn't really have a "true" DX12 architecture yet, but I don't agree on ****ting on their products so much. If you don't really expect a tremendous uplift with DX12, their stuff is ok. The problem they have is that they sell with "DX11 pricing" which doesn't seem to correspond to the DX12 performance against similar Radeons (look the Nano vs the 980 for example).
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
I'm not even sure I know what the definition of a "true" DX12 architecture is. Unless you just define it as GCN. Which I guess is basically what it's going to come down to, since DX12 is essentially built off of GCN's capability. It just bothers me when I read posts like "Maxwell and Pascal are single threaded" which is like just complete nonsense. Like if you're talking about actual threads, it's super wrong, but even if you translate it from he's trying to say, that it can't mix compute/graphics, it's still wrong. Maxwell can mix compute/graphics.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
I'm not even sure I know what the definition of a "true" DX12 architecture is.
Properly capable of multi-engine stuff. Less serialized. Please let's not start this again.
https://forums.guru3d.com/data/avatars/m/225/225084.jpg
I agree that NVIDIA doesn't really have a "true" DX12 architecture yet, but I don't agree on ****ting on their products so much. If you don't really expect a tremendous uplift with DX12, their stuff is ok. The problem they have is that they sell with "DX11 pricing" which doesn't seem to correspond to the DX12 performance against similar Radeons (look the Nano vs the 980 for example).
You must understand that Nvidia can do pretty much what they want with pricing VS AMD because they have a MASSIVE market share in the GPU area. From a business point of view they must feel that they can do nothing wrong, which was more than evident with the 970 debacle. Basically releasing a product that was slightly misleading but still went on to become the best selling card of all time. Either Nvidia has built up a massive fanboy base or they are just that good. Opinion of that statement makes no difference, truth is that AMD is like a flea on a cows behind compared to Nvidia in the GPU market.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
You must understand that Nvidia can do pretty much what they want with pricing VS AMD because they have a MASSIVE market share in the GPU area. From a business point of view they must feel that they can do nothing wrong, which was more than evident with the 970 debacle. Basically releasing a product that was slightly misleading but still went on to become the best selling card of all time. Either Nvidia has built up a massive fanboy base or they are just that good. Opinion of that statement makes no difference, truth is that AMD is like a flea on a cows behind compared to Nvidia in the GPU market.
They have 56,64% of the gaming PC market, according to the Steam Hardware Survey. AMD has 25,46%. That's more than double, but GCN is the target platform for Apple and all the consoles on top. They are big, but on PC and the PC is not the end of all things. All AAA titles have engines tweaked for GCN, otherwise it would be a commercial suicide.
data/avatar/default/avatar36.webp
I'll be very disappointed if the final release of the DX12 patch for public doesn't support feature level 11_0/11_1
https://forums.guru3d.com/data/avatars/m/220/220626.jpg
The point, as usual, is moot. NVIDIA cannot be ignored because (as already stated) putting out a PC game that only runs well on a quarter of users is pointless. And even if NVIDIA was tossed aside by devs thanks the prominence of AMD, and things like GameWorks on GitHub for UE4 didn't work... it's not like they would lay down and give up. Same as AMD, these companies are too big (or important) to fail. It's the dynamic we have had for a long time now like it or not, Intel makes CPU's, NVIDIA makes GPU's, and AMD serves as the only competition to both. If either three companies are on the verge of falling they will be saved. I'm just tired of people spelling doom for NVIDIA or AMD (obviously no one spells doom for Intel lol). It's so melodramatic. Maybe I'm reading into this too much, but the way this all comes across is what I've been seeing more and more of for awhile now. And it seems no hardware, software, or video game quells the dooms-day-sayers.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Biggest problem for AMD, at least in recent years is that they're always one step behind Nvidia , release time wise. Their gpus are obviously pretty good, the drivers are also good. Problem is they're a few months behind when it comes to releasing cards to challenge Nvidia. They need to change that, they need to be a step ahead. There's plenty of people who are often looking to upgrade to the newest architecture. But those sales go into Nvidia bank account. I'm sure if AMD had 1080 and 1070 competitors out right now, Nvidia would lose out on quite a few sales. They simply let Nvidia do whatever they want.
I would actually argue that their SW is behind, not their hardware. Look at Hawaii/Maxwell comparative performance over time.
https://forums.guru3d.com/data/avatars/m/228/228574.jpg
If anything amds drivers are better than nvidias with pretty nice boosts in performance even for older cards something you just don't see with nvidia.