Battlefield Hardline VGA graphics performance review

Published by

Click here to post a comment for Battlefield Hardline VGA graphics performance review on our message forum
https://forums.guru3d.com/data/avatars/m/130/130124.jpg
I was hopping to see AMD 290X in 3840x2160 - Ultra - 4xMSAA. Maybe you can add them in the future?
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
I was hopping to see AMD 290X in 3840x2160 - Ultra - 4xMSAA. Maybe you can add them in the future?
Oh buddy, can you just please read the text in articles as well ?
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
I was hopping to see AMD 290X in 3840x2160 - Ultra - 4xMSAA. Maybe you can add them in the future?
He's locked out of the game.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
My 280X still sits between 770 and 780, not bad. Let's just hope Witcher 3 will be playable on it.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
I was hopping to see AMD 290X in 3840x2160 - Ultra - 4xMSAA. Maybe you can add them in the future?
So that Hilbert dude is an Nvidia fanboy is the first thing that came to you mind eh ? Yeah, well remember the HardwareID DRM thingy ? I am locked out of the game at this point and have not been able to test the AMD Radeon cards in Ultra HD. And that kind of blows as we even have a Radeon R9 290X with 8 GB of graphics memory in the house that I wanted to pitch in with Ultra HD compared towards the GTX 980 and Titan X. AMD is doing surprisingly well in this game with much cheaper hardware. BTW look at the performance CRUMBLE in Ultra HD, these cards all will need to drop down from 4xMSAA towards 2xMSAA. But again, I can't show you that due to the lockout. The GTX 780 Ti crumbles, it is a 3GB VRAM card, this resolution versus our settings eat close to 4 GB of graphics memory. We'll show you that on the next page. Anyway, I'll owe you the Radeon Ultra HD results as well as FCAT frametime results as currently my EA accounts are a dead ship in the water due to the DRM HardwareID lockouts.
Here you go.
https://forums.guru3d.com/data/avatars/m/130/130124.jpg
Oh buddy, can you just please read the text in articles as well ?
I did, that's why i said maybe in the future when the EA crap get's fixed you can add them to the review.
data/avatar/default/avatar17.webp
The 290/290x seem to be getting better with age. 🙂
https://forums.guru3d.com/data/avatars/m/260/260828.jpg
Does this game have Mantle compatibility? Sorry for my english
https://forums.guru3d.com/data/avatars/m/251/251033.jpg
Does this game have Mantle compatibility? Sorry for my english
Yes, Mantle is supported by Hardline. ...Buying a 290X was the best decision I made in 2013!
https://forums.guru3d.com/data/avatars/m/224/224796.jpg
Thanks for the review Hilbert! *EA stinks like a strip club after Mardi Gras
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
The 290/290x seem to be getting better with age. 🙂
Im very happy to see how the 290 did. Though I only purchased my card late last year, its already proven to be quite strong still. I wanted to switch around this time. Last two setups were AMD. I usually switch between both sides every couple of cards. But no way in hell I was spending $300 for a GTX 780 3GB when I got my R9 290 4GB OC for $242 with shipping.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
The 290/290x seem to be getting better with age. 🙂
As is all AMD GPUs. AMD/ATi has always been at a disadvantage at release date because their drivers always start out very poor. By the time the GPUs perform the way the hardware was intended, they're already obsolete. I would argue that AMD usually releases better hardware than nvidia, but nvidia's drivers are definitely superior. On release date (of either the hardware or the games), drivers are what matters most, and that's why nvidia tends to rank better. I think they just need to re-write their drivers from the ground up. Anyway, I'll be interested to see Hardline's results if/when they release a Mantle version. I have no intention on getting the game but it still interests me.
https://forums.guru3d.com/data/avatars/m/209/209401.jpg
Thanks for the review Hilbert! *EA stinks like a strip club after Mardi Gras
I dont know the smell...
https://forums.guru3d.com/data/avatars/m/115/115616.jpg
Thanks for the benchmarks, Hilbert! Do you think it still makes sense to use MSAA at UHD resolution? I was expecting better 970 and 980 performance compared to 290(x), and it turns out, that the AMD cards can handle this game nicely. Also, this DRM sucks. I'm glad I haven't bought this game with preorder. Now I can vote with my wallet.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
Thanks for the benchmarks, Hilbert! Do you think it still makes sense to use MSAA at UHD resolution?
No, I wanted to test some other AA possibilities, e.g. a lower MSAA (2x) value and for Nvidia MFAA. However ... I can't start-up the game and test/continue with this article until the DRM protection gives me permission to access the game again.
https://forums.guru3d.com/data/avatars/m/115/115616.jpg
No, I wanted to test some other AA possibilities, e.g. a lower MSAA (2x) value and for Nvidia MFAA. However ... I can't start-up the game and test/continue with this article until the DRM protection gives me permission to access the game again.
Thanks again. I really get your frustration and I hope they'll resolve this issue.
data/avatar/default/avatar34.webp
This game seems to use a lot of screen space reflection. Does anyone know if the SSR covers dynamic objects as well as static? I want to play the SP campaign, but I don't want to pay full price for the game as I have no interest in MP..
data/avatar/default/avatar34.webp
The 290/290x seem to be getting better with age. 🙂
I'm certain it's because of Hawaii's compute performance why it's doing so well in this game.. Kepler was relatively much weaker in compute, whilst Maxwell improved compute performance for NVidia by a large margin.. Frostbite 3 engine uses lots of compute shaders for all sorts of things to speed up performance.
https://forums.guru3d.com/data/avatars/m/258/258801.jpg
Why ? The GTX970 is way faster than 290X 🙂
That guy has no idea what he's doing, look at the frametime for the 290X. There's something wrong in his system. He states himself frametime is actually better for AMD in multiplayer and worse for nVidia. I'll wait for Hilbert's frametime results.
https://forums.guru3d.com/data/avatars/m/258/258801.jpg
Come on, this Benchmark is way better http://www.pcgameshardware.de/Battlefield-Hardline-PC-258805/Specials/Benchmark-Test-1154059/ Non-Reference cards, min-framerate etc.:) ...take a look on the frametimes (SINGLEPLAYER) 🙂 GTX970 http://www.pcgameshardware.de/screenshots/original/2015/03/Frametimes_SP_GTX_970-pcgh.png 290x http://www.pcgameshardware.de/screenshots/original/2015/03/Frametimes_SP_290X_DX_20150325105837-pcgh.png AMD have not even a chance.
Did you just listen to what I said? Frametime seems to be completely brokenly stuttery for AMD in single player and the opposite in multiplayer.