Battlefield V Open Beta: PC performance benchmarks

Game reviews 127 Page 1 of 1 Published by

Click here to post a comment for Battlefield V Open Beta: PC performance benchmarks on our message forum
data/avatar/default/avatar36.webp
Interesting the Vega performance seems lacking at 1080p & 1440p, yet take a jump at 4K. Thank you HH! 🙂
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
Yeah I know, in its current state it's just so difficult to measure. I've placed many disclaimers in the article, and in fact, have been doubting to even post it as really numbers are all over the place depending on where you are on a map and indeed resolution. Better something than nothing I figured. Once the final game is out I can streamline a good path/scene to measure in, precisely. These results, I've given it to be a preliminary and indicative classification, not a precise one.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
I wonder what it is about Frostbite that makes DX12 such a mess. You'd think the team that requested/helped AMD build an entire low level API would be the best at DX12 and yet they are somehow one of the worst. I guess we'll see when the final build ships and numbers updated but the performance/stuttering mentioned doesn't look any better than their previous DX12 implementations.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
I don't know about DX12 with the Frostbyte engine... I tend to wonder the same as you @Denial , especially as they are actively working together with Nvidia to implement RTX, which is DX12. But to what use if the DX12 adaption itself is lacking? Maybe they should first fix their DX12 "container" itself before thinking about RTX performance numbers, since those are crippled by, well being DX12 on Nvidia and Frostbyte in the first place.
https://forums.guru3d.com/data/avatars/m/134/134194.jpg
1080ti looking ok at 4k I am assuming that's stock not overclocked (only need 60fps)
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Alan Stables:

Your Vega numbers are way off from every other test and my own testing
DX12 vs DX11, 4.2Ghz X99 processor vs 5Ghz Z370 processor. Not to mention a quick google search shows multiple sites with results +- 15% of one another, on top of Hilbert literally saying the performance is all over the place in the game, on top of multiple user reports complaining about how the performance is a mixed bag..
https://forums.guru3d.com/data/avatars/m/90/90026.jpg
lol, dx12 fails again... Where is promised boost? BTW this RAM usage is kinda weird too, AMDs eats a lot less of it.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
GREGIX:

lol, dx12 fails again... Where is promised boost? BTW this RAM usage is kinda weird too, AMDs eats a lot less of it.
Both cards are AMD. Failure of DX12 is on the developer, not the API.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Embra:

Interesting the Vega performance seems lacking at 1080p & 1440p, yet take a jump at 4K.
Pretty typical - Vega's overall performance isn't that impressive but its performance proportionately scales up better than Nvidia's hardware.
GREGIX:

lol, dx12 fails again... Where is promised boost?
DX12 and Vulkan reduce CPU and PCI-e overhead. If the GPUs are bottlenecked, there isn't going to be [much of] an improvement.
https://forums.guru3d.com/data/avatars/m/268/268716.jpg
Silly question, do DX12 and Raytracing go hand in hand ?
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
Zooke:

Silly question, do DX12 and Raytracing go hand in hand ?
It is not even remotely a silly question. Yes, I started DX12 testing with the knowledge in the back of my mind that DX-R, or DirectX Raytracing is an extension to the DX12 API.
data/avatar/default/avatar26.webp
fOrTy_7:

Thanks for benchmarks. The game itself sucks really bad though.
No, it's just you...
data/avatar/default/avatar28.webp
Denial:

I wonder what it is about Frostbite that makes DX12 such a mess. You'd think the team that requested/helped AMD build an entire low level API would be the best at DX12 and yet they are somehow one of the worst. I guess we'll see when the final build ships and numbers updated but the performance/stuttering mentioned doesn't look any better than their previous DX12 implementations.
I think by now it's safe to conclude that doing DX12 right is hard, and that everyone is still learning ins and outs.
data/avatar/default/avatar21.webp
I don't think MS makes it any easier as well.
data/avatar/default/avatar32.webp
schmidtbag:

Pretty typical - Vega's overall performance isn't that impressive but its performance proportionately scales up better than Nvidia's hardware.
Or it's getting hard to fully utilize all those GCN cores - something we already saw with Fury
https://forums.guru3d.com/data/avatars/m/186/186805.jpg
What a terrible game. The performance is pretty decent around 120-160fps Ultra 1440p using the BF1 SLI profile and usage is around 70-95%. However, the game is just boring, the maps are way too big even for 64 players, the graphics look like quite a step down from previous BF games. Its littered with post process effects that add nothing but smears and blurs to textures and the overall image. Spotting people is basically a task within it self. I am VERY disappointed with the game graphically, performance can only get better though so that is a plus. I like the time to kill, which has been dramatically decreased which is by far the best thing they have done with this game. I can not tell u how many times I played BF before and emptying an entire clip into a guy with every bullet getting a hit marker for him to shrug it off and run away was beyond infuriating. This game has solved that issue 100%, now people drop within 5-6 bullets and like 1-2 to the head. But I just feel like this is going to be a very short lived BF game. People are already sick of the WW1/2 setting again and all they want is modern combat again.... guess you can't please everyone.
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
Well this performance doesn't look too bad in this review: 118fps for 1070ti with DX11 at 1080p all maxed out. 1070ti scores the same (2% more) than my overclocked GTX 1070 in Timespy Graphics Score, so I should be ok with my GTX 1070 for close to 144fps because I don't even run BF1 maxed out (AA off, Lighting Quality High), so if I run the same slightly reduced settings in BF V I reckon I'll see close to 144fps, although I'll be increasing the viewable angle to 90 degrees Horizontal which will reduce fps a bit. I reckon I'll get a good experience in this game with GTX 1070, not far off BF1 in framerate, might have to reduce one more setting to get the same framerate I'm seeing in BF1 though - good result! Thanks for the review Hilbert, looking forward to a review post launch, with RTX etc. EDIT: or I could just download the open beta & test it myself....downloading now!
https://forums.guru3d.com/data/avatars/m/239/239622.jpg
Running @ 7720x1440 100hz eyefinity with 3 R9-390's. 50% res. Getting 60-70 FPS in game. On 17.12.2 drivers as well. Pretty happy with how its playing currently.
https://forums.guru3d.com/data/avatars/m/260/260103.jpg
screwtech02:

Running @ 7720x1440 100hz eyefinity with 3 R9-390's. 50% res. Getting 60-70 FPS in game. On 17.12.2 drivers as well. Pretty happy with how its playing currently.
That's pretty impressive performance. What settings did you use.
https://forums.guru3d.com/data/avatars/m/219/219428.jpg
I still get the feeling Dice use a DX12 wrapper for their DX11 render path and just add some eye candy on top of that. If in 2018 your "DX12" game still performs worse then 11 it's likely that it is not a native DX12 title to begin with.