Review: Battlefield 1 DirectX 11 and 12 PC graphics performance

Published by

Click here to post a comment for Review: Battlefield 1 DirectX 11 and 12 PC graphics performance on our message forum
https://forums.guru3d.com/data/avatars/m/228/228816.jpg
Thanks for the time and effort to do this Hilbert!
https://forums.guru3d.com/data/avatars/m/248/248721.jpg
Thanks for the review Hilbert! Looks like a fun game, well optimised since day one. It's nice to see cards like 290/390 and 290X/390X are doing so good in 1080p/1440p and surprisingly well in 2160p. On the other hand like you've just said 980Ti results arr weird indeed, hope next update for BF1 or next set of drivers for green cards is going to solve the problem.
https://forums.guru3d.com/data/avatars/m/34/34585.jpg
These guys seem to show the 980Ti getting beaten by the 390X and trailing the 1070 and Fury. So it's not Guru3D's that appears to be out of place either. Thermal throttling? I take they are all reference designs and the guys posting higher framerates may have locked their fan speeds higher. http://*************/battlefield-1-directx-12-benchmarks-amd-nvidia/
data/avatar/default/avatar28.webp
O GOD , the price for it it's just to much ! When did games become so expensive ?
https://forums.guru3d.com/data/avatars/m/238/238795.jpg
I just wonder how can they make basically the same game over and over again, and still manage to require more processor power every time?. new physics engine?, bots with advanced ia?, didnĀ“t give any attention to cpu overhead under dx11?.
Isn't it a completely different engine than bf4? I don't think frostbite was ready when bf4 was released.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Isn't it a completely different engine than bf4? I don't think frostbite was ready when bf4 was released.
Same engine, FB3 - but they've made obvious improvements to material shaders and stuff since then. I imagine a lot of iteration goes on in-between the various releases of games, similar to how UE4 has multiple new/overhauled systems in it since it launched. Honestly, I would think with the level of integration Frostbite now has in EA games, they probably have a dedicated team just for maintaining/improving it. The various studios that use it probably just pick a release and maintain that fork for the duration that specific game.
https://forums.guru3d.com/data/avatars/m/224/224720.jpg
I'd really like to see DX11vsDX12 on a low spec CPU comparison. DX12 is supposed to be about efficiency improvements so it would be nice to see if that shows up with like an i3 vs AMD quad core. Assuming it's anything like mantel there should be some significant gains over DX11 on low-end CPUs. So far I have not seen anyone test this.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Anyone else having this issue I am having? The game ran fine with my system when I was playing the open beta. Sometimes I can get 100 fps synced using Ultra setting at 1440p with DX11. If I close the game and restart it later with the same settings I am barely getting 60-70 fps. I then delete the BF1 folder in my documents & restart the game, sometimes this fixes the issue, but reverts back to the crap frame rate after closing down & restarting.
that looks like some shader cache conflict.. Test shader cache off by BF1 driver profile if its any different.
https://forums.guru3d.com/data/avatars/m/31/31371.jpg
It is actually 'their' but besides the point, this game is not crappy , probably will be one of the best games this year. I have over 600 hours in BF4 alone. Honestly, this DRM will affect almost no one. Other than this particular benchmarking case I can't see how this DRM will adversely affect anyone.
Ok you win a cookie It is when has spy ware DRM Keep mind that most likely the beta didn't
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
With the cost of the game, looks like I'll be waiting a while to pick it up.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
No SLI when DX12 is enabled. I get 70-75 FPS on Ultra + 125% Resolution Scale on my 1080 @ 2126..But on DX11 I get Consistent 100 FPS due to having SLI..
Are you sure? Did you check via monitoring?
https://forums.guru3d.com/data/avatars/m/224/224720.jpg
I'm surprised by the lack of performance gain with DX12 in the benchmark. In SP I tested the same spot with DX11 and DX12, my FPS went from 76 to 85 (approximately). I'll try that spot again when the full game releases, this was done on the 10h demo. GPU: AMD 390 Drivers: 16.10.1 (not whql) With that said, DX12 slightly stuttered for me. I mentioned this elsewhere on the forums. It seems to go away after a long period of play, but I do get small stutters after launch.
I'm surprised too and BF1 being such a major title, it will be interesting to see if this game shows some any real benefit to DX12 in any situations at all, and if not, does that mean that this game is just really well coded for DX11 or that DX12 is a gimmick?
https://forums.guru3d.com/data/avatars/m/116/116362.jpg
Awesome game. Awesome graphics. Yo
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
Updated (19/10): More cards added - GTX 980 Ti retested.
https://forums.guru3d.com/data/avatars/m/239/239003.jpg
It's a very optimized game, at 1440p with a 1070 I can max it out easily. And this on DX11. I always knew that Frostbite will become a very good looking engine, and BF1 is another proof for that.
you can't say it's optimized bro, you own the state of the art graphics card, i would kill if a 1070 wouldn't run it nicely, what worries me is the cpu utilization as it was mentioned that high end cpus are being used up which is totally out of proportion...
https://forums.guru3d.com/data/avatars/m/239/239003.jpg
Same engine, FB3 - but they've made obvious improvements to material shaders and stuff since then. I imagine a lot of iteration goes on in-between the various releases of games, similar to how UE4 has multiple new/overhauled systems in it since it launched. Honestly, I would think with the level of integration Frostbite now has in EA games, they probably have a dedicated team just for maintaining/improving it. The various studios that use it probably just pick a release and maintain that fork for the duration that specific game.
it is definitely a heavily modified Bf4 frostbite engine, and the game looks absolutely sick.
data/avatar/default/avatar34.webp
you can't say it's optimized bro, you own the state of the art graphics card, i would kill if a 1070 wouldn't run it nicely, what worries me is the cpu utilization as it was mentioned that high end cpus are being used up which is totally out of proportion...
Every high-end card from the last 4 years can play this on max settings @4K and average 30fps, how is this NOT optimized? Would it be better if the game only used 1 core for a more cinematic experience?
https://forums.guru3d.com/data/avatars/m/252/252846.jpg
R9 Fury doing her job very well and cheaper than rx 480 ( 8gb ) and gtx 1060 ( 6gb ) in france ( 279e )