Unreal Engine 4 Elemental DX12 Tech Demo Download

Benchmarks & Demo's 179 Updated by Hilbert Hagedoorn

Click here to post a comment for Unreal Engine 4 Elemental DX12 Tech Demo Download on our message forum
data/avatar/default/avatar07.webp
Ironically, it runs much better under DX11 than DX12 for me. I guess the technology is still immature.
https://forums.guru3d.com/data/avatars/m/175/175739.jpg
Some textures look horrible. Like XBox 360 textures.
https://forums.guru3d.com/data/avatars/m/253/253034.jpg
Anyone looked at CPU utilization to see if DX12 is actually working properly?
https://forums.guru3d.com/data/avatars/m/252/252732.jpg
Unreal Engine 4 is certainly capable of much better than what the Elemental demo provides that's for sure, it's a few years old now afterall. Epic did mention that DX12 would be available from version 4.9 of UE4 which is still being worked on. The new Unreal Tournament will be using it 4.9/DX12 as soon as it's available.
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
Anyone looked at CPU utilization to see if DX12 is actually working properly?
Cut & Pasted from my post in the other thread (bold is related to your question): I think you can run the benchmark in DX11 (if you run the main application icon), or in DX12 if you run the link marked "DX12". It runs OK on my laptop - I got an average of 40fps as recorded by FRAPS using DX11, with minimum fps in the high 20's. Running the DX12 version seemed to be the same fps (FRAPS wouldn't run with DX12), except overall power draw was probably on average 5W less as measured by my Kill-A-Watt - I think this was because I was getting 99% constant GPU utilisation during DX11, and during DX12 I was getting slightly less at 97% constant GPU utilisation, CPU utilisation seemed to be similar between the two versions strangely, although did look that DX12 CPU utilisation only very slightly more evenly distributed between cores. VRAM usage on DX12 was 700MB higher than DX11. Both DX11 & DX12 versions looked the same, so I don't think they're running different settings. All a bit MEH, but happy that the DX12 demo ran well on my laptop regardless!
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
Unreal Engine 4 is certainly capable of much better than what the Elemental demo provides that's for sure, it's a few years old now afterall. Epic did mention that DX12 would be available from version 4.9 of UE4 which is still being worked on. The new Unreal Tournament will be using it 4.9/DX12 as soon as it's available.
Maybe it will be possible to combine "A Boy and His Kite" with UE4.9 once it's out of testing, should be a better test. 😀 (And will allow UE4.9 to mature a bit more and possibly improve it's D3D12 implementation too.) Downside to that is that the showcase version of that particular demo is a full 50 GB. (But it will allow the user to combine the showcase with the latest engine version and test new features that way.)
https://forums.guru3d.com/data/avatars/m/115/115616.jpg
It worked pretty well on my aging 680/920 combo, except for the rift formation at the end, where it dropped <30 fps. To be honest, it seems like game visual quality is mostly (at least for me) about these features and in this order: 1. Animation smoothness 2. How realistic animation is, eg natural running 3. Textures 4. Lights 5. Extra effects And the most of performance bottlenecks are related to the extensive use of extra shiny effects, where it would be enough just to pack better textures for clearly visible objects, and make them behave naturally.
https://forums.guru3d.com/data/avatars/m/254/254132.jpg
Was expecting a lot more from DX12. Thanks Unreal.
https://forums.guru3d.com/data/avatars/m/34/34585.jpg
DX11 was smooth all the way through CPU utilization was around 20-22% mostly on a single core but had glare issues where the textures was corrupt. DX12 ran ok but was a little slower gone from around 55fps to 50 average and 700MB more memory used, the strange thing with DX12 was the odd stutter where fps gone from 50fps down to 10fps just for a second before returning to normal happened in 5 parts of the demo, MSI after burner show utilization dropping and same with the clock speeds, it's not over heating either. CPU utilization was average of 8% total spread mostly across 4 cores with threads only being uses a little. There not a massive amount of diff in the two, glare problems was not experienced in DX12 and there was steam off the **** with heat waves and significantly more smoke and debris, textures remained the same. This was at 2560x1080p, DX12 used slightly less around 97% temps for the GPU was 83C, DX11 used 99% and temps gone up to 89C, fan speed increased alot more in DX11 indicating the GPU was put under more pressure since it had higher utilization and clock speed remained stable since there was no stutters, there was 6 parts in the demo where it stuttered in DX12 and the clock speeds drop for a second there. Latest release of Cat drivers for W10 MSI Gamer 3 R9 290 (no overclock, can't seem to unlock voltage under W10) and i7 4790k @ 4.8GHz
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Was expecting a lot more from DX12. Thanks Unreal.
Its just some guru3d user who ported it to DX12 with few minor tweaks, overall its still same old DX11 elemental demo. 🙂 And I see he used too much bokeh DOF by some parts, not really a fan of that because it glitches main character helmet (horns) when he stands up..
https://forums.guru3d.com/data/avatars/m/254/254132.jpg
Its just some guru3d user who ported it to DX12 with few minor tweaks, overall its still same old DX11 elemental demo. 🙂 And I see he used too much bokeh DOF by some parts, not really a fan of that because it glitches main character helmet (horns) when he stands up..
RTSS is capping FPS in the "DX12 version". I didn't notice that before.
https://forums.guru3d.com/data/avatars/m/252/252846.jpg
Crossfire profil = 0 . Because of borderless , when i'm fullscreen same problem . I want to test dx12 with cfx but ...
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
Crossfire profil = 0 . Because of borderless , when i'm fullscreen same problem . I want to test dx12 with cfx but ...
Try the Dying Light crossfire profile. I used dying light sli profile for the UE4 demos and it works for sli. Even this worked.
https://forums.guru3d.com/data/avatars/m/252/252846.jpg
I'll try thanks .
https://forums.guru3d.com/data/avatars/m/220/220188.jpg
it looks terrible
https://forums.guru3d.com/data/avatars/m/201/201426.jpg
it looks terrible
Well its a 2 year old tech demo that was ported to DX12 code and isnt finished code. Also back then the textures were lower with alot of shaders to hide that fact. So it looks even worse now then it did then.
https://forums.guru3d.com/data/avatars/m/258/258688.jpg
Glad to hear this is a couple of years old because--hopefully--that means Sweeney has had time to whip the U4 engine into something nice for the PC. I like the content of this demo...it's fun to watch...but the quality is very low, as others have already said. Looks just like console fare, to me, and as that is all Sweeney has been doing for many years, now (ever since Epic had the brilliant idea to dump PC development), I hope by now that he'll have whipped up some nice PC-level quality in the engine. I couldn't see anything requiring D3d12 in the demo, really. Besides, DXDIAG is still telling me that the highest D3d level my R9 380 supports in hardware is 11.1...unless Microsoft hasn't properly updated dxdiag.exe to report D3d hardware support higher than that--but I think it reports just what the driver tells it.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
nvidea failed attempt to try to derail dx12 low level api performance so we stay with the green camp because they cant compete with amd asynchronous shaders. How stupid do they think we are !!! Nvidea will need a new gpu design because they cant compute the same way amd does. The Fury performance is going to surprise many people very soon and then the defence of there purchase will hit the fan..... Its never changes.
Wtf are you talking about. The DX12 benchmark the Fury X just matches the 980Ti. Which it already does in some titles. How is Nvidia not competing? It's literally neck and neck.
https://forums.guru3d.com/data/avatars/m/235/235224.jpg
nvidea failed attempt to try to derail dx12 low level api performance so we stay with the green camp because they cant compete with amd asynchronous shaders. How stupid do they think we are !!! Nvidea will need a new gpu design because they cant compute the same way amd does. The Fury performance is going to surprise many people very soon and then the defence of there purchase will hit the fan..... Its never changes.
Oxide Games released a patch to artificially gimp DX results back in Star Swarm when NV's low overhead driver was putting their cards above AMD's Mantle. I wouldn't put all your eggs in one basket, let alone one from OG. The results also aren't anywhere near as drastic as you claim them to be.
https://forums.guru3d.com/data/avatars/m/254/254132.jpg
It's too soon to get worked up about lack of optimizations or performance.