The Division 2: PC graphics performance benchmark review
Click here to post a comment for The Division 2: PC graphics performance benchmark review on our message forum
Undying
1660 slower than 590 and 1660ti slower than 1070. Great showing for new gtx turing cards.
Overall game runs good on a decent hardware it seems.
kilyan
The dx 12 issues i encountered in the beta are still present: occasional micro freezes here and there, and the lighting going crazy. They fixed at least the random crashes
SpajdrEX
I think that the Vignette effect being disabled could fix most of the lighting issues.
HybOj
Radeon VII same as 1080ti in 1080p and better in anything above. Not bad. Considering 1080ti was last ok card from nVidia in terms of performance and perf/buck ratio.
Too bad u cant buy neither VII or 1080ti now... LOL!
Too expensive for me anyway.
But to me, VII looks better than 1080ti (16gb HBM2) Thats not THAT bad and not THAT old news..
Also, its barely released and will work better in time.
Game runs "ok"...
Embra
Define "destroyed"?
2-4 fps at 1440p & 4k?
kilyan
metagamer
metagamer
Denial
The hyperbole in these threads is hilarious.
vbetts
Moderator
https://prosettings.net/cs-go-best-settings-options-guide/
Fortnite
https://prosettings.net/best-fortnite-settings-options-guide/
Overwatch
https://prosettings.net/overwatch-best-settings-options-guide/
Look at the settings of the games though, typically they're lower and both card would have no problem doing that.
This depends on the game completely, CSGO for example.
Rich_Guy
Dx11 lookin good! π
metagamer
I wish we had 1% and 0.1% charts too, the average alone is kinda not enough because some cards will tank more than others.
Undying
JonasBeckman
Is Apex still running on Source engine like Titanfall 2 if that's the case chances are it's using the same configuration and the texture is a cache value defined in Megabytes where ultra is 8192 and doesn't really do that much if you lower it down a bit. π
(Keeps more textures in VRAM so less swapping around.)
For this game there's a few additional settings in the config file like the neutral lighting from the first game ("full bright" though not quite that drastic.) and it's possible to disable a few things like temporal anti-aliasing and some minor tweaking though a bit less than the first game. Similar glitches too such as upping reflection quality to ultra (3 in the config file.) breaking several reflective surfaces most notably on the character models.
Well time to give this a more thorough read, always fun to see a performance comparison and now there's D3D11 and D3D12 too and it's not just CPU either since even higher-end systems reportedly see a 10 - 15 percent gain.
Think Turing and also Pascal handles async fairly well so AMD probably sees some competition here since the 1080Ti and the upper end NVIDIA cards outperform Vega although the VII probably does alright but the 2080 and 2080Ti should be faster overall.
No NVIDIA or AMD specific effects and many things mentioned in the tech feature video also applied to the first game but I think CPU usage has improved further and probably a bigger focus on async compute as well.
(Wonder if that's D3D11 and D3D12 or just D3D12, Crackdown 3 has a toggle for it for DX11 though I expect Division 2 here to focus heavily on DX12 though some users are reporting crashes so it might be sensitive to third party software, overclocks and of course also the display driver itself.)
Might try it but I like using ReShade so a 10% perf hit isn't that bad of a compromise from going with D3D11 though I should probably compare them at some point.
(RAM and CPU are hitting their limits though, Ryzen 3000 and DDR4 next perhaps but eh who knows. π )
EDIT: Realistic visuals like this though I wonder what ray-tracing can eventually accomplish once it's developed further and more common, that's probably beyond my lifetime however but I am curious seeing the natural limits of screen-space effects, rasterization and what is approximated with global illumination, light and shadows and all that stuff but yeah decades until this can be phased out in full.
(And at point it's all about streaming and renting and the power of some cloud somewhere. π Well perhaps not quite but that's also been quite in focus recently but also quite a ways off yet.)
The game isn't bad looking but it's going to be a balance of visual quality and performance and of course feature parity and console and PC and the whole thing so kinda curious seeing what's going to happen for the next gen or two although it will take time. Oh well, this isn't bad either.
(AI? Physics? NO! Shiny visuals better. Ha ha! Well maybe that too can catch up eventually. Maybe.)
Undying
Truder
Bloody hell, 970 has really fallen bellow... That card was equivalent to 290/390/480/580 in performance and now it's looking very weak...
I'm wondering how the 4GB versions of the 470/480/570/580 compare, if it's a VRAM issue (Fury, is typically also falling behind in this regard, but it's not too bad in this game at least) but 970, performance is just woeful in comparison.
Is the 970 suffering from the kepler effect? Maybe the 980 too?
JonasBeckman
Undying
xrodney
Eastcoasthandle