The Division 2: PC graphics performance benchmark review

Game reviews 127 Page 1 of 1 Published by

Click here to post a comment for The Division 2: PC graphics performance benchmark review on our message forum
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
1660 slower than 590 and 1660ti slower than 1070. Great showing for new gtx turing cards. Overall game runs good on a decent hardware it seems.
data/avatar/default/avatar22.webp
The dx 12 issues i encountered in the beta are still present: occasional micro freezes here and there, and the lighting going crazy. They fixed at least the random crashes
https://forums.guru3d.com/data/avatars/m/147/147322.jpg
I think that the Vignette effect being disabled could fix most of the lighting issues.
https://forums.guru3d.com/data/avatars/m/267/267153.jpg
Radeon VII same as 1080ti in 1080p and better in anything above. Not bad. Considering 1080ti was last ok card from nVidia in terms of performance and perf/buck ratio. Too bad u cant buy neither VII or 1080ti now... LOL! Too expensive for me anyway. But to me, VII looks better than 1080ti (16gb HBM2) Thats not THAT bad and not THAT old news.. Also, its barely released and will work better in time. Game runs "ok"...
data/avatar/default/avatar20.webp
Define "destroyed"? 2-4 fps at 1440p & 4k?
data/avatar/default/avatar13.webp
spajdrik:

I think that the Vignette effect being disabled could fix most of the lighting issues.
i can try that, my issue is the light becomes so bright that i get blinded
https://forums.guru3d.com/data/avatars/m/273/273822.jpg
warlord:

And radeon vii is destroyed by 2080. Vega is so old news.
It's strange because it's an AMD game and you'd expect the RVII to perform better, especially as the RVII and 2080ti are listed as the 4k60 cards for this game and the 2080 is listed as the 1440p card.
https://forums.guru3d.com/data/avatars/m/273/273822.jpg
Undying:

1660 slower than 590 and 1660ti slower than 1070. Great showing for new gtx turing cards. Overall game runs good on a decent hardware it seems.
You're looking at it completely wrong, but that doesn't surprise me in the slightest. 1660ti beats the 590 and 1660 beats the 580. In a AMD sponsored game.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
The hyperbole in these threads is hilarious.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
warlord:

You do realize there are competitive gamers playing 1080p 144hz+? That 10 fps difference in an AMD sponsored title is quite laughable.
This depends on the game completely, CSGO for example. https://prosettings.net/cs-go-best-settings-options-guide/ Fortnite https://prosettings.net/best-fortnite-settings-options-guide/ Overwatch https://prosettings.net/overwatch-best-settings-options-guide/ Look at the settings of the games though, typically they're lower and both card would have no problem doing that.
https://forums.guru3d.com/data/avatars/m/55/55855.jpg
Dx11 lookin good! πŸ˜€
https://forums.guru3d.com/data/avatars/m/273/273822.jpg
I wish we had 1% and 0.1% charts too, the average alone is kinda not enough because some cards will tank more than others.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
metagamer:

I wish we had 1% and 0.1% charts too, the average alone is kinda not enough because some cards will tank more than others.
Especially those with lower amount of vram. Try playing Apex on 8gb insane texture setting and 6gb card will choke and stutter.
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
Is Apex still running on Source engine like Titanfall 2 if that's the case chances are it's using the same configuration and the texture is a cache value defined in Megabytes where ultra is 8192 and doesn't really do that much if you lower it down a bit. πŸ™‚ (Keeps more textures in VRAM so less swapping around.) For this game there's a few additional settings in the config file like the neutral lighting from the first game ("full bright" though not quite that drastic.) and it's possible to disable a few things like temporal anti-aliasing and some minor tweaking though a bit less than the first game. Similar glitches too such as upping reflection quality to ultra (3 in the config file.) breaking several reflective surfaces most notably on the character models. Well time to give this a more thorough read, always fun to see a performance comparison and now there's D3D11 and D3D12 too and it's not just CPU either since even higher-end systems reportedly see a 10 - 15 percent gain. Think Turing and also Pascal handles async fairly well so AMD probably sees some competition here since the 1080Ti and the upper end NVIDIA cards outperform Vega although the VII probably does alright but the 2080 and 2080Ti should be faster overall. No NVIDIA or AMD specific effects and many things mentioned in the tech feature video also applied to the first game but I think CPU usage has improved further and probably a bigger focus on async compute as well. (Wonder if that's D3D11 and D3D12 or just D3D12, Crackdown 3 has a toggle for it for DX11 though I expect Division 2 here to focus heavily on DX12 though some users are reporting crashes so it might be sensitive to third party software, overclocks and of course also the display driver itself.) Might try it but I like using ReShade so a 10% perf hit isn't that bad of a compromise from going with D3D11 though I should probably compare them at some point. (RAM and CPU are hitting their limits though, Ryzen 3000 and DDR4 next perhaps but eh who knows. πŸ˜€ ) EDIT: Realistic visuals like this though I wonder what ray-tracing can eventually accomplish once it's developed further and more common, that's probably beyond my lifetime however but I am curious seeing the natural limits of screen-space effects, rasterization and what is approximated with global illumination, light and shadows and all that stuff but yeah decades until this can be phased out in full. (And at point it's all about streaming and renting and the power of some cloud somewhere. πŸ˜› Well perhaps not quite but that's also been quite in focus recently but also quite a ways off yet.) The game isn't bad looking but it's going to be a balance of visual quality and performance and of course feature parity and console and PC and the whole thing so kinda curious seeing what's going to happen for the next gen or two although it will take time. Oh well, this isn't bad either. (AI? Physics? NO! Shiny visuals better. Ha ha! Well maybe that too can catch up eventually. Maybe.)
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
JonasBeckman:

Is Apex still running on Source engine like Titanfall 2 if that's the case chances are it's using the same configuration and the texture is a cache value defined in Megabytes where ultra is 8192 and doesn't really do that much if you lower it down a bit. πŸ™‚ (Keeps more textures in VRAM so less swapping around.)
So that means less texture popin and stutter. That translates in better experience even if you have slower gpu like rx580 compared to 1660.
https://forums.guru3d.com/data/avatars/m/169/169351.jpg
Bloody hell, 970 has really fallen bellow... That card was equivalent to 290/390/480/580 in performance and now it's looking very weak... I'm wondering how the 4GB versions of the 470/480/570/580 compare, if it's a VRAM issue (Fury, is typically also falling behind in this regard, but it's not too bad in this game at least) but 970, performance is just woeful in comparison. Is the 970 suffering from the kepler effect? Maybe the 980 too?
https://forums.guru3d.com/data/avatars/m/209/209146.jpg
Undying:

So that means less texture popin and stutter. That translates in better experience even if you have slower gpu like rx580 compared to 1660.
Yeah that too, stuttering and dips for a online game especially competitive can be quite a problem. Hitching during critical moments can be quite a hassle so no wonder it's so common to dial down settings to near or below min-spec among other advantages this can provide. Things changed pretty quickly after hovering around 3 - 4 GB too, 5 or near 6 and then up to 8 GB or even higher and it's not just cache either but actual data being stored and newer games pushing even above 8 GB particularly if combined with high-res textures or texture pack add-ons and then higher display resolutions such as ultra wide 3440x1440 becoming more supported and popular but also 3840x2160 itself or higher though now we're pretty much requiring a high-end GPU to drive that. (Not helped by the lower number of SLI and Crossfire titles and support here.)
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Only Intruder:

Bloody hell, 970 has really fallen bellow... That card was equivalent to 290/390/480/580 in performance and now it's looking very weak... I'm wondering how the 4GB versions of the 470/480/570/580 compare, if it's a VRAM issue (Fury, is typically also falling behind in this regard, but it's not too bad in this game at least) but 970, performance is just woeful in comparison. Is the 970 suffering from the kepler effect? Maybe the 980 too?
980ti is also slower than rx590 in this game. Kepler is getting old.
data/avatar/default/avatar14.webp
kilyan:

The dx 12 issues i encountered in the beta are still present: occasional micro freezes here and there, and the lighting going crazy. They fixed at least the random crashes
Well same problem you describe are as well in DX11 and as for random crashes being fixed, I had three and my mate in mission two in last hour during single mission.
https://forums.guru3d.com/data/avatars/m/150/150085.jpg
Undying:

1660 slower than 590 and 1660ti slower than 1070. Great showing for new gtx turing cards. Overall game runs good on a decent hardware it seems.
That about sums it up. I thought that the 1660ti would handily beat the 590 in this title.
r3nt5ch3r:

Whats the point in eating up all VRAM? Any visual differences? Any differences in frametimes? Or just filling up empty space?
Just fill up all the video cards things...because that's what memory is for amiright? /s