Dragon Age: Inquisition VGA graphics performance review

Game reviews 127 Page 1 of 1 Published by

Click here to post a comment for Dragon Age: Inquisition VGA graphics performance review on our message forum
https://forums.guru3d.com/data/avatars/m/105/105985.jpg
Poor Hilbert has to test these crappy games lol Good job.
https://forums.guru3d.com/data/avatars/m/16/16662.jpg
Administrator
Poor Hilbert has to test these crappy games lol Good job.
For the RPG gamer/aficionado, this is a good title my man.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Nice article HH. So this is a second game that we see this mix up performance. 280X fast as 780, 290X faster 780ti. And that poor 680/770 left behind. Whats happening with kepler cards and their drivers? Did they do that on purpose, lol?
data/avatar/default/avatar01.webp
Nice article HH. So this is a second game that we see this mix up performance. 280X fast as 780, 290X faster 780ti. And that poor 680/770 left behind. Whats happening with kepler cards and their drivers? Did they do that on purpose, lol?
AMD gaming evolved ore bad drivers, u never know.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
I'm surprised to see that Mantle actually is slower than dx11, at least at the moment. But also, 980SLI for 60fps at 1440p? Wow 😀
https://forums.guru3d.com/data/avatars/m/254/254338.jpg
I'm surprised to see that Mantle actually is slower than dx11, at least at the moment. But also, 980SLI for 60fps at 1440p? Wow 😀
Is there any clue as to if Split-Frame Rendering (SFR) was used for multiple GPUs instead of the standard Alternate-Frame Rendering (AFR)? This was the case with Civilization: Beyond Earth and even though SFR didn't scale FPS as well as AFR, it did provide a noticeably lower latency which translated to smoother and more responsive game play.
https://forums.guru3d.com/data/avatars/m/105/105985.jpg
For the RPG gamer/aficionado, this is a good title my man.
I will take your word for it. Don't take offence rpg'ers I have equal hate for just about all the new games
https://forums.guru3d.com/data/avatars/m/227/227853.jpg
Wait what? GTX 970 on full HD Ultra quality only 50 fps? I don't get it. What's going on with the games these days? I just ordered a GTX 970 and I'm already disappointed, this is quite a high-end card. My 560 ran Deus Ex Human Revolution on 60 fps. And that was a 250-dollar card. I bought this rig exactly when DXHR came out. I was expecting at least 70 fps from current games on Full HD from cards around 970 performance. The 970 also barely runs AC:Unity at 40 fps. Seriously do I have mental issues or games are starting to trash single-card setups?
https://forums.guru3d.com/data/avatars/m/90/90667.jpg
Wait what? GTX 970 on full HD Ultra quality only 50 fps? I don't get it. What's going on with the games these days? I just ordered a GTX 970 and I'm already disappointed, this is quite a high-end card. My 560 ran Deus Ex Human Revolution on 60 fps. And that was a 250-dollar card. I bought this rig exactly when DXHR came out. I was expecting at least 70 fps from current games on Full HD from cards around 970 performance. The 970 also barely runs AC:Unity at 40 fps. Seriously do I have mental issues or games are starting to trash single-card setups?
Bad coding and bad optimization that all, lower some settings will gain u good fps. most games have settings that take fps and do barely anything. 2005-2006 era is gone, where high end GTX could reach 100fps at all games maxed out even the most demanding, i recon the money we pay for all gpus is overpriced way overpriced.
https://forums.guru3d.com/data/avatars/m/252/252888.jpg
Very enjoyable game! And the amount of content is staggering, one of the better titles of this year for sure. Did I mention the near bugless state of the game? 😀
https://forums.guru3d.com/data/avatars/m/258/258688.jpg
For the RPG gamer/aficionado, this is a good title my man.
Good look at the game, too, HH...especially from one whose cup of tea is not RPGs, apparently...;) I prefer this kind of game to anything else, actually. But really well-made, involving RPGs unfortunately come along only every few years or so, and when one is released it is usually to a lot of pent-up demand, as you noticed in the game reviews you've seen.
https://forums.guru3d.com/data/avatars/m/227/227853.jpg
Bad coding and bad optimization that all, lower some settings will gain u good fps. most games have settings that take fps and do barely anything. 2005-2006 era is gone, where high end GTX could reach 100fps at all games maxed out even the most demanding, i recon the money we pay for all gpus is overpriced way overpriced.
Hell yeah, I agree. I remember when I got my 6600GT, boy did that card absolutely destroy games for the 200-dollar price tag.
https://forums.guru3d.com/data/avatars/m/90/90667.jpg
Hell yeah, I agree. I remember when I got my 6600GT, boy did that card absolutely destroy games for the 200-dollar price tag.
Yep i remmeber my first good ati card, Radeon 9550 it was the era where nvidia failed hard with FX(3DFX ex employes created the FX series to avenge nvidia 😀), card cost me nothing and played FEAR at medium settings some at high with good fps, even with junk pentium 4 1.7ghz. 7600GT was awesome as well, ate fear like nothing and all other games. Other than card being overpriced, or more accurate games are not optimized at all, most of them. consoles play big part at that.
https://forums.guru3d.com/data/avatars/m/115/115462.jpg
Great Job Hilbert, the benchmarks are pretty much right. On my rig, the minimum FPS I had was arround 45, but I can keep 60 with adaptive vsync most of the time, maxed, 1440p, 4x msaa. The game is great fun for me, and I don't regret upgrading my GPUs for it (plus TW3, GTA5 and some others), while it is indeed demanding... it looks accordingly good imo.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Seeing as Mantle is designed to reduce CPU overhead and isn't necessarily targeted toward improving GPU performance, I can't say the results that that surprising. Run this on a crappy CPU like an A6 and I'm sure you'll see some good results. Anyway, I likely won't be getting this game. Dragon Age: Origins was very disappointing - I'm glad I got it free on Origin. It basically felt like a severely dumbed-down version of Neverwinter Nights and pretended to have a much more interesting story than it really had. I never even bothered to complete the game because it felt like a chore.
https://forums.guru3d.com/data/avatars/m/200/200207.jpg
Yep those results look spot on. Been actually wondering myself about perf as been noticing the odd dip to 45fps or so... Might trying lowering from ultra... (needs another 970).
https://forums.guru3d.com/data/avatars/m/258/258688.jpg
Wait what? GTX 970 on full HD Ultra quality only 50 fps? I don't get it. What's going on with the games these days? I just ordered a GTX 970 and I'm already disappointed, this is quite a high-end card. My 560 ran Deus Ex Human Revolution on 60 fps. And that was a 250-dollar card. I bought this rig exactly when DXHR came out. I was expecting at least 70 fps from current games on Full HD from cards around 970 performance. The 970 also barely runs AC:Unity at 40 fps. Seriously do I have mental issues or games are starting to trash single-card setups?
Mini-rant follows...;) Your hardware is fine, but... What you're seeing is what I call "console-rationalization" on the part of software developers these days...or, better yet, "consolitis"....;) Some developers & publishers have really come down with a bad case of it. Everywhere we're seeing artificial 30 fps/60 fps limits set in the game software so that even if you want to see what sort of frame rates your spiffy new card is capable of you *can't.* At least, you can't without a bit of investigation and some configuration work (and then maybe you can.) Blame it on the consoles--whether it's the lousy xBone's gpu-that-struggles-for-1080P-in-2014--or the PS4, which is still slower than my 1GHz 2GB HD7850 + & hobbled by a shared system/vram memory bus--that's the problem. Years ago, only Carmack at id liked to throw-in artificial frame-rate-limiters to cover over some poor engine design or other--including but not limited to vsync issues--but with these newer consoles things are really getting out of hand. What you are seeing is, imo, a deliberate attempt on the part of developers and publishers to blur the lines as much as possible between wide-open, free and unfettered PC gaming software that can make use of the hardware you throw at it, and these bottom-feeding consoles. The publishers and developers (mainly Microsoft & Sony) are *paying money* (just my guess) to get new releases *dumbed down* or hobbled or whatever you want to call it--just so the public won't immediately see how much more gaming power they can buy in a computer over either console for just a wee bit more cash up front. Now that both consoles are literally low-end x86 gaming PCs--only closed off and proprietary, too, it's become especially important to publishers & developers like Microsoft & Sony to keep the general markets as ignorant of these things for as long as possible. Scream loud & long enough at your favorite developers and I think they will listen...eventually...;) Meantime, fire up some games developed from a time when frame-rate was king and watch those fps meters spin...! Edit: One of the first things anyone should do with a new 3d card that is purportedly much more powerful than his old one is (a) crank up the resolution, if possible, and (b) lay on the FSAA & eye candy. That's where the power of a new gpu will really shine through...if you happen to be running a cpu-limited game and you leave the eye-candy where it used to be on the old card and you run at the same resolution, but you expect to see a dramatic frame-rate improvement, you are most likely going to be really disappointed. The new card should allow you to run smoothly at resolutions and with layers of eye candy that the your old card *choked* on...that's really how you will see a big difference, imo.
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
Seeing as Mantle is designed to reduce CPU overhead and isn't necessarily targeted toward improving GPU performance, I can't say the results that that surprising. Run this on a crappy CPU like an A6 and I'm sure you'll see some good results. Anyway, I likely won't be getting this game. Dragon Age: Origins was very disappointing - I'm glad I got it free on Origin. It basically felt like a severely dumbed-down version of Neverwinter Nights and pretended to have a much more interesting story than it really had. I never even bothered to complete the game because it felt like a chore.
And it has the best story and atmosphere out of DA games so yeah 😀
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
Bad coding and bad optimization that all, lower some settings will gain u good fps. most games have settings that take fps and do barely anything. 2005-2006 era is gone, where high end GTX could reach 100fps at all games maxed out even the most demanding, i recon the money we pay for all gpus is overpriced way overpriced.
Yeah they sure did at resolution of 800x600. Nope they weren't that fast when gaming at 1600x1200+ it was around the same as of now really we are just gaming at way higher resolution and image quality... While 7900 gtx and ati x1950xtx were good cards they did not play nearly everything even at 60fps... 7800 gtx barely managed 60fps on doom 3 with AA and AF and Doom3 had been out for a year. They were nowhere near close to 60 fps when it came to 1920x1200 resolution on the games that came out on 2006 the demanding ones in some you barely broke 30. So nope never heard of such era 😀 Nostalgia is an interesting thing.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Even the "Super" era of 8800GTX/Ultra could not maxed out Crysis original so yeah, i dont remember it either.