Very good review. Obviously the card is very energy efficient. But its using well over 200Watts and for me that's a deal breaker. I am somewhat dumfounded that it does not draw 150/170 watts like the 1070. As crazy as it sounds I wonder what how much energy could be saved with a small underclock.
I wonder why the Witcher is such a powerful title for the 2080 Ti compared to every other title in the line up. Strange 🙂
Well performing card - hope to see it actually in sales and at a close price to the MSRP.
Witcher 3 is a most likely a driver issue on ampere, the 3080 should pull clear ahead of the 2080ti but its quite impaired, considering its not a cpu limitation based on the 3090 also being slightly faster I'd say a architecture specific optimization might be required.
TPU bench is where I expect performance to be, so this looks unique to the Guru3D benchmark machine.
could be the installed windows CU?
Very good review. Obviously the card is very energy efficient. But its using well over 200Watts and for me that's a deal breaker. I am somewhat dumfounded that it does not draw 150/170 watts like the 1070. As crazy as it sounds I wonder what how much energy could be saved with a small underclock.
If by "well over 200Watts" you mean eight (8) Watts, then I AGREE
It's using well over 200Watts
[spoiler]
LOL
https://abload.de/img/lkapozgjcv.png [/spoiler]
If ultra textures are used 3070 is faster and if nightmare textures are used 2080ti pulls ahead. Framerate does not matter this shows what is the limitation on the gpu. Its a 1440p card but we already knew that.
The FE card seems pretty good, the performance is very much like the 2080ti, so that makes it OK for 4K, but 8GB is going to be a problem going forward at that resolution. I suppose this is an opportunity for the 6700XT, which is supposed to come with 12GB of VRAM. We'll see how that plays out but 3090 aside Nvidia has been stingy with VRAM on their Ampere cards.
RTX wise it's not that much faster than a 2080TI, interesting... especially there I would have expected the "easy" win vs Turing. Good card and spot it's at, but price and availability will kill or seal the deal for many.
Just link me to that article so I can see for myself.
Just watch the Hardware Unboxed review of the 3070.
edit
There you can also see how bogus Nvidia's claim of improved Gen 2 RT or Tensor cores are, at least with hybrid rendering.
I hope not for AMD's sake.
It's pretty crazy that five years after its release their fastest card can barely break 50fps @4k.
If AMD gets ray-tracing, imagine running RT on top of THAT, but with next GEN visuals.
Release date for review day before AMD's 6000 series announcement...
Card itself looks like good midrange contender. I still find the amount of vram a bit low, and with all the rumours of refreshes soon will wait to see what is on the horizon first.
The price, power draw and performance of the 3090 and 3080 didn't impress me as it was obvious they didn't scale close to lineally with more cuda cores. Its even more obvious now as the 3070 looks to be a sweet spot of efficiency. You basically get a 200w 2080Ti for $500 which is rather impressive. I have a feeling a 3070 Ti with a bit more cuda cores and VRAM might make for the peak of maxing the architecture from an efficiency stand point. I will wait to see what AMD launches but the 3070 is on my short list.
Oh I get it. I just thought it was funny how @Undying said 8gb just "isn't enough" when it still clearly is. 😉
For now, yes it is, for the most part. There are some, a relatively few, titles where 8GB isn't enough, Doom Eternal isn't the only game where at 4K 8GB becomes a problem, which is where the Radeon VII I have comes in handy. 😀
Things are about to change, though, certainly there won't be relief for those of us who play at 4K with 8GB. 1440p high refresh is more comfy overall, 4K for all its shiny is more of a headache.
TPU bench is where I expect performance to be, so this looks unique to the Guru3D benchmark machine.
could be the installed windows CU?
1 benchmark being slightly off is hardly an indicator of anything system wide. It could very well be the scene choice. Because Guru3D traditionally falls nicely within all other reviews.
Take 3090 aggregate performance for example.
1 benchmark being slightly off is hardly an indicator of anything system wide. It could very well be the scene choice. Because Guru3D traditionally falls nicely within all other reviews.
Take 3090 aggregate performance for example.
in no scene would a 2080ti and 3080 bench the same.
been waiting on this,great review as always,hopefully i can get one of these when they go on sale,ive saved long enough and been on this gtx970 long enough lol
Witcher 3 is a most likely a driver issue on ampere, the 3080 should pull clear ahead of the 2080ti but its quite impaired, considering its not a cpu limitation based on the 3090 also being slightly faster I'd say a architecture specific optimization might be required.TPU bench is where I expect performance to be, so this looks unique to the Guru3D benchmark machine. could be the installed windows CU?