Review: NVIDIA GeForce RTX 3070 Founder Edition

Published by

Click here to post a comment for Review: NVIDIA GeForce RTX 3070 Founder Edition on our message forum
https://forums.guru3d.com/data/avatars/m/181/181063.jpg
So... is it safe to upgrade now??? Sure, yet another review for a card that could be in stores ...or not....from a company that could be in trouble from competition...or not on this day when outside my window the sky is cloudy and could rain...or not... That's how much I started to care about these new Nvidia card releases...or not.
data/avatar/default/avatar05.webp
Still waiting for: Just get beat up 2080 Ti with 73.5 hours of warranty from one of those nice miners fellow gamers instead. Then pat yourself on the back if you can find a difference in DOOM / W:YB between Ultra-Nightmare textures and Ultra-Insane-Mein-Leben-Nightmare textures.
Astyanax:

in no scene would a 2080ti and 3080 bench the same.
1080p looking at the wall? I'll be honest: It's one freaking game, 1080p settings, Hilbert could be benching it of a toaster for all I care 🙂
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
Well I feel a bit better about having a preorder in for a TUF 3080 now I've seen this review & the prices here in the UK for the 3070. Cheapest for 3070 delivered is about £540, and I'm gonna get my 3080 for £700, which is exactly 30% more expensive for the 3080 but you get 30% more performance and another 2GB of VRAM....yeah ok, I got a pretty good deal then choosing a 3080 here in the UK!
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
AlmondMan:

Just odd if the Witcher 3 is the only title that is memory bandwidth sensitive. Let's not make those 8 vs 10 vs 20gb conspiracies the norm. How does that explain the lack of such massive gaps between other cards with lower amounts of vram? Further, Gamers Nexus made this https://www.Disney.net/game-bench/1949-witcher-3-texture-quality-comparison-vram-and-fps where you see no impact in performance based on texture quality, which would make some amount of sense that there would be a difference if it was performant impactful - certainly speedwise.
Not that weird, because despite what a lot of people think, games aren't that VRAM intensive. Remember - just because your VRAM may be show to be 100% filled, that doesn't mean it's actually full. A lot of it is just a buffer, which helps minimize load times and stuttering. Although Hilbert used an i9, a system with PCIe 4.0 would help dramatically alleviate any game that demands more VRAM. I still think Nvidia should've given the 3070 10GB and the 3080 12GB, but 8GB is "sufficient", only barley and probably not for long.
H83:

Performance wise, this card is good, basically the same as the 2080ti but the price is too high. A 70 series cards should be around 350€, 400€ max, more than 500€ for this is simply too much. Nvidia is pricing some of the GPUs too much, i just hope AMD can deliver with big Navi at reasonable prices.
Considering what they did with the 2000 series, I don't think we should be complaining. The 3070 MSRP is still less than half of a 2080 Ti while overall being faster. People in these forums are already salty about how much money they spent on something that was so easily obsoleted by something so much cheaper. Think of it like Intel back in 2017-2018, where you were getting basically half the offering of AMD for a higher price. If Intel tried to compete against that directly, they would just really piss off their customers who would be like "wait so why was the thing I bought last year so expensive when you could have easily dropped the price so much and still turn a profit?". Companies can't make it too obvious how much they're gouging you for your money.
https://forums.guru3d.com/data/avatars/m/103/103120.jpg
AlmondMan:

Just odd if the Witcher 3 is the only title that is memory bandwidth sensitive.
Every tested DX12 title is faster. Every DX11 is slower a bit. Probably up to scheduler. Doesn't matter. They are all old games. But that's to everyone saying 8 GB isn't enough for 4K games. More than enough. 3DNews tested RTX 3090, RTX 3080 and RTX Titan in 8K. 10 GB is even enough in 8K in many games, but not any game though.
data/avatar/default/avatar07.webp
AlmondMan:

Just odd if the Witcher 3 is the only title that is memory bandwidth sensitive. Let's not make those 8 vs 10 vs 20gb conspiracies the norm. How does that explain the lack of such massive gaps between other cards with lower amounts of vram? Further, Gamers Nexus made this https://www.Disney.net/game-bench/1949-witcher-3-texture-quality-comparison-vram-and-fps where you see no impact in performance based on texture quality, which would make some amount of sense that there would be a difference if it was performant impactful - certainly speedwise.
I am not making any conspiracies here. I am just saying that vRAM also plays a part on top of speed as well. There is a lot of variables on what could cause such a performance gap.
data/avatar/default/avatar10.webp
Nictron:

Witcher 3must be a vram thing. look how close the 1080Ti is 11GB. That is very strange!
Yes, The Witcher 3 is very sensitive with VRAM, if you want to see if your GPU has a stable memory OC, launch this baby up and look for any texture issues. 🙂
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
mem bandwidth 2080ti = 616gb/sec 3070 = 448gb/sec ^ that will impact FPS more than vram buffer. So when we see titles that the 3070 slips behind the 2080ti at 4k, keep that in mind. Insufficient vram buffer will more likely result in very choppy game play and poor frame times. If FPS is lower but gameplay is smooth, it is not an insufficient vram buffer issue.
https://forums.guru3d.com/data/avatars/m/273/273906.jpg
Nvidia left a quite a considerable gap between 3070 and 3080. A room for a 3070Ti maybe?
https://forums.guru3d.com/data/avatars/m/267/267153.jpg
Just a note for those of us who forgot Witcher 3 is a 2015 game, where a 3,5gb VRAM GTX 970 can max it. And the VRAM will not even be full. So please, erase that "it might be VRAM running off" argument from your heads, really. Its something else. Either some driver issue, bandwidth or aliens, but not VRAM. VRAM is a issue at the DOOM Eternal, so you can see that 8gb is something which will sooner than later bottleneck you from maxing out your games.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
schmidtbag:

Considering what they did with the 2000 series, I don't think we should be complaining. The 3070 MSRP is still less than half of a 2080 Ti while overall being faster. People in these forums are already salty about how much money they spent on something that was so easily obsoleted by something so much cheaper. Think of it like Intel back in 2017-2018, where you were getting basically half the offering of AMD for a higher price. If Intel tried to compete against that directly, they would just really piss off their customers who would be like "wait so why was the thing I bought last year so expensive when you could have easily dropped the price so much and still turn a profit?". Companies can't make it too obvious how much they're gouging you for your money.
For me, the direct comparison with the 2080ti makes no sense because the price of it was just too absurd, so absurd everything seems like great value against it, even a flying turd. The 2080Ti was a 700/800€ card sold for 1200€ because of the llack of competition... I compare the 3070 against the old 70 series cards and it´s clear the steady rise of prices on the last generations. For example, i´ve bought a Pallit a few years ago for 270€, then a 970 for 330€ and then a 1070 for 430€ on a sale!!! And now the 3070 is going to cost more than 500€! This is supposed to be a mid range card that is now priced almost as an high end one! For me this is just too much.
https://forums.guru3d.com/data/avatars/m/281/281256.jpg
still waiting for tomorrow
https://forums.guru3d.com/data/avatars/m/145/145154.jpg
People seem surprised but the xx70 series always seemed to stack up well versus the previous xx80Ti cards. This seems on par.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
H83:

For me, the direct comparison with the 2080ti makes no sense because the price of it was just too absurd, so absurd everything seems like great value against it, even a flying turd. The 2080Ti was a 700/800€ card sold for 1200€ because of the llack of competition... I compare the 3070 against the old 70 series cards and it´s clear the steady rise of prices on the last generations. For example, i´ve bought a Pallit a few years ago for 270€, then a 970 for 330€ and then a 1070 for 430€ on a sale!!! And now the 3070 is going to cost more than 500€! This is supposed to be a mid range card that is now priced almost as an high end one! For me this is just too much.
The whole 2000 series was outrageously overpriced. The RT/tensor cores has something to do with that, but Nvidia overestimated how much people give a crap about shiny puddles. You said you bought the 1070 for 430 on sale. That's not a big price difference compared to the 3070 MSRP. The MSRP is supposed to be $500, however, you're not wrong that it's probably going to be higher than that, and that I do feel is worth griping about. Anyway, aside from the RT cores, prices have gone up because demand has too. Just about everyone who wants a 1080p-ready gaming PC has one already, which is why 1080p hardware is pretty cheap nowadays. A lot of us have been waiting for something good for VR or 4K, and let's face it, $500 for 4K at max detail is the best price we've seen. We just have to hope AMD releases something competitive enough to control these prices. I suspect that a lot of people intend to get a new GPU this year, and the market might die down a bit next year.
https://forums.guru3d.com/data/avatars/m/198/198862.jpg
Honestly this 3070 launch is like deja vu from 2 years ago. Its very much 2070 vs 1080ti slower and 3gb vram less offering only cinematic raytraced experience if your into that.
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
One needs to remember that Pascal was marked up as well, and to some extend Maxwell before it too. The last generation from NVidia that kept the prices somewhat constant was 600 series (Kepler), as it had strong competition from AMD/ATI 7000 series (and that amazing 7970GHzE which was cheaper, had 50% more memory and also more powerful than GeForce 680) Then it all went to hell with the introduction of the first "Titan" in 2013 with a price that was considered absolutely RIDICULOUS by everyone back them, a GPU costing $999 ???? while previous generation top single GPU was only $500. Yet, enough people bought it, then with the introduction of 780 Ti, and even more people bought that (as it was faster indeed), and that triggered a cascade effect that kept creeping up prices of the entire range until present day. The fact that AMD was stuck with Rebrandeon for several "generations" didn't help at all... --- Turing was the top end price ridiculousness, and we're finally seeing some scale back, but there is still a long, LONG way to go until these tiny little square rocks that make pretty pictures come back to a "reasonable" price. In some parts of the world people buy fully functional cars with $1000 or less... (yes, old, but still functional). Huge metal things that take you from place A to place B, not a toy for playing video-games...
https://forums.guru3d.com/data/avatars/m/232/232504.jpg
Now we're talking! This card is extremely capable for 1440p @75Hz monitor. Almost like the 1660 Super for 1080P @75Hz.
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
it is clear this is where Nvidia was saving their magic as i've been only moderately impressed until now. this price point is the money maker that enables the entire line-up. my only real question is "inside baseball" how does the size of the gpu scale against full Ampere in cm? that's relevant because of yield vs. demand.
https://forums.guru3d.com/data/avatars/m/232/232504.jpg
k0vasz:

It could have been a really good offer, but now that AMD probably will offer a card with the same performance but with more RAM, and with the upgraded 3070(S) version on TSMC 7nm is already on the horizon, it doesn't look too tempting... also, the same amount of memory as on the four years old 1070, feels cheap - it might be enough today, but definetaly not helping making this card futureproof I'm really curious for tomorrow's announcements!
THAT'S WHY UP TO 1440P, 3060TI IMO RECOMMENDED EVEN MORE THAN 3070. BECAUSE IN CASE 3060TI RETAINS THE SAME PERF AS THE 3060 SUPER AS IT HAPPENED WITH 16 SERIES, IT'S STILL ONE OF THE BEST OFFERS VALUE/MONEY/PERF.
https://forums.guru3d.com/data/avatars/m/103/103120.jpg
I wouldn't click the link. Likely to be infected.