Review: NVIDIA GeForce RTX 3070 Founder Edition

Published by

Click here to post a comment for Review: NVIDIA GeForce RTX 3070 Founder Edition on our message forum
https://forums.guru3d.com/data/avatars/m/282/282473.jpg
Undying:

2080ti does perform better than 3070 where you need more than 8gb vram like that example in doom eternal. If you reduce the texture quality 3070 pulls ahead.
does it ? https://www.purepc.pl/test-kart-graficznych-geforce-rtx-3070-vs-geforce-rtx-2080-ti?page=0,7 https://www.techpowerup.com/review/nvidia-geforce-rtx-3070-founders-edition/17.html cause buying a 2080ti used was one of my options,but after analyzing a lot of 3070 reviews,I'd prefer the 3070 even tho it's 8G.only thing I found on it is pcgh running out of vram if enabled RT and manually setting image streaming option beyond ultra.I wouldn't trade that for a 50W reduction in power draw and better noise&thermals. well,to each his own.at $500 and 2080Ti performance,8G is a compomise I'm willing to take.This card is god damn sleek too,I may sell my 3080 once 3070FE is in stock. https://cdn.staticneo.com/a/nvidia-geforce-rtx-3070-fe/13.jpg
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
wavetrex:

"Godfall needs 12GB memory for 4K gameplay with UltraHD textures" https://videocardz.com/newz/godfall-requires-12gb-of-vram-for-ultrahd-textures-at-4k-resolution Oh boy, this card will not age well...
An AMD partnered game optimized to allow AMD to milk their extra vram advantage for marketing purposes. Like what Nvidia did with the extra tessellation to cripple AMD performance. Both approaches are disingenuous. Pretty sure Godfall could easily been designed for 10gb vram or less with no impact on visuals or performance, but AMD just wanted to fill their extra vram to gain an advantage over Nvidia. Some of the best looking games ever (ie, Metro Exodus, AC Odyssey, etc) easily do with less than 8gb @ 4k. This is just one side trying to screw the other with unnecessarily leveraging their HWs "moar is better" components.
https://forums.guru3d.com/data/avatars/m/254/254132.jpg
Astyanax:

ultra hd is not for playing.
Where does it say that?
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
theoneofgod:

Where does it say that?
common sense.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
Astyanax:

ultra hd is not for playing.
Astyanax:

common sense.
No, it isn't. What isn't common sense is getting a top end GPU in 2021 that can't even match console settings due to VRAM restrictions. Also the 3090 runs games very nicely at 4k, and some of us have even 4k120 displays. Imagine that. Why would you make excuses for Nvidia?
https://forums.guru3d.com/data/avatars/m/254/254132.jpg
PrMinisterGR:

No, it isn't. What isn't common sense is getting a top end GPU in 2021 that can't even match console settings due to VRAM restrictions. Also the 3090 runs games very nicely at 4k, and some of us have even 4k120 displays. Imagine that. Why would you make excuses for Nvidia?
All you need to remember is that H.266 and GDDR6X aren't a thing.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
theoneofgod:

All you need to remember is that H.266 and GDDR6X aren't a thing.
I don't understand what you said/joked about :P 😀
https://forums.guru3d.com/data/avatars/m/273/273323.jpg
PrMinisterGR:

And here I am, yet again whinning about VRAM.
I own a 3070 and I've found in several modern titles I have to scale back VRAM heavy settings like Texture Quality and Shadows to get certain titles to run "smoothly" -- the RTSS overlay will make it "look like" the games aren't using up all the available VRAM but what actually seems to be occurring is the driver is managing the allocation (so that the VRAM doesn't get maxed out) and then assets are being stored on system memory -- that, or perhaps some games need VRAM overhead of a sufficient amount to stream in the new large assets, I'm not sure -- just speculating really -- but I have noticed at 1440p a discernible reduction in stuttering in titles like Hunt Showdown when I drop Textures and Shadows from High to Medium. VRAM usage is reduced and if I set the game's VRAM target to 90% when those settings are on high the game stutters horribly (this does not occur with those values paired back). The 3000 series cards were mostly great imo but the 3070 absolutely should've had 10 gigs while the 3080 should've launched with 12 (they rectified that on the 3080 side eventually though seems like). It's just crazy to me that this sort of issue crops up even at 1440p in some modern games and, especially with RT on, VRAM usage can be very high in some titles. A lot of reviewers seem to miss this in their testing because overlays will not report the VRAM being "maxed out" anymore since the driver takes over managing the allocation (but really yeah, the VRAM budget is all used up and there are side effects). If my 3070 had 10 gigs of VRAM I think it would be "sufficient" -- 8 gigs just isn't enough in my experience and it kinda sucks having to pair back VRAM heavy settings in modern titles considering how recent that card is/the degree of cost and all that jazz.
https://forums.guru3d.com/data/avatars/m/259/259654.jpg
BlindBison:

I own a 3070 and I've found in several modern titles I have to scale back VRAM heavy settings like Texture Quality and Shadows to get certain titles to run "smoothly" -- the RTSS overlay will make it "look like" the games aren't using up all the available VRAM but what actually seems to be occurring is the driver is managing the allocation (so that the VRAM doesn't get maxed out) and then assets are being stored on system memory -- that, or perhaps some games need VRAM overhead of a sufficient amount to stream in the new large assets, I'm not sure -- just speculating really -- but I have noticed at 1440p a discernible reduction in stuttering in titles like Hunt Showdown when I drop Textures and Shadows from High to Medium. VRAM usage is reduced and if I set the game's VRAM target to 90% when those settings are on high the game stutters horribly (this does not occur with those values paired back). The 3000 series cards were mostly great imo but the 3070 absolutely should've had 10 gigs while the 3080 should've launched with 12 (they rectified that on the 3080 side eventually though seems like). It's just crazy to me that this sort of issue crops up even at 1440p in some modern games and, especially with RT on, VRAM usage can be very high in some titles. A lot of reviewers seem to miss this in their testing because overlays will not report the VRAM being "maxed out" anymore since the driver takes over managing the allocation (but really yeah, the VRAM budget is all used up and there are side effects). If my 3070 had 10 gigs of VRAM I think it would be "sufficient" -- 8 gigs just isn't enough in my experience and it kinda sucks having to pair back VRAM heavy settings in modern titles considering how recent that card is/the degree of cost and all that jazz.
The 10GB is basically the amount of "fast" memory on the Series X. Despite Microsoft insisting it is "transparent", in practice games seem to use GPU assets up to that amount, and the other 4GB available as CPU memory.