GeForce RTX 2070 in November with 2304 shader cores

Published by

Click here to post a comment for GeForce RTX 2070 in November with 2304 shader cores on our message forum
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Fox2232:

Today, I learned that one of my friends want RTX 2080 for new raytracing. I was like OK, that should work. But then I learned that he wants to move from his 1080p 144Hz screen to 1440p 144Hz screen. That no longer works very well with raytracing. It looks like people believe that raytracing features are not as demanding as they are.
Yeah, I see what you mean. I don't want to "blame" anybody here (especially not your friend), but as with everything you buy, people need to smart up themselves... one can only blame marketing for so much, there has to be people blindly believing it too to make marketing work miracles. That's a personal responisibilty of every buyer in a capitalist system, to be sure what you do with your money. But yes, I'm not sure he knows how hard that hit can be... even though Dice for instance already said they're towning down ray tracing / reflections for BFV once it hits it's release for the game (post launch) to improve performance... not much fun buying a new expansive card and a new monitor just to be crippled by bad performance of ray tracing... although we would know why, it's a subjective saddening experience. I personally am waiting for reviews first and foremost, and honestly, if the new cards (for me personally 2080Ti) perform well even without ray tracing, it might still be interesting to get them. All depends on the money they want and how DX11 performance looks like. If your friend "only" wants to get faster DX11 performance, it's just a matter of checking simple benchmarks between 1080Ti and 2080. I'm fairly sure you were wise enought to tell him to wait another week for benchmarks 😀
data/avatar/default/avatar18.webp
From that block diagram, TU106 looks like reject TU102s chopped in half. I'm confident non RT performance should be good enough but predict HW RT is going to be a novelty with a small market since it will be lacking in Tensor cores to do the job correctly. It already takes a 2080Ti to barely get 60 FPS @ 1080p with RT enabled. Gonna definitely wait it out for AMD to offer their solution before considering any upgrades.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Nvidia stated more features were coming via NGX to Nvidia Experience that will only run on the RTX cards. So buying a 2070 isn't just about RT in games, the AI video/image upscaling is coming as well among other things. I don't know if all those features are worth the cost increase but I feel like people are forgetting about them.
Valken:

From that block diagram, TU106 looks like reject TU102s chopped in half. I'm confident non RT performance should be good enough but predict HW RT is going to be a novelty with a small market since it will be lacking in Tensor cores to do the job correctly. It already takes a 2080Ti to barely get 60 FPS @ 1080p with RT enabled. Gonna definitely wait it out for AMD to offer their solution before considering any upgrades.
Is it lacking the tensor cores? The DICE implementation doesn't even use tensor cores. Also why does everyone keep quoting the 1080P@60 RT from DICE but ignore that they said they found a direct 30% performance increase after that benchmark, said they could potentially find more performance by running the RT operations concurrently (which they weren't doing) and have supposedly lowered the raycount since then? Not to mention as I said, they aren't using tensor for denoising and the RT resolution scales independently of raster so predictions on 1080p> are out the window?
https://forums.guru3d.com/data/avatars/m/145/145154.jpg
Sylencer:

Not even sure if that could be considered an upgrade from my 1070...
I'm thinking this is a "skip it" generation of cards for people who already have a 10 series.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
Fox2232:

Right, nobody needs it today. But you are paying premium for it in 2070. And at time owners of 2070 will have games which use RTX, 2070 will likely not be enough.
It does depend on your quality settings. By that i mean we know that a RTX 2080 ti with raytracing is supposed to do around 60fps (maybe better when final game release and better drivers, we'll see), but at what quality is everything else? How will lowering other graphical features of the game with a 2070, and keeping raytracing enabled, work on a 2070? Will there be an area that the 2070 can do raytracing at 60fps? We'll have to see in the end ofcourse, but i'm certain there will be people who are willing to turn down other settings in a game to get what ray tracing offers.
Denial:

Also why does everyone keep quoting the 1080P@60 RT from DICE but ignore that they said they found a direct 30% performance increase after that benchmark, said they could potentially find more performance by running the RT operations concurrently (which they weren't doing) and have supposedly lowered the raycount since then? Not to mention as I said, they aren't using tensor for denoising and the RT resolution scales independently of raster so predictions on 1080p> are out the window?
This needs more likes and more people need to read it. Plus, they had during that less performing benchmark, a test at 1440p, and though it wasn't 60fps, it was 45-60 if i recall correctly, which is playable (though to many they have no desire to play in that fps, i understand that). So everything they are doing to increase the performance and optimizations will also make it more likely that 1440p will be 60fps+
data/avatar/default/avatar16.webp
Actually you are correct. I meant RT cores, not Tensor. I stand corrected. In the video, Dice did say they found more performance but they also said they will cutback and tone down the effects. They said they tried to max out for demo purposes. That was with a 2080Ti. So 60 FPs + 30% = 78 FPS. If TU106 looks to be nearly half the spec of TU102, it would push ~ 39 FPS at the same clocks at 1080p resolution with RT ON. A lot of the game engine FPS scales linearly for Nvidia architectures. AMD is way different in that 4K differentiates LESS from 1440p. We still need more market saturation of HW RT to make it viable for developers to use. RT is the new VR as was 4K as was AA... Some things everyone can use, some things only a few or it dies...