Editorial: GeForce RTX 2080 and 2080 Ti - An Overview Thus far

Published by

Click here to post a comment for Editorial: GeForce RTX 2080 and 2080 Ti - An Overview Thus far on our message forum
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
This is a great summary of everything we know so far. Should also point out that Nvidia hinted at the tensors being used for more than just raytrace acceleration - at Siggraph they mentioned a few new AA methods, ATAA and DLAA, and they also mentioned AI Upscaling. No idea if any of that is coming to consumer cards - but I figure if they have the hardware they may as well do value-add features with it. Especially because new rumors are pointing to the GTX2060 being as fast as a 1080. I feel like they are going to need to have some nifty features for the RTX series to drive sales.
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
great job keeping this concise. afaik, tensor cores will be disabled on the consumer line. at least, that's what Nvidia hinted at during earnings call 2nd quarter 2017. mainly so AI and Deep Learning have to buy Quadros and Titans...as is AI, Deep Learning, and Big Data have been cheating with 1080ti's (on a massive scale... i.e. Google has more than the nations of Denmark, Belgium, and the Netherlands combined).
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
tunejunky:

great job keeping this concise. afaik, tensor cores will be disabled on the consumer line. at least, that's what Nvidia hinted at during earnings call 2nd quarter 2017. mainly so AI and Deep Learning have to buy Quadros and Titans...as is AI, Deep Learning, and Big Data have been cheating with 1080ti's (on a massive scale... i.e. Google has more than the nations of Denmark, Belgium, and the Netherlands combined).
The RTX acceleration is based on Tensor. They will probably be crippled for training ops but definitely not for inferencing. Also I don't know why you think they are "cheating" with 1080Tis.. the 1080Ti doesn't support TCC mode.
data/avatar/default/avatar13.webp
Well done!! ๐Ÿ™‚ I wonder if we will actually see those ~prices.
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
mai
Denial:

The RTX acceleration is based on Tensor. They will probably be crippled for training ops but definitely not for inferencing. Also I don't know why you think they are "cheating" with 1080Tis.. the 1080Ti doesn't support TCC mode.
only because i've seen them in use at server farms
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
tunejunky:

mai only because i've seen them in use at server farms
Idk - doesn't really make much sense to use 1080Tis, TCC is completely disabled on them and even if you could find a way to enable it the SLI connector isn't going to give you any kind of scalability. Now if you said Titan's that would be a different story - because those have TCC enabled - but even there, on the newer Titans, the cluster bandwidth is gimped (NVLink disabled) on purpose so you're not going to get any kind of good scaling out of them. That's not to mention that GP100/GV100 have other features more geared towards compute workloads: http://images.anandtech.com/doci/10325/PascalVenn.png?_ga=1.83280852.372099904.1468967622 GV adds independent scheduling and more. I mean I definitely think Nvidia is going to be gimping some features to keep data center guys using their beefier, more expensive hardware.. but AFAIK Tensor cores power RTX, the denoising runs on them. So the Tensors should 100% be there and somewhat functional for that regard.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Nice preview. Although iยดm a little surprised with Nvidia releasing a new card with tensor and ray tracing cores at the same time... Seems expensive... Anyway, now all i have to do is wait for the next christmas giveaway...
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Interesting - seems like the 2000 series is definitely more than just a refresh. The 2080Ti is more impressive than I was expecting.
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
Thanks for the article, an informative overview of what we could expect. I think I can see myself skipping this new architecture & going with the next one. I think with the inclusion of these new Ray Tracing Cores as well as the Tensor Cores, I'm thinking these may need time to mature with the next new release (after this one) rather than just jumping on the first release, and we also don't know how important or unimportant these new cores will be, so that's more of a risk jumping on it when games are not developed for it yet. I'm likely to extract a bit more value from GTX 1070, and only upgrade it if I can't maintain 1080p/144Hz at decent settings in upcoming online shooters (like BF V). I'm excited to read the reviews though for these cards, learn about it & to see the performance breakdowns!
https://forums.guru3d.com/data/avatars/m/169/169351.jpg
Robbo9999:

Thanks for the article, an informative overview of what we could expect. I think I can see myself skipping this new architecture & going with the next one. I think with the inclusion of these new Ray Tracing Cores as well as the Tensor Cores, I'm thinking these may need time to mature with the next new release (after this one) rather than just jumping on the first release, and we also don't know how important or unimportant these new cores will be, so that's more of a risk jumping on it when games are not developed for it yet. I'm likely to extract a bit more value from GTX 1070, and only upgrade it if I can't maintain 1080p/144Hz at decent settings in upcoming online shooters (like BF V). I'm excited to read the reviews though for these cards, learn about it & to see the performance breakdowns!
I agree somewhat, it'll be a few generations until we see what these new features will bring us and how well they'll be implemented. The problem I see with the ray-tracing processing being leveraged (I would argue that it's simplified ray tracing, low pass with AI processing on top) is that it's exclusive to nVidia through their gameworks platform and if the rumours are true, there will be further segmentation with nVidia hardware as well (GTX vs RTX) so only the very top end cards will see this capability. The problem this brings, as we've seen with anything with Gameworks integration is, we see only few games with the enhanced graphics and fairly often, particularly if games are ported from consoles (which really shouldn't be difficult to port anymore given the current gen consoles are x86) often with crippling results (from both of the main hardware manufactures and again, only high end NV cards only just being able to view the results). Lets also be honest here, the majority of games available today are made for the consoles, the only difference is PC gaming allows for various configurations of quality (for the better usually of course) but what will be the point of this new technology if it's going to be inaccessible to the majority? Perhaps I'm being cynical but I can't help but fear we're going to see more of the same - PhysX and Gameworks being locked away and once again with the ray tracing package being a part of gameworks, stifling the innovation that nVidia are actually providing and it and makes me think of this quote "the left hand doesn't know what the right hand is doing" - in this case the left hand of nVidia is innovation, making great hardware and technology but the right hand is the greed and monopoly, keeping it locked away, not giving the innovation what it deserves. So what do I mean by all this? Well while I can appreciate that yes, people will be expected to pay high prices for the new features, it's understandable, the problem is, we still wont see the innovations for many many many years, not because of the prices themselves but because of the exclusivity of the features and it's resultant platform availability and ecosystems and the rather small marketshare in the grand scheme of computer gaming - unless nVidia have a hand during game development, we're just not going to see the ray-tracing unless the next gen consoles also leverage this hardware. Tl;dr I just really really really hope nVidia's innovation in ray-tracing wont follow the same history of PhysX.
https://forums.guru3d.com/data/avatars/m/172/172560.jpg
a lot of "humans" on the internet still cannot convince me that they are humans.
data/avatar/default/avatar03.webp
2080 ti for $800 Bucks sounds like a damm good deal! I still cant figure out what that raytracing is if it slapped me in the face haha. The graphics looked really nice of course in the video but again in plain english what am I looking for lol?
data/avatar/default/avatar35.webp
gx-x:

a lot of "humans" on the internet still cannot convince me that they are humans.
RTX 2070 will easily be able to simulate AI which is more nuanced than 3/4 of internet humans PS if coins that utilize RT cores take off, we're fked goiid thing is they won't appear right away
nz3777:

2080 ti for $800 Bucks sounds like a damm good deal! I still cant figure out what that raytracing is if it slapped me in the face haha. The graphics looked really nice of course in the video but again in plain english what am I looking for lol?
Ray tracing is a simulation of physically correct light-path. The resulting image being the solution of this simulation. Unlike traditional rendering, which starts either with the known image and just draws it, or it calculates certain parts of the scene, but nowhere near as strict and physically correct as Ray-tracing. Think of RT as a theoretical physicist's answer to a question: what this scene looks like
https://forums.guru3d.com/data/avatars/m/270/270041.jpg
If this stuff is true then this will be the first time a XX80 card doesn't beat the previous XX80ti card, infact by Tflops this will actually be slightly weaker, and the new XX80ti has less than 50% more Tflops compared to the over double jump we saw with the 980ti to 1080ti.... interesting to see if tensor cores are used to make up for this or the architecture could. So far seems more like a 1080ti with extra cores and a few added features
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
@nz3777 Ray tracing in layman terms is like using a laser pointer to draw something on a wall while holding a camera shutter open until you're done drawing (aka, moving the laser ray all over the place) Any objects in the path of the light will create shadows, any reflective ones will bump the ray in another direction, any translucent one will make it refract through it and also change direction. But instead of a laser, this is done via mathematics, for EVERY PIXEL on the screen, and realtime raytracing means it has to be done 30-60 or more times every second. This is extremely difficult to achieve, because the mathematics involved are very complex.
https://forums.guru3d.com/data/avatars/m/238/238795.jpg
This is making me very happy I pulled the trigger on a 1080 Ti and decided the 2080 looked underwhelming from initial leaks. Seems to be more and more like an accurate bit of info. Minus a few new performance killing effects like nvidias wish it were real raytracing, there doesn't seem to be much difference. I am still curious for benches though. Optimizations and drivers can help the 2080 improve. I still feel the 1080 Ti will be enough for me until the next refresh now. Hopefully. I said it before, with PC games being console ports for the most part, and consoles not being upgraded until 2020, a 1080 Ti should suffice until the next consoles hit and more power is required. the 10xx seies seems like it was meant for this current gen and anything additional is just extra gravy.
https://forums.guru3d.com/data/avatars/m/223/223196.jpg
Two of these puppies are so very much WANT, but very little NEED. Then there's the matter of CAN AFFORD. Well, I'm going to wait and see the reviews and benchmarks, then decide.
https://forums.guru3d.com/data/avatars/m/224/224796.jpg
I think I'm more excited about NVLink than anything else so far. ๐Ÿ™‚