PNY GeForce RTX 2080 and 2080 Ti Product Data Sheet Slips Out

Published by

Click here to post a comment for PNY GeForce RTX 2080 and 2080 Ti Product Data Sheet Slips Out on our message forum
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Paulo Narciso:

Honestly I'm very sceptical about this "new" technologies that vendors try to sell. Over the years they tried to sell new hardware, announcing wonders about those technologies like Physx, Tesselation, etc. And to these days they are hardly used. I think Nvidia is trying to justify their tensor cores in a gaming card, instead of giving more cuda cores which would give more tangible performance increase.
Tessellation is used in nearly every single game? PhysX (GPU accelerated) was hardware locked to Nvidia which is why it never saw adoption. That plus physics systems themselves are a nightmare for multiplayer implementation and thus often not used anyway. The problem with just adding more CUDA cores is that game developers are running into a visual plateau with rasterization. The amount of work you need to put in to fake a shader into looking like a physically based representation is getting exponentially higher. Raytracing is just a paradigm shift that automatically gives us better results with less work (but less performance as well). Should also point out that the "RT core" is a misnomer, the "raytracing engine" is built into the SM and ALUs - the processing occurs entirely in the SM. The CUDA cores on Volta/Turing are larger due to a larger cache, which should give a general IPC uplift and would have happened regardless to raytracing or not. I keep seeing people thinking RT/Tensor cores are discreet cores - they aren't - think of it more like an ISA extension that utilizes ALU concurrency featured in Volta/Turing.
https://forums.guru3d.com/data/avatars/m/172/172560.jpg
Well, having listened to that John Carmack talk he gave back in 2016, shaders are being thrown out for many things. Even first Rage used a lot of light sampling instead of baked shadows. As did Doom reboot. What he said thou was that material properties have to be updated if you want things to look real. Otherwise, it doesn't matter if you ray trace it or not. Updating materials today (or rather back then in 2015/2016) required a lot of work, involved laser readings etc. I assume companies like Epic and others that make engines will do the heavy lifting, but still...I mean, looking back at first Crysis game, you can see how much benefit is there when you just do the materials more correctly. It will never look outdated, like, say, Quake 2. ๐Ÿ™‚
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
gx-x:

Well, having listened to that John Carmack talk he gave back in 2016, shaders are being thrown out for many things. Even first Rage used a lot of light sampling instead of baked shadows. As did Doom reboot. What he said thou was that material properties have to be updated if you want things to look real. Otherwise, it doesn't matter if you ray trace it or not. Updating materials today (or rather back then in 2015/2016) required a lot of work, involved laser readings etc. I assume companies like Epic and others that make engines will do the heavy lifting, but still...I mean, looking back at first Crysis game, you can see how much benefit is there when you just do the materials more correctly. It will never look outdated, like, say, Quake 2. ๐Ÿ™‚
Well in terms of photorealistic graphics in materials the direction everyone is headed is photogrammetry. They are getting better at delighting the photogrammetry samples (this was the hardest, most labor intensive part previously, but algorithms/software are helping to speed this up) and handling the texture requirements better. Eventually most photorealistic games will use Quixel quality level stuff everywhere. They'll probably also start using machine learning to create new textures with similar properties to those.. as it's hard to get to planets like Mars to sample rocks/soil for the next Doom ๐Ÿ˜› It's a combination of everything that brings higher visual quality - the industry is tackling every side. Lighting is the most challenging from a compute standpoint, that's why this DXR acceleration stuff is deemed so "revolutionary". Were obviously still far away from 100% scene path tracing but this is a great start.
https://forums.guru3d.com/data/avatars/m/172/172560.jpg
Well, big studios get 99% of everything correctly, at 30 mins per frame render time...(watch The Adventures of Tintin if you haven't already) so that is not the issue, the issue is how to do it at 60 fps, at least, without throwing one unimaginable number of transistors at it ๐Ÿ˜€
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
What games expected to use RT over next few months. Metro Exodus only one announced?
https://forums.guru3d.com/data/avatars/m/172/172560.jpg
afaik, yes, even that is not for sure, could be just a stunt.
https://forums.guru3d.com/data/avatars/m/212/212018.jpg
Someone open the windows... to much smoke with this ray tracing shit lol WATCHOUT!!!! 2070 6 times faster than 1080Ti in ray tracing ops... speaking of fps 20% behind.
https://forums.guru3d.com/data/avatars/m/242/242573.jpg
nz3777:

Oh I almost forgot....Nvidia=Fps! These guys are a Monster Giant company they dont even need to try to sell products if they fart people will flock to come and get a smell, yes when you corner the market like that and have no compition thats what happens! I wish i was the ceo,sigh! Lol.
I think people flock to nVidia more because they sell cards with outstanding performance, cutting edge features, and don't feed consumers bull$hit performance claims which never turn out to be true like their competition has done for generations.
data/avatar/default/avatar17.webp
They have been leading the industry since I got onto the scene which was Gtx 500 series (long time ago) iam sure some of you remember even older gen cards but I follow it really close,Amd sometimes (almost) trys to catch-up but iam affraid to say Nvidia is just so far advanced at this point theres no catching up to them anymore! Been a loyal fan since my 1st Gtx 580 and never thought about getting another Amd card.
https://forums.guru3d.com/data/avatars/m/172/172560.jpg
GF 5 series, no offense, is a recent history.
https://forums.guru3d.com/data/avatars/m/212/212018.jpg
^ And was the worst nvidia gen i can remember lol