GeForce RTX 2080 and 2080 Ti - An Overview Thus far

Graphics cards 1054 Page 1 of 1 Published by

Click here to post a comment for GeForce RTX 2080 and 2080 Ti - An Overview Thus far on our message forum
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
wavetrex:

@nz3777 Ray tracing in layman terms is like using a laser pointer to draw something on a wall while holding a camera shutter open until you're done drawing (aka, moving the laser ray all over the place) Any objects in the path of the light will create shadows, any reflective ones will bump the ray in another direction, any translucent one will make it refract through it and also change direction. But instead of a laser, this is done via mathematics, for EVERY PIXEL on the screen, and realtime raytracing means it has to be done 30-60 or more times every second. This is extremely difficult to achieve, because the mathematics involved are very complex.
So by the time a few games use it, we could be on the heels of Turings successor which will improve upon it further. So ray tracing on Turing may not be anything for gamers to be exited about.
https://forums.guru3d.com/data/avatars/m/270/270008.jpg
There is likely going to be one or two Gameworks games that use Ray Tracing in the near future. I really don't see this becoming a popular feature for 4-5 years. Also, I wouldn't be surprised if Intel was very competitive with ray tracing when they finally release there GPU. I guess what I'm getting at is this whole ray tracing in hardware thing is mostly a gimik at this point unless you are a game developer and if you are a game developer you have some new toys.
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
According to some post we should stop R&D cause games are not gonna use features coming with each new gen. Guess what, no new stuff hardware wise, means no developers will introduce it into games... And so far, most ppl are looking to improve performance , not new features. So not buying a new product just because you cant fully use it? Ok. Maybe someone needs to explain that to ppl buying 100 room mansions, or cars that can go +65mph (as most countries have speed limits). 😀 As long as no card below the ti has more than 8gb, i will skip this, as siege will need more than 8gb vram to go beyond 1440p, and i wont pay more than 800, doubt ill get a LC ti for that..
https://forums.guru3d.com/data/avatars/m/239/239459.jpg
All sounds really exciting but I won't be upgrading until I start seeing options in games that my current 1070 can't do, it's like when physx came out and I started seeing all these new games, well mainly mafia 2 at the time but I had to get a card that could do it because I felt like I was missing out, I'll most likely skip this next generation myself but it all depends how quick the devs are to implement all this new technology into their games, if this DXR takes off then I may have to dig into my wallet but even then I'm not gonna be upgrading until there's a nice watercooled version available with a pre applied block like my seahawk, I don't want to be messing around with waterblocks again and voiding my warranty.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
fry178:

According to some post we should stop R&D cause games are not gonna use features coming with each new gen. Guess what, no new stuff hardware wise, means no developers will introduce it into games... And so far, most ppl are looking to improve performance , not new features. So not buying a new product just because you cant fully use it? Ok...
Not so much that, its the massive RT marketing blitz Nvidia is unleashing upon us, hoping to reel in unsuspecting gamers by the millions who may led to believe it will transform all their games instantly. Most buyers will not likely know wtf RT is but will be swept up by all the hype regardless.
https://forums.guru3d.com/data/avatars/m/238/238382.jpg
I really hope crypto mining doesn't take off again when the new cards are release... still waiting for prices to go back to normal *__*
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
[youtube=yVflNdzmKjg] Thought this was relevant...
data/avatar/default/avatar40.webp
Now I totally understand what to look for with the ray-tracing. Thanks guys. This in theory schould bring Games more to life very sweet! But of course we are still keeping phys-x right? This is just more icing on the cake. Also Amd has to respond to this? Nvidia cannot be touched at the moment and future as well,Too damm strong!
https://forums.guru3d.com/data/avatars/m/242/242134.jpg
@DW75 And? When has bus be a bottleneck on xx60 and up? So far i have seen "horsepower" and vram amount being the issue, not the bus speed. Just because its slower in speed doesn't mean it will perform less. Go back for previous gen ti and the like, why haven't they been bottlenecked, even that bus speed doesn't go past 256/384 for years now... As long as they keep improving other things like compression, its not an issue. And similar with vram. A xx60/70 will run out of power, before running into vram issues. A switched between xx50/ti/60/70/80 (upgrades; new build for others), and all games from 2000 till R6 siege (newest game for me) running 1080/1440p, vram isnt the issue, but performance was...
https://forums.guru3d.com/data/avatars/m/66/66219.jpg
Nice editorial, got me up to speed good. 🙂 Hangin for that 2080ti upgrade... hope its released with or soon after 2080.
data/avatar/default/avatar13.webp
Is there any leaked mining info? I sold 65% of my mining farms and I am ready if those cards prove worth, just better performance and saving 50%+ on power will save me few MW per month
https://forums.guru3d.com/data/avatars/m/211/211933.jpg
Wasn't planning to buy this gen but if they release the Ti this soon i'll find it very hard not to upgrade.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
tunejunky:

mai only because i've seen them in use at server farms
There is other GPU for this duty... it's a waste of money.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
schmidtbag:

Interesting - seems like the 2000 series is definitely more than just a refresh. The 2080Ti is more impressive than I was expecting.
We don't know yet, don't forget that RX480 were suposed to be at GTX1080's level on paper (lol 😛) and GTX480 more economic than previous gen (lol x2 😛 and also not to be in green fanboy)... On preview you can have any data you want to make the hype bigger... the only fact you will get will be when it will be tested in real condition.
data/avatar/default/avatar07.webp
If the nvlink works in the sense that it makes the gpu's act as one gpu, then i am buying 2x 2080 ti's on release !
https://forums.guru3d.com/data/avatars/m/271/271877.jpg
The final price will depend on chips sizes... if they don't cut them, they will be expensive, at least based on Quadro price list. And Nvidia loves sofware and hardware crippled performance based on segmentation, so I can expect the usual anti consumer tactics (PhysX, Gameworks, Gtx 970, ...). But won't be a problem for them, they will keep their throne because two games uses the magic raytraced light efects, and will sell 2080tis in pairs. But nice technology indeed 😛
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
Only Intruder:

I agree somewhat, it'll be a few generations until we see what these new features will bring us and how well they'll be implemented. The problem I see with the ray-tracing processing being leveraged (I would argue that it's simplified ray tracing, low pass with AI processing on top) is that it's exclusive to nVidia through their gameworks platform and if the rumours are true, there will be further segmentation with nVidia hardware as well (GTX vs RTX) so only the very top end cards will see this capability. The problem this brings, as we've seen with anything with Gameworks integration is, we see only few games with the enhanced graphics and fairly often, particularly if games are ported from consoles (which really shouldn't be difficult to port anymore given the current gen consoles are x86) often with crippling results (from both of the main hardware manufactures and again, only high end NV cards only just being able to view the results). Lets also be honest here, the majority of games available today are made for the consoles, the only difference is PC gaming allows for various configurations of quality (for the better usually of course) but what will be the point of this new technology if it's going to be inaccessible to the majority? Perhaps I'm being cynical but I can't help but fear we're going to see more of the same - PhysX and Gameworks being locked away and once again with the ray tracing package being a part of gameworks, stifling the innovation that nVidia are actually providing and it and makes me think of this quote "the left hand doesn't know what the right hand is doing" - in this case the left hand of nVidia is innovation, making great hardware and technology but the right hand is the greed and monopoly, keeping it locked away, not giving the innovation what it deserves. So what do I mean by all this? Well while I can appreciate that yes, people will be expected to pay high prices for the new features, it's understandable, the problem is, we still wont see the innovations for many many many years, not because of the prices themselves but because of the exclusivity of the features and it's resultant platform availability and ecosystems and the rather small marketshare in the grand scheme of computer gaming - unless nVidia have a hand during game development, we're just not going to see the ray-tracing unless the next gen consoles also leverage this hardware. Tl;dr I just really really really hope nVidia's innovation in ray-tracing wont follow the same history of PhysX.
Yes, I think a lot of it will come down to when & if consoles receive ray tracing cores & tensor cores - then it's sure to take off and become a major part of a gaming GPU, until that point I think it could be quite niche and not used much, which is why I think I'm gonna probably skip this next generation until I see what's happening.
alanm:

So by the time a few games use it, we could be on the heels of Turings successor which will improve upon it further. So ray tracing on Turing may not be anything for gamers to be exited about.
Yes, my thoughts too. (ties in with my paragraph above too).
https://forums.guru3d.com/data/avatars/m/250/250418.jpg
Pascal 471 mm2 Turing 754 mm2 I wish they stopped making it bigger and bigger to get more performance. Not even dreaming I can afford that... Also, I thought Tensor Cores were not ideal for gaming? I could be wrong.
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
[youtube=iRpRr4oehgY] ...and this