PNY GeForce RTX 2080 and 2080 Ti Product Data Sheet Slips Out
Click here to post a comment for PNY GeForce RTX 2080 and 2080 Ti Product Data Sheet Slips Out on our message forum
Texter
KingK76
Texter
@airbud7: yeah but even nVidia were surprised by Titan's success, often selling them by the couple at $1k a piece...2080Ti should be interesting to behold though. 1080Ti's often hitting CPU limitations already. Turing desperately needs next-gen features that kill Pascal performance.
fry178
@Pimpiklem
complaining about a product that hasnt been released, you havent seen perform, with games not yet written.
lol, ok.
nz3777
Reading your guys posts cracks me up, Hell iam still on a 900 series 980 to be exact I see no need to upgrade unless your doing 4k max settings. But also iam 41 and my gaming days are now like once a week ( down from 24/7) my daughter took over my position lol.
As far as performance goes I saw you guys comparing gtx 1080 from What i saw a 1080 is 2x the performance of a gtx 980 so i might just get a used 1080 when my rysen build is finished. 1080ti would be super-sweet for me!
2080 needs to be at least 25%-35% faster then 1080, ti version 40%-50% in my humble opinion. Later gurus. Nz.
nz3777
Oh I almost forgot....Nvidia=Fps! These guys are a Monster Giant company they dont even need to try to sell products if they fart people will flock to come and get a smell, yes when you corner the market like that and have no compition thats what happens!
I wish i was the ceo,sigh! Lol.
gx-x
rtx2070 will be faster or similar to 1080Ti. Just look at the history. Doesn't matter what "leaks" show, or don't. My claim has ~90% chance to be true, based on past 10 years of products.
alanm
XP-200
What will be even more interesting will be the second hand market for 1070/ 1080 cards, what will be the going price for cards that might have had the guts ran out of them for mining 24/7........should be very interesting to watch.
gx-x
I am not going to write down something you can see for yourself by looking through GPU reviews on guru3d.
PS. I talked about performance per performance, not performance per dollar, watt, duck, dog, amd, sick days etc.
it's not assumption, it's a fact.
start with series 4xx and work your way up.
gx-x
gx-x
honestly, we don't know that yet. This article and what is going around now is speculation. Might be true, might not be. It's "wait and see" game at this point.
On the other hand, there are far far more 1060 cards sold than 1070s, let alone 1080. They made much more money on low end.
Andrew LB
https://hexus.net/media/uploaded/2017/3/1c9a8251-8039-4dc6-9e84-40f92178c220.png
https://hexus.net/media/uploaded/2017/3/8a5fb095-af03-4a73-8016-a95830dedfda.png
I guess you missed the part where he said those prices look to be place holders.
And even if you dont factor in inflation, the top cards today are still cheaper than they were in 2007.
wavetrex
Darren Hodgson
I'm interested in the reviews of the RTX 2080 Ti but I'm not convinced I need one, not when the GTX 1080 Ti is still a beast and offering high framerates at 2560x1440 (and often at 4K too using DSR) with 60+ fps in most games (well, those that are well optimised anyway... We Happy Few, yes, I'm looking at you!!!).
There are no new consoles due this year so all games will still be aimed at the base PS4 and Xbox One specs (with "up to" 4K enhanced ones for the Pro and X variants). That means that I would be better off waiting until next year for the 2nd generation Turing cards which would likely have a die-shrink meaning better power efficiency and higher clocks/performance. By then the PS5 and Xbox One successor will be on the horizon, if not released, which means that the extra power may be needed for multiplatform games.... or, maybe, not since I don't intend upgrading to a 4K display for a few years yet. As I stated previously, maxed out settings and high framerates are far more important to me than compromised settings at 4K and 2560x1440 does offer the perfect "sweet spot" IMO on a 27" 165 Hz display.
Denial
Denial
gx-x
Denial
That's why they continue to develop/support DX11 along side of DX12.
Aside from the CPU overhead reduction, all DX12 does is give you deeper access to the hardware. The developer has to be the one that uses that level of access to improve performance and essentially out optimize the driver developers at AMD/Nvidia. Thinking that was going to happen in any reasonable timeframe or to any real extent was only pushed by delusional forum going gamers with no understanding of how difficult that level of software development is - it certainly wasn't said by Microsoft.
Well yeah, obviously if you're expecting 100% adoption in the first few years you're going to be disappointed, but I think it's promising enough that devs will utilize it to a higher rate than previous technologies. In the professional rendering industry nearly every single first and third party renderer has pledged support for it (mostly through OptiX). It will eventually save artists a ton of time and thus money for games as well as just overall visual improvements. I don't think it matters that it's mixed with rasterization as the it's alleviating the most difficult lighting tasks - ones that are often hard to replicate and performance intensive anyway.
No they weren't lol. Like show me where Microsoft was selling DX12 as the fix for everything? All the articles that came out with DX12 said the exact opposite, that DX12 would be hard to adopt and it would take a ton of time to come to fruition - only the larger, more experienced developers would even gain capability out of it. Denial