GeForce RTX 2080 TimeSpy Result Set Leaks - Titan Xp performance

Published by

Click here to post a comment for GeForce RTX 2080 TimeSpy Result Set Leaks - Titan Xp performance on our message forum
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
EL1TE:

Sigh, i hate times like these, where's AMD at? If AMD wasn't this slow i'm sure that even NVIDIA would release something much better, we'll know for sure how good these cards are in about 2 weeks i guess. :/ I'm getting triggered that there's nothing worth upgrading to considering the prices of the 2080Ti, because the GTX1080 is getting on my nerves due to being subpar for 3440x1440 ;_;
I think the prices would be different but I doubt they'd release something much better. Not much more to do with the current node and 7nm isn't ready yet for chips anywhere near this big.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
People here sure are pessimistic... Despite having fewer CUDA cores than a 1080Ti while mostly being the same architecture (excluding the raytracing stuff), it performs better. Sure, its price is terrible, but people are still pre-ordering it, so unfortunately the price is justified.
data/avatar/default/avatar05.webp
should have called it the 1180, lmao
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
schmidtbag:

People here sure are pessimistic... Despite having fewer CUDA cores than a 1080Ti while mostly being the same architecture (excluding the raytracing stuff), it performs better. Sure, its price is terrible, but people are still pre-ordering it, so unfortunately the price is justified.
I don´t think they are pessimistic they are just "furious" with Nvidia´s pricing scheme and i can´t blame them. The good thing about gaming is that´s a very affordable hobby compared to others but companies like Nvidia are changing that...
https://forums.guru3d.com/data/avatars/m/196/196426.jpg
schmidtbag:

so unfortunately the price is justified.
By that logic a 2080 Ti should have costed umm... 12 million dollars ? Because it is probably 100.000 times faster than A Riva TNT ( https://www.tomshardware.com/reviews/comparison-graphics-cards-nvidia,96-13.html ) The point of technical progress is to push the PERFORMANCE up while reducing price ( https://en.wikipedia.org/wiki/Ephemeralization ). The bigger and bigger prices have nothing to do with the advancement in performance, but everything to do with "economics", "capitalism" and (lack of) "competition". And also NV being greedy assholes, and Jensen needing moar'n'moar leather jackets.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
schmidtbag:

People here sure are pessimistic... Despite having fewer CUDA cores than a 1080Ti while mostly being the same architecture (excluding the raytracing stuff), it performs better. Sure, its price is terrible, but people are still pre-ordering it, so unfortunately the price is justified.
Eh, it's really not even close to the same architecture. The cache is double the size, double the bandwidth and unified. It has a new memory controller and a new video decoder. It regained hardware scheduling to support independent warp execution. It completely lost all it's ILP ability (the architecture is no longer superscalar) but each partition within the SM now has its own dispatch/warp scheduler. INT/FP are completely separated within the SM (but cannot be issued instructions simultaneously). It supports INT8/INT4 and FP16 (RPM). Each SM partition also comes with Tensor cores as well. The architecture also supports variable rate shading in hardware. Overall Pascal -> Turing is probably a larger change than Kepler -> Pascal.
wavetrex:

By that logic a 2080 Ti should have costed umm... 12 million dollars ? Because it is probably 100.000 times faster than A Riva TNT ( https://www.tomshardware.com/reviews/comparison-graphics-cards-nvidia,96-13.html ) The point of technical progress is to push the PERFORMANCE up while reducing price ( https://en.wikipedia.org/wiki/Ephemeralization ). The bigger and bigger prices have nothing to do with the advancement in performance, but everything to do with "economics", "capitalism" and (lack of) "competition". And also NV being greedy assholes, and Jensen needing moar'n'moar leather jackets.
I mean the prices mostly have to do with the fact that the chip is 65% larger, which leads to lower yields per wafer and the 50% increase in transistors while the manufacturing cost per transistor has been stagnating for sometime.
data/avatar/default/avatar40.webp
EL1TE:

Sigh, i hate times like these, where's AMD at? If AMD wasn't this slow i'm sure that even NVIDIA would release something much better, we'll know for sure how good these cards are in about 2 weeks i guess. :/ I'm getting triggered that there's nothing worth upgrading to considering the prices of the 2080Ti, because the GTX1080 is getting on my nerves due to being subpar for 3440x1440 ;_;
AMD doesn't have 1080ti competitor but it still hangs pretty well and surpases the 1080 at 4k at over 70fps.. Strange Brigade 4k with async compute: https://static.techspot.com/articles-info/1685/bench/4K-.png
data/avatar/default/avatar25.webp
EL1TE:

Sigh, i hate times like these, where's AMD at? If AMD wasn't this slow i'm sure that even NVIDIA would release something much better, we'll know for sure how good these cards are in about 2 weeks i guess. :/ I'm getting triggered that there's nothing worth upgrading to considering the prices of the 2080Ti, because the GTX1080 is getting on my nerves due to being subpar for 3440x1440 ;_;
Then get a 1080Ti if you find the Rtx prices hard to swallow - may even get a second card (second hand) if the rig can take it. I've done 1080Ti since they arrived, and got no issues running 4K Ultra. Everyone whines about the prices of the RTX generation.. these cards have more cudas than previous gen, performs better per core AND comes with loads of the tech that enables Ray tracing... The price is pretty much where it needs to be. For anyone who wish to have more performance than they had should go single 1080Ti, or SLI if they want it cheap - seing as prices now drop a lot, specially second hand.. For those who cherish technological advancement, and understand how pricing works.. the 20x0 lineup is a very nice range of cards 🙂
data/avatar/default/avatar06.webp
Denial:

Eh, it's really not even close to the same architecture. The cache is double the size, double the bandwidth and unified. It has a new memory controller and a new video decoder. It regained hardware scheduling to support independent warp execution. It completely lost all it's ILP ability (the architecture is no longer superscalar) but each partition within the SM now has its own dispatch/warp scheduler. INT/FP are completely separated within the SM (but cannot be issued instructions simultaneously). It supports INT8/INT4 and FP16 (RPM). Each SM partition also comes with Tensor cores as well. The architecture also supports variable rate shading in hardware. Overall Pascal -> Turing is probably a larger change than Kepler -> Pascal.
But sadly the changes aren't made with gaming in mind, regardless what nvidia might say... the rt / tensor cores makes a huge difference for quadro and tesla use, but it's a gimmick on geforce cards... geforce cards would have benefitted alot more by scrapping the rt / tensor cores, and getting more shader cores instead. (no, it wouldn't be better for raytracing, but who is actually going to use that anyways... we want more performance in 100% of the games, rather better graphics in 0,001% of games)
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
im pretty convinced this is the 2070, not the 2080.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
H83:

I don´t think they are pessimistic they are just "furious" with Nvidia´s pricing scheme and i can´t blame them. The good thing about gaming is that´s a very affordable hobby compared to others but companies like Nvidia are changing that...
Nobody is forcing you (not you, specifically) to buy a 2080+. With the exception of the mining craze, there has always been affordable GPUs to give hobbyists an adequate gaming experience. I'm sure the 2060 will be priced reasonably and will probably offer modest 2K gaming performance. That's ideal for hobbyists. Nvidia's series that end in 80 or higher aren't for hobbyists, they're for enthusiasts. If you're enthusiastic enough about their products, the price doesn't matter, and clearly that seems to be the case due to the preorders.
wavetrex:

By that logic a 2080 Ti should have costed umm... 12 million dollars ? Because it is probably 100.000 times faster than A Riva TNT ( https://www.tomshardware.com/reviews/comparison-graphics-cards-nvidia,96-13.html )
Um... no? Because if nobody is willing to pay for it then the price isn't justified... Value is determined entirely by those who are willing to pay for it. People are pre-ordering a GPU without any official benchmarks and are willing to pay the exorbitant pricing. Therefore, whether you like it or not, the price is in fact justified.
The point of technical progress is to push the PERFORMANCE up while reducing price ( https://en.wikipedia.org/wiki/Ephemeralization ).
I don't disagree with that, but that's besides the point.
The bigger and bigger prices have nothing to do with the advancement in performance, but everything to do with "economics", "capitalism" and (lack of) "competition". And also NV being greedy assholes, and Jensen needing moar'n'moar leather jackets.
I also agree there.
Denial:

Eh, it's really not even close to the same architecture. The cache is double the size, double the bandwidth and unified. It has a new memory controller and a new video decoder. It regained hardware scheduling to support independent warp execution. It completely lost all it's ILP ability (the architecture is no longer superscalar) but each partition within the SM now has its own dispatch/warp scheduler. INT/FP are completely separated within the SM (but cannot be issued instructions simultaneously). It supports INT8/INT4 and FP16 (RPM). Each SM partition also comes with Tensor cores as well. The architecture also supports variable rate shading in hardware. Overall Pascal -> Turing is probably a larger change than Kepler -> Pascal.
Ah I wasn't aware there were that many differences (I knew of many of those, but not all of them). For the record, I was mostly focusing on the architectural differences that were specific to gaming (exclusing the tensor cores) but I guess considering everything you said, there's still enough differences.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
HardwareCaps:

2025 mhz?! the boost clocks are 1800 on the FE edition which is Overclocked already, Haha the most retarded generation EVER!
The advertised boost clock is not the maximum possible boost.
EL1TE:

Sigh, i hate times like these, where's AMD at? If AMD wasn't this slow i'm sure that even NVIDIA would release something much better, we'll know for sure how good these cards are in about 2 weeks i guess. :/ I'm getting triggered that there's nothing worth upgrading to considering the prices of the 2080Ti, because the GTX1080 is getting on my nerves due to being subpar for 3440x1440 ;_;
You know nvidia doesn't even compete with AMD? it never competed with ATI either, its always been competing with itself
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Dragam1337:

But sadly the changes aren't made with gaming in mind, regardless what nvidia might say... the rt / tensor cores makes a huge difference for quadro and tesla use, but it's a gimmick on geforce cards... geforce cards would have benefitted alot more by scrapping the rt / tensor cores, and getting more shader cores instead. (no, it wouldn't be better for raytracing, but who is actually going to use that anyways... we want more performance in 100% of the games, rather better graphics in 0,001% of games)
Nvidia was already playing with the SM layout in Pascal (4:1 vs 5:1 in Maxwell) in an attempt to extract more ILP out of modern games. You can't infinitely scale core count and expect game performance to scale with it. They had to make a change at some point - maybe a little premature but it was definitely coming. Few other things: RT "cores" are a misnomer - they're not discreet cores, just a function of INT/ALU changes within the SM. Tensors are discreet but relatively small within the SM. The biggest die size changes is from the cache doubling. RPM, which is now a feature of both Vega/Turing should promote developers to utilize much faster, lower precision shaders when necessary (next generation of consoles should be getting this as well). FC5 is the only current game that utilizes it AFAIK but I think more will in the future now that both vendors support it in consumer parts. The TLP changes should allow DX12 developers to extract more performance out of the architecture in mixed compute/graphics scenarios. We've heard relatively nothing about Variable Rate Shading and how that's going to be utilized. DLSS can be pretty huge going forward as it's relatively easy to implement (very similar to TAA) and Nvidia seems intent on training more games. I think overall the release is too expensive and honestly probably a little too late in 16/12nm lifespan but I think the changes we are seeing here were coming one way or another. I also think the features that will be added to this card through NGX will add value over its lifecycle - all of which will be based on the deep learning additions to the hardware. That being said I'm probably not buying it unless they drastically cut the MSRP. 2080Ti needs to be $800 or less or it's a no-go for me. 1080Ti more than covers my QHD needs for now and foreseeable future and I can't afford both a $1200 video card and a $2000 4K 144hz monitor. One, if not both of those prices need to drop for me to consider the package.
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
If this is an overclocked GTX 2080, then it's only 16% faster than a Guru3d overclocked GTX 1080: https://www.guru3d.com/articles_pages/asus_geforce_gtx_1080_strix_oc_11_gbps_review,38.html The Maths: 10030 / 8665 = 1.16 = 16% faster That 16% figure is kinda what people were expecting when looking at the raw specs of the GTX 2080. I'm not impressed yet, need to see proper reviews, conflicting ideas on potential performance from different sources currently.
data/avatar/default/avatar08.webp
There is a deal on a Vega 64 Sapphire Nitro +LE for $520. I think I am getting it. I play 1440p and it will be more than enough for me. I was hoping the 2080 would have been more interesting.
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
the real issue (imho) is the performance of the RTX 2070. that is the sweet spot in the price/performance ratio for Nvidia... the xx70 always is. only we hardcore geeks give a crap about the RTX 2080/ti. especially at that price. the budget gamer could build an entire rig for the price of the high end cards. the average enthusiast is the market for the $300-400 cards. and as we all know, the higher the price, the fewer the buyers. nobody (in the consumer realm) needs a flagship product to game, even competitively. just match your monitor to your card if you want to squeeze out the most FPS. that way you do not need to turn off the eye candy and frankly you can get an excellent G-Sync monitor (at 1440p) for less than the RTX 2080/ti.
data/avatar/default/avatar21.webp
If AMD could put out cards the compete with 2060/2070 at much lower prices, they will rake $$$.
data/avatar/default/avatar26.webp
This should've been the 2070 based on the increase in performance of the last gen cards. If this is really the 2080, then price/performance-wise, this new gen cards fall way behind the previous one.
https://forums.guru3d.com/data/avatars/m/271/271560.jpg
Embra:

There is a deal on a Vega 64 Sapphire Nitro +LE for $520. I think I am getting it. I play 1440p and it will be more than enough for me. I was hoping the 2080 would have been more interesting.
spot on! add in the lower cost of Freesync and you get a fantastic experience for hundreds of dollars less. i'm an Nvidia guy, but my mind is open...especially if i need to replace my rig or build one for someone else (Xmas time every year).