NVIDIA GTX 1170 Alleged Benchmark Leaked - beats 1080 Ti

Published by

Click here to post a comment for NVIDIA GTX 1170 Alleged Benchmark Leaked - beats 1080 Ti on our message forum
https://forums.guru3d.com/data/avatars/m/263/263841.jpg
Yeah.... IMO there is no way the bean counters would allow the 1170 to come shipped with 12 gigs of ram let alone gddr6! 1180 maybe.
https://forums.guru3d.com/data/avatars/m/248/248627.jpg
Initially they perform a little less or on par with the ti's but tend to beat them later on as drivers mature. As for these results I dont buy it 16gb vram just seems like too much for me and wouldnt this be slow for gddr6 as it's only at 10gbps.
data/avatar/default/avatar10.webp
EL1TE:

Except it has been like this for years? The 1070 beat the 980 Ti also? The 970 also beat the 780 Ti because there was no 800 series for desktops, nothing new here. I dunno where people who posted here calling this BS have been, new to PC hardware?
You are absolutely right, I wouldn't even expect any other way. I don't know why some are calling BS, maybe they are 1080Ti owners in denial 😀
https://forums.guru3d.com/data/avatars/m/260/260048.jpg
More hype BS.
https://forums.guru3d.com/data/avatars/m/271/271131.jpg
If this article and this: http://www.guru3d.com/news-story/photo-of-a-nvidia-gpu-engineering-board-fitted-with-gddr6-spotted.html ... together will happen? Turing with GDDR6? But if this IS the case: Wouldn't we run into the next crysis because GDDR6 chips will not be broadly available at first? But WOW, 1170 beats a 1080ti? This sounds very nice. If prices would be in a reasonable range (not some overhyped 1500$ thing), say ~400 to 500 for 1170, then: http://channeleye.co.uk/wp-content/uploads/2013/02/shut-up-and-take-my-money.png Regardless if it has GDDR6 or HBM2 or GDDR5X or GDDR5, I don't care about the tech specifically as long as it is fast enough, what could be archived with any of those. The same goes with the chip itself. EDIT: In the first screenshot in the article, the 16.384MB ... is that ... RAM or GDDR? System mem or graphic mem? It's mem from the card, right? 16 gigs .... I come from times where this was 1 - 16 megs! Please tell me I am NOT wrong! 😀
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
The odds of that being true are less than zero.
data/avatar/default/avatar38.webp
Extraordinary:

Would 3DMark even know it was a GTX 1170? I have vague memories of unreleased cards showing as some generic placeholder rather than listing the correct model? Could be remembering wrongly
Its feasible to assume that whoever gets their hands on a pre-release GPU also got a driver to go with it.
https://forums.guru3d.com/data/avatars/m/63/63170.jpg
icedman:

Initially they perform a little less or on par with the ti's but tend to beat them later on as drivers mature. As for these results I dont buy it 16gb vram just seems like too much for me and wouldnt this be slow for gddr6 as it's only at 10gbps.
Yup. This would equate to having the same Memory bandwidth as the 1070/1080, and GDDR5X at 10Gbps... There is no way the same memory bandwidth is going to be able to feed enough shaders to do 1080Ti speeds.
https://forums.guru3d.com/data/avatars/m/115/115462.jpg
Glottiz:

You are absolutely right, I wouldn't even expect any other way. I don't know why some are calling BS, maybe they are 1080Ti owners in denial 😀
IIRC the 970 was at most equal to a 780Ti, same with 1070 and 980Ti... 980&1080 were ~20% better than 780Ti&980Ti. But hey, maybe my memory deceives me, but I did have both 970s and 1070. Either way, I'd be happy for more horsepower, even on the "lower" end upcoming cards, but I have a feeling that we're still a while away from launch as nvidia will probably like to clean current stock first. Especially since they're basically competing with themselves at this point.
https://forums.guru3d.com/data/avatars/m/63/63215.jpg
The next lot of AMD cards are on the way, I wouldn't say there's no competition by the time these products are released.
https://forums.guru3d.com/data/avatars/m/226/226864.jpg
HWgeek:

BS, NV has no need to unleash such performance boost while there is still no competition to 1080TI, I think that 20%~30% performance boost is more reasonable (1170 vs 1070 / 1180 vs 1080 etc).
If you consider that Nvidia is pushing their 4k 144Hz display tech, however, it would make sense to have video cards that can actually feed such monitors 144 frames per second at great detail levels. If their upcoming top video card (xx80 Ti or whatever) would indeed be capable to do so with ultra settings, then it would make sense that the xx70 tier is around the 1080Ti level of performance to drive 4k on ultra detail at around 60 frames per second (and a xx80 at around 100-120FPS). I'm not saying this will be the case, but it would make sense to me, if they were indeed capable to pull it off. They might very well be, depending on how much potential they had been holding back with Pascal and how well the optimization process has gone since. Such a boost in performance from one series to another certainly wouldn't be a novum either. Another thing to consider is pricing and how much more a new xx70 would cost compared to a 1070 at release. If it costs considerably more (if Nvidia's recent pricing strategy is any indication, it likely will), performance must be considerably better as well. That said, it might all just be fabricated and actual performance might be worse or better. At this point it's all speculation. We'll see once the card is released to know more. 😉
data/avatar/default/avatar22.webp
Anyone who supports this is real I would like to remind you that the Bios on GPUs can be amended to display what ever card they want yes? As long as the drivers are compatible with the product code the string value can change and display even Vega 64 or "Panos". This is a GTX1080ti with different Bios header value. It takes 5 minutes to replace the name. Also the way 3dMark is working, it should display that the benchmark wasn't valid since the card product code should have been new and unknown at the point of the benchmark. It even does so on new drivers all the time for heaven sake. Don't be that gullible.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Seems to good to be true. And with the lack of any real competition, i really don´t see Nvidia pushing the envelope this hard. In case this is legit, then i can´t wait for the next Guru3d christmas contest!
https://forums.guru3d.com/data/avatars/m/63/63170.jpg
HardwareCaps:

the question is when.... AMD doesn't seem to be close to a major new release... nothing that could threaten nvidia at least
Yup. the only thing to come from AMD GPU's this year should be a 12nm refresh of the Polaris GPU's. Early next year they might have a thing to show, and I suspect they are keeping some stuff very quiet in case they need it around Xmas time 😉
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
rock1m1:

This has always been the case. For example current gen goes as follows: 1060 >= 980 1070>= 980ti 1080 >1070 1080ti>1080
Well tbh 980ti will beat 1070 when both get an decent oc. Reason why 1070 at stock is ahead is because of the clock difference. Unless they have similar clock difference or way more cuda cores I can't see the same repeating with 1170. Unless they pulled some magic out of their ass and gained in cuda core efficiency.
https://forums.guru3d.com/data/avatars/m/254/254725.jpg
What's next, 24GB x80 🙄?
https://forums.guru3d.com/data/avatars/m/231/231931.jpg
EL1TE:

Except it has been like this for years? The 1070 beat the 980 Ti also? The 970 also beat the 780 Ti because there was no 800 series for desktops, nothing new here. I dunno where people who posted here calling this BS have been, new to PC hardware?
Actually the 1070 is slower than 980Ti when both are clocked at max, same story for the 980 and 1060. A 1170 won't beat a 2100mhz 1080Ti unless the architecture is radically different.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
EL1TE:

Except it has been like this for years? I dunno where people who posted here calling this BS have been, new to PC hardware?
I see where you're coming from, but:
HardwareCaps:

for Pascal we saw a huge node improvement, from 28nm to 16nm. Maxwell was a huge jump with new architecture design, I just don't see that going for next gen cards. All indicators show a refreshed pascal with some extra features and improved node "12nm" (still same tech). also I'd say they next gen cards are still far away, Nvidia has no reason to release cards while Pascal stock is high and there's no competition.
I think it is very reasonable to expect the 1170 to beat (or at least be close to) the 1080, but I don't think it's likely that it'll beat the 1080Ti. EDIT: If the 1170 does beat a 1080Ti, I'd be curious how much more expensive it'd be vs the 1070.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
The only way the 1170 beats the 1080Ti is if it ships at ~500-600mm2, which would mean 1180 at ~800mm2. 12nm even on the 6.5T library doesn't offer much in the way of power/speed improvements.. 10% at most.
https://forums.guru3d.com/data/avatars/m/239/239175.jpg
Why is everyone so riled up about this? Even if this is fake, the 1170 beating, or at least matching the 1080 Ti is what one would expect. Actually people would raise eye brows if this wasn't the case. What's so weird about it? I don't get it :-/