GeForce GTX 2070 and 2080 Could Launch Summer 2018
Click here to post a comment for GeForce GTX 2070 and 2080 Could Launch Summer 2018 on our message forum
Glidefan
Moderator
Hm. If it's Pascal then it means no Tensor so perhaps no raytracing capabilities? Unless that DX12 feature doesn't need tensor or tensor like units.
I would expect them to have a card with that feature on since they showed it working and engines are trying to get ready for it.
AMD said that they are also working on that if i remember correctly.
Hilbert Hagedoorn
Administrator
DX12 Raytracing is supported, the RTX Library as well. However, some features could be accelerated by Tensor cores. So no, Tensor cores are not mandatory for DX RT.
ttnuagmada
I was under the impression that the ability to use AI to approximate a large chunk of the raytracing was the only thing that would make it possible to even do in realtime. Without the tensor cores and FP64, I didn't think Volta was really very different from Pascal apart from being on a newer fab process.
Noisiv
Volta's SMs are ~50% more efficient than Pascal\s. And Volta itself is almost a year old(!)
So even if assume they have been doing nothing, but twiddling their thumbs, new GTX series will be at least as efficient as Volta.
And this (50-60%), is pretty much the amount of efficiency improvement necessary for little-big core (GTX 1180) to win against the last-gen BIG core (1080Ti)
fantaskarsef
1180Ti / 2080Ti / 1185Ti - 70-80% performance gain over a 1080Ti? Where's my wallet...
Ryrynz
Missed the /s there on Gb/s
ttnuagmada
Silva
I wish they would stop making bigger dies and make GPU's more affordable.
I know it won't happen unless someone releases a more powerful card for cheaper.
What about AMD? They should be working on their own GDDR6 controller, maybe Polaris refresh again?
Noisiv
16/12. 12nm is just an improvement of 16FF+
You won't get nowhere near 50% better efficiency from improvements made on the same node. ~20-25% tops
Same as from slightly lower clocks (only for Titan vs Titan, mezzanine cards are similarly clocked)
And even then, who cares about clocks if perf. and efficiency is there. It can ran at 1MHz for all I care.
Oh and... almost forgot... scratch that clock advantage completely, because Titan V comes with 1/2 FP64 and with tensor cores, which certainly weighs more in terms of efficiency than measly 120MHz.
Nah... 12nm is built on the same node as 16FF+, and TSMC calls it Kaarme
Nvidia could probably afford to charge more for the new architecture. Ethereum asics are apparently appearing, so the miner GPU demand should be losing steam somewhat. If the market is flooded with cheap second-hand, it seems like selling new stuff would be harder. So, if it's harder, they might as well sell less for more profit. It should be possible if the new generation beats the old handily in power, at least towards the upper end. Plus new architecture is always more exciting.
Who knows what's going to happen to AMD. If they haven't got anything new to offer, it's hard to see how they'd sell anything much. HBM2 probably still isn't making things easy for them. Perhaps they will just weather a near extinction of their GPU side until they are ready to make a sudden comeback like they did with Ryzen in the CPU market.
Noisiv
nz3777
Hilbert AngerDoom keeps us all up to date with pc info the least we can do is spell his name correctly.
Is that irish,scotish?
Denial
Idk everyone said 'GTX1080' wouldn't exist either because consumers might get confused by 1080p resolution, but it does so..
Irenicus
Umm, sorry but "Ampere" is where we get the word Amps, which comes from André-Marie Ampère, the father of electrodynamics, not some obscure website. I suppose you think Tesla comes from the car name? LOL
Denial
Reddoguk
It's because we want a card that finally gives unbeatable bang for buck. 1160 with 8gigs of GDDR6 beating a 1080 for 1/3 of the price. 3rd party 1160's factory OC'd closing in on 1080Ti.
Xmas release gg thanks nVidia.
1170 will come in two flavors an 8gb and a 16gb version, 1180 will be 16gb only.
fantaskarsef
Aura89
Waiting on GDDR6 for new graphics cards to release?
Yawn~
If GDDR6 wasn't such a disappointment i could see waiting, but currently, it's just so boring and barely even better (and in some cases not even better)
Hilbert Hagedoorn
Administrator
-Tj-
That cook picture would be ideal then 😀
Edit:
nvm I've read the article again, saw it at the end 🙂