Nvidia talks Pascal 16GB Memory at 1TB/S bandwidth

Published by

Click here to post a comment for Nvidia talks Pascal 16GB Memory at 1TB/S bandwidth on our message forum
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Very much looking forward to Volta, both for architectural changes and performance gains.
data/avatar/default/avatar08.webp
Very much looking forward to Volta, both for architectural changes and performance gains.
Already looking for Volta? I'd say Pascal looks promising enough and as a 4k gamer - it looks like a must-have to me.
https://forums.guru3d.com/data/avatars/m/116/116362.jpg
Let's hope AMD brings some more competition to the table. Otherwise we'll have to shell out 1200+ bucks again for the new Titan.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Already looking for Volta? I'd say Pascal looks promising enough and as a 4k gamer - it looks like a must-have to me.
Yes already looking for Volta, since I don't see too much sense in upgrading from Maxwell to Pascal. Sadly, I wouldn't say you'll be fine with everything just buy getting yourself a single 'big' Pascal at 4K... that's why I hope Volta will bring constant 60fps to 4K with everything looking nicely. On the other hand, I am mostly hoping for architectural changes or advances, respectively in the matter of how the work queues are treated, and how the driver will work with dx12 (and I expect Pascal to still show some weaknesses at first).
https://forums.guru3d.com/data/avatars/m/164/164033.jpg
For sure Big Pascal will play stuff at 4K @ 60fps just like 980 ti does play some now. Volta will do the same but way more consistently. Next year should be interesting with Pascal and new AMD gpus + Zen.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
I personally won't be considering any new hardware of any kind until nvidia and/or AMD figure out a better way to handle 4k screens. You can keep throwing more memory (and memory bandwidth) and tighter transistor designs at these GPUs but it doesn't seem to be helping enough. 4k is hurting performance on today's hardware the same way AA did 10 years ago. Not that AA's performance hit was ever properly addressed either... In the meantime, I'm fine with 1080p.
Let's hope AMD brings some more competition to the table. Otherwise we'll have to shell out 1200+ bucks again for the new Titan.
I hope you're trolling...
https://forums.guru3d.com/data/avatars/m/142/142454.jpg
I personally won't be considering any new hardware of any kind until nvidia and/or AMD figure out a better way to handle 4k screens. You can keep throwing more memory (and memory bandwidth) and tighter transistor designs at these GPUs but it doesn't seem to be helping enough. 4k is hurting performance on today's hardware the same way AA did 10 years ago. Not that AA's performance hit was ever properly addressed either... In the meantime, I'm fine with 1080p. I hope you're trolling...
Maxwell and Fury are both built on the same transistor size as the previous generation and both significantly improve performance at 4K over the previous generation. If you increase the number of pixels by 4, you need 4X the GPU power to hit the same FPS. The same thing was the case with 1080p and will be the same again with 8K. It just takes time for GPU power to catch up. I'm with you, I'm also sticking with 1080p for the time being. I value framerate over anything else.
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
I'm ready! Can't wait to see my performance Sky Rocket with Pascal.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Maxwell and Fury are both built on the same transistor size as the previous generation and both significantly improve performance at 4K over the previous generation. If you increase the number of pixels by 4, you need 4X the GPU power to hit the same FPS. The same thing was the case with 1080p and will be the same again with 8K. It just takes time for GPU power to catch up. I'm with you, I'm also sticking with 1080p for the time being. I value framerate over anything else.
I agree, but they still fall short. I don't remember hardware struggling as much when 1080p started to get popular. The thing is, we're starting to reach the limits of Moore's Law. Back when 1080p went mainstream, we had a long way to go in terms of electrical refinement and just overall architectural design. It's going to be hard to figure out how to fine-tune something that's already near its limits. Companies can keep adding more cores or frequency but efficiency is going to be an issue.
https://forums.guru3d.com/data/avatars/m/260/260855.jpg
Pascal should allow 4K gaming with 1 GPU. It has to or it wont sell like the 980/980ti's did.
According to the Steam Hardware Survey for October 2015, less than one tenth of one percent of users are running 4K monitors:
Primary Display Resolution 1366 x 768 = 26.43% 1600 x 900 = 7.14% 1920 x 1080 = 34.90% 2560 x 1440 = 1.27% 2560 x 1600 = 0.08% 3440 x 1440 = 0.06% 3840 x 2160 = 0.07%
There are 18 times as many people running QHD displays as there are people running 4K. This idea that 4k is relevant to the actual market, rather than just the marketing, is a myth. The 900 series has sold well for Nvidia because it's a good product. They went up in VRAM which everyone wants more of, the cards run quiet and cool, they get great fps-per-dollar, the power consumption is a bonus, and there's very little competition from AMD. If they just improve performance and VRAM again, then Pascal will sell well regardless of 4K performance. Obviously 4K will be better than Maxwell, but how much better is all but irrelevant to their sales figures.
https://forums.guru3d.com/data/avatars/m/142/142454.jpg
I agree, but they still fall short. I don't remember hardware struggling as much when 1080p started to get popular. The thing is, we're starting to reach the limits of Moore's Law. Back when 1080p went mainstream, we had a long way to go in terms of electrical refinement and just overall architectural design. It's going to be hard to figure out how to fine-tune something that's already near its limits. Companies can keep adding more cores or frequency but efficiency is going to be an issue.
Going to 1080p from 1280x1024 or 1024x768 is only a 1.6 or 2.6 increase in pixels. The jump to 4K is BIG! They keep saying we're reaching the limit of Moore's law and yet it is still being hit every year and if it hasn't been hit in the desktop space, it's only because Nvidia and Intel are dripfeeding technology due to lack of competition.
https://forums.guru3d.com/data/avatars/m/134/134194.jpg
For sure Big Pascal will play stuff at 4K @ 60fps just like 980 ti does play some now. Volta will do the same but way more consistently. Next year should be interesting with Pascal and new AMD gpus + Zen.
I am looking at Volta as possible next upgrade I will skip Pascal
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Going to 1080p from 1280x1024 or 1024x768 is only a 1.6 or 2.6 increase in pixels. The jump to 4K is BIG!
That may be so, but the hardware performance has dramatically increased too. IGPs in modern PCs and tablets (if you include Tegra) can outperform some of the best GPUs back when 1080p went mainstream.
They keep saying we're reaching the limit of Moore's law and yet it is still being hit every year and if it hasn't been hit in the desktop space, it's only because Nvidia and Intel are dripfeeding technology due to lack of competition.
Reaching the limit doesn't mean we have reached the limit. There is obviously room for improvement but not much. Intel may be dripfeeding their CPUs, but not their GPUs. Also, nvidia is still actively working toward improving their hardware. AMD is plenty competitive against Nvidia - the only thing nvidia has leverage with are the 750Ti and 980Ti, and the 980Ti is too expensive for most people.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Moore's law (if you can even call it a law) is about transistor count, not performance. It's also been adjusted from a year to 18 months since it's inception. As far as the fabrication, it definitely will slow down. Intel's only real method of competing in ARM's space is with smaller processes. They've been hiring the best and brightest engineers on the planet to do just that. But classical physics can no longer be used to govern the properties of silicon (or any material really) past 14nm. So for the first time ever they are applying quantum theories that haven't been fully tested to actual physical applications. On the math end, it was easy to simulate and predict how stuff would behave up until this point. Now they have to hire physicists to solve decade old physics problems in order to continue. Unless there is some major unifying breakthrough in physics I don't see any semblance of Moore's law continuing. In fact with Intel's 14nm delay it's already dead for the most part in the consumer space.
https://forums.guru3d.com/data/avatars/m/156/156133.jpg
Moderator
According to the Steam Hardware Survey for October 2015, less than one tenth of one percent of users are running 4K monitors: There are 18 times as many people running QHD displays as there are people running 4K. This idea that 4k is relevant to the actual market, rather than just the marketing, is a myth. The 900 series has sold well for Nvidia because it's a good product. They went up in VRAM which everyone wants more of, the cards run quiet and cool, they get great fps-per-dollar, the power consumption is a bonus, and there's very little competition from AMD. If they just improve performance and VRAM again, then Pascal will sell well regardless of 4K performance. Obviously 4K will be better than Maxwell, but how much better is all but irrelevant to their sales figures.
Wow, I would have thought 720p would be the highest and not 1080p. My steam rig streams at 720p for latency purposes. I say bring it on, I'm finally starting to dive into 4k gaming and I'm loving it so far.
https://forums.guru3d.com/data/avatars/m/186/186763.jpg
I'm more than likely going to be buying one of these next year, really need to replace my two 7970s, I want one Single awesome GPU now!
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Moore's law (if you can even call it a law) is about transistor count, not performance. It's also been adjusted from a year to 18 months since it's inception.
Transistor count is often (not always) directly correlated to performance. If you can fit more transistors into a smaller package under the same architecture, you will (generally speaking) get more performance. But suppose we were to reach the physical limitation of silicon transistors: an architecture will have then reached it's maximum speed. If you want more performance out of the same design, you'll have to either: * Increase the frequency (which draws more power) * Add more cores (which also draws more power) * Add more cores but lower the frequency (which won't always help performance) * Or optimize the software (very undependable) Otherwise, you'll have to design a new architecture. But honestly, there's only so much you can do to get the perfect ratio of performance, TDP, and cost. There's still a long way until we reach that limit, but performance improvements have been noticeably slowing down. We're going to have to move on from silicon eventually. I think there's plenty of room for silicon to get us decent performance at 4K resolutions, but I personally think it's going to require some re-thinking of GPU architectures; I don't think smaller fabrication nodes, tri-gate transistors, or HBM2 memory are enough.
https://forums.guru3d.com/data/avatars/m/142/142454.jpg
That may be so, but the hardware performance has dramatically increased too. IGPs in modern PCs and tablets (if you include Tegra) can outperform some of the best GPUs back when 1080p went mainstream.
Possibly but in-game graphics have also improved. Battlefield 4 or GTA V would be unplayable (with high graphics settings) @ 1080 on a 8800 GTX. Since 4K has been generally available, (Guru3d only started doing 4K reviews with the release of the 780TI), the best GPUs have improved by about 50%. That's nowhere near the 400% needed and 400% wont be seen for quite a long time.
https://forums.guru3d.com/data/avatars/m/186/186763.jpg
On thech forums there is to much "hype" for 4k and 1440k, while there are less than 2% of people using 1440p and less than 0.3% using 4K. HD and Full HD is dominant, HD because of laptops still using 1366x768 and most desktops are 1080p.
To much hype!? Not sure what your talking about here. I game at 1440p and have to say I much prefer the 27" screen size and the higher res, have no interest in going back to 1080p. No doubt I'll feel the same if I go 4K why would i want to go back! Or the same as 120Hz, 144Hz and 165Hz once you go higher dam near no one wants to go back, its just better lol Its just not mainstream yet, mostly because it requires a lot of money to do, especially 4K.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
I game at 1440p and have to say I much prefer the 27" screen size and the higher res, have no interest in going back to 1080p. No doubt I'll feel the same if I go 4K why would i want to go back! Or the same as 120Hz, 144Hz and 165Hz once you go higher dam near no one wants to go back, its just better lol
I see what you mean, but you'd be surprised how quickly you can adapt. Try playing an N64 on an old CRT TV and you're probably going to think "holy hell this is an ugly awful experience" but 3 hours later and suddenly it's not so bad anymore. Our brains are great at filling in the gaps and refining data that doesn't exist, it just takes a moment. Many people don't really see the reason to upgrade because 1080p is usually "good enough". 4k is incomparably better looking than 1080p, but until you see it for yourself you'll never think that, because your brain does such a good job at making 1080p look better than it really is. This is one of those things where "ignorance is bliss" is very true.