NVIDIA To Launch GeForce GTX 1060 with 5GB Graphics Memory

Published by

Click here to post a comment for NVIDIA To Launch GeForce GTX 1060 with 5GB Graphics Memory on our message forum
data/avatar/default/avatar11.webp
Fender they probably retiring the fucking 3GB because TBH, we all know the real minimum is 4 now. This makes perfect sense to do.. Also maybe they have some chips that aren't 100% and this makes them viable.. regardless, I think this is a good move.
https://forums.guru3d.com/data/avatars/m/197/197287.jpg
xIcarus:

I agree with you but there are unfortunately WAY too many tech illiterate people. For example when I was little I wasn't so literate when it came to stuff like this so I bought a 5700LE instead of a regular 5700. I naively thought it's an improved/revised version of the 5700 at a better price. Had the naming scheme been proper, it would have been an FX5650 or even lower than that; something I wouldn't have considered.
But whose fault was that? Yours, for not doing any kind of research and making sure you're getting what you want, or nvidias, for making more then one product that you didn't understand? People need to take responsibility for themselves. Being naive or technologically illiterate but making decisions anyways is not an excuse. Either you know what you're buying, or you don't. You don't go to a car dealership and expect to get everything you saw in commercial on the lowest cost version of the car. You ask, do your research, and find out what features you want, don't want, what is possible, and the price differences, then you make your decision. Buying electronics, especially specific computer components, is no different. I do not feel sorry when people go out and buy a premade PC for $600 that was labeled "gaming" and then hear them complain that their computer doesn't miraculously perform as well as buying a graphics card itself worth $600, which i have many times heard people say they expected, because the computer had "gaming" on the label. Bottom line is, people will always be confused by PC component naming, no matter how "simplified" they do or do not try and make it. I have, at least on three separate occasions, been asked told this generalized statement about the GT 1030. "I have a GTX 960, but that GT 1030 is a really good price! and i can sell my GTX 960 for more then the GT 1030, and get extra performance! I'm going to get it!" Of which obviously i asked why they think a GT 1030 would out perform a GTX 960, and their reasoning: "Because it's a next-gen graphics card! mine is a 9 (or insert other number here), and that's a 10!" ....... So again, with how easy it is to figure this stuff out, with that website i posted for one, it's just not excusable to say "I didn't know what i was getting was so much worse then another product, etc."
xIcarus:

I think that they should make more use of the extra space they have. They could have easily renamed the 3GB GTX1600 to a GTX1055 in this case. Or 1070 and 1075 instead of the 1070Ti. 1085 instead of 1080Ti. Not touching 1090 since those should historically be dual-GPU cards.
I'm not saying i don't disagree with this. In my opinion, their original release should have been a GTX 1060 with 3GB and a GTX 1060 ti with 6GB. But the problem with this, is if they decide to add a product in later, like they are doing now, they would have had to think of where they were going ahead of time. I'm certain this 5GB version of this card was never even thought of when they originally released the 1060s. Now, again, if they had known, it could have been a 1060 3GB, 1060 ti with 5gb, and 1065 with 6GB. But simple fact of the matter is they don't really have that luxury currently because they went with a simple 1060 3GB and 1060 6GB when released (which again i think should have been 1060 and 1060 ti, but that still, again, would not fix any issue for adding 1060 5GB) As to a rename, that'd be a bad idea, and quite frankly, with how much better a 1060 3GB is vs a 1050 or 1050 ti, that card has no business being in the 105X line of cards. That would be super, super confusing. Though i'm not really sure what you have against the ti naming. Again, as said with any product, if i see that there is a "number here" and "same number + ti", no matter the product, i'm going to check what the differences are. That is my responsibility, no one elses. And it's not like its specific to computer components. Cars, phones, random house components, etc. etc. etc. all have different modeling numbers with letters attached and etc. My car has an S, SL, SE and EL, and those each have different features that can be added or not added. Again, its my responsibility to find out all that information, no one elses. With how common it is that virtually every product ever bought has multiple different models with similar names to denote slight and sometimes major differences, i just don't understand how people are NOT doing their due diligence.
schmidtbag:

Riiiight.... So you're telling me that you legitimately can't pay the difference between the 3GB and 6GB model? If you're on that much of a budget, how about don't get a 1060? There are plenty of really good but older GPUs out there for the same (or lower) price. This 5GB model will do nothing but close a price gap that already wasn't that big to begin with. So if this is what you are so adamant about disagreeing over, I can't imagine what other petty things you will also argue about.
I'm not sure he's saying he can't spend over $200, he's saying there's a limit on importing things to whatever country he is in that is above $200. That being said, i don't really understand this, and after looking it up, i can't really find information on this either. What i can find is that there may be, in some places, a FEE for items about $200, whereas no fee below $200, but that's not a "limit". So i don't really understand. If there are actually countries out there that limit it to $200 in value, that is pretty sad, but at the same time, i can't imagine someone living in a country where they can't get access to above $200 cards within their country, maybe for a crazy spike in price, but still.
https://forums.guru3d.com/data/avatars/m/227/227853.jpg
Aura89:

But whose fault was that? Yours, for not doing any kind of research and making sure you're getting what you want, or nvidias, for making more then one product that you didn't understand? People need to take responsibility for themselves. Being naive or technologically illiterate but making decisions anyways is not an excuse. Either you know what you're buying, or you don't. You don't go to a car dealership and expect to get everything you saw in commercial on the lowest cost version of the car. You ask, do your research, and find out what features you want, don't want, what is possible, and the price differences, then you make your decision. Buying electronics, especially specific computer components, is no different. I do not feel sorry when people go out and buy a premade PC for $600 that was labeled "gaming" and then hear them complain that their computer doesn't miraculously perform as well as buying a graphics card itself worth $600, which i have many times heard people say they expected, because the computer had "gaming" on the label. Bottom line is, people will always be confused by PC component naming, no matter how "simplified" they do or do not try and make it. I have, at least on three separate occasions, been asked told this generalized statement about the GT 1030. "I have a GTX 960, but that GT 1030 is a really good price! and i can sell my GTX 960 for more then the GT 1030, and get extra performance! I'm going to get it!" Of which obviously i asked why they think a GT 1030 would out perform a GTX 960, and their reasoning: "Because it's a next-gen graphics card! mine is a 9 (or insert other number here), and that's a 10!" ....... So again, with how easy it is to figure this stuff out, with that website i posted for one, it's just not excusable to say "I didn't know what i was getting was so much worse then another product, etc."
Not denying that it wasn't my fault, it obviously was and people should always make their own research; however the companies could be more forthcoming with their naming conventions. Right now we have a 3GB, 5GB and 6GB 1060. Many stores won't even discriminate and you'll get all 3 versions when you filter for a 1060 so you have to look well for that too. A 1055 would have made a ton more sense than a 3GB 1060. Intel is even worse at this, their mobile naming scheme is a more extreme example where stuff went wrong. You have dual core i3s, dual core i3s with hyperthreading, dual core i5s, dual core i5s with hyperthreading, dual core i7s with hyperthreading and quad core i7s with hyperthreading. As if that's not enough, some freaking Pentiums have hyperthreading too now. What the sh1t seriously. To make matters worse, the new 8th gen low power chips are quad cores but they have no mobile 6-cores in their lineup like they have on desktop. Look, all I'm saying is that I should instantly be able to figure out a product's characteristics by its name. I expect a 3GB 1060 to be a 1060 with less RAM and that's it. Instead I'm getting a dumbed down 1060 with less RAM. I also expect a bloody i5 mobile chip to NOT have hyperthreading as it's been customary until now. Well no longer, 8th gen i5 chips have hyperthreading so that's one less distinguishing feature compared to an i7. Can I figure all this out by researching? Yes. But why do I have to search for every damn chip and see its major features instead of encoding them into the naming scheme? Why don't HT processors have a different suffix? Why aren't they at least encoded by the number of cores that is consistent across different series (i3, i5, i7)? Why does a dumbed down chip (1060 3GB) have the same name as the full chip? This is simply naming sodomy to me no matter how I attempt to look at it. At least we don't have that retarded 'LE' suffix anymore. Now we have U in processors which basically is a majorly castrated chip, but it bears the same number as the full chip. What.
data/avatar/default/avatar02.webp
Ryrynz:

Fender they probably retiring the fucking 3GB because TBH, we all know the real minimum is 4 now. This makes perfect sense to do.. Also maybe they have some chips that aren't 100% and this makes them viable.. regardless, I think this is a good move.
You're probably right about that. But it makes me wonder if the 5GB version will be more powerful than the 3GB version with it only having 160 bit bus that can achieve 160 gb/s vs the 192 gb/s the 6gb and the 3 gb obtain with having the same amount of Cuda Cores as the 6GB version.