NVIDIA Very Likely To Reveal GeForce GTX 1180 on August 20

Published by

Click here to post a comment for NVIDIA Very Likely To Reveal GeForce GTX 1180 on August 20 on our message forum
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
H83:

I understand perfectly that for your specific case an 15% improvement is not worth buying a new card, it´s normal. But i think guys like you who want better cards to power 4K screens are the minority so Nvidia is probably going to neglect them in this next generation... Specially because they little to no competition... Personally i´m going to skip this gen unless Nvidia releases something really great an unexpected.
Yes I understand your reasoning. Although I just wanted to say, gaming at 1440p here, so I would have even more time to wait with an upgrade than at 4K. But... there's the itch... 😀
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
If its a weak performance gain, then maybe in another 6-12 months we'll see the real beef @ 7nm. Timed in accordance with what AMD is doing in that time frame.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
Yeah... Duh... 😀.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
H83:

I understand perfectly that for your specific case an 15% improvement is not worth buying a new card, it´s normal. But i think guys like you who want better cards to power 4K screens are the minority so Nvidia is probably going to neglect them in this next generation... Specially because they little to no competition... Personally i´m going to skip this gen unless Nvidia releases something really great an unexpected.
Nvidia should stop pushing 4K TV's and monitors then. They basically removed my ability to buy an HDR 1440P because the only ones available are FreeSync, which they don't support - but then if performance is only 15% on the new cards, they also removed my ability to play games at 4K 60Hz on their new monitors. Even at the Titan V's 30-35% gain, 4K 60 just becomes playable in most newer titles, beyond 60Hz still relatively unplayable. So they'd basically need 50%+ performance at minimum to cross that magic "threshold" where it becomes worth it. Honestly the G-Sync lock in is really starting to get frustrating.. the selection of G-Sync monitors is pathetic compared to FreeSync and the advantages G-Sync had are gone now unless you're willing to shell out $2000 for a monitor no card can run in newer/high graphic titles.
https://forums.guru3d.com/data/avatars/m/122/122801.jpg
Hense why Nvidia will be selling old cards as new, they have tons of stock and no one is buying...............I wont hold my breath.
https://forums.guru3d.com/data/avatars/m/216/216349.jpg
Denial:

Nvidia should stop pushing 4K TV's and monitors then. They basically removed my ability to buy an HDR 1440P because the only ones available are FreeSync, which they don't support - but then if performance is only 15% on the new cards, they also removed my ability to play games at 4K 60Hz on their new monitors. Even at the Titan V's 30-35% gain, 4K 60 just becomes playable in most newer titles, beyond 60Hz still relatively unplayable. So they'd basically need 50%+ performance at minimum to cross that magic "threshold" where it becomes worth it. Honestly the G-Sync lock in is really starting to get frustrating.. the selection of G-Sync monitors is pathetic compared to FreeSync and the advantages G-Sync had are gone now unless you're willing to shell out $2000 for a monitor no card can run in newer/high graphic titles.
Well you can play games on a 4K monitor with a 1080Ti as long as you make some compromises with the graphics/visual options, it´s not like it´s mandatory to max out every graphical settings specially because lots of those settings offer insignificant improvements while demanding a big performance hit... As for G-Sync, let´s just say i hate proprietary stuff on PC gaming and i´m waiting for it to die as soon as possible. Nvidia has to adopt FreeSync!
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
H83:

Well you can play games on a 4K monitor with a 1080Ti as long as you make some compromises with the graphics/visual options, it´s not like it´s mandatory to max out every graphical settings specially because lots of those settings offer insignificant improvements while demanding a big performance hit...
True - I'd really like to see more reviewers try games at 4K with AA off. That often butchers performance and at 4K, the visual difference is likely too small to make the heavy performance impact worth it. I'd much rather have AA off than my frame rate dip below 45FPS.
As for G-Sync, let´s just say i hate proprietary stuff on PC gaming and i´m waiting for it to die as soon as possible. Nvidia has to adopt FreeSync!
Probably never going to happen. Nvidia would rather a technology get abandoned than imply AMD (or Intel or Microsoft or VESA or Khronos, etc, doesn't matter who) had a better way of going about something. There are only 2 reasons why I avoid Nvidia's products, and one of them is their arrogance (the other being they're a bit too expensive for my taste). To be fair, Nvidia has valid reasons for their arrogance, but it really gets in the way of progress. Just as a side note, AdaptiveSync is the VESA standard that Nvidia should support; FreeSync is AMD's branding of it, so Nvidia wouldn't use that name no matter what.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
fantaskarsef:

While this may be true, users of Maxwell cards ("generation 900") could have switched over to Pascal ("generation 1000"), if not for the still high prices. There's still users here (talking about you @-Tj- 😀) that are waiting for an upgrade, but not for MSRP. You are absolutely right, another 15% on top of Pascal could make an upgrade at MSRP more interesting for them, I didn't think about that. And Pascal upgrades to what comes next ("generation 1100") might not be that useful (especially if they pair it with a Gsync monitor already), that's also true, but most of all 4K gamers will definately want to upgrade their 1080s / 1080TIs, just to keep those 60fps without turning graphic settings down. That's basically what I was thinking about when typing my second post in this thread, I have a Gsync monitor, I don't care if I game at 142 fps or 130... that's why I was asking out loud if it really is a worthy upgrade to gain 15%, and that for me, it most likely won't be worth the money. I would just have one more reason to be happy with my monitor, it had the price of a high end card by itself, now I could justify those investments, actually 😀
Well, I don't play as much lately, but if the price is right I might buy it. I'm still thinking it will be 699$ like said at first, otherwise they won't make any real money out of it..
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
-Tj-:

I'm still thinking it will be 699$ like said at first, otherwise they won't make any real money out of it..
I have a really hard time understanding how you came to that conclusion... The vast majority of the expense you're paying for is the engineering, not the physical product itself. Seeing as this is a rebranded product and is nearing 3 years old, their engineering costs have been long paid for. Let's not forget Nvidia's huge success in the server market (where profit margins are even higher) and in cryptocurrency mining. For $700, Nvidia is making some crazy profits, and I think think multi-billion dollar net revenue is sufficient proof of that. So yes, they will be making plenty of real money out of it, and the reason it's so expensive is because of lack of competition.
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
That was a initial leak saying, 1170 for 399$.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
schmidtbag:

The vast majority of the expense you're paying for is the engineering, not the physical product itself. Seeing as this is a rebranded product and is nearing 3 years old, their engineering costs have been long paid for
Eh - it's not like Nvidia hires more people during a GPU architecture project and lays them off afterward. They added 1625 employees to their R&D team over the last three years and their reported R&D costs have a increased by 31% since last year - compared to 30% for the six years prior. The increased spending was for upgrades to facilities in their new HQ. So yeah, Pascal R&D costs probably paid for but Nvidia is increasing spending significantly... what all those people are working on is anyone's guess.
data/avatar/default/avatar35.webp
It's bizarre these cards are finally coming out. I remember doing my build this time last year and people were saying hold off on getting a gpu, the new cards are right around the corner. I can't justify getting a new card, or losing Freesync for it, but I hope for other's sake the 11x0 cards have a nice performance bump and the pricing is reasonable (though I'm not that optimistic on either front).
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
meh dont want to know i just spent to much on this 1070ti
https://forums.guru3d.com/data/avatars/m/242/242573.jpg
EL1TE:

NVIDIA doesn't make good GPUs to make us happy they make it to be the best so that people buy their products and it makes investors happy,
I always get a kick out of people who believe making customers and investors happy are mutually exclusive, while implying moral superiority according to the company they align themselves with. I bet you also believe the rich becoming richer results in everyone else becoming poorer. Every company is in business to make money and anyone claiming otherwise is insane. Provide the best product with the least amount of expenses in order to make a profit and drive the industry forward through reinvestment in new products. Which in turn has the goal of making consumers happy, investors happy, and employees happy. This never ending nonsense of people claiming how bad Intel/nVidia is and how virtuous AMD is (and those who buy their products) is just idiotic, and always has been.