Nvidia not going to support VESA Adaptive Sync

Published by

Click here to post a comment for Nvidia not going to support VESA Adaptive Sync on our message forum
https://forums.guru3d.com/data/avatars/m/142/142454.jpg
The big question is if either provides a superior gaming experience as Nvidia imply Gsync does. If there is no effective difference between the 2 solutions, one day gsync will die. It might take a long time but I don't think even Nvidia can steamroller VESA in the long run. I just saw that ATI part of VESA but Nvidia are not.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
As the Emperor has forseen it.
https://forums.guru3d.com/data/avatars/m/39/39210.jpg
Why support something that doesn't make you extra $$? Typical
https://forums.guru3d.com/data/avatars/m/68/68055.jpg
Money must flow!
https://forums.guru3d.com/data/avatars/m/248/248627.jpg
Goodbye Nvidia and hello AMD all this propietary crap from Nvidia is getting real annoying AMD actually seems like they care even if it was Nvidia that pushed them to.
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
Why support something that doesn't make you extra $$? Typical
Fact is G-Sync is here today. Where is Free Sync?
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
They should make a different approach, and have some sort of adapter between the Monitor and Videocard. But i'm not sure if that's possible.
https://forums.guru3d.com/data/avatars/m/231/231799.jpg
The big question is if either provides a superior gaming experience as Nvidia imply Gsync does. If there is no effective difference between the 2 solutions, one day gsync will die. It might take a long time but I don't think even Nvidia can steamroller VESA in the long run. I just saw that ATI part of VESA but Nvidia are not.
are there any compare tests or its still too soon for them?
data/avatar/default/avatar04.webp
yea AMD is kinda misleading on the freesync front imo. the full support from the 7000 series up till R7/R9 only covers video playback and powersaving. the ONLY amd cards that support freesync as an equal to gsync are the GCN cards (R9 295X2, 290X, R9 290, R9 285, R7 260X and R7 260) the rest doesn't support gaming freesync. where g-sync actually supports the GTX 6xx 7xx 8xx and 9xx cards. strange that they'd make it a standard when there's only 6 model's of gfx card currently on the planet that support it but i guess it's free.
https://forums.guru3d.com/data/avatars/m/142/142454.jpg
are there any compare tests or its still too soon for them?
I'm sure that if you look into the details of how both technologies work, you could more or less say exactly what the differences would be between them. As for a real-world test, I'm not sure there are any stand-alone displays that support ASync yet. I guess to have a truly fair test, you would need a display that supports both GSync and ASync. I wonder if that will ever happen?
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
yea AMD is kinda misleading on the freesync front imo. the full support from the 7000 series up till R7/R9 only covers video playback and powersaving. the ONLY amd cards that support freesync as an equal to gsync are the GCN cards (R9 295X2, 290X, R9 290, R9 285, R7 260X and R7 260) the rest doesn't support gaming freesync. where g-sync actually supports the GTX 6xx 7xx 8xx and 9xx cards. strange that they'd make it a standard when there's only 6 model's of gfx card currently on the planet that support it but i guess it's free.
AMD has more than just graphics cards that are compatible...
AMD APUs codenamed "Kaveri," "Kabini," "Temash," "Beema" and "Mullins" also feature the necessary hardware capabilities to enable dynamic refresh rates for video playback, gaming and power-saving purposes.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Just as I see it, Gsync will never be supported by AMD, and FreeSync will never be supported by Nvidia. Actually, both systems put the customer in the same situation: You have to match your monitor with your gpu.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Just as I see it, Gsync will never be supported by AMD, and FreeSync will never be supported by Nvidia. Actually, both systems put the customer in the same situation: You have to match your monitor with your gpu.
Yeah except Nvidia has the option AMD doesn't. Terrible move by Nvidia.
https://forums.guru3d.com/data/avatars/m/258/258664.jpg
Yeah except Nvidia has the option AMD doesn't. Terrible move by Nvidia.
Well, I wouldn't have expected anything else. But as that technology basically is open to any gpu manufacturer, I guess it's implemented through the driver, correct? Well as Nvidia has the possibility to adopt, they won't do so until they've gotten their profit out of Gsync. The question is, and that's a hard one to answer, will they ever change course, and go with both gsync and freesync support once they've made millions with gsync?
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
Well, I wouldn't have expected anything else. But as that technology basically is open to any gpu manufacturer, I guess it's implemented through the driver, correct? Well as Nvidia has the possibility to adopt, they won't do so until they've gotten their profit out of Gsync. The question is, and that's a hard one to answer, will they ever change course, and go with both gsync and freesync support once they've made millions with gsync?
Unless their hand is forced, it's unlikely. NVidia looks for ways to lock people into their hardware. Proprietary "features" are the easiest way to do that. If they "develop" something, and make it a proprietary "feature" then people who want it are stuck with their hardware. If they make it open, then people have more choice. While choice is great for consumers, it's bad for businesses. The more choice people have, the more businesses have to be worried about. Of course, I do find it quite funny that nobody has an issue with NVidia pulling crap like this, but if AMD were to have done it, this thread would have gone into full blown AMD flaming by now.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
The AMD Radeon™ R9 295X2, 290X, R9 290, R9 285, R7 260X and R7 260 GPUs additionally feature updated display controllers that will support dynamic refresh rates during gaming. no other graphics card, and who in their right mind would use apu for gaming, it wont be nearly enough for 1080p so there is no point of that at this time, as for why nvidia won't support iot, they need to make back their investment with gsync and untill that happens they will no way in hell support freesync, and the big thing is as mentioned many times gsync is here now, while freesync is still on the drawing board(late stages but still no market implementation) asfor the cost, gsync already replaces the scaler in those displays, the extra cost is premium form manufacturers not nvidia(small part maybe 50bucks) and the same will be true for free sync i can guarantee that
I mean yeah the Freesync module definitely costs something, AMD has already indirectly stated that. And honestly we have no idea how freesync compares to G-Sync. If it will have the same game/driver issues. If there is added latency compared to it, etc. I still think it's a ****ty move by Nvidia though. I don't want to be locked into their video cards because I bought a monitor. They should support both and just continue to make G-Sync better through updates and give a valid reason as to why it's better than the alternative.
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
The AMD Radeon™ R9 295X2, 290X, R9 290, R9 285, R7 260X and R7 260 GPUs additionally feature updated display controllers that will support dynamic refresh rates during gaming. no other graphics card, and who in their right mind would use apu for gaming, it wont be nearly enough for 1080p so there is no point of that at this time, as for why nvidia won't support iot, they need to make back their investment with gsync and untill that happens they will no way in hell support freesync, and the big thing is as mentioned many times gsync is here now, while freesync is still on the drawing board(late stages but still no market implementation) asfor the cost, gsync already replaces the scaler in those displays, the extra cost is premium form manufacturers not nvidia(small part maybe 50bucks) and the same will be true for free sync i can guarantee that
Not everyone games at 1080P. There's a large portion of the market still using 720 and 1050 displays. The A8 and A10 APUs are just fine for 720P and light 1080P gaming. Most of those using AMD's iGPU aren't trying to play the latest games at high resolutions or high graphics settings. Instead, they find settings that are actually playable.
https://forums.guru3d.com/data/avatars/m/115/115616.jpg
The AMD Radeon™ R9 295X2, 290X, R9 290, R9 285, R7 260X and R7 260 GPUs additionally feature updated display controllers that will support dynamic refresh rates during gaming. no other graphics card, and who in their right mind would use apu for gaming, it wont be nearly enough for 1080p so there is no point of that at this time, as for why nvidia won't support iot, they need to make back their investment with gsync and untill that happens they will no way in hell support freesync, and the big thing is as mentioned many times gsync is here now, while freesync is still on the drawing board(late stages but still no market implementation) asfor the cost, gsync already replaces the scaler in those displays, the extra cost is premium form manufacturers not nvidia(small part maybe 50bucks) and the same will be true for free sync i can guarantee that
Cost-wise G-Sync is so expensive partially because it's FPGA and not in high-volume production yet. In a long run, it may be a really chip solution, as you get a single chip replacing a few. It may also result in decreased input lag compared to multi-chip solution. Of course, manufacturers are free to use more robust chips in place of several simple ones in their adaptive vsync solutions.
data/avatar/default/avatar02.webp
Unless their hand is forced, it's unlikely. NVidia looks for ways to lock people into their hardware. Proprietary "features" are the easiest way to do that. If they "develop" something, and make it a proprietary "feature" then people who want it are stuck with their hardware. If they make it open, then people have more choice. While choice is great for consumers, it's bad for businesses. The more choice people have, the more businesses have to be worried about. Of course, I do find it quite funny that nobody has an issue with NVidia pulling crap like this, but if AMD were to have done it, this thread would have gone into full blown AMD flaming by now.
well AMD has there fair share of exclusive tech to counter nvidia's physx and... TXAA? are the exclusive game tech's for nvidia i think. wile AMD has it's tressFX and mantle since dx is so bad. doesn't seem different from amd saying nope to cuda/physx and forcing havoc down our throat most of the time. in the end it's quite simple amd doesn't want to use stuff made by nvidia, and nvidia doesn't want to use stuff made by amd.
https://forums.guru3d.com/data/avatars/m/69/69564.jpg
It's not different at all. It's typical eye for an eye bull and the only ones who stand to loose is us - the customer