NVIDIA: G-SYNC Validation Runs into 94% failure rates

Published by

Click here to post a comment for NVIDIA: G-SYNC Validation Runs into 94% failure rates on our message forum
https://forums.guru3d.com/data/avatars/m/45/45709.jpg
That's exactly why I still don't consider upgrading my monitor - with its "forced" refresh rate of 65 Hz (1080p), an old school, 24 inches IPS Acer gets me "there", into the virtual reality. Early adopters will be biting the dust again...
data/avatar/default/avatar16.webp
I have an Alienware AW2518HF the one with freesync, it's not g-sync validated but i never noticed any issue with using gsync compatible on it, i would call this bullshit, but well maybe i'm not seeing if i have any problems or something like that, i use gsync compatible from the first driver they introduced it, and i never noticed any flickers or i dont know whatever problems.
data/avatar/default/avatar10.webp
Flickering issues can be inconsistent, and sometimes only occur under certain image conditions. They have special test scenarios designed to provoke such problems more commonly then they occur in real games. But of course that doesn't mean it never happens in real gaming. Some people also don't perceive some types of flicker that much, in other people it just causes headaches without being obvious to "see". So there can be a range of issues that not everyone is even susceptible to equally. I wish I could see their reasons for failing a screen, but for obvious reasons they are not going to publish that. Personally, I applaud their efforts of testing and validating every possible screen ever. Reviewers have in the past not been that great at checking Adaptive Sync support/behavior.
https://forums.guru3d.com/data/avatars/m/79/79740.jpg
What goes on in Nvidias monitor testing lab. [youtube=Um0PFoB-Ls4]
https://forums.guru3d.com/data/avatars/m/226/226864.jpg
I think I've read that Nvidia also requires that the monitor has VRR enabled out of the box for Gsync compatible certification. This eliminates all the monitors that have a switch/OSD setting to turn the feature on and come off by default (which apparently many monitors do), even if they happen to work flawlessly once it's enabled. Many also don't have the required range for certification (but one can adjust that with CRU usually). Small things like that probably factor in for so many monitors failing certification. I think in the long haul the certification standards are a good thing, as they will inspire display manufacturers to raise their standards.
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Nvidia originally built their GPUs so that they would achieve adaptive sync with the very special hardware required inside the screen. AMD built their GPUs so that far less is demanded from the screen. Thus it makes perfect sense AMD video cards would work with far more screens sufficiently than Nvidia video cards, of course forgetting the samples that are rejected due to obvious spec reasons, like not a wide enough sync range; they wouldn't even need to test those. It's like taking a very low riding car and an SUV and comparing the kinds of roads they can survive.
https://forums.guru3d.com/data/avatars/m/175/175902.jpg
Kaarme:

Nvidia originally built their GPUs so that they would achieve adaptive sync with the very special hardware required inside the screen. AMD built their GPUs so that far less is demanded from the screen. Thus it makes perfect sense AMD video cards would work with far more screens sufficiently than Nvidia video cards, of course forgetting the samples that are rejected due to obvious spec reasons, like not a wide enough sync range; they wouldn't even need to test those. It's like taking a very low riding car and an SUV and comparing the kinds of roads they can survive.
Anyway both Freesync and Gsync are bullshit to make people buy monitor... On paper it is nice but in real world, it's not so nice. The bad thing with Gsync is the hardware inside monitor (remain of their pro sync add on) again nice idea but make an extra cost that final consumer shouldn't pay 🙁 Freesync isn't so free either and is not so compatible with all, but as less expensive it is the more popular. For main consumer it should be free AND compatible with all brand of GPU and supported by everyone, then it's might be the evolution.
data/avatar/default/avatar19.webp
rl66:

Anyway both Freesync and Gsync are bullshit to make people buy monitor... On paper it is nice but in real world, it's not so nice.
Only someone that has never experienced the difference on a good monitor would say that. You can of course get by without it. Just get a 60Hz display, turn down graphic settings until its 100% over 60 FPS, and it'll be smooth as well. But that "solution" is full of compromise.
data/avatar/default/avatar03.webp
rl66:

Anyway both Freesync and Gsync are bullshit to make people buy monitor... On paper it is nice but in real world, it's not so nice. The bad thing with Gsync is the hardware inside monitor (remain of their pro sync add on) again nice idea but make an extra cost that final consumer shouldn't pay 🙁 Freesync isn't so free either and is not so compatible with all, but as less expensive it is the more popular. For main consumer it should be free AND compatible with all brand of GPU and supported by everyone, then it's might be the evolution.
I agree with @nevcairiel that only someone who hasn't experience adaptive sync makes such post, especially 4 years after Freesync! AMD Freesync is the software implementation of the VESA Adaptive Sync standard existed on embedded systems more than 10 years now. The whole "Freesync" is a marketing nomenclature to go against Nvidia Gsync which requires an expensive module manufactured by Nvidia. In addition, the Nvidia Gsync laptops, is using exactly the same VESA Adaptive sync "Freesync", and not Gsync module. That is why is flaky at best on laptops, especially with the crap Nvidia drivers who do not have even the simple functionality to cap FPS. Something found on AMD drivers for many years now. -------------------- As for the article, until Nvidia puts an FPS limiter on their drivers, they won't validate any "Freesync" monitor properly. Because all Vesa Adaptive Sync monitors fail on exactly the same issues Laptop Gsync monitors have, yet Nvidia brushes them (gsync laptop monitor) issues under the carpet...... (And I have a Predator 15, so talking from experience).
https://forums.guru3d.com/data/avatars/m/115/115710.jpg
Well, most adaptive sync monitors are 75Hz at best with very limited range. Of course those are going to fail the certification but it doesn't really mean that they will not work.
data/avatar/default/avatar07.webp
The nvidia certification is obviously a good thing as it basically forces monitor makers to put the effort in to pass the certification on their future monitors - if they don't bother they've obviously just done the minimum to stick the freesync label on so avoid. Anyway Nvidia are finally giving us a reliable guarantee of freesync quality, something AMD should have been doing since they first introduced freesync. This is something AMD must change if they want to take gpu share - AMD initiatives (freesync) should not require and Nvidia stamp on them for us to know that they actually work properly, the AMD stamp needs to be a stamp of quality.
data/avatar/default/avatar02.webp
Dribble:

This is something AMD must change if they want to take gpu share - AMD initiatives (freesync) should not require and Nvidia stamp on them for us to know that they actually work properly, the AMD stamp needs to be a stamp of quality.
Apparently FreeSync2 has higher requirements to be able to use that marketing term, but its not controlled by AMD, so any manufacturer can claim to fullfill those and get the rubber stamp, even if its buggy.
https://forums.guru3d.com/data/avatars/m/147/147322.jpg
Is there any list with what display monitors they tested?
https://forums.guru3d.com/data/avatars/m/248/248994.jpg
Dribble:

The nvidia certification is obviously a good thing as it basically forces monitor makers to put the effort in to pass the certification on their future monitors - if they don't bother they've obviously just done the minimum to stick the freesync label on so avoid. Anyway Nvidia are finally giving us a reliable guarantee of freesync quality, something AMD should have been doing since they first introduced freesync. This is something AMD must change if they want to take gpu share - AMD initiatives (freesync) should not require and Nvidia stamp on them for us to know that they actually work properly, the AMD stamp needs to be a stamp of quality.
G-sync screens are priced to high heavens compared to Freesync, on average. It's a big factor manufacturers can get away with far less for Freesync. Following your plan, there would be very few adaptive sync screens available at all because they'd all be expensive and out of reach for most. However, the reality is different, and so even Jensen needed to bow his leather jacket head and make Freesync screens a possibility in Nvidia drivers, which is the total opposite of what you said. Btw, Nvidia rejecting a screen doesn't mean it won't work perfectly with an AMD video card. In fact people have found some rejected screens work just fine with Nvidia video cards as well. Nvidia set super high standards like a child throwing a tantrum because she was forced to do something against her will. You can bet Nvidia would have rather kept selling their overpriced modules to screen manufacturers.
data/avatar/default/avatar14.webp
Michal Turlik 21:

Just as stated by someone here...could not the VESA Adaptive Sync standard be enough to enjoy a better experience? Do the players (AMD and NVIDIA) have always to omit standards and try to get the ball by their own side? I am guessing how much NVIDIA is supporting the VESA adaptive sync in their drivers...same as for AMD. By now I am very happy with my almost 100% rgb compliant Asus PA329Q which (fortunately) has no support for any of both.
Freesync is VESA Adaptive Sync. Is not something different, and Nvidia is supporting it on their laptops also, who do not have GSync module. Even Intel said that it will be supporting Freesync/VESA Adaptive Sync, so does the Xbox One.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Fediuld:

Freesync is VESA Adaptive Sync. Is not something different, and Nvidia is supporting it on their laptops also, who do not have GSync module.
Technically Freesync builds on VESA adaptive sync. It has features that aren't specified or required in VESA's spec so it is a bit different but for the most part, yeah, it's the same thing.
https://forums.guru3d.com/data/avatars/m/115/115710.jpg
Denial:

Technically Freesync builds on VESA adaptive sync. It has features that aren't specified or required in VESA's spec so it is a bit different but for the most part, yeah, it's the same thing.
Freesync is adaptive sync the same way G-Sync Compatible is adaptive sync. They are just branding their adaptive sync implementations with fancy names.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Anarion:

Freesync is adaptive sync the same way G-Sync Compatible is adaptive sync. They are just branding their adaptive sync implementations with fancy names.
I get that but both of them add features and have their own requirements over Adaptive Sync. For example you can have an adaptive sync monitor without LFC - which still makes it adaptive sync but does not make it G-Sync Compatible.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
That is a shockingly high amount of fails, but, I think it's good Nvidia is so picky about certification. At this rate, the only reason to pay the high premium for G-Sync is so you know you're getting the best possible experience. However, I think it'd be worth it for Nvidia to have 3 separate tiers for certification. So, 94% of displays might fail for a gold rating, but, a 600 nit display should still qualify for a bronze.
https://forums.guru3d.com/data/avatars/m/271/271877.jpg
Gsync module does only one thing and it cost a premium to consumers equal to an small computer: APU 50 €, MB 50€, 2x4 GB Ram DDR4 50€, 250 GB NVMe M.2 disk 50€... People should have learn something about Nvidia's marketing machine after GTX 970 fiasco, or the GPP nonsense...