NVIDIA: G-SYNC Validation Runs into 94% failure rates
Click here to post a comment for NVIDIA: G-SYNC Validation Runs into 94% failure rates on our message forum
DLD
That's exactly why I still don't consider upgrading my monitor - with its "forced" refresh rate of 65 Hz (1080p), an old school, 24 inches IPS Acer gets me "there", into the virtual reality. Early adopters will be biting the dust again...
liviut
I have an Alienware AW2518HF the one with freesync, it's not g-sync validated but i never noticed any issue with using gsync compatible on it, i would call this bullshit, but well maybe i'm not seeing if i have any problems or something like that, i use gsync compatible from the first driver they introduced it, and i never noticed any flickers or i dont know whatever problems.
nevcairiel
Flickering issues can be inconsistent, and sometimes only occur under certain image conditions. They have special test scenarios designed to provoke such problems more commonly then they occur in real games. But of course that doesn't mean it never happens in real gaming.
Some people also don't perceive some types of flicker that much, in other people it just causes headaches without being obvious to "see". So there can be a range of issues that not everyone is even susceptible to equally. I wish I could see their reasons for failing a screen, but for obvious reasons they are not going to publish that.
Personally, I applaud their efforts of testing and validating every possible screen ever. Reviewers have in the past not been that great at checking Adaptive Sync support/behavior.
alanm
What goes on in Nvidias monitor testing lab.
[youtube=Um0PFoB-Ls4]
Cave Waverider
I think I've read that Nvidia also requires that the monitor has VRR enabled out of the box for Gsync compatible certification. This eliminates all the monitors that have a switch/OSD setting to turn the feature on and come off by default (which apparently many monitors do), even if they happen to work flawlessly once it's enabled. Many also don't have the required range for certification (but one can adjust that with CRU usually). Small things like that probably factor in for so many monitors failing certification.
I think in the long haul the certification standards are a good thing, as they will inspire display manufacturers to raise their standards.
Kaarme
Nvidia originally built their GPUs so that they would achieve adaptive sync with the very special hardware required inside the screen. AMD built their GPUs so that far less is demanded from the screen. Thus it makes perfect sense AMD video cards would work with far more screens sufficiently than Nvidia video cards, of course forgetting the samples that are rejected due to obvious spec reasons, like not a wide enough sync range; they wouldn't even need to test those. It's like taking a very low riding car and an SUV and comparing the kinds of roads they can survive.
rl66
nevcairiel
Fediuld
Anarion
Well, most adaptive sync monitors are 75Hz at best with very limited range. Of course those are going to fail the certification but it doesn't really mean that they will not work.
Dribble
The nvidia certification is obviously a good thing as it basically forces monitor makers to put the effort in to pass the certification on their future monitors - if they don't bother they've obviously just done the minimum to stick the freesync label on so avoid. Anyway Nvidia are finally giving us a reliable guarantee of freesync quality, something AMD should have been doing since they first introduced freesync.
This is something AMD must change if they want to take gpu share - AMD initiatives (freesync) should not require and Nvidia stamp on them for us to know that they actually work properly, the AMD stamp needs to be a stamp of quality.
nevcairiel
SpajdrEX
Is there any list with what display monitors they tested?
Kaarme
Fediuld
Denial
Anarion
Denial
schmidtbag
That is a shockingly high amount of fails, but, I think it's good Nvidia is so picky about certification. At this rate, the only reason to pay the high premium for G-Sync is so you know you're getting the best possible experience.
However, I think it'd be worth it for Nvidia to have 3 separate tiers for certification. So, 94% of displays might fail for a gold rating, but, a 600 nit display should still qualify for a bronze.
Luc
Gsync module does only one thing and it cost a premium to consumers equal to an small computer: APU 50 €, MB 50€, 2x4 GB Ram DDR4 50€, 250 GB NVMe M.2 disk 50€...
People should have learn something about Nvidia's marketing machine after GTX 970 fiasco, or the GPP nonsense...