NVIDIA: G-SYNC Validation Runs into 94% failure rates

Published by

Click here to post a comment for NVIDIA: G-SYNC Validation Runs into 94% failure rates on our message forum
https://forums.guru3d.com/data/avatars/m/239/239175.jpg
spajdrik:

Is there any list with what display monitors they tested?
They would never publish that, since it would damage their partners if they've put products they sell into a shitlist.
https://forums.guru3d.com/data/avatars/m/267/267641.jpg
It would be nice, if Nvidia could release some testing utility, i know that not everything is possible to test with SW only, but it would be good start. Otherwise im glad that someone really pushing display quality..
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
sinnedone:

Nvidia is simply trying to stain the technology to paint their ecosystem in a better light.
In an alternate universe where it was 2017 again and Nvidia was going to support Adaptive Sync displays after G-Sync already launched, what should they have done differently from what they've done here?
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Denial:

In an alternate universe where it was 2017 again and Nvidia was going to support Adaptive Sync displays after G-Sync already launched, what should they have done differently from what they've done here?
They could've worked together with VESA for Adaptive Sync from the very beginning, rather than take their usual route of exclusive proprietary tech.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
schmidtbag:

They could've worked together with VESA for Adaptive Sync from the very beginning, rather than take their usual route of exclusive proprietary tech.
Okay but in 2017 AMD had already done that for them so now Nvidia wants to support VESA sync, what do they do differently?
https://forums.guru3d.com/data/avatars/m/225/225084.jpg
I think 144hz/144fps is still very difficult to reach for today's GFX cards in most recent AAA games. That's why i went for 75hz monitor because i can do 75hz/75fps in most games and somehow it's so much better than 60hz but reaching double that would need a beast of a system. Tell me those that have say a 2070/2080 can you reach 144fps in say Rage 2?
data/avatar/default/avatar38.webp
schmidtbag:

That is a shockingly high amount of fails, but, I think it's good Nvidia is so picky about certification. At this rate, the only reason to pay the high premium for G-Sync is so you know you're getting the best possible experience. However, I think it'd be worth it for Nvidia to have 3 separate tiers for certification. So, 94% of displays might fail for a gold rating, but, a 600 nit display should still qualify for a bronze.
I'd go the other way - the current spec can stay the minimum, no point lowering quality - I can't see why all future displays can't support the Nvidia spec. However if you want hdr, etc then they should have higher ratings for that. Get rid of original gsync altogether as it's dead now. I mean now you've got nvidia supporting freesync who will actually buy a gsync display - why would you buy a monitor that locks you into one gpu vendor?
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Denial:

Okay but in 2017 AMD had already done that for them so now Nvidia wants to support VESA sync, what do they do differently?
Not go against the grain by creating their own proprietary version of something that does pretty much the same thing? I'm not really sure what you're asking here or implying. To put it in a different way: AdaptiveSync and G-Sync mostly accomplish the same goal. They're similar enough that Nvidia eventually ended up supporting Free/Adaptive Sync with good results, showing that they never needed to make G-Sync in the first place. If they cooperated with VESA to perfect Adaptive Sync from the very beginning, everyone would benefit, including Nvidia. But, now that Free/Adaptive sync are actually gaining some traction, people are now questioning "why spend extra for G-Sync?" and pretty much the only reason I can come up with is "you get a certified high-premium experience" which other displays can't guarantee.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
schmidtbag:

Not go against the grain by creating their own proprietary version of something that does pretty much the same thing? I'm not really sure what you're asking here or implying. To put it in a different way: AdaptiveSync and G-Sync mostly accomplish the same goal. They're similar enough that Nvidia eventually ended up supporting Free/Adaptive Sync with good results, showing that they never needed to make G-Sync in the first place. If they cooperated with VESA to perfect Adaptive Sync from the very beginning, everyone would benefit, including Nvidia. But, now that Free/Adaptive sync are actually gaining some traction, people are now questioning "why spend extra for G-Sync?" and pretty much the only reason I can come up with is "you get a certified high-premium experience" which other displays can't guarantee.
I'm just asking how they should have entered the adaptive sync market without people thinking that they are hijacking it. You are saying they should have never made G-Sync proprietary in the first place.. but they did and that was done. After that happened, AMD was like "hey we can do that without the module and we're going to make it a standard" what should have Nvidia's answer have been? Keep in mind by this point G-Sync as a brand was already established and it had clearly defined rules in terms of supported frequencies, LFC, options above the sync range, etc. People, for example the poster I quoted, keep saying things like "nvidia is trying to stain the technology" because they have certifications and using their brand name.. what were they supposed to do, given that they already invented G-Sync as a proprietary technology?
https://forums.guru3d.com/data/avatars/m/156/156348.jpg
Still not working on my MSI Optix MAG27CQ
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Denial:

I'm just asking how they should have entered the adaptive sync market without people thinking that they are hijacking it.
Adaptive Sync is an industry standard; anyone accusing Nvidia of hijacking it would be a moron. If anything, AMD is the one who hijacked it, with their own Freesync branding. Since Intel already supported "vanilla" Adaptive Sync, Nvidia could have done the same, at which point it wouldn't make sense to accuse them of anything negative.
You are saying they should have never made G-Sync proprietary in the first place.. but they did and that was done. After that happened, AMD was like "hey we can do that without the module and we're going to make it a standard" what should have Nvidia's answer have been? Keep in mind by this point G-Sync as a brand was already established and it had clearly defined rules in terms of supported frequencies, LFC, options above the sync range, etc.
I'm not saying G-Sync shouldn't have been proprietary, I'm saying it shouldn't have existed, period (or, if it were to exist, it should've been a fork of Adaptive Sync, in the same way Freesync is). We're talking about hypothetical situations here, so I don't really get why you're drawing the line at the point when G-Sync was already released. In other words, if my point all along is that G-Sync never needed to exist and Nvidia should've worked with VESA from the beginning (as in, the beginning of the technology that drives displays to dynamically adapt to frame rate), asking what Nvidia's answer should have been after G-Sync was already established doesn't really make sense to me in this context. That being said:
People, for example the poster I quoted, keep saying things like "nvidia is trying to stain the technology" because they have certifications and using their brand name.. what were they supposed to do, given that they already invented G-Sync as a proprietary technology?
You need to back up the timeline; the issue at hand isn't from 2017. I don't know exactly what was in development first or what year it was in development, but for argument's sake, let's say Nvidia was developing G-Sync before Adaptive Sync and started development in 2014. Nvidia chose to keep their development all to themselves. Considering they were footing the bill, obviously it makes sense they would do that. However, they didn't have to do that. Before they even started working on G-Sync, Nvidia could have proposed the idea to VESA, where they, along with Intel, AMD, Qualcomm, and probably other big names like Microsoft or Apple would all contribute toward the creation of Adaptive Sync. It's worth pointing out the creation of Adaptive Sync was inevitable. So, Nvidia needlessly spend all this time and money working on a technology that pretty much everyone else worked together to replicate. This is what I meant by "working with VESA from the beginning". If Nvidia went to VESA before just going solo, they wouldn't be in this situation where they've got this technology that doesn't appear to be paying itself off and is now heading toward irrelevancy.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
schmidtbag:

You need to back up the timeline; the issue at hand isn't from 2017. I don't know exactly what was in development first or what year it was in development, but for argument's sake, let's say Nvidia was developing G-Sync before Adaptive Sync and started development in 2014. Nvidia chose to keep their development all to themselves. Considering they were footing the bill, obviously it makes sense they would do that. However, they didn't have to do that. Before they even started working on G-Sync, Nvidia could have proposed the idea to VESA, where they, along with Intel, AMD, Qualcomm, and probably other big names like Microsoft or Apple would all contribute toward the creation of Adaptive Sync. It's worth pointing out the creation of Adaptive Sync was inevitable. So, Nvidia needlessly spend all this time and money working on a technology that pretty much everyone else worked together to replicate.
Nvidia developed G-Sync first, AMD looked (literally at the event they announced it at because I remember the articles where techsites were asking AMD what they thought of Nvidia's announcement) at it and said "we don't need a module to do this" spent a year developing and launched Freesync, minus a few key features that it's added over time. During that year, they proposed adding adaptive sync as a VESA standard DP1.2a - while 1.2a was released in 2013, the Adaptive Sync option for it didn't come until over a year later in May, 2014 a year after G-Sync. It wasn't in development prior to G-Sync and it was entirely added by AMD. You say the creation of adaptive sync was inevitable but it's creation spawned from G-Sync and Freesync being out. VBLANK has been a thing far longer than G-Sync but using it to sync the display and framerate wasn't. I guess theoretically someone would have came up with it eventually but no one was talking about it prior to Nvidia. Regardless, Nvidia does what Nvidia does best and made the tech proprietary. I agree wit would have been better if they didn't, but they did. Now VESA standard comes out, Nvidia has to switch. My question, to the specific person I quoted (because he mentioned the hijacking part) was simply what could they have done better than they have now after they made the decision to support adaptive sync? The only thing I wish they've done is come out with support earlier.. other then that I think branding it G-Sync along with having a certification program for it are both fine.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Denial:

You say the creation of adaptive sync was inevitable but it's creation spawned from G-Sync and Freesync being out.
None of that changes my point (I figured G-Sync came out first anyway, but thanks for the confirmation). Adaptive Sync's creation was inevitable for 2 reasons: 1. Because Nvidia pitched it to VESA so it could be an industry standard (rather than go solo developing G-Sync). 2. Because Nvidia wanted to keep G-Sync to themselves and competitors (like AMD) weren't about to be left behind. So, because Adaptive Sync was going to exist no matter what, Nvidia could have just saved themselves the time and money by working together with VESA to create it, rather than create G-Sync by themselves and wait for the rest of the industry to come up with a response. Nvidia would be naive if they thought they were going to be the only ones with this type of technology.
My question, to the specific person I quoted (because he mentioned the hijacking part) was simply what could they have done better than they have now after they made the decision to support adaptive sync?
My response to you was the answer to that question: the better option Nvidia could have done was to never have gone solo from the moment of G-Sync's inception, and instead have proposed the idea to VESA. If you're going to ask about a hypothetical alternative timeline, why does it have to begin after G-Sync was released? I still don't fully understand why that's the moment where you draw the line. However, if we are to ask what Nvidia could have done different in 2017, I would agree there isn't anything - they already committed to their decision; a decision I personally would say was a mistake.
The only thing I wish they've done is come out with support earlier.. other then that I think branding it G-Sync along with having a certification program for it are both fine.
I agree with this.
https://forums.guru3d.com/data/avatars/m/115/115710.jpg
Denial:

I get that but both of them add features and have their own requirements over Adaptive Sync. For example you can have an adaptive sync monitor without LFC - which still makes it adaptive sync but does not make it G-Sync Compatible.
It doesn't mean that it doesn't work.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
Anarion:

It doesn't mean that it doesn't work.
I don't recall saying or even implying that it doesn't work, just that Freesync and Adaptive Sync aren't identical. Freesync is an extension to adaptive sync and offers additional features - G Sync "compatible" is the same way. That's it.
schmidtbag:

If you're going to ask about a hypothetical alternative timeline, why does it have to begin after G-Sync was released? I still don't fully understand why that's the moment where you draw the line.
Because the person I quoted is claiming that Nvidia is staining the technology - maybe I'm reading into his post a little too much but presumably he finds that the G-Sync certification program is intentionally putting down Freesync/Adaptive Sync displays in order to bolster Nvidia's proprietary G-sync solution. I'm asking how they could have done this differently? Should they not have the certification program? Should they not label the Gsync "Compatible" monitors with Gsync at all to differentiate them more? Would this including loosing the standards and allowing monitors with really bad ranges to be labeled whatever it is they come up with? My question was specific to what he was claiming and not really intended to ask the question if Nvidia should have had G-Sync proprietary to begin with - which I agree that they shouldn't have.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Denial:

Because the person I quoted is claiming that Nvidia is staining the technology - maybe I'm reading into his post a little too much but presumably he finds that the G-Sync certification program is intentionally putting down Freesync/Adaptive Sync displays in order to bolster Nvidia's proprietary G-sync solution. I'm asking how they could have done this differently? Should they not have the certification program? Should they not label the Gsync "Compatible" monitors with Gsync at all to differentiate them more? Would this including loosing the standards and allowing monitors with really bad ranges to be labeled whatever it is they come up with?
Ahh ok, I see what you mean now. Yeah, I agree with you there.
https://forums.guru3d.com/data/avatars/m/258/258688.jpg
gerardfraser:

Nvidia fooling people and are doing a good job,top class propaganda.Soon there will be no Freesync and only Gsync LOL. Anyway I am glad Nvidia finally accepted adaptive sync.Good job for them doing so,trying to discredit freesync monitors when they can actually work good with Nvidia is well typical Nvidia which does not matter to me because I made an informed decision based on experience with Freesync and Gsync monitors. For me Freesync monitor on a Nvidia card is a better experience than Gsync monitor on a Nvidia card.Simple as that for me.
Gsync, like all proprietary nVidia "technologies," exists for one purpose only--to turn a profit for nVidia. AMD comes along and offers a similar technology that does the same thing but is open-sourced instead of proprietary, courtesy AMD--meaning that anyone can adopt it commercially without charge--even nVidia--and so of course nVidia bucks the trend because it's Gsync--not AMD open-sourced Freesync--that earns nVidia a profit. Notice that no one ever claims that nVidia's expensive hardware Gsync implementations are superior to AMD's open-standard FS1&2--not even nVidia makes that claim! But nVidia can't make the claim, because, of course, Gsync isn't superior to Freesync at all as both are simply different approaches to doing the very same thing. Instead, but typically, and unfortunately for us, nVidia undertakes a 'Baghdad-Bob' type of ad campaign built around something grandly entitled "nVidia monitor certification"--a "certification" that has as its singular goal to "demonstrate" why people should purchase hardware Gsync BECAUSE "...94% of all the monitors on earth fail nVidia's certification" for open-source software Gsync support (Freesync1/2)! --While interestingly enough, that same 94% (or better) might easily pass, apparently, an AMD Freesync1/2 "certification"...:D nVidia's approach, called Gsync, consists of physical, proprietary monitor circuitry that is available for a price and that is permanent for the life of the monitor, meaning that although your monitor may surpass the longevity of your current Gsync hardware and/or your current GPU, or both, alas, your next GPU must be a nVidia GPU and it must support whatever new version of Gsync hardware exists, and etc. and and etc. ad infinitum! AMD's approach, called Freesync 1 or 2, is open source, requires no proprietary physical circuitry and is software-upgradable, etc. and etc. Somewhat comically, however, nVidia seems to be stretching to its limits to discover new and ever-more impractical and expensive ways in which to gouge its apparently unsophisticated customers 🙄 *bur-r-r-rrrrrp!* (It is not clear whether nVidia merely thinks its customers actually are unsophisticated or whether they are indeed unsophisticated, however--[copied from internal nVidia corporation memo # 22566 5/302019--for internal consumption only].) So, what's happening? Why all of this weirdness from nVidia about proprietary "monitor certifications" from on high...? Seems pretty simple to understand. nVidia is comparing its proprietary hardware Gsync specs (that have no bearing whatsoever on the quality of a hardware Gsync display versus a software Freesync1/2, "software Gsync," display) to the general specification list of monitors that have no hardware Gsync circuitry onboard! It is the display quality from a monitor that counts in the end, certainly, not any proprietary specifications compliance...;) The sole reason for nVidia's insistence on specifications compliance instead of image quality appears to be so that nVidia can find a pretext for "failing" a monitor! So, yes, these monitors may fail nVidia's contrived spec tests--but that is not the same thing as saying that these monitors are not capable of supporting Freesync1/2 to the extent that they look just as good if not better than nVidia's costly proprietary hardware Gsync implementation would look in the same monitor!...Jeez. Ex: Monitor #246 in today's test batch has failed compliance. Please remove it from the network line, thank you, and mark accordingly. Monitor #246 has output compliance instruction #12Ae when the contiguity factor of the non-specific instructional underwrite should be #11.75Aef. Although this error normally does not affect screen and/or pixel output, and is not germane to function, it does however signal that Monitor #246 has failed compliance-regulation testing. So, there are a couple of points that should be absorbed here, imo: *Failing a monitor specifications test contrived by nVidia as a non-Image-Quality test to measure proprietary hardware Gsync support tells us nothing about the quality of nVidia's hardware Gsync implementation for that monitor, compared to a Freesync 1/2 implmentation. We learn the salient but wholly unimportant facts that a given monitor fails the nVidia hardware Gsync certification compliance test, but are told nothing at all about the monitor's image quality! nVidia seemingly wishes to focus the attention of its prospective customers on specification compliance results as opposed to any actual image quality differences...;) *If nVidia was to actually show how well the open-source Freesync1/2 works, all without forcing the customer to buy expensive proprietary "solutions" like Gsync, well, soon it would become apparent to even the slowest among us that the Gsync hardware approach to adaptive sync is a the far inferior approach of the two.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
waltc3:

Gsync, like all proprietary nVidia "technologies," exists for one purpose only--to turn a profit for nVidia. AMD comes along and offers a similar technology that does the same thing but is open-sourced instead of proprietary, courtesy AMD--meaning that anyone can adopt it commercially without charge--even nVidia--and so of course nVidia bucks the trend because it's Gsync--not AMD open-sourced Freesync--that earns nVidia a profit. Notice that no one ever claims that nVidia's expensive hardware Gsync implementations are superior to AMD's open-standard FS1&2--not even nVidia makes that claim! But nVidia can't make the claim, because, of course, Gsync isn't superior to Freesync at all as both are simply different approaches to doing the very same thing. Instead, but typically, and unfortunately for us, nVidia undertakes a 'Baghdad-Bob' type of ad campaign built around something grandly entitled "nVidia monitor certification"--a "certification" that has as its singular goal to "demonstrate" why people should purchase hardware Gsync BECAUSE "...94% of all the monitors on earth fail nVidia's certification" for open-source software Gsync support (Freesync1/2)! --While interestingly enough, that same 94% (or better) might easily pass, apparently, an AMD Freesync1/2 "certification"...:D nVidia's approach, called Gsync, consists of physical, proprietary monitor circuitry that is available for a price and that is permanent for the life of the monitor, meaning that although your monitor may surpass the longevity of your current Gsync hardware and/or your current GPU, or both, alas, your next GPU must be a nVidia GPU and it must support whatever new version of Gsync hardware exists, and etc. and and etc. ad infinitum! AMD's approach, called Freesync 1 or 2, is open source, requires no proprietary physical circuitry and is software-upgradable, etc. and etc. Somewhat comically, however, nVidia seems to be stretching to its limits to discover new and ever-more impractical and expensive ways in which to gouge its apparently unsophisticated customers 🙄 *bur-r-r-rrrrrp!* (It is not clear whether nVidia merely thinks its customers actually are unsophisticated or whether they are indeed unsophisticated, however--[copied from internal nVidia corporation memo # 22566 5/302019--for internal consumption only].) So, what's happening? Why all of this weirdness from nVidia about proprietary "monitor certifications" from on high...? Seems pretty simple to understand. nVidia is comparing its proprietary hardware Gsync specs (that have no bearing whatsoever on the quality of a hardware Gsync display versus a software Freesync1/2, "software Gsync," display) to the general specification list of monitors that have no hardware Gsync circuitry onboard! It is the display quality from a monitor that counts in the end, certainly, not any proprietary specifications compliance...;) The sole reason for nVidia's insistence on specifications compliance instead of image quality appears to be so that nVidia can find a pretext for "failing" a monitor! So, yes, these monitors may fail nVidia's contrived spec tests--but that is not the same thing as saying that these monitors are not capable of supporting Freesync1/2 to the extent that they look just as good if not better than nVidia's costly proprietary hardware Gsync implementation would look in the same monitor!...Jeez. Ex: Monitor #246 in today's test batch has failed compliance. Please remove it from the network line, thank you, and mark accordingly. Monitor #246 has output compliance instruction #12Ae when the contiguity factor of the non-specific instructional underwrite should be #11.75Aef. Although this error normally does not affect screen and/or pixel output, and is not germane to function, it does however signal that Monitor #246 has failed compliance-regulation testing. So, there are a couple of points that should be absorbed here, imo: *Failing a monitor specifications test contrived by nVidia as a non-Image-Quality test to measure proprietary hardware Gsync support tells us nothing about the quality of nVidia's hardware Gsync implementation for that monitor, compared to a Freesync 1/2 implmentation. We learn the salient but wholly unimportant facts that a given monitor fails the nVidia hardware Gsync certification compliance test, but are told nothing at all about the monitor's image quality! nVidia seemingly wishes to focus the attention of its prospective customers on specification compliance results as opposed to any actual image quality differences...;) *If nVidia was to actually show how well the open-source Freesync1/2 works, all without forcing the customer to buy expensive proprietary "solutions" like Gsync, well, soon it would become apparent to even the slowest among us that the Gsync hardware approach to adaptive sync is a the far inferior approach of the two.
You always write the most contrived nonsense... you literally "burped" in your post, who does that? Why would Nvidia certify a G-Sync compatible display for image quality? The entire point is that they are guaranteeing adaptive refresh to work - not that it meets some quality metric. There are people that have reported issues on monitors not certified when force enabling it - which you can do regardless to whether it's certified or not and test it. Also they are showing how well freesync works - every single monitor that is certified works fine across brands. Done, shown, I know if I buy a monitor on that list it's going to work on Nvidia and AMD and that it's not going to have any problems on either.
data/avatar/default/avatar13.webp
DLD:

That's exactly why I still don't consider upgrading my monitor - with its "forced" refresh rate of 65 Hz (1080p), an old school, 24 inches IPS Acer gets me "there", into the virtual reality. Early adopters will be biting the dust again...
Yeah I understand that. Prior to upgrading recently I just couldn't see how much better a gaming monitor could be over a decent Dell Ultrasharp I was using. How wrong I was. My new monitor is far far superior in every way. The black level, less bleed, less IPS glow, fabulous colours. And games like Doom in high refresh rate are simply amazing!
data/avatar/default/avatar36.webp
You've also got to remember that in the early days the Nvidia hardware was required - none of the monitors could do it. Nvidia took a monitor and built the scaler themselves to make it work, then released it. If Nvidia hadn't done that the monitor makers wouldn't have done it themselves. It took years - first there was only Nvidia's gsync scalers, then AMD put in a spec for freesync, but we didn't instantly end up with amazing freesync displays, all the early ones sucked. Then given a few years and a fair amount of copying Nvidia (all the early decent freesync displays were gsync displays too) and we've got to where we are today. Same with 120hz monitors - there were none till nvidia invented 3d vision and told monitor makers we need 120hz monitors to make this work. 3D vision has now died but the whole high refresh rate low latency monitor boom that's now so key to fast paced gaming was kick started by that. Before then despite there being 120hz tv's for years monitor makers hadn't lifted a finger to transfer that to the PC market. AMD basically did the same thing with multimonitor support and eyeinfinity - it existed in the pro world but AMD pushed it into the every day pc and gamers world. Anyway got to thank Nvidia and AMD for where we are today - multi-monitor, high refresh rate, low latency and variable sync. No point trolling them for investing in future technologies and trying to make money off it. It's the same with all tech - the company inventing the tech only does it to make money so to pay for that investment they charge the earth, but eventually over time the costs drop - however without them putting the upfront money in to invent something new we wouldn't have it today at all.