AOC Adds Three G90 3-sided frameless gaming monitors

Published by

Click here to post a comment for AOC Adds Three G90 3-sided frameless gaming monitors on our message forum
data/avatar/default/avatar31.webp
So basically the same as the old monitors, but since they are frameless, they are more expensive. Definitely worth it when you are a gamer and all you care about is a cheap performing monitor...
https://forums.guru3d.com/data/avatars/m/225/225706.jpg
From a productivity point the frame is good as well - less of a dark area - contrast between screen and frame. Less eyestrain and distance to move your eyes - keep more in peripheral vision. Higher resolution would be nice though 🙂 For me, I'd pay more for the frame being like that.
https://forums.guru3d.com/data/avatars/m/166/166706.jpg
From what I understand G-Sync just died
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
Raider0001:

From what I understand G-Sync just died
Why?
https://forums.guru3d.com/data/avatars/m/166/166706.jpg
Neo Cyrus:

Why?
Does it offer any advantages over freesync now? its just more expensive The END
data/avatar/default/avatar19.webp
Raider0001:

From what I understand G-Sync just died
Educated consumers still buy nVidia. In fact, most everybody does. Logic.
https://forums.guru3d.com/data/avatars/m/166/166706.jpg
sammarbella:

It also offers the unique "advantage" to enjoy his adaptive sync feature ONLY with less performing (less FPS) LOL
Who invented that bullshit ? because I came here prepared [youtube=2CE-wSU1KMw]
https://forums.guru3d.com/data/avatars/m/166/166706.jpg
sammarbella:

AMD PR stunt did marvels with some ppl. You came blinded like the AMD PR stunt. It was a blinded and completely subjective "test" because both monitors were locked at the exact same Hz to avoid the need to show AND unlock FPS. The winner was obviously the cheaper Freesync monitor price tag not Vega GPU. My 5000 dollars car will always win a blind performance test at 30 kilometer per hour locked speed vs a 100000 dollars Ferrari at the same locked speed. It's cheaper and has the same speed for less price! LOL
you are wrong (they were comparing technologies not graphic cards)... on the video according to you > they came blinded.., is it correct ? Oh that test is reproducible... nvidia never did their own version...
data/avatar/default/avatar08.webp
Nothing of interest here, move along now................
https://forums.guru3d.com/data/avatars/m/166/166706.jpg
TimmyP:

Educated consumers still buy nVidia. In fact, most everybody does. Logic.
I know that but, I would not buy vauxhall astra diesel over a mustang per say just because vauxhall is more efficient and its faster top speed, there are other factors...unlike with cars we just have 2 gpu brands
data/avatar/default/avatar32.webp
I still see a frame (albeit a smallish one) on those other three sides. I therefore call shenanigans on this "frameless" claim.
https://forums.guru3d.com/data/avatars/m/166/166706.jpg
Nolenthar:

The accurate and unbiased truth is that gsync has more variable framerate over free-sync, with capabilities down to 30 hz and up to 240hz whereas free sync monitor generally have a lot less leeway (most are 48 to 75 Hz). GSync is also a "premium" technology which is often integrated to higher quality monitor whereas Freesync is a bit of hit and miss and can be integrated to cheap panels. Not mentioning the fact that Nvidia has the uncontested performance crown. So yeah, where you might have been right saying Freesync is helping making adaptive refresh technology affordable and mainstream, it will take some time before Gsync dies. It might just do indeed, as more expensive TV sets release with Freesync and force Nvidia to support it, but until then, Gsync is well present and the big stars (4K and Ultra Wide HDR monitors are all releasing with Gsync).
Did you read the content of this news? there are monitors with Freesync of 30Hz - up to performance crown of a given GPU is not a factor of a given adaptive sync technology performance... these are 2 separate things G-Sync has no meaning or purpose it is wasted cash and resources you can have a ~10 dollar 30Hz-240Hz Freesync monitor driver chip (which is just new version of a 10 dollar non freesync chip) or a 200 dollar G-Sync board same spec
https://forums.guru3d.com/data/avatars/m/166/166706.jpg
LordKweishan:

Freesync vs GSync: One of the key aspects most people forget or don't even know about is that you need to OVERDRIVE the voltage to the pixels currently for higher refresh rates. This creates unique PROBLEMS for variable refresh monitors since the amount of voltage to apply changes depending on how quickly it needs to respond. This can result in smeared or distorted colors. ...GSync as a hardware module has things like this in mind...
https://imgur.com/download/lh0vJAl what is this sorcery doing in the menu of my cheap freesync panel ? oh no, nvidia is not the owner of the overdrive feature ?!
LordKweishan:

**Combining the larger RANGE of brightness to darkness that HDR offers along with the variability of each frame it MAY be that a hardware module like GSYNC is needed for the best experience.
HDR avalible with freesync 2.0 enabled panels
Nolenthar:

My post was more general than those few monitors which indeed have a good range but having Freesync on a monitor doesn't mean you'll have such a variable range. GSync guarantees it.
Since when G-Sync is a quality certificate ? because my brother has one of those expensive G-Sync enabled Asus laptops - it doesn't have overdrive function and that matrix is really not good Do we have to reach a conclusion of which adaptive sync technology is better for a 5 second per frame slide-show smoothness ?
Lee83ant:

... Freesync monitors may be priced better then G Sync but does that really matter when you cant buy a AMD GPU in the first place?
Yes it matters a lot because freesync doesn't go anywhere but a performance crown might in the near future
https://forums.guru3d.com/data/avatars/m/132/132389.jpg
Raider0001:

Does it offer any advantages over freesync now? its just more expensive The END
It works on nVidia cards. nVidia has the large majority of the market which would want any type of adaptive sync. And yes, according to every analysis ever, GSync is more consistent across brands due to nVidia's strict requirements and that otherwise worthless module that jacks up the price even more. GSync isn't going anywhere anytime soon.
data/avatar/default/avatar18.webp
Raider0001:

From what I understand G-Sync just died
Raider0001:

Does it offer any advantages over freesync now? its just more expensive The END
Raider0001:

Did you read the content of this news? there are monitors with Freesync of 30Hz - up to performance crown of a given GPU is not a factor of a given adaptive sync technology performance... these are 2 separate things G-Sync has no meaning or purpose it is wasted cash and resources you can have a ~10 dollar 30Hz-240Hz Freesync monitor driver chip (which is just new version of a 10 dollar non freesync chip) or a 200 dollar G-Sync board same spec
I know you're here simply trying to confirm your biases without actually looking for legitimate answers. Others have given you the advantages of G-Sync over FreeSync and why it's not dead. Going to reiterate, including advantages of FreeSync vs. G-Sync, so that it's all here: 1) Abundance: FreeSync >>> G-Sync. Advantage: FreeSync. By a long shot. 2) Cost: The G-Sync premium is generally $200-$250. Advantage: FreeSync. By a long shot. 3) Refresh rate range: G-Sync: 30Hz refresh rate minimum is a requirement, regardless of the monitor in question, regardless of the refresh rate maximum (60, 100, 120, 144, 165, 180, 240). All monitors support going below the minimum limit without disabling G-Sync functionality. FreeSync: Nothing is a requirement. Some Korean monitors even have a laughable 48-60Hz range. Low Framerate Compensation (LFC) is only enable when the spread between the minimum and the maximum is 2.5x. You can adjust the range via CRU, and for several monitor models, can actually achieve the 2.5x spread (but not necessarily for every sample of that monitor model). For some monitors like the MG278Q, if you want 144Hz as the refresh rate maximum, then your minimum HAS to be 57Hz. Some samples can hit 55Hz, others might just miss that 57Hz (e.g. 58Hz). Advantage: G-Sync. By a long shot. 4) Response times: G-Sync: G-Sync modules are fine-tuned for each panel that is G-Sync certified. Variable overdrive is implemented so that when your refresh rate changes, you don't get ghosting / overshoot. Static overdrive on a variable refresh-rate monitor is a bad idea, and Nvidia know this from the start. Very infrequently do you get complaints of G-Sync monitors having flicker, except on menus and cases where the framerate goes down to zero or close. FreeSync: Nothing is guaranteed. It is up to the monitor manufacturer to implement overdrive correctly. So, you get a bunch of monitors that have variable overdrive working, another bunch that has ghosting / overshoot, another bunch that has borked overdrive settings when FreeSync is enabled, etc... Much more frequent complaints of flicker on FreeSync monitors, sometimes going as far as affecting an entire model. Advantage: G-Sync. By a long shot. 5) Strobing: G-Sync: Every single monitor (perhaps with an exception or two) implements ULMB. With TN panels, ULMB supports higher refresh rates, closer / equal to the maximum the monitor is capable of in normal / G-Sync mode. With IPS / AHVA panels, often lower refresh rates than what the monitor is capable of in normal / G-Sync mode. ULMB generally works, but is not necessarily perfectly implemented on every model. FreeSync: Nothing is guaranteed. It is up to the manufacturer to implement strobing or not - some implementations work great, others are laughable. Advantage: G-Sync. By a long shot. 6) Compatibility: G-Sync works all the way back to a GTX 680 (release in March 2012). FreeSync works all the way back to a 290X (released in November 2013). Advantage: G-Sync, if ever so slightly now (the 680 simply outdated - the 290X is still a very capable card).
Raider0001:

https://imgur.com/download/lh0vJAl what is this sorcery doing in the menu of my cheap freesync panel ? oh no, nvidia is not the owner of the overdrive feature ?!
You're being dense on purpose. Reread what he said. He talked about variable overdrive. The option you have just showed us controls overdrive level, which could be variable (probably not), static (highest probability), or not working at all (still likely with some FreeSync monitors ...).
Since when G-Sync is a quality certificate ? because my brother has one of those expensive G-Sync enabled Asus laptops - it doesn't have overdrive function and that matrix is really not good Do we have to reach a conclusion of which adaptive sync technology is better for a 5 second per frame slide-show smoothness ?
Nice of you to talk about G-Sync as a whole then summon ONLY a particular subset of G-Sync monitors - laptops. Not only are they a relatively new breed, they also lack the G-Sync module that I assume you want dead (understandable, given its cost - not understandable, given variable overdrive - which could perhaps be implemented GPU-side with a display standard update, or not). If the display does not give you a menu with an overdrive function, it does not mean that it lacks overdrive - that would be madness, particularly if the display in question has an IPS-type panel. As for the "matrix" (I assume you mean the panel?) not being good - G-Sync is not necessarily a guarantee of good panel quality. It is more-so a guarantee of fluidity and cleanliness of motion - something gamers tend to care more about (although I disagree with the trajectory many gamers take with TN panels with absolutely horrible color presets but "low" response times).
Yes it matters a lot because freesync doesn't go anywhere but a performance crown might in the near future
Time is money, and waiting for AMD to catch up might be worth less to someone than just ponying up the $200-$250 extra for a G-Sync monitor and getting a guarantee of motion smoothness / cleanliness and pairing that up with a GPU that performs (often much) better than what the competition from AMD is capable of (e.g. Vega 56 / 64 vs. 1080Ti). Now with the latest Titan V release, although that GPU is clearly not meant for gaming, and yields are horribly low, it is an indicator that Nvidia are (if slightly) more than a generation ahead of AMD. Their GTX1080 released in the middle of last year (2016). AMD's competition arrived a year later, with quite worse power draw, and comparable (and sometimes worse / better) performance. Let's not talk about the laughable Vega 64 - that card is just nonsense. If Nvidia manage to get yields up in the next few months - year, then by the time AMD has any response to the GTX 1080Ti (if they can save themselves from Vega), Nvidia would already have a big, fat Volta GPU on the market. So, in conclusion, it really doesn't make sense to talk of a shifting performance crown when that performance crown is not likely (read: at all) to be moving heads any time soon.
https://forums.guru3d.com/data/avatars/m/166/166706.jpg
yasamoka:

I know you're here simply trying to confirm your biases without actually looking for legitimate answers. Others have given you the advantages of G-Sync over FreeSync and why it's not dead. Going to reiterate, including advantages of FreeSync vs. G-Sync, so that it's all here: 1) Abundance: FreeSync >>> G-Sync. Advantage: FreeSync. By a long shot. 2) Cost: The G-Sync premium is generally $200-$250. Advantage: FreeSync. By a long shot. 3) Refresh rate range: G-Sync: 30Hz refresh rate minimum is a requirement, regardless of the monitor in question, regardless of the refresh rate maximum (60, 100, 120, 144, 165, 180, 240). All monitors support going below the minimum limit without disabling G-Sync functionality. FreeSync: Nothing is a requirement. Some Korean monitors even have a laughable 48-60Hz range. Low Framerate Compensation (LFC) is only enable when the spread between the minimum and the maximum is 2.5x. You can adjust the range via CRU, and for several monitor models, can actually achieve the 2.5x spread (but not necessarily for every sample of that monitor model). For some monitors like the MG278Q, if you want 144Hz as the refresh rate maximum, then your minimum HAS to be 57Hz. Some samples can hit 55Hz, others might just miss that 57Hz (e.g. 58Hz). Advantage: G-Sync. By a long shot. 4) Response times: G-Sync: G-Sync modules are fine-tuned for each panel that is G-Sync certified. Variable overdrive is implemented so that when your refresh rate changes, you don't get ghosting / overshoot. Static overdrive on a variable refresh-rate monitor is a bad idea, and Nvidia know this from the start. Very infrequently do you get complaints of G-Sync monitors having flicker, except on menus and cases where the framerate goes down to zero or close. FreeSync: Nothing is guaranteed. It is up to the monitor manufacturer to implement overdrive correctly. So, you get a bunch of monitors that have variable overdrive working, another bunch that has ghosting / overshoot, another bunch that has borked overdrive settings when FreeSync is enabled, etc... Much more frequent complaints of flicker on FreeSync monitors, sometimes going as far as affecting an entire model. Advantage: G-Sync. By a long shot. 5) Strobing: G-Sync: Every single monitor (perhaps with an exception or two) implements ULMB. With TN panels, ULMB supports higher refresh rates, closer / equal to the maximum the monitor is capable of in normal / G-Sync mode. With IPS / AHVA panels, often lower refresh rates than what the monitor is capable of in normal / G-Sync mode. ULMB generally works, but is not necessarily perfectly implemented on every model. FreeSync: Nothing is guaranteed. It is up to the manufacturer to implement strobing or not - some implementations work great, others are laughable. Advantage: G-Sync. By a long shot. 6) Compatibility: G-Sync works all the way back to a GTX 680 (release in March 2012). FreeSync works all the way back to a 290X (released in November 2013). Advantage: G-Sync, if ever so slightly now (the 680 simply outdated - the 290X is still a very capable card). You're being dense on purpose. Reread what he said. He talked about variable overdrive. The option you have just showed us controls overdrive level, which could be variable (probably not), static (highest probability), or not working at all (still likely with some FreeSync monitors ...). Nice of you to talk about G-Sync as a whole then summon ONLY a particular subset of G-Sync monitors - laptops. Not only are they a relatively new breed, they also lack the G-Sync module that I assume you want dead (understandable, given its cost - not understandable, given variable overdrive - which could perhaps be implemented GPU-side with a display standard update, or not). If the display does not give you a menu with an overdrive function, it does not mean that it lacks overdrive - that would be madness, particularly if the display in question has an IPS-type panel. As for the "matrix" (I assume you mean the panel?) not being good - G-Sync is not necessarily a guarantee of good panel quality. It is more-so a guarantee of fluidity and cleanliness of motion - something gamers tend to care more about (although I disagree with the trajectory many gamers take with TN panels with absolutely horrible color presets but "low" response times). Time is money, and waiting for AMD to catch up might be worth less to someone than just ponying up the $200-$250 extra for a G-Sync monitor and getting a guarantee of motion smoothness / cleanliness and pairing that up with a GPU that performs (often much) better than what the competition from AMD is capable of (e.g. Vega 56 / 64 vs. 1080Ti). Now with the latest Titan V release, although that GPU is clearly not meant for gaming, and yields are horribly low, it is an indicator that Nvidia are (if slightly) more than a generation ahead of AMD. Their GTX1080 released in the middle of last year (2016). AMD's competition arrived a year later, with quite worse power draw, and comparable (and sometimes worse / better) performance. Let's not talk about the laughable Vega 64 - that card is just nonsense. If Nvidia manage to get yields up in the next few months - year, then by the time AMD has any response to the GTX 1080Ti (if they can save themselves from Vega), Nvidia would already have a big, fat Volta GPU on the market. So, in conclusion, it really doesn't make sense to talk of a shifting performance crown when that performance crown is not likely (read: at all) to be moving heads any time soon.
Have you seen the video link I put up there ? there are ppl not seeing variable overdrive its virtual, response times also because AMD already won input lag war in VR which I assume works everywhere else too. G-Sync features are similar to some audio-voodoo stuff really, or Apple marketing. Good you write so much putting there 0 proofs that something works in reality. Diamond cables works for ya ? good It is obvious for me that nvidia would do absolutely everything in their power to make sure you buy their 200$ crap watching just numbers on the web page and of course nvidia being single best monitor driver chip manufacturer did the best driver at a 1.0 version than all of the other brands battle their way to survive for ~50 years? G-SYNC: Nothing is mandatory ! not even G-SYNC board inside
data/avatar/default/avatar22.webp
Raider0001:

Have you seen the video link I put up there ? there are ppl not seeing variable overdrive its virtual, response times also because AMD already won input lag war in VR which I assume works everywhere else too.
More nonsense. I have no idea what you're getting to. Show input lag figures for Nvidia vs. AMD where you "assume" they work.
G-Sync features are similar to some audio-voodoo stuff really, or Apple marketing. Good you write so much putting there 0 proofs that something works in reality. Diamond cables works for ya ? good
I have a G-Sync monitor (ViewSonic XG2703-GS) and I have tested what I have said up there. What do you have to go from? Some nonsense about your brother's G-Sync laptop? Sure. Your method of proving things by "assumption" seems to work really well. If you have any contention with what I have said up there, then counter-argue. I will have to take you for a kid if all you can do is respond that I have provided zero proofs. But yes, erect strawmen as if we are people who would buy expensive cables that offer no advantage. Such great arguing skills - we all know more than you do here, and you're not fooling us at all.
It is obvious for me that nvidia would do absolutely everything in their power to make sure you buy their 200$ crap watching just numbers on the web page and of course nvidia being single best monitor driver chip manufacturer did the best driver at a 1.0 version than all of the other brands battle their way to survive for ~50 years?
Unlike you, we here actually own G-Sync monitors and have tested them extensively. You have nothing to go by, we have everything to go by.
G-SYNC: Nothing is mandatory ! not even G-SYNC board inside
More crap.
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
yasamoka:

6) Compatibility: G-Sync works all the way back to a GTX 680 (release in March 2012). FreeSync works all the way back to a 290X (released in November 2013). Advantage: G-Sync, if ever so slightly now (the 680 simply outdated - the 290X is still a very capable card).
FreeSync is supported back to the HD7000 series. Aside from that, I really don't care to read any more because this thread appears to have turned into another pathetic AMD bashing thread, just like every other thread that mentions anything even remotely related to AMD. From what I've seen over the last few years, this is no longer a hardware enthusiast forum. It's an Intel/NVidia enthusiast forum. Any time AMD is mentioned to any extent, all the Intel or NVidia loyalists pop up and start bashing AMD. It's time you people grow up. You're actually damaging this forum's reputation.
data/avatar/default/avatar09.webp
sykozis:

FreeSync is supported back to the HD7000 series.
Only for video playback. We've never seen it in action even then.
Aside from that, I really don't care to read any more because this thread appears to have turned into another pathetic AMD bashing thread, just like every other thread that mentions anything even remotely related to AMD. From what I've seen over the last few years, this is no longer a hardware enthusiast forum. It's an Intel/NVidia enthusiast forum. Any time AMD is mentioned to any extent, all the Intel or NVidia loyalists pop up and start bashing AMD. It's time you people grow up. You're actually damaging this forum's reputation.
Are you seriously suggesting that *I'm* an Nvidia fanboy? Dear, you must have a goldfish memory. Yeah, whenever someone posts what's actually going on in the GPU market, and criticizes one of the companies, people jump to call them a fanboy. Try your nonsense with someone else; you know my post history over the years on this forum - to accuse me of being a fanboy is utterly laughable.
https://forums.guru3d.com/data/avatars/m/196/196284.jpg
yasamoka:

Only for video playback. We've never seen it in action even then. Are you seriously suggesting that *I'm* an Nvidia fanboy? Dear, you must have a goldfish memory. Yeah, whenever someone posts what's actually going on in the GPU market, and criticizes one of the companies, people jump to call them a fanboy. Try your nonsense with someone else; you know my post history over the years on this forum - to accuse me of being a fanboy is utterly laughable.
I'm well aware of your posting history. You seem to completely miss my point here though. This thread actually has nothing to do with AMD. This thread concerns monitors from AOC. Yes, those monitors have "FreeSync" as a listed feature. Big deal. If people want a FreeSync monitor, so be it. I own one myself. I've had no issues with it. However, in my case, FreeSync is completely useless since every game I play runs well in excess of 200fps resulting in having a framerate limit set to 60fps for most games. That aside, there was no reason for NVidia to be mentioned in this thread as these monitors don't support G-Sync. This thread also has nothing to do with the GPU market seeing as the news article is specifically about 3 new (and vastly overpriced) AOC monitors. Of course, at no point did I refer to anyone as a "fanboy" nor did I make such an accusation of you but you respond to me with childish insults. I don't know how old you are and quite frankly I don't care. Childish behavior is damaging to a tech forum's reputation and credibility, regardless of the accuracy of the information. You moved from AMD to NVidia. Good for you. I moved from NVidia back to AMD again in my main system. Made more sense than sticking with NVidia when I had already planned to switch to RyZen upon it's release. I personally have no preference in regards to hardware. I buy what fits my needs at the time. My point still stands though. There was mention in the news post about a feature specific to AMD and the thread turned towards bashing AMD, as usual. Support, regardless of how limited, is still support. I could always throw my HD7950 in and test it, but what purpose would that serve aside from irritating me? I don't actually use FreeSync anyway.