ASUS ROG Strix XG248Q Adaptive Sync 240 Hz Monitor
Click here to post a comment for ASUS ROG Strix XG248Q Adaptive Sync 240 Hz Monitor on our message forum
Sixtyfps
Pointless unless you have eyes like a fighter pilot
FrostNixon
anub1s18
Denial
mdrejhon
pursuit camera invention used by many reviewers -- that I have peer reviewed conference paper coauthored with NIST.gov, NOKIA and Keltek. I was the world's first person to measure the input lag of GSYNC (in late 2013) and I was the world's first person to test 480Hz.
So as an authority in this topic matter:
There are OTHER mainstream benefits of high Hz completely unrelated to fighter pilots.
1. Higher Hz means less stroboscopic effects.
https://www.blurbusters.com/wp-content/uploads/2017/08/project480-mousearrow-690x518.jpg
2. Higher Hz means less motion blur (without needing flicker strobe backlight like ULMB)
https://www.blurbusters.com/wp-content/uploads/2014/03/motion_blur_from_persistence.png
3. Display motion blur goes down with higher Hz when using flickerfree mode.
See animations www.testufo.com/eyetracking and www.testufo.com/persistence as an example.
4. Input lag benefits. Higher Hz has less scanout-related latency.
We already easily tell apart 125Hz mice and 1000Hz mice. The difference between 120Hz display and a 1000Hz display is confirmed to be a similar latency improvement in scientific experiments of laboratory displays. Also, at 1000Hz, even the input lag of perfect VSYNC ON becomes very small (3 frames of latency is only 3ms).
For more reading, please educate yourself with mathematics & science:
Blur Busters Law: The Amazing Journey To Future 1000Hz+ Displays
For blurless sample-and-hold (strobless ULMB, flickerless ULMB) is only possible via ultra-high Hz.
Virtual reality scientist agree. Many!
As one of the many, this is an NVIDIA SCIENTIST TWEET:
https://www.blurbusters.com/wp-content/uploads/2018/06/img_5b116df080ebb.png
(& many others 1000Hz confirmations by many scientists)
Many websites need to stop writing "X Hz is worthless" articles non-science misinformation. Maybe it's important to you, and doesn't show visual benefits due to display limitations (fake Hz or slow response) but until you've actually seen good, proper, scientific high-Hz displays -- do not claim as such. It is TOTALLY WRONG and does a DISSERVICE to industry to spread this wrong information. I help display manufacturers engineer their computer monitors. Future GPUs with "frame rate amplification technology" will overcome the GPU side problem over the coming decade, too. And panel manufacturers have often delayed Hz progress because they doubted -- until they realized they were wrong. Eventually, Blur Busters plans to begin calling out all non-scientific media websites (possibly with paid advertisement to shame these websites if they post future articles about the worthlessness of "X Hz".) that still perpetuate these falsehoods. Strobeless ULMB/LightBoost can only be accomplished via 500Hz-1000Hz to achieve zero-flicker blur reduction (blurless sample-and-hold requires ultra-short refresh cycles. 1ms persistence requires 1ms refresh cycles for flickerfree). New tests by scientists have already confirmed that the vanishing point of the diminishing curve doesn't disappear yet far beyond 1000Hz. And higher Hz displays makes lower Hz cheaper!!!! Do you really want to do a tantamount equivalent of telling Intel and AMD not to manufacture faster and cheaper CPUs....
Thank you for reading this important public service post.
Much appreciated.
"X Hz is useless to everyone" needs to die like the "humans can't tell 30fps vs 60fps" crap. ๐
Cheers,
Chief Blur Buster
False.
Apples versus oranges.
What a fighter pilot sees has nothing to do with refresh rate, because it's a glimpse test of a single flash that's not brightness-compensated, and is for object identification. It's tantamount to flashing a single frame for a single Hertz, and asking a user to identify the object in it.
I am an inventor of TestUFO, and the mdrejhon
You are welcome!
fantaskarsef
A screen for CS:GO players. Can't imagine to have many people feed a 1080p display with 240fps besides professional gamers. Strictly hardware wise that is, not that it wouldn't make sense.
fantaskarsef
mdrejhon
it has happened again with 480Hz where Zisworks (whom I helped with strobe backlight stuff) has launched 480Hz well before mainstream manufacturers did. I've got contacts with indies that is experimenting in 1000Hz stuff already -- expect this to happen by the early half of 2020s. For example, it costs only $200 of FPGA modifications to a $700 DLP projector to make it output motion-flawless 1000Hz, though there is a severe loss of color depth, and of course, having FPGA skillz. But there are CHEAP indie/maker/hacker ways to achieve 1000Hz experiments and they will bear fruit by the early 2020s. It may take till 2030s before cheap 1000Hz occurs, but we do try to move the needle before the mainstream does ๐
Even if it's not important to you -- we don't need those "30fps vs 60fps" luddite stuff (like "X Hz is worthless") to misinform people in the interim. That stuff belong in the garbage bin -- any of that will be unceremoniously shot down by Blur Busters. ๐
Cheers,
Chief Blur Buster
There is work being done on "frame rate amplification technologies" that will raise framerate fairly cheaply (more framerate per dollar) without adding input lag.
Oculus' spacewarp tech is one of these (45fps->90fps) -- it's very rough and wright-brothers at the moment compared to what will come -- tomorrow's frame rate amplification tech will be built more directly into silicon -- and artifactlessly and laglessly achieve 100fps->1000fps frame rate amplification. Imagine, midrange GPUs doing 1000fps cheaply. Within our lifetimes.
This is in overdrive at the moment because of virtual reality, and the benefits will eventually filter down over the coming decade to midrange cards for desktop gaming monitors.
Consoles are finally jumping on HFR (120Hz XBox), and LG demonstrated HFR streaming (120fps movies) at CES 2018. It takes times for those things to become much more widespread.
Thanks to VR, there is incredible lab stuff but it's going to take years, since some of the focuses are on increasing pixel count. Higher framerates at 4K and 8K.
Also, it's a vicious circle -- higher resolution amplify motion clarity limitations of Hz massively. 4K 120Hz LCD degrades motion clarity on a relative-percentage more than 1024x768 60Hz LCD.
For one-screen-width-per-second horizontal panning motion, measured in the length of TestUFO blur trailing size behind moving UFO objects:
--> 1024x768 60fps motion -- motionblurs 1024/60ths screenwidth (motion blur trail length of 17 pixels -- roughly 17x blurrier than stationary graphics), 17:1 degradation in image sharpness between motion-vs-stationary
--> 3840x2160 120fps motion -- motionblurs 3840/120ths screenwidth (motion blur trail length of 32 pixels -- roughly 32x blurrier than stationary graphics), 32:1 degradation in image sharpness between motion-vs-stationary.
So you see, higher resolution amplifies sample-and-hold motion blur visibility. Naturally de-blurring (stroblessly, since real life doesn't strobe/flicker) retina graphics in fast motion will require extremely high frame rates at refresh rates. The more Retina a display becomes, the lower the persistence you need to completely eliminate display motion blur. And the only way to do strobeless low persistence is ultra-high-fps at ultra-high-Hz.
Certainly, you need bigger jumps ups to see human benefits.
e.g. 60Hz -> 120Hz -> 240Hz -> 480Hz -> 960Hz
Milking the diminishing return curves requires progressively bigger jumps upwards to see any benefits. Obviously, the first time a high Hz is achieved, it's often not fully efficient (e.g. response time limitations) but perfect effiency means a perfect halving of motion blur for a Hz doubling (for comfortably flickerfree sample-and-hold). So saying "240Hz is garbage because I can't tell apart 120Hz vs 144Hz" does not acknowledge how the curve behaves. The next jump upwards can get increasingly difficult, without losing the ability to have retina resolutions.
Strobe-based blur reduction (ULMB/LightBoost) is wonderful, but strobing ain't Holodeck Star Trek final frontier stuff -- blurless+flickerless+strobeless (analog refreshrateless display) is more human-eye natural, but we can't go analog, so we have to go ultra-high-Hz to simulate analog naturalness.
Eventually the resolution-race becomes over when everything is "retina", and some of the last remaining races becomes temporal resolution (which requires raising Hertz for people who don't like input lag or strobing). We're worsening our motion blur degradation deltas (clarity of stationary versus motion) by going to higher resolutions, and will eventually trigger more pressure to go beyond 60Hz in the longer-term humankind.
Yesterday, plasma cost an arm and a leg. Today, 4K TVs are almost the same price as 1080p HDTVs (they have almost stopped selling 1080p TVs in Best Buy now). Same thing may happen to 1000Hz when it costs only a few dollars more than 120Hz in, let's imagine, 50 years from now. Who knows? When consoles play at 120-240Hz and competitive are already in the 1000Hz leagues.
The progress is much slower than SSDs and CPUs, but there's already a Hertz Moore's Law starting up already where Hz doubles every approximately 5-10 years, which is currently observed.
Sure, 1000Hz this is "long term" and "in a decade or few" stuff obviously.
The first 1000Hz indie displays will arrive in the early 2020s. Indie came out with 240Hz in year 2013 - several years before manufacturers did. And fantaskarsef
mdrejhon
fantaskarsef
Corrupt^
There's very few engines where even a TITAN could probably hold 240 constant.
Only the very efficient Doom 2016 engine comes to mind, that was putting out massive framerates even on my 1080 GTX.
Anyways I've grown more "casual" but I'm still very nitpicky about latency, so I've settled somewhat around 120 fps & 120 Hz... until 120 fps becomes mainstream and 240 becomes way more affordable.
Personally I wish NVIDIA/ATI also came with some scaling mechanism so that 720p looks crisp on a 1440p display (as 1 pixel of 720p would fit exactly within 4 squared pixels of a 1440p resolution).
That way I could play my casual games at 1440p and play my more serious stuff at 720p.
mdrejhon
When one doubles refresh rates, one does have to up the game everywhere else. 240Hz can easily feel worthless in situations (e.g. 30fps at 240Hz isn't going to be noticeably better than 30fps at 144Hz!) unless you do lots to eliminate lots of weak links. Just replacing the monitor is not always enough.
-- GPU upgrade with more framerate, or playing older games (e.g. CS:GO)
-- Engine upgrade to handle more framerate better without other problems getting in the way (e.g. microstutters), e.g. engine performing as smooth as TestUFO
-- Mouse upgrade since mouse microstutters can become a huge weak link at high Hz (unable to tell 120fps vs 240fps). Upgrade your mouse mat too -- make sure mouse turn left/right is as smooth as keyboard strafe left/right -- before judging monitor Hz.
For VSYNC OFF and VRR operation, a mouse poll rate unsynchronized with refresh rate, should be a poll rate of at least 4x the display refresh rate. We've already noticed human-visible microstuttering in 1000Hz gaming mice during strobed/ULMB operation that is only fixable at 2000Hz+ (or via other means such as perfectly synchronizing the poll rate to the refresh rate).
-- Panel upgrade since early 240Hz monitors had poor overdrive tuning (pixel response limitations that doesn't halve motion blur relative to 120Hz). This happened to a couple of early 240Hz monitors.
-- Impeccable framepacing. Framepacing errors should ideally be a tiny fraction of a refresh cycle. A framepacing error of 4ms doesn't matter at 60Hz, but it produces mega-microstutter at 240Hz (4ms refresh cycles).
At 8000 pixels per second panning motion (one screen width panning in 1/2 second at 4K), a 1ms gametime error generates an 8-pixel jump -- still human visible as a single microstutter in a TestUFO-style motion test! So gametimes and framepacing ideally should become sub-millisecond accurate during 240Hz+ operation. CS:GO is capable of that nowadays, but many games cannot achieve such framepacing accuracy. VRR operation reduces the necessity of framepacing accuracy to an extent, so 480Hz VRR will help, that said, the framepacing accuracy demands to avoid a microstutter (which is still visible on 240Hz and 480Hz), jumps up a lot.
Sure, the microstutter vibration amplitude halves at twice the Hz, and is visible for the duration but still human visible (e.g. 479fps at 480Hz still produces 1 slightly visible TestUFO stutter per second). But it's now become so damn sensitive to microstutter due to huge demands on consistent frametimes, so bad microstutter harmonics can still become visible -- such as pileups of delays -- where a 4ms pause means 2 missed refresh cycles at 480Hz (1/480sec = ~2ms)! In fact... Motion problems of microstutters are even still (barely) visible at 1000fps @ 1000Hz under scentific tests, so we're infact still not at the vanishing-point of diminishing returns curve. Yet. The invention of Hertz (the human idea of using a series of static images to represent moving imagery) is still a royal pain with artifacts such as tearing, microstuttering, latency, etc, which only gradually diminishes until the next weak link is hit (e.g. the GPU, the mouse, etc).
Either way:
The leap to true genuine display refresh rates of 480Hz, 960Hz, 1000Hz will probably take a couple of decades to mature (including weak links in software and other computer accessories), but definitely worthwhile with lots of solvable problems that engineers are currently working on.
The GPU side of equation is the hardest one. The "frame rate amplification technology" part will be the biggest issue -- extra framerate for cheaper is critical -- especially since framerates will be kept down by things like real-time ray tracing. However, it does not preclude continued increases in framerates over the long term. Average 3D framerates today are still higher than 20 years ago, so there is progress on average.