Acer XB280HK is a 28-inch 4K Ultra HD monitor with G-Sync

Published by

Click here to post a comment for Acer XB280HK is a 28-inch 4K Ultra HD monitor with G-Sync on our message forum
https://forums.guru3d.com/data/avatars/m/211/211933.jpg
why would you need gsync anyway, I still believe screen tearing is made up belief that doesn't actually exists.. If i ever had it was so minor that I never noticed, those screen tearing screenshots sure look ridiculous..
You'd be surprised how wrong you are . Screen tearing is real and is out to get you! Depends on what games you play i guess, i myself don't mind it that much but there's a lot of people who can't play because of it.
https://forums.guru3d.com/data/avatars/m/199/199386.jpg
wow, this and two 780ti 6gb would set me back £2000. I much prefer lower scale gaming for now, when gtx 990 is here perhaps.
I think the problem is deeper than that. For affordable 4,096 Horizontal pixel gaming there needs to be a complete redesign of the basic PC architecture. AMD, Intel, IBM, Apple, nVidia, Samsung, Microsoft and considerable contribution from the global community need to get a new clean design from scratch. All this x86 crap needs to go. Too expensive, too much power draw and running out of steam fast. When I was a kid, I was told we'd be in flying cars by 2015 - but no, we still got x86 processors running binary...64 bit? pfft, I would have expected 1,024 bit tech running my kettle ffs. This is what happens when incremental increases are more profitable. /gets off soapbox.
https://forums.guru3d.com/data/avatars/m/235/235224.jpg
4k G-Sync, do want 😀
https://forums.guru3d.com/data/avatars/m/227/227994.jpg
why would you need gsync anyway, I still believe screen tearing is made up belief that doesn't actually exists..
I'm guessing you also believe there is no difference between 30 and 60 FPS, because the human eye can't see past 30 FPS eh?
https://forums.guru3d.com/data/avatars/m/238/238382.jpg
why would you need gsync anyway, I still believe screen tearing is made up belief that doesn't actually exists...
Is this like a joke comment?
https://forums.guru3d.com/data/avatars/m/182/182702.jpg
why would you need gsync anyway, I still believe screen tearing is made up belief that doesn't actually exists.. If i ever had it was so minor that I never noticed, those screen tearing screenshots sure look ridiculous..
Nice troll bro 😀
https://forums.guru3d.com/data/avatars/m/115/115616.jpg
waw, this and two 780ti 6gb would set me back £2000. I much prefer lower scale gaming for now, when gtx 990 is here perhaps.
Well, g-sync drastically reduces the hardware requirements, as you don't need your system to stay at 60fps+ levels 100% of the time. So if you accept frame rates around 40, single 780ti or your dual 760.
data/avatar/default/avatar24.webp
Well, g-sync drastically reduces the hardware requirements, as you don't need your system to stay at 60fps+ levels 100% of the time. So if you accept frame rates around 40, single 780ti or your dual 760.
yea but if you're packing a display like this whats the point of medium/high textures instead of ultra :P i'm guessing 4k single displays are hitting the 3gb vram limit in high end games already (or they probably should be if the textures are up to snuff) that's only gone get worse. in the next 2 years so bundeling 2 2GB gpu's or 2 3GB gpu's with a display like this almost seems like a waste : /
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
All this x86 crap needs to go. Too expensive, too much power draw and running out of steam fast.
I don't get it - how exactly is x86 a problem when it comes to high-res gaming? And what do you propose to improve it? x86 may be old, kind of messy (but it is CISC after all...) and a bit power hungry, but there's a certain point where x86 becomes very efficient, if not more efficient than any other architecture. Try getting an ARM CPU to compete with an i7 in terms of performance-per-watt and the ARM will most certainly fail miserably. When you try to get an x86 CPU to compete against the power draw of an existing ARM CPU, the ARM will most likely perform better. This is my gripe with intel - they want to dominate EVERYTHING but x86 is not a 1-size-fits-all architecture by any means.
When I was a kid, I was told we'd be in flying cars by 2015 - but no, we still got x86 processors running binary...64 bit? pfft, I would have expected 1,024 bit tech running my kettle ffs.
I both agree and disagree. x86 should have been obsoleted a long time ago, but in the Windows world, software compatibility would be a nightmare if that were the case. But why go beyond 64 bit architectures? In the server world, where software compatibility in new systems often doesn't matter at all, they still stick with 32 bit and 64 bit architectures. Every year servers have the opportunity to increase the bus width but they don't. GPUs are the only exception, but their operation isn't comparable to a CPU. If money wasn't in the equation, then at this point we'd likely all own a quantum computer at this point. But since that isn't the case and since companies only do things in their own interest, your demands seem very naive.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
To note that AMD64bits instructions are too backward compatible with both 16 and 32bits. This is why you can run full 32bits software with it, and not the invert.. the x86_64 is not 32bits extended to 64bits, but full 64bits but is backward compatible with it.
x86-64 is backward compatible on a HARDWARE level. In contrast, I believe IA64 is a 64-bit x86 architecture, but it isn't backward compatible on either a hardware of software level. This brings me back to my point about Windows being an issue - if you want newer, better architectures, the software has to be designed for it. 32 bit Windows is STILL prevalent. For whatever reason, MS didn't try pushing for wider buses 8 years ago, and because of this they're crippling the computer industry.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
I think the problem is deeper than that. For affordable 4,096 Horizontal pixel gaming there needs to be a complete redesign of the basic PC architecture. AMD, Intel, IBM, Apple, nVidia, Samsung, Microsoft and considerable contribution from the global community need to get a new clean design from scratch. All this x86 crap needs to go. Too expensive, too much power draw and running out of steam fast. When I was a kid, I was told we'd be in flying cars by 2015 - but no, we still got x86 processors running binary...64 bit? pfft, I would have expected 1,024 bit tech running my kettle ffs. This is what happens when incremental increases are more profitable. /gets off soapbox.
I don't get it - how exactly is x86 a problem when it comes to high-res gaming? And what do you propose to improve it? x86 may be old, kind of messy (but it is CISC after all...) and a bit power hungry, but there's a certain point where x86 becomes very efficient, if not more efficient than any other architecture. Try getting an ARM CPU to compete with an i7 in terms of performance-per-watt and the ARM will most certainly fail miserably. When you try to get an x86 CPU to compete against the power draw of an existing ARM CPU, the ARM will most likely perform better. This is my gripe with intel - they want to dominate EVERYTHING but x86 is not a 1-size-fits-all architecture by any means. I both agree and disagree. x86 should have been obsoleted a long time ago, but in the Windows world, software compatibility would be a nightmare if that were the case. But why go beyond 64 bit architectures? In the server world, where software compatibility in new systems often doesn't matter at all, they still stick with 32 bit and 64 bit architectures. Every year servers have the opportunity to increase the bus width but they don't. GPUs are the only exception, but their operation isn't comparable to a CPU. If money wasn't in the equation, then at this point we'd likely all own a quantum computer at this point. But since that isn't the case and since companies only do things in their own interest, your demands seem very naive.
There is no reason to retire x86. Modern x86 processors barely resemble "x86" processors anyway. They all have sophisticated frontends that decode CISC based instructions to internal formats that resemble RISC where they need to. ARM's only advantage is that they focused on low power from the start, Intel has slowly been going through and modifying their instructions/internal arch to better suit those needs. Moorefield is x86 and is power/performance competitive with ARM and should be seen in products later this year. Also nothing is black and white, it isn't like ARM is only capable of low power, they could easily scale their design and make internal changes to better suit high performance needs. They just knew it would be more difficult to compete with Intel so they went a completely different route to avoid competing with them. And yeah, the compatibility thing is a problem. It would be like retiring all world languages in favor for a superior one that is more accurate with less words/phonics/whatever. Languages evolve naturally to fill the voids/gaps/concerns that populations have. X86/ARM is exactly the same way, neither one is set in stone they are constantly evolving. There have also been tons of other instruction sets that have claimed all kinds of benefits but failed in gaining traction, mainly because Intel can do whatever it wants internally on a chip and mimic those benefits. As for G-Sync, I think I may personally wait to see what the deal is with Freesync and see if my questions about G-Sync get answered. Nvidia keeps saying they are going to fix things in the future with G-Sync, but is that going to require an new module? Are they going to issue firmware updates to the current G-Sync module? Is that why they went with an FPGA instead of a ASIC board? I really don't want to lock myself into Nvidia only with my monitor purchase.
https://forums.guru3d.com/data/avatars/m/80/80129.jpg
At 30hz, this is not a gaming monitor. Would you run your current monitor at 30hz? Try it lol, it's horrible
It's 60hz.
https://forums.guru3d.com/data/avatars/m/90/90026.jpg
what a piece of crap 60hz only and tn what sense is to use g-sync with 60hz only monitor???
data/avatar/default/avatar20.webp
I would be satisfied if it can handle 4k @ 60fps on desktop and 120hz on 2560x1440 or 1080p @ gsync when gaming. Would that be possible?
data/avatar/default/avatar40.webp
As excited as I am to see a gsync monitor come out, this particular model doesn't do much for me, and definitely not for the average gamer. A) est. $949 is a crazy amount of money to spend on a TN panel. 4K is nice and all but I'd personally be more than happy at 1440p, as I don't intend on gaming beyond that resolution anyway B) Acer is just a...kind of a ghetto brand in my eyes...could just be a perception due to some junk I've had from them in the past, but I can't see myself shelling more than $500 on a monitor if it isn't from what I consider to be a top brand.
data/avatar/default/avatar29.webp
Liking the specs of the screen i just wonder why there is no response time listed?
https://forums.guru3d.com/data/avatars/m/220/220507.jpg
Overclockers UK are selling this for £499 http://www.overclockers.co.uk/showproduct.php?prodid=MO-065-AC Specification: G-Sync Technology Flicker-less Technology Height Adjust stand with tilt, swivel and pivot Display Screen Size: 28" (70.8cm) Aspect ratio: 16:9 Panel Type: TN Brightness: 300cd/m2 Contrast Ratio: 1000:1 (Typ) Dynamic Contrast Ratio: 10,000,000 Resolution: 3840x2160 (Only over DisplayPort at 60Hz) Response Time: 1ms (GTG) Colurs: 16.7M Bits: 8 bits+HiFRC Viewing Angle (H / V ): 170 ° / 160 ° Connectivity: 1x DisplayPort 1.2 (60Hz) USB Hub 3.0: Yes (1up, 4down) VESA: 100x100mm Seakers: No Warranty: 2yr
https://forums.guru3d.com/data/avatars/m/90/90726.jpg
At 30hz, this is not a gaming monitor. Would you run your current monitor at 30hz? Try it lol, it's horrible
Whatever dude, its proven that the human eye cannot see above 30FPS anyway.. Sincerely, Console Peasant
data/avatar/default/avatar13.webp
why would you need gsync anyway, I still believe screen tearing is made up belief that doesn't actually exists.. If i ever had it was so minor that I never noticed, those screen tearing screenshots sure look ridiculous..
I suppose you think that oxygen is made up too?