CES is set to get started right away, but already some news is starting to trickle out of Las Vegas. Acer has taken the wraps off of two new monitors aimed at gamers. The first features NVIDIA G-SYNC technology, for smoother frame delivery. For more info on G-SYNC, please refer to this article. The basics of it is that instead of the monitor refreshing at a fixed rate, it instead waits for a new frame from the GPU before refreshing.

Acer XB270HU

The first monitor is the Acer XB270HU, which, according to the Acer press release, is the world’s first IPS monitor with G-SYNC capability. It is a 27” 2560x1440 resolution IPS panel, with a maximum refresh rate of 144 Hz. This will give much better viewing angles (up to 178°) and generally a better color accuracy as well, although that will have to be tested. The XB270HU also comes on a height adjustable stand which offers tilt and swivel. The specifics of the panel are not mentioned, so at this time we cannot say whether it is a 6 or 8 bit panel. Availability is March 2015.

Acer XG270HU

The second gaming monitor is the XG270HU which has a 27” edge-to-edge frameless display according to the press release. It is not completely frameless of course, but the bezels are much smaller than normal on the side. The bottom of the monitor still has a large bezel though, so if you are looking for something frameless to use in portrait mode, these are not the monitors for you. The XG model is a TN panel, but features the same 2560x1440 resolution and 144 Hz refresh rate as the XB model, and features HDMI 2.0, DVI and DisplayPort 1.2 connections. Acer is claiming a 1 ms response time for this model. As with the XB model, availability will be in March 2015.

Prices were not announced at this time.

Source: Acer

Comments Locked

27 Comments

View All Comments

  • Dhalmel - Saturday, January 3, 2015 - link

    That feel when you don't know if you have a high refresh 1440p IPS monitor or a HQ IPS 60hz 4k monitor.....

    SOME DAY
  • Dhalmel - Saturday, January 3, 2015 - link

    want*
  • kenansadhu - Sunday, January 4, 2015 - link

    Lol =)
  • r3loaded - Sunday, January 4, 2015 - link

    144Hz IPS 3840x2160 (or even 5120x2880!) monitor with adaptive sync.

    I can dream!
  • DanNeely - Sunday, January 4, 2015 - link

    Unfortunately, to get 5k above 60hz; even displayport 1.3 will need two cables. :(
  • Kutark - Sunday, January 4, 2015 - link

    Someone was mentioning that in another forum that 1440p, 144hz, a 32 bit color is beyond display port 1.2's capabilities. I looked it up and it appears to be close. Is using two cables an option?
  • DanNeely - Sunday, January 4, 2015 - link

    Since, AFAIK, there aren't any DP 1.3 capable cards out yet (the standard was only finalized in September of 2014); all current 1440p 120/144hz monitors and all 4k 60hz monitors support dual cable DP1.2. I believe that they all appear as two half width monitors to your graphicscard and require using eyefininity/3d surround to combine them into a single screen to be presented to the OS.
  • DanNeely - Sunday, January 4, 2015 - link

    err strike part of that; I was off one DP generation (and 2x speed) mentally. DP 1.2 is able to fit both of those resolutions in a single stream. For high bitrate color; output levels are 30 bits (10 bits/channel); which does just squeeze in at 1440p /144Hz. What is normally called 32-bit color on your GPU is 24bit color and an 8 bit alpha (transparency) channe. The alpha is only used in rendering and not output to the monitor. I'm not sure if 30bit color is bitpacked to 40bits (to minimize memory consumption) when an alpha channel is needed or expanded to 64 bits/pixel (memory accesses are easier to program/generally faster when aligned to the size of processor data words).
  • Kutark - Monday, January 5, 2015 - link

    Thanks man, not only clarified for me but gave me some ammo to use on another forum.

    (some guy was saying this monitor is stupid because it can't even support 32bit color @ 144hz/1440p). By the bandwidth numbers i was looking up it was close, but now that you clarified how the video card actually does 32 bit color, it makes a ton more sense.
  • zodiacfml - Sunday, January 4, 2015 - link

    same.

Log in

Don't have an account? Sign up now