Brightness and Contrast

For this review, I altered the way I measure brightness and contrast. With display vendors including dynamic modes that shut off the backlight completely on all-black screens, those can’t be used for testing anymore as it leads to infinite contrast ratios. Additionally it doesn’t do a real world, accurate test of what the real contrast and brightness levels would be. Instead I am now measuring the screen using a 5x5 ANSI contrast pattern, once regular and once reversed. This allows for getting a better idea of panel response, and it will reward companies that use better backlighting systems (like LED array systems) that have precise control, less backlight bleeding, or that move to a technology like OLED in the future.

The downside is that new results are not directly comparable to old results. It also doesn’t scale well from a large display to a smaller display, as the smaller targets on small displays mean you might more easily read light from another target. I would expect that numbers now will look a little worse than before because of the harder testing, and nowhere near the ridiculous numbers often quoted by vendors. It will provide better data for the readers, however, and so it is the way to move forward.

White Level -  XR Pro, Xrite i1D2 and XR i1DPro

With the brightness set to maximum we get a white level in the center of the ANSI pattern of 259 nits. This is lower than I expected from the specs, but a full field white might be able to generate a brighter square as well. For the minimum white level I set the brightness level down to 20. With it set to 19 I could get a reading on white of around 1 nit, but black was below what the meter could read, and so I had to set it up to 20. Set here, I had 3.8 nits of light output (which is rounded up to 4 in the chart). Some displays only get down to 70-80 nits, which might be brighter than some users want, so this is a good number to see. I just wish the brightness control functioned all the way down to 0 to provide better control of the backlight overall.

Black Level - XR Pro, Xrite i1D2 and XR i1DPro

The black level of the Nixeus with the backlight set to maximum is 0.553. This seems high, but this is our first pass with the new ANSI testing method, so we will have to see if this winds up being high or low after a couple more reviews. The black level of the Nixeus with the brightness at 20 is 0.008 nits, which is quite low. It’s really as low as we’ve measured before, but this is somewhat a result of a backlight that stops being functional below this setting.

Contrast Ratio -  XR Pro, Xrite i1D2 and XR i1DPro

The resulting contrast ratios here are 469:1 at maximum and 455:1 at minimum. These unfortunately come in at the bottom of the list for 27” monitors--and really it's the bottom of the chart for all the monitors we have reviewed recently. Getting good contrast ratios becomes harder as the display gets larger, and the backlighting systems to really pull them off become more expensive. I think we are almost stuck with these lower contrast ratios until we start to see more screen innovations, like OLED or backlit-LED array displays, but those are also very expensive. I’m not too happy about the sub-500:1 number as dynamic range is very important in a display, but it’s a compromise you’ll have to weigh yourself.

Introduction, Design, and OSD Nixeus NX-VUE27 Color Quality
POST A COMMENT

65 Comments

View All Comments

  • Death666Angel - Friday, September 21, 2012 - link

    Or AMD 7xxx. :) Reply
  • Penti - Friday, September 21, 2012 - link

    No, just no. Only 3 GHz HDMI 1.4a support above 1920x1200. Don't confuse the two, the monitor does not support it. DVI-DL or DP that rules here. The Nixues might accept a higher res signal over HDMI but it doesn't have the bandwidth to handle it so it causes issues. DP or DVI-DL recommended and is the only one's supported by the vendor. It's basically like trying to run SL-DVI at a higher res then specced here. Skip HDMI-connections whenever you can, skip notebooks with only HDMI whenever you can if you want to run over 1920x1200. Even if you happen to have stuff supporting HDMI 1.4a 3GHz (3GHz part is vital here) in your portable stuff the monitor isn't yet supporting it. They need a new generation of chips driving the displays. GCN and Kepler might be practical if you like to run above 2560x1600 though, but most monitors still requires two DP-connections for 3840x2160/2400 when they don't have true DP1.2 support. There isn't really much of any hardware around to support all the other DP1.2 features either such as daisy chaining.

    HDMI is essentially useless here unless it can scale your console (1280x720/1920x1080) good enough on that screen to be usable and correctly viewed. VGA isn't really any use either. You simply have to use a lower res screen if you don't have access to DL-DVI and or DP supporting stuff.
    Reply
  • atotroadkill - Friday, September 21, 2012 - link

    Thanks for the clarification... I currently use the displayport connection from my GTX 670 to my NX-VUE27... before when I was using HDMI 1.4 I did experience artifacts and some sync issues at 2560x1440. After that I tried Dual Link DVI but I couldn't see my bios... but after switching to Displayport those issues went away. Reply
  • Despoiler - Thursday, September 20, 2012 - link

    Too much processing lag. 2 FRAMES!!! Glad I held off. That is a non-starter. Reply
  • atotroadkill - Thursday, September 20, 2012 - link

    "Because I have to run at a non-standard resolution compared to the Nixeus, you might see some additional lag being added to the input than if you ran natively, but there is no way for me to actually test the native input lag time. There is also no way on the Nixeus to set a 1080p image to be centered and not scaled, which might reduce lag by doing 1:1 mapping and bypassing the scaler but at the expense of only using part of the screen."

    The 2 Frames and processing is because of Non-Native resolution testing and testing it 1080p...so if you are gaming at 1080p on the monitor then yes it will bother you - and gaming on this monitor at 1080 you shouldn't get this monitor anyways.

    I'm using it at 2560x1440 playing BF3 and it has no affect on my shooting and timing (with V-SYNC off)
    Reply
  • abhaxus - Friday, September 21, 2012 - link

    I wonder about this also. Since the author doesn't have a CRT capable of 1440p for reference, why not just compare the input lag using the HP as a reference at 1440p? Seems like a solution that might get answers for those of us who are quite interested in this monitor. I suspect it's as you say, that without scaling it can do ok. Reply
  • cheinonen - Friday, September 21, 2012 - link

    Well, testing that would require that I still have the HP monitor, but since I didn't buy it, that isn't really an option for me to do. The only CRTs out there that can do 1440p are probably projectors with 9" CRTs, and unfortunately installing a 100+ lb. projector, not to mention the cost of finding one in great shape to test it, precludes that. Reply
  • trynberg - Friday, September 21, 2012 - link

    Not that I expect you to get one, but there are plenty of 19-21" CRTs out there that can do 2048X1536 resolution for dirt cheap...I have two sitting at home right now (19" Mitsubishi and 21" Sun/Sony). Reply
  • cheinonen - Friday, September 21, 2012 - link

    It has to be 2560x1440, though, for the exact same native resolution, and something that can do that is incredibly hard to find. Reply
  • Sabresiberian - Friday, September 21, 2012 - link

    Hmm well okay the Sony GDM FW900 runs at 2304x1440, not 2560, but wouldn't that give you better results?

    (I'm not sure this is a huge issue, as long as your testing methodology is consistent across LCD displays.)
    Reply

Log in

Don't have an account? Sign up now