Color Quality

We report two main quality metrics in our display reviews: color accuracy (Delta-E) and color gamut. Color gamut refers to the range of colors the display is able to represent with respect to some color space. In this case, our reference is the AdobeRGB 1998 color space, which is larger than the sRGB color space. So our percentages are reported with respect to this number, and larger is generally better.

Color accuracy (Delta E) refers to the display’s ability to display the correct color requested by the GPU and OS. The difference between the color represented by the display, and the color requested by the GPU is our Delta-E, and lower is better here. In practice, a Delta E under 1.0 is perfect - the chromatic sensitivity of the human eye is not great enough to distinguish a difference. Moving up, a Delta E of 2.0 or less is generally considered fit for use in a professional imaging environment - it isn’t perfect, but it’s hard to gauge the difference. Finally, Delta E of 4.0 and above is considered visible with the human eye. Of course, the big consideration here is frame of reference; unless you have another monitor or some print samples (color checker card) to compare your display with, you probably won’t notice. That is, until you print or view media on another monitor. Then the difference will no doubt be apparent.

As I mentioned in our earlier reviews, we’ve updated our display test bench. We’ve deprecated the Monaco Optix XR Pro colorimeter in favor of an Xrite i1D2 since there are no longer up-to-date drivers for modern platforms.

For these tests, we calibrate the display and try to obtain the best Delta-E we can get at both 200 nits of brightness for normal use, and 100 nits for print brightness. We target 6500K and a gamma of 2.2, but sometimes the best performance lies at native temperature and another gamma, so we try to find what the absolute best performance could be. We also take an uncalibrated measurement to show performance out of the box using either the manufacturer supplied color profile, or a generic one with no LUT data. For all of these, dynamic contrast is disabled.

Color Tracking - XR Pro and Xrite i1D2

Uncalibrated the display's color accuracy isn't very good. I found the 27-inch LED Cinema Display to be way too blue and green out of the box, calibrated the display did much better:

Color Tracking - XR Pro and Xrite i1D2

The 27-inch LED Cinema Display isn't going to be winning any awards for color reproduction but it's good enough when calibrated.

Color Tracking - XR Pro and Xrite i1D2

Curiously enough, dropping brightness down to 100 nits caused a noticeable reduction in color tracking. The average delta E went up to 2.2 while most of the 27's competitors remained about the same. The 27-inch behaves very differently depending on what brightness setting you have it on.

LCD Color Quality

Apple managed to do relatively well with the WLED backlight but it's still no match for the color gamut you get from any of the CCFL backlit displays. Note that my old 30 hasn't aged well, it's only able to cover roughly 73% today.

The Experience Color Uniformity
Comments Locked

93 Comments

View All Comments

  • Razorjackman - Tuesday, September 28, 2010 - link

    Why are all the displays from 24" and below relatively inexpensive, while the +27" seem to be disproportionately expensive?

    I'm still using a HannsG 28" from 2 years ago for $400.00. I expected to get larger or denser for less by now.

    Does Moores Law apply to monitors?
  • B3an - Tuesday, September 28, 2010 - link

    Because theres loads of cheap crappy TN panels in displays 24" and under, which i'd never even consider buying. Most old cheap CRT monitors would have better colours.

    When you move to 27" and especially 30" you start getting some quality IPS panels.
    You dont just get a larger display with more pixels, you get a vastly better image too.
  • B3an - Tuesday, September 28, 2010 - link

    ... You also get better viewing angles as well.
  • Alexstarfire - Tuesday, September 28, 2010 - link

    True, but they were saying even the IPS panels for the 24" were $539 and up. Says so on the first page. A near doubling in price for an extra 3 inches isn't very compelling. Though, I don't know if the two 27" monitors they talk about are the cheapest for 27" monitors.
  • hughlle - Tuesday, September 28, 2010 - link

    Partly to do with who they are aimed at i guess. A lot of the folk opting for these high end 30" displays as opposed to just whacking an LCD tv on their desk, are using it for high end audio, graphic design etc, professional use you might say, so can be expected to spend the premium.
  • TegiriNenashi - Tuesday, September 28, 2010 - link

    I'd suggest there is a category between high end 30" and mainstream 24". There is something wrong with the market when 24" TN is <$250 and 30" IPS is >$1000 and there is nothing in-between.
  • juncture - Tuesday, December 14, 2010 - link

    These people also forgot to mention the HUGE difference being the resolution.... Try to find a 24" that has the same resolution as the $1000+ 27"-30". How was this not mentioned in this argument?
  • plonk420 - Tuesday, September 28, 2010 - link

    seconded .. it's either crap dirt cheap TNs for me ... or IPS (but preferrably oLED). i hope i never spend more than $140 on a TN again...

    (posting from a dying CRT ... with TN+Dell *VA as secondary)
  • plonk420 - Tuesday, September 28, 2010 - link

    seconded @B3an
  • Targon - Tuesday, September 28, 2010 - link

    It has to do with yields and the production costs. It is far easier to make a PERFECT four inch display than an eight inch display, just because there is a smaller chance of dead or stuck pixels. As the size of the screen goes up, it just becomes that much harder to make a display without any problems.

    As time has gone on, with the older technologies, it has gotten easier to make larger panels that don't have problems, but now we are seeing newer technologies that raise the difficulty again.

    When you got that 28 inch display, was that a 1080p or 1080i, or 720p? These days, the LED technologies are where a lot of the focus is. Then you have the move from 60Hz to 120Hz to 240Hz. 3D is also creeping in as something that will push costs up.

    Two years ago, the cost of a 23 inch panel was MUCH higher than it is today, and you are seeing displays that could not do 1080p going away. So, now that 1080p is the norm for flat panel displays, the question is when better displays will become the norm on the desktop.

    We also have a problem with what integrated video can handle. The Radeon 3300 integrated video for example will handle 1280x1024 decently, but starts to have a bit of a problem at 1920x1080 when it comes to basic games and such. So, before the mainstream consumer can make decent use of higher resolution displays, the base level for GPUs needs to go up by a bit more to make the experience properly "smooth".

    Remember, prices come down when the manufacturers can expect high enough sales volumes to allow the drop in price to still provide a good profit. So, how many 1920x1200 displays would sell at $300 compared to 1920x1080 displays selling at $210? How about going up from there, would the general public pay for a higher quality display if their computer couldn't push the pixels well enough at the higher resolutions? Even on the gamer front, would you be willing to pay $400 for a 23 inch display with a higher resolution since the higher resolution means lower framerates?

Log in

Don't have an account? Sign up now