Uniformity is tested by using 25 locations across the screen and measuring the color checker chart at each of them. From there we can pull out contrast, black and white uniformity, and color uniformity. This review is the first to utilize the newest measurement available in CalMAN 5.1.2: dE From Center. Now instead of measuring the dE2000 at every location, we measure it relative to the center measurement.

This gives us a true uniformity measure. I could measure the left side and the right side of the monitor and get a dE2000 of 2.0 for each side. What that doesn’t tell me is that the left side might be red tinted, and the right side blue tinted, while the center might be green tinted. In this case they could all measure the same dE2000, but look totally different. By comparing the measured values to the center, we get an actual measurement of if one area of the screen will look the same as another area. Since we always use the center of the screen as our calibration target, which is why everything is measured relative to that.

Starting out with White Uniformity, we see decent but not amazing results. The panel stays within +/- 10% for the center, but falls down to a 17% variation at the edges. The light fall-off is relatively high, and makes me wonder if the look of the panel, and its thin design, might place a bit of emphasis on style over substance.

With the black level charts we see similar results. The middle of the panel is +/- 10% again, but the edges fall off to a nearly 20% difference. There is a curious rise in black level in one measure where there was a fall-off in white level, but otherwise the results between the two measurements are similar.

Looking at the resulting contrast, the numbers here are much closer to 100%, which we expect to see. Areas with light fall-off affect the white and black levels almost equally, so the contrast ratios are very similar all across the screen. That gives us a 700:1 expected contrast ratio for the screen as a whole.

Now we can see the new dE2000 From Center data. The issues here crop up at the outsides of the screen, where we see the backlighting issues earlier. Uneven lighting is the issue most likely to cause color issues on the screen, and that is certainly backed up here. In the center of the screen, you are going to not see a difference in colors when you look directly at the screen. With a light loss of less than 10%, and a color dE2000 of <2 for most of the center screen, everything will look identical. As you get to the extreme edges you will run into more issues. I will need more monitors to be tested with this new method, but I think this is going to wind up as a good result in the end.

Going with an LED lighting system, and not a backlit array one, is always a bit of a concern for me. Overall the PQ321Q does well for uniformity for using one, and it avoids some of the massive issues we have seen with some LED systems before. But we are looking for near-perfection from the ASUS and it can’t quite do that. The center 60% of the screen is excellent overall, and for most people that will mean you may not notice these issues at all, but they are there.

dE2000 Data, 80 cd/m2 Calibration Power Use, Input Lag, Gaming and Gamut
Comments Locked

166 Comments

View All Comments

  • ninjaburger - Tuesday, July 23, 2013 - link

    I feel like this is an easy position to take with very few 4k TVs in the wild, very little content delivered in 4k, and, maybe most importantly, *even less* content being finished at 4k (as opposed to upscaled 2.5k or 3k).

    When you see native 4k content distributed at 4k on a good 4k display, you can see the difference at normal viewing distances.

    I don't think it's worth adopting until 2015 or 2016, but it will make a difference.
  • Hrel - Tuesday, July 23, 2013 - link

    I see no reason to upgrade until 1080p becomes 10800p. Kept the CRT for 30 years, inherited the damn thing. That's how long I intend to keep my 1080p TV; whether tv makers like it or not.
  • Sivar - Tuesday, July 23, 2013 - link

    30 years? I hope you don't have a Samsung TV.
  • althaz - Tuesday, July 23, 2013 - link

    It doesn't matter what brand the TV is, good TVs last up to about 7 years. Cheaper TVs last even less time.
  • DanNeely - Tuesday, July 23, 2013 - link

    My parents no-name 19" CRT tv lasted from the early '80's to ~2000; the no-name ~30" CRT tv they replaced it with was still working fine ~3 years ago when they got a used ~35-40" 720p LCD for free from someone else. I'm not quite sure how old that TV is; IIRC it was from shortly after prices in that size dropped enough to make them mass market.

    Maybe you just abuse your idiotboxes.
  • bigboxes - Wednesday, July 24, 2013 - link

    You must be trolling. My top of the line Mitsubishi CRT started having issues in 2006 in year seven. I replaced it with an NEC LCD panel that I'm still using today. It could go at any time and I'd update to the latest technology. I'm picky about image quality and could care less about screen thinness, but there is always options if you are looking for quality. I'm sure your 1080p tv won't make it 30 years. Of course, I don't believe your CRT made it 30 years without degradation issues. It's just not possible. Maybe you are just a cheap ass. At least man up about it. I want my 1080p tv to last at least ten years. Technology will have long passed it by at that time.
  • bigboxes - Wednesday, July 24, 2013 - link

    Of course, this coming from a man who replaced his bedroom CRT tv after almost 25 years. Even so, the tube was much dimmer before the "green" stopped working. Not to mention the tuner had long given up the ghost. Of course, this tv had migrated from the living room as the primary set to bedroom until it finally gave up the ghost. I miss it, but I'm not going to kid myself into believing that 1988 tech is the same as 2012. It's night and day.
  • cheinonen - Tuesday, July 23, 2013 - link

    I've expanded upon his chart and built a calculator and written up some more about it in other situations, like a desktop LCD here:

    http://referencehometheater.com/2013/commentary/im...

    Basically, your living room TV is the main area that you don't see a benefit from 4K. And I've seen all the 4K demos with actual 4K content in person. I did like at CES this year when companies got creative and arranged their booths so you could sometimes only be 5-6' away from a 4K set, as if most people ever watched it from that distance.
  • psuedonymous - Tuesday, July 23, 2013 - link

    That site sadly perpetuates (by inference) the old myth of 1 arcminute/pixel being the limit of human acuity. This is totally false. Capability of the Human Visual System (http://www.itcexperts.net/library/Capability%20of%... is a report from the AFRL that nicely summarises how we are nowhere even CLOSE to what could actually be called a 'retina' display.
  • patrickjchase - Tuesday, July 23, 2013 - link

    A lot of people seem to confuse cycles/deg with pixels/deg. The commonly accepted value for the practical limit of human visual acuity is 60 cycles/deg, or 1 cycle/min. The paper you posted supports this limit by the way: Section 3.1 states "When tested, the highest sinusoidal grating that can be resolved in normal viewing lies between 50 and 60 cy/deg..."

    To represent a 60 cycle/deg modulation you need an absolute minimum of 120 pixels/deg (Nyquist's sampling theorem). Assuming unassisted viewing and normal vision (not near-sighted) this leads to an overall resolution limit of 500 dpi or so.

    With that said, the limit of acuity is actually not as relevant to our subjective perception of "sharpness" as many believe. There have been several studies arguing that our subjective perception is largely a driven by modulations on the order of 20 pairs/degree (i.e. we consider a scene to be "sharp" if it has strong modulations in that range). Grinding through the math again we get an optimal resolution of 200-300 dpi, which is right about where the current crop of retina displays are clustered.

Log in

Don't have an account? Sign up now