Display

One of my only issues with the Note 2 after using it for a long time was resolution. Although the move to a subpixel matrix with a full 3 subpixels per pixel on the Note 2 honestly really put most of my concerns at bay at launch, 720p started to feel limiting further on in that product’s cycle. Having more display area is great, however in Android what really matters is resolution. It was ironic having this phone with a huge display, but 720p resolution that was quickly eclipsed by devices with so much smaller displays. With the Note 3 Samsung moves to 1080p at 5.7 inches, up from the 720p at 5.5 inches in the Note 2, and 1280x800 at 5.3 inches from the original Note.

A question that immediately comes up every time we get a Samsung phone with AMOLED is first, what kind, and second what subpixel unit cell is behind it all, be it an RGB stripe or some other RG,BG alternative unit cell. In the case of the Note 3 we unsurprisingly see Samsung use the same unit cell as they did on SGS4, an offset pattern with green on one line and red and blue on another line. There’s a square blue subpixel with more area than the circular red and green subpixels as well to compensate for the difference in luminous efficiency of the material used in each subpixel type. As I’ve said in the past this isn’t PenTile (although people have started using that brand as a proxy for RG,BG alternatives) but something else entirely, but the ultimate end is still the same, two subpixels per unit pixel and not an RGB stripe.


The question for most normals then becomes – is this a big deal or can a normal human being see it? I’d argue that the subpixels on the Note 3, like the SGS4, are now small enough that they can’t be seen. I used to very picky about this, but I don’t find the new offset RG,BG pattern distracting at all on the Note 3. Subpixel size moves from just above 1 arcminute (1.006 and 1.073 for the Note and Note 2 respectively) down to 0.741 for the Note 3, making them small enough to in theory exceed human eye resolution of 1 arcminute. I won’t break out the huge table or chart, or go over all of that again, but it’s nice to see that finally be the case with the Note 3.

Brightness (White)

The Note 3 has the same display mode settings as we’ve seen in other generations, these mDNIe toggles allow some control over display color curves. They’re ultimately not a mitigation for Android’s lack of a real CMS and don’t completely solve the oversaturation issue that comes hand in hand with AMOLED’s different spectral curves, but they do help somewhat. These are unchanged as well from the SGS4 – Adapt Display is checked by default and will select which mode to use automatically for first party apps and a few others, but you can select between dynamic, standard, professional photo, and movie manually, which have different tunings for white point, gamut, and saturation. There’s also still the toggle for automatically adjusting screen tone depending on what’s being displayed.

Of the modes and configuration options available, I don’t doubt for a second that the most used one will be the defaults, however if you’re looking for the most sane from a color accuracy perspective it’s still Movie mode with the auto screen tone toggle unchecked. I gave Samsung the benefit of the doubt and ran all my measures in Movie mode as a result, but also took saturation measures of the other modes so you can see the difference in gamut and saturation with what you get under those.


The Standard and Dynamic modes have a ton of oversaturation, extending far beyond sRGB. In Dynamic mode we can also see some compression going on at the higher saturation levels, effectively blowing out those colors even more, with the second to last point almost on top of the last point. Pro Photo mode clamps down gamut and makes saturation a bit more linear, but has some odd other artifacts that show up. With the Movie selection made, the Note 3 display is considerably more controlled and linear, and makes a dramatic difference in how everything appears on the Note 3 during normal use. If you care about display really this is the only setting you should be using.


White point in movie mode is still bluer than I’d like at an average of just over 7100K, but in the all important Gretag Macbeth patch test, Delta-E is pretty low and puts it among iPhone 5, HTC One, and G2 territory. The results under movie mode from the Note 3 are actually nicely controlled. It still isn’t perfect, but there’s at least been an attempt made to give users that option if they don’t want garish colors that might look great on a store display but not so great if you care about matching photos you’ve taken to a display or print later, or web content between desktop and mobile.

CalMAN Display Performance - White Point Average

CalMAN Display Performance - Grayscale Average dE 2000

CalMAN Display Performance - Gretag Macbeth Average dE 2000


CalMAN Display Performance - Saturations Average dE 2000


 

Performance: CPU, GPU, NAND & USB 3.0 Camera
Comments Locked

302 Comments

View All Comments

  • Nathillien - Tuesday, October 1, 2013 - link

    You whine too much LOL (as many others posting here).
  • vFunct - Tuesday, October 1, 2013 - link

    I agree that it's cheating.

    The results don't represent real-world use. Benchmarks are supposed to represent real-world use.

    Geekbench actually runs real programs, for example.
  • Che - Tuesday, October 1, 2013 - link

    Since when do canned benchmarks really represent real world use?

    I don't have a dog in this fight, but benchmarks are very controlled, tightly scripted, and only give you details on the one thing they are measuring. The only way to define real world performance is by..... Using said device in the real world for a period of time.

    I care more for his comments on the actual use of the phone, this will tell you more than any benchmark.
  • doobydoo - Saturday, October 19, 2013 - link

    They are meant to be a way of measuring the relative performance that you'll get with real world use.

    Whatever the actual benchmark, provided some element of that benchmark is similar to something you'll do on the device, the relative performance of different phones should give you a reasonable indication how they will relatively perform in real world use.

    The problem is when companies specifically enable 'benchmark boosters' to artificially boost the phone above what is normally possible for real world use, and thus the relative scores of the benchmark which were previously useful are not.
  • darwinosx - Tuesday, October 8, 2013 - link

    So you are a kid that owns a Samsung phone. Yes, it really is that obvious.
  • Spunjji - Tuesday, October 8, 2013 - link

    Handbag.
  • runner50783 - Tuesday, October 1, 2013 - link

    Why is this cheating?, is not that they are swapping CPUs or anything, the SoC is still running under specification, so, get over it.

    What this make is benchmarks irrelevant, because Manufactures can tweak their kernels to just get better scores that do not reflect daily use.
  • Chillin1248 - Tuesday, October 1, 2013 - link

    No, it is not running under the specification that the consumer will get.

    They raise the thermal headroom, lock the speed to 2.3 ghz (which would normally kill battery time and cause heat issues). Now if Anand would test the battery life while looping the benchmark tests, then it would be fine as the discrepancy would show up. However, he uses a completely different metric to measure battery life.

    Thus, Samsung is able to artificially inflate only their benchmark scores (the only time the "boost" runs is during specific benchmark programs) while hiding said power usage to get those scores.
  • vFunct - Tuesday, October 1, 2013 - link

    It's cheating because the resuts can't be reproduced in the real world for real users.

    Geekbench uses real-world tests, and they need to represent real use.

    Samsung artificially raises the speed of Geekbench so that, for example it's BZip2 compress speeds can't be reproduced when I run BZip2 compress.

    Samsung doesn't allow me to run BZip2 as fast as they run it in benchmarks. Samsung gives the benchmarks a cheat to make them run faster than what the regular user would see.
  • bji - Wednesday, October 2, 2013 - link

    You know, you'd think benchmark authors would figure this stuff out and provide a tool to be used with their benchmark to obfuscate the program so that it can't be recognized by cheats like this. Whatever values the cheaters are keying off of when analyzing the program, just make those things totally alterable by the installation tool. If the benchmark program ends up with a randomized name, it is still usable for benchmarking purposes and the cheaters cannot tell its the benchmark they are trying to cheat on.

    Seriously why do I have to be the one to always think of all of the obvious solutions to these problems!??! Same thing happens at work! lol

Log in

Don't have an account? Sign up now