Looking Forward to WUXGA and QXGA Tablets

In a similar vein to the 4K displays, it looks like many tablets are getting a serious resolution bump in the next few months. When I complain regularly about the state of laptop displays (I can count the number of good laptop LCDs we saw in the last year on one hand), it gives me hope to see tablets pushing for higher quality, higher resolution panels. Amazingly enough, ASUS has announced that the Eee Prime Transformer will receive a 1920x1200 update in Q2 this year (and for the record, they’re not the only ones planning on using such a panel). Rumors suggest that the iPad 3 will go one step further and offer a QXGA (2048x1536) panel, sticking with the 4:3 aspect ratio of previous iPads—though of course Apple hasn’t officially announced anything yet—and there's even talk of some QSXGA (2560x2048) and/or QWXGA (2560x1600) tablets shipping later this year.

I had the chance to play with the upcoming Eee Prime Transformer TF700T, and I loved the increased resolution. Surprisingly, the Tegra 3 chipset appeared able to handle WUXGA quite well, though I didn’t get a chance to test any games. Gaming at WUXGA is going to really stress current SoC GPUs, however, at least if you want decent quality settings. Many desktop users—even those with high-end cards like the GTX 570/HD 6970—run at 1920x1200, albeit with significantly higher quality textures and geometry than seen in tablet games. Even so, pushing ~2MP on a tablet at decent frame rates will very likely need more memory bandwidth and faster GPUs; I expect many games will run at a lower resolution and simply scale the image to the screen size. Outside of gaming, however, higher resolutions can be very useful. Browsing the web at 1280x720 is doable, 720x1280 not so much; 1080x1920 on the other hand is wide enough for all the 1024-width websites that you won’t have to zoom out to see it. Plus, text and images in general will be improved.

What really irks me is that all of this comes in a 10.1” IPS package, exactly what I’ve been asking for in laptops for the past several years. What’s more, the price point for these is in the <$600 range, and we’re still getting 16:10 aspect ratio panels instead of being forced into 16:9. I asked several manufacturers, "How is it we're getting 16:10 aspect ratio tablets with IPS WUXGA displays, and you still can't put anything better than a low quality 1366x768 TN panel into your laptops?" Naturally, they blamed the display manufacturers and consumers for not being willing to buy better quality laptops.

There's certainly some truth to that, but it's also a matter of supply and demand; if ASUS for instance were to order a million ~13.3" 1920x1200 IPS laptop displays, I'm sure they could get prices down to <$1000 for a quality laptop. Naturally, they're worried that the laptops wouldn't sell well enough and they’d get stuck with a bunch of “too expensive” laptops. With all the $500 Best Buy laptops floating around they may be right, but I wish I could convince more people to stop settling for low quality displays in their laptops. That brings me to my final top-three device/tech that impressed me at CES.

I Have Seen the Future, and the Future Is 4K Ultrabooks Everywhere and Wrap Up
Comments Locked

78 Comments

View All Comments

  • therealnickdanger - Wednesday, January 18, 2012 - link

    That's your problem. 42" is too small to appreciate the detail. I know, I've got a few 1080p displays (17" notebook, 42" LCD, 60" plasma) and none of them compare to my 1080p projector (120"). 4K would be great to have though to more accurately capture the detail inherent to 35mm and 70mm film. 8K would be great too, but that's a ways away yet.

    We're "stuck" at 24fps because that's how film is shot and has been shot for about 100 years.
  • Finraziel - Wednesday, January 18, 2012 - link

    Well I'm exagerating my point slightly, I don't actually mean that I see no point at all in upping the resolution and obviously on way bigger screens the advantage will be more obvious, I'm just saying that I think that increasing the framerate might be a bigger win for a lot of people. As for being stuck on 24 fps because that's just how it's always been done, well, I guess you still go around with a horse and cariage or take the steamtrain/boat for larger distances? Just because something was done in a certain way for a long time doesn't mean you can't improve it. But I'm glad to see what name99 and B3an are saying below :)
  • name99 - Wednesday, January 18, 2012 - link

    You are right about frame rate but there is s small amount of good news on that front. A few Hollywood directors who actually understand tech and are in a position to push the issue (notably James Cameron) are trying to ramp up frame rates.

    http://www.hollywoodreporter.com/news/james-camero...

    Obviously with digital cinemas this is a lot easier to push, but I expect that even if
    Avatar2 is shot in 48 or 60 fps, there will be a long long period of crossover. I mean, my god, we're still stuck with interlace on plenty of broadcast TV.
  • B3an - Wednesday, January 18, 2012 - link

    The Hobbit movie is shot in 4k and 48 FPS.
  • sicofante - Tuesday, January 17, 2012 - link

    The problem with high-DPI displays for laptops and desktops is none of the main operating systems are designed to handle resolution-independent graphics. Even OSX does it in a tricky way, and it works because they control everything (as usual). Windows or Linux should go the true resolution-independence way (not the tricky OSX way). Then, and only then, maybe, and just maybe, manufacturers would consider enhancing the DPI of their screens and consumer would buy into them. While a user gets tiny text on any display, high-DPI displays can't start showing on ordinary computers. That just doesn't happen on tablets. That's why you get high-DPI displays there.

    BTW, true resolution independence calls for hardware acceleration, but that shouldn't be an issue on laptops, much less on desktops.
  • sicofante - Tuesday, January 17, 2012 - link

    I meant "NO hires displays for computers while on Windows, OSX or LInux" for the title. Don't understsand why there's no edit button here.
  • LesMoss - Tuesday, January 17, 2012 - link

    Not to mention that many web pages break at higher resolutions.
  • JarredWalton - Tuesday, January 17, 2012 - link

    They break because the browsers aren't coded to be DPI aware right now. I think a lot of this will get fixed with Windows 8 and Metro apps; we'll find out later this year. Anyway, I'm using a 30" display with 120 dpi setting in Windows 7 and browsing the web hasn't been one of my complaints (though I wish text and Flash would scale rather than being done at a fixed pixel size). I suppose if you define "break" as "are really tiny and hard to read on a high DPI display" then I can agree with you.
  • name99 - Wednesday, January 18, 2012 - link

    Bullshit. They break on your crappy browser.

    Do web pages display fine on iPhone Safari? OK then.

    I don't understand why people feel a compulsive need to say something doesn't work when proof that it works has been shipping for almost two years.
  • Malih - Tuesday, January 17, 2012 - link

    does the Yoga 13" have some sort of Thunderbolt port?

    I wish it does, external GPU is something I look forward to with future Ultrabooks to make my desktop obsolete, since my work doesn't use that much CPU anyway.

Log in

Don't have an account? Sign up now