Anand and I just spent some time playing with NVIDIA's recently announced Project Shield, which is NVIDIA's portable handheld gaming device home to the also just-announced Tegra 4 SoC (4x ARM Cortex A15). I came away pretty impressed with Project Shield ergonomics after initially going in very skeptical (like GPU editor Ryan Smith) about how 38 watt-hours of batteries, a display, Tegra 4, and all the accompanying hardware could possibly feel comfortable stuffed into what boils down to a game controller.

First off, Shield isn't nearly as heavy as I thought it would feel in the hands. I expected Shield to feel very dense, instead the device doesn't feel much different from a wireless Xbox 360 controller with the rechargeable battery pack. I made a point of holding onto the Shield the entire time in a few different positions and didn't feel my hands or arms fatiguing at all. NVIDIA carefully positioned the batteries right around the palm areas, so the heaviest parts of the device are right in-axis with ones palms and arms rather than being elsewhere that would torque one's grip. There's something about the construction that feels balanced and masks the fact that this is actually a sizeable piece of hardware. Shield will have a soft touch bottom and different top texture finish, this first version we played with did not.

NVIDIA was eager to point out that both the D-Pad, triggers, and bumpers are all in the process of being tweaked and tuned, and that the spring preloads and click points on the Shields we played with were nowhere near final. This is good to hear as the D-Pad at present definitely needed to be less mushy and more communicative, we're told this will be replaced with a much more responsive one before Shield is finalized. I didn't have a problem with the analog sticks but would love to feel alignment nubs or texture on the domes. 

My biggest thoughts were framed around the placement of the two analog sticks, which at present is places the tops of them in the plane of the buttons and D-Pad. This initially felt very alien, until I realized this is done because the display needs to close shut (the analog sticks would otherwise protrude through the screen) but felt a little bit awkward. I'm used to the 360 controller personally, which has analog sticks that protrude above the plane of the buttons. I can see one getting used to this as it feeling awkward is simply an artifact of my prior exposure and trained response to the 360 controller. 

The 720p 5-inch "retinal" display indeed exceeded my visual acuity and was hard to pick out individual pixels on, though I still think there's too much bezel on Shield that should be taken up with more display. Something like a 5.5" 1080p display would make much more sense, but that's probably reserved for another iteration. The biggest concern is how smaller features on PC games played back on the Shield would be difficult to pick out. NVIDIA claims it will mitigate this by working with game developers on appropriate titles or offering a pinch to zoom feature to let users read small elements. Obviously some games with lots of text (they offered the example of EVE Online) can't possibly work perfectly, but I had no problems playing a few levels of Assassins Creed 3 and Black Ops 2 on Shield. I could see some H.264 artifacts while playing on Shield, higher bitrates could solve that problem easily. 

Wireless connectivity on the Shield is courtesy Broadcom's latest BCM43241 which is 2x2:2 802.11a/b/g/n, which is the right thing to do in a platform that so strongly leverages wireless display and control. Responsivity while playing PC games on the Shield was extremely good, I couldn't detect any lag. 

At the front of the Shield is a small gap and grille, it turns out that heatsink in the NVIDIA press event video was in fact real, as air is drawn from the front of the Shield, over the heatsink, and exhausted out the back. There is indeed a fan inside, albeit a small one. NVIDIA says it won't come on all the time during normal use, and after playing a few Android games natively (Hawken) on the device this seemed to be the case. One Shield left in the sun did have the fans kick on though, which are essentially inaudible. I didn't actually feel Shield get hot at all during use, but that heatsink wasn't just for show in the press event video/demo, it's real.

At the very back are the microSD, microHDMI, microUSB, and headphone jack. I'm told these are also changing slightly and would honestly like to see the headphone jack come around to the front. 

What really struck me about the Shield was how very far from ergonomic the device appears, and yet how surprisingly comfortable it is to hold. NVIDIA nailed the underside ergonomics almost perfectly, there's a small ledge for your fingers to rest on, and the palm cups are indeed reminiscent of the 360 controller. 

Android 4.2.1 on the Shield felt extremely responsive and fluid. I am very impressed with browser scrolling in both Chrome and Browser.apk, the latter of which is now a huge optimization target for NVIDIA. The rest of the UI was also very fluid. I should note that NVIDIA is not allowing benchmarking at present, so we can't say anything but just subjective impressions about Tegra 4 performance.

We recorded a video of gameplay on the Shield and are uploading it as fast as CES connectivity will afford. Edit: Videos are now up. 

POST A COMMENT

41 Comments

View All Comments

  • Stuka87 - Monday, January 07, 2013 - link

    He meant to say 660 (It requires a GTX650, or a GTX660M). Reply
  • cmikeh2 - Monday, January 07, 2013 - link

    That was my bad. You're right. Reply
  • MonkeyPaw - Monday, January 07, 2013 - link

    The fact that Shield uses a fan raises a few questions about Tegra4. I get the impression this is a higher TDP chip than what we will see in tablets. That, or future tablets might be gaining active cooling. Reply
  • jwcalla - Monday, January 07, 2013 - link

    Yikes. Not a fan of that idea. Reply
  • cmikeh2 - Monday, January 07, 2013 - link

    Pun intended? Reply
  • ET - Tuesday, January 08, 2013 - link

    Fan intended. Reply
  • cmikeh2 - Monday, January 07, 2013 - link

    I'd guess that the functional TDP of this thing is probably in the realm of 8W so similar to the IVB ULV announced today. Considering the Exynos 5250 would spike to 8 W but typically run at 4 I can see this running at around 6-8 under typical usage. Reply
  • Chloiber - Monday, January 07, 2013 - link

    I don't think so. The T4 in this thing is probably clocked very aggressively and pushed to the limit. It's made for gaming.
    Additionally, coming from the One X: also T3 gets pretty hot if you game for a long period of time. Start a THD game which puts heavy load on all 4 cores and you may see temps beyond 50°C (as reported for the BATTERY). Well, a One X is not made for gaming, at least not for hours - but this thing is. So as Brian pointed out, it did not get hot during normal usage for him and the fan was not needed. But I expect the chip to heat up if use it heavily for a long period of time.
    Reply
  • tipoo - Tuesday, January 08, 2013 - link

    Maybe not. The Tegra 3 in the Ouya has a fan and heatsink, but obviously the ones in phones and tablets don't require that. It depends on clocks and voltages. Maybe the Shield one runs higher than others. Reply
  • TareX - Monday, January 07, 2013 - link

    This thing is as portable as a laptop. The VITA on the other hand is super light and ultraportable, plus is backed by the big gaming companies (well, some of them) and SONY. The Shield seems like the older brother of the Xperia Play... Dedicated games will be far from console quality, similar to how FIFA Soccer on the Vita compares to FIFA on the iPad3/Android. No comparison. Reply

Log in

Don't have an account? Sign up now