Introduction

Oooh this is dangerous.

It started with Intel quietly (but not too quietly) informing many in the industry of its plans to enter the graphics market with something called Larrabee.

NVIDIA responded by quietly (but not too quietly) criticizing the nonexistant Larrabee.

What we've seen for the past several months has been little more than jabs thrown back and forth, admittedly with NVIDIA being a little more public with its swings. Today is a big day, without discussing competing architectures, Intel is publicly unveiling, for the first time, the basis of its Larrabee GPU architecture.

Well, it is important to keep in mind that this is first and foremost NOT a GPU. It's a CPU. A many-core CPU that is optimized for data-parallel processing. What's the difference? Well, there is very little fixed function hardware, and the hardware is targeted to run general purpose code as easily as possible. The bottom lines is that Intel can make this very wide many-core CPU look like a GPU by implementing software libraries to handle DirectX and OpenGL.

It's not quite emulating a GPU as it is directly implementing functionality on a data-parallel CPU that would normally be done on dedicated hardware. And developers will not be limited to just DirectX and OpenGL: this hardware can take pure software renderers and run them as if the hardware was designed specifically for that code.

There is quite a bit here, so let's just jump right in.

The Design Experiment: Could Intel Build a GPU?
POST A COMMENT

101 Comments

View All Comments

  • DerekWilson - Monday, August 04, 2008 - link

    this is a pretty good observation ...

    but no matter how much potential it has, performance in games is going to be the thing that actually makes or breaks it. it's of no use to anyone if no one buys it. and no one is going to buy it because of potential -- it's all about whether or not they can deliver on game performance.
    Reply
  • Griswold - Monday, August 04, 2008 - link

    Well, it seems you dont get it either. Reply
  • helms - Monday, August 04, 2008 - link

    I decided to check out the development of this game I heard about ages ago that seemed pretty unique not only the game but the game engine for it. Going to the website it seems Intel acquired them at the end of February.

    http://www.projectoffset.com/news.php">http://www.projectoffset.com/news.php
    http://www.projectoffset.com/technology.php">http://www.projectoffset.com/technology.php

    I wonder how significant this is.
    Reply
  • iwodo - Monday, August 04, 2008 - link

    I forgot to ask, how will the Software Render works out on Mac? Since all Direct X code are run to Software renderer doesn't that fundamentally mean most of the current Windows based games could be run on Mac with little work? Reply
  • MamiyaOtaru - Monday, August 04, 2008 - link

    Not really. Larrabee will be translating directx to its software renderer. But unless Microsoft ports the directX API to OSX, there will be nothing for Larrabee to translate. Reply
  • Aethelwolf - Monday, August 04, 2008 - link

    I wonder if game devs can write their games in directx then have the software renderer convert it into larrabee's ISA on windows platform, capturing the binary somehow. Distribute the directx on windows and the software ISA for mac. No need for two separate code paths. Reply
  • iwodo - Monday, August 04, 2008 - link

    If anyone can just point out the assumption anand make are false? Then it would be great, because what he is saying is simply too good to be true.

    One point to mention the 4Mb Cache takes up nearly 50% of the die size. So if intel could rely more on bandwidth and saving on cache they could put in a few more core.

    And am i the only one who think 2010 is far away from Introduction. I think 2009 summer seems like a much better time. Then they will have another 6 - 8 months before they move on to 32nm with higher clock speed.

    And for the Game developers, with the cash intel have, 10 Million for every high profile studio like Blizzard, 50 Million to EA to optimize for Intel. It would only cost them 100 million of pocket money.
    Reply
  • ZootyGray - Monday, August 04, 2008 - link

    I was thinking of all the p90's I threw away - could have made a cpu sandwich, with a lil peanut software butter, and had this tower of babel thing sticking out the side of the case with a fan on top, called lazarus, or something - such an opportunity to utilize all that old tek - such imagery.

    griswold u r funny :)
    Reply
  • Griswold - Monday, August 04, 2008 - link

    You definitely are confused. Time for a nap. Reply
  • paydirt - Monday, August 04, 2008 - link

    STFU Griswald. It's not helpful for you to grade every comment. Grade the article if you like... Anandtech, is it possible to add an ignore user function for the comments? Reply

Log in

Don't have an account? Sign up now