CUDA - Oh there’s More

Oh I’m not done. Other than PhysX, NVIDIA is stressing CUDA as another huge feature that no other GPU maker on the world has.

For those who aren’t familiar, CUDA is a programming interface to NVIDIA hardware. Modern day GPUs are quite powerful, easily capable of churning out billions if not a trillion instructions per second when working on the right dataset. The problem is that harnessing such power is a bit difficult. NVIDIA put a lot of effort into developing an easy to use interface to the hardware and eventually it evolved into CUDA.

Now CUDA only works on certain NVIDIA GPUs and certainly won’t talk to Larrabee or anything in the ATI camp. Both Intel and ATI have their own alternatives, but let’s get back to CUDA for now.

The one area that GPU computing has had a tremendous impact already is the HPC market. The applications there lent themselves very well to GPU programming and thus we see incredible CUDA penetration there. What NVIDIA wants however is CUDA in the consumer market, and that’s a little more difficult.

The problem is that you need a compelling application and the first major one we looked at was Elemental’s Badaboom. The initial release of Badaboom fell short of the mark but over time it became a nice tool. While it’s not the encoder of choice for people looking to rip Blu-ray movies, it’s a good, fast way of getting your DVDs and other videos onto your iPod, iPhone or other portable media player. It only works on NVIDIA GPUs and is much faster than doing the same conversion on a CPU if you have a fast enough GPU.

The problem with Badaboom was that, like GPU accelerated PhysX, it only works on NVIDIA hardware and NVIDIA isn’t willing to give away NVIDIA GPUs to everyone in the world - thus we have another catch 22 scenario.

Badaboom is nice. If you have a NVIDIA GPU and you want to get DVD quality content onto your iPod, it works very well. But spending $200 - $300 on a GPU to run a single application just doesn’t seem like something most users would be willing to do. NVIDIA wants the equation to work like this:

Badaboom -> You buy a NVIDIA GPU

But the equation really works like this:

Games (or clever marketing) -> You buy a NVIDIA GPU -> You can also run Badaboom

Now if the majority of applications in the world required NVIDIA GPUs to run, then we’d be dealing in a very different environment, but that’s not reality in this dimension.

Mirror’s Edge: Do we have a winner? The Latest CUDA App: MotionDSP’s vReveal
Comments Locked

294 Comments

View All Comments

  • SiliconDoc - Monday, April 6, 2009 - link

    Don't worry, it is mentioned in the article their overclocking didn't have good results, so they're keying up a big fat red party for you soon.
    They wouldn't dare waste the opportunity to crow and strut around.
    This was about announcing the red card, slamming nvidia for late to market, and denouncing cuda and physx, and making an embarrassingly numberous amount of "corrections" to the article, including declaring the 2560 win, not a win anymore, since the red card didn't do it.
    That's ok, be ready for the change back to 2560 is THE BESt and wins, when the overclock review comes out.
    :)
    Don't worry be happy.
  • tamalero - Thursday, April 9, 2009 - link

    SD, you seriously have a mental problem right?
    I noticed that you keep bashing, being sarcastically insultive (betwen other things.) to anyone who supports ati.
  • SiliconDoc - Thursday, April 23, 2009 - link

    No, not true at all, there are quite a few posts where the person declaring their ATI fealty doesn't lie their buttinski off - and those posts I don't counter.
    Sorry, you must be a raging goofball too who can't spot liars.
    It's called LOGIC, that's what you use against the lairs - you know, scientific accuracy.
    Better luck next time - If you call me wrong I'll post a half dozen red rooster rooters in this thread that don't lie in what they say and you'll see I didn't respond.
    Now, you can apologize any time, and I'll give you another chance, since you were wrong this time.
  • Nfarce - Thursday, April 2, 2009 - link

    I just finished a mid-range C2D build, and decided to go with the HD 4870 512MB version for $164.99 (ASUS, no sale at NE, but back up to $190 now). This was my first ATI card and it was a no-brainer. While the 4890 is a better card, to me, it is not worth the nearly $100 more, especially considering I'm gaming at either 1920x1200 on a 40" LCD TV or a 22" LCD monitor at 1680x1050.

    Nvidia has lost me after 12 years as a fanboy for the time being, I suppose. What I will do here when I have more time is determine if buying another 4870 512MB for CrossFire will be the better bang for my resolutions or eventually moving up to the 4890 when the price drops this summer and then sell the 4870.

    Thanks for the GREAT review AT, and now I have my homework cut out for me for comparisons with your earlier GPU reviews.
  • Jamahl - Thursday, April 2, 2009 - link

    Good job with tidying up the conclusion Anand.
  • Russ2650 - Thursday, April 2, 2009 - link

    I've read that the 4890 has 959M transistors, 3M more than the 4870.
  • Gary Key - Thursday, April 2, 2009 - link

    That is correct and is discussed on page 3. The increase in die size is due to power delivery improvements to handle the increased clock speeds.
  • Warren21 - Thursday, April 2, 2009 - link

    Maybe the tables should be updated to reflect this?
  • Gary Key - Thursday, April 2, 2009 - link

    They are... :)
  • helpmespock - Thursday, April 2, 2009 - link

    I've been sitting on my 8800GT for a while now and was thinking about going to a 4870 1GB model, but now I may hold off and see what prices do.

    If the 4890/275 force the 4870 down in price then great I'll go with that, but on the other hand if prices slip from the new parts off of the $250 mark then I'll be tempted by that instead.

    Either way I think I'm waiting to see how the market shakes out and in the end I, the consumer, will win.

Log in

Don't have an account? Sign up now