Comments Locked

50 Comments

Back to Article

  • mberest - Sunday, July 20, 2008 - link

    The bababoom website (www.badaboomit.com) states that the highest input resolution will be 720x480p. So for us blu-ray transcoders this is not the product we've been waiting for. The search continues...
  • toyotabedzrock - Thursday, July 17, 2008 - link

    They need to find a way to use 2 gpu cards in sli or not, and also offload a small amount to the system cpu.

    Also the BadaBOOM app looks to simplistic, power users will want more fetchers. We don't need the same kinda control over the encode as a pro does.
  • keanu13 - Tuesday, July 8, 2008 - link

    Great info. i really believe that H.264 is one of the best video codec. and also the portability is also much flexible. although its has some limitation (cons)

    may be u can have a look at http://video-codecs.blogspot.com/
  • macscoop - Wednesday, July 2, 2008 - link

    Here's my wishlist item. I want to be able to input a list of AVCHD 1080i files from a high-def camcorder and have them transcoded to lower-resolution H.264 videos in MP4 container, suitable for embedding in a web page and playing in the "Moviestar" version of the Flash player. Right now, that takes too many steps.
  • dalleyg - Thursday, June 26, 2008 - link

    As requested in the article, here's a short wishlist:

    (a) Support input from common HD camcorders like the Canon HG10 (they use some "advanced" H.264 encoding features that very few decoders actually handle correctly).

    (b) Have a command-line version with no GUI so that batch scripts can be used.

    (c) Allow different compression options for different decoder complexities, different encoding times, and different bitrates.

    (d) Supply a DirectShow DMO and/or VFW encoder so people can purchase the CUDA encoder and then use it from other applications such as custom applications or VirtualDub.
  • Spiny - Monday, June 30, 2008 - link

    >(b) Have a command-line version with no GUI so that batch scripts can be used.

    Better: a COM interface so you could script it with Javascript or VB Script.
  • thkbrew89 - Wednesday, June 25, 2008 - link

    I would really like to see .mkv input/output like the article says, because a lot of my files are in that format. I would also like to see support for windows media (WMV) because that's the only format I can stream to my Media Center Extender. Basically, I would like to be able to take any kind of video input, and convert it to a .wmv using the VC1 codec. Obviously options like vbr/cbr, 1-pass/2-pass, auto-scaling, subtitles etc. would all be appreciated. It seems they are including ability to output to ipod/iphone compliant H.264, which would also be nice.
  • kilkennycat - Tuesday, June 24, 2008 - link

    As far as I have heard, Adobe is actively working on CUDA implementations of GPGPU acceleration of both photo-processing and video processing/transcoding elements of their next-gen Creative Suite 4. For the professional photographer or videographer/editor, if successful, this will be an enormous time-saver in transitioning from the raw input to the final finished product. And the cost of a GTX280 card (or several in SLI) would be mere chicken-feed in terms of the potential extra business revenue.
  • shiggz - Tuesday, June 24, 2008 - link

    There is actually a few critical advantages for me of the ps3 264 profile over the .mkv. The "scene" switched to dual then quad cores basically as soon as they were purchasable even at 500-1k$ prices. Because for them getting to be first with scene release i guess is a big deal, so if speed and is much improved and quality comparable "the scene" might move to this unlike what they did with divx vs xvid.

    -gpu hardware accel (laptops and my weak CPU HTPC) without GPU accel 1080p .mkv are just not playable for me. With GPU accel NP, The ps3 profile is hardware accelerated through like powerdvd etc.

    -codec compatibility through future ps4 and past(as I mentioned earlier)

    Once burnable blu ray get down to a few bucks a disc i hope to do away with having a "windows" box connected to my TV altogether. Just have DVD quality 264rips of shows and things all on one bluray. Could probably fit all .264 DVD rips of simpsons on 1 or 2 blu ray.

    my vision is to have one 30slot CD case filled with .264 BLUray basically 1 disc per show of dvd or better quality. Then have them all able to play off the ps3/ps4/ps5 or weak Laptop.

    Also if your into this sorta stuff, i Want to get my laptop Solar powered for 300$. So earthquake,flood, long term power outage i can still have music/games/movies. And hardware accel .264 is the key to it all for me.

    http://www.siliconsolar.com/portable-solar-power-s...">http://www.siliconsolar.com/portable-solar-power-s...
  • LTG - Tuesday, June 24, 2008 - link

    Contrary to what someone else said x264 is being ported to GPUs and the project is active.

    Just Google this: Dark Shikari GPU

    It looks like Avail Media has hired a contractor who gets paid full time to work on the project.

    @Anand: It would be great if you could use magic reporter skills get the latest update on this project. Dark Shikari said the code wouldn't be GPL'd until it was completed.

    Also for testing you'll want 1080p uncompressed source video.

    The only place I know of that offers this free for testing and research is here:
    http://www.hdgreetings.com/ecard/video-1080p.aspx">http://www.hdgreetings.com/ecard/video-1080p.aspx

    Can't wait for the followup -
  • 7oby - Wednesday, June 25, 2008 - link

    > Contrary to what someone else said x264 is being ported to GPUs and the project is active.
    >
    Just Google this: Dark Shikari GPU
    > It looks like Avail Media has hired a contractor who gets paid full time to work on the project.

    I said that and it looks like my information is more uptodate (06/21/2008):
    "Dark Shikari has already said that Avail Media is now going for FPGA's instead of CUDA ..."
    http://forum.doom9.org/showpost.php?p=1151050&...">http://forum.doom9.org/showpost.php?p=1151050&...

    Dark Shikari is working for Avail Media
  • PatMeenan - Tuesday, June 24, 2008 - link

    Anand, I'm sure I didn't just see you publicly admit to violating the DMCA by ripping BD movies for your home theater, right?
  • legoman666 - Tuesday, June 24, 2008 - link

    He didn't rip them. He tripped over the power cable for the comp and when he got up, the movie had ripped itself somehow.
  • icrf - Tuesday, June 24, 2008 - link

    hehe, I was going to go with the slightly more feasible "I encoded them on a box overseas"
  • JonnyDough - Wednesday, June 25, 2008 - link

    You've been reading too many news articles about the U.S. Military "not torturing" "terrorists".
  • michal1980 - Tuesday, June 24, 2008 - link

    nearly all blu-rays are encoded using a 'next gen' codec. Ethier mpg4, or an AVC flavor.

    given that they are both efficent codecs. Why on earth would you want to compress that file even more? Loss of quality will happen.

    to me, its like compressing mp3 to another mp3. At least when ripping DVD's to compress then to x264, you are taking mpg2, and putting in a new format that is vastly more efficent.

    And while I see a need to rip the blu-ray to a harddrive(media center). To loss the new audio formats in the process, is destroying half the point of new formats.
  • legoman666 - Tuesday, June 24, 2008 - link

    You clearly have never seen a HD rip @ 12gb. The difference between the 12gb reencode and the 30gb original is negligible.

    Re-encoding to 720p is also useful for people who don't have a 1080p display to take full advantage of the Blu-Ray.
  • icrf - Tuesday, June 24, 2008 - link

    It's like buying CBR 320k mp3's from an online retailer, and transcoding them to VBR to save a little space on your mobile audio player. Makes perfect sense, especially if you'll always listen on the mobile device which isn't high fidelity and you can't hear the difference between the two. If I have a 720p HDTV at home and hardware that can't decode a 1080p stream, why would I spend the extra storage space on the original?
  • michal1980 - Tuesday, June 24, 2008 - link

    expect the video encoding is already VBR.

    If you are going for true quality, then why throw out data? I can almost understand the transcode from 1080p to 720p.

    Still are you going to be stuck with a 720p display forever?

    another poster said the difference is neglabile. Depending on the size of your screen, and moive then yes.

    But i'd argue, that esspically if you are using it in a prue teather setting like Anand is doing, Then why? you spent thosands on the display/projector/screen, etc, a lot of money on the audio gear.

    And then you are going to worry about sainvg 8gb on your hard drive?

    how cheap is harddrive space these days? I saw 750gb for ~90 dollars

    or about 12 cents a gb, Is it worth the the dollar, or buck fifty, a movie + processing time? I would venture to geuss that the electric bill to re-encode the movies, would make that difference in cost savings even smaller.

    Unless you are really space limited, or doing it for compatiablity reasons. I'd rather have the data as original as possible. The enocders spent lots of times in most cases optimizing the encode, so it always looks good. No Automagic encoder is going to do that.
  • chrnochime - Tuesday, June 24, 2008 - link

    I don't see why it's such an exciting news when you don't even know what kind of quality setting is being used by the transcoder.
  • lucapicca - Tuesday, June 24, 2008 - link

    I wonder if all this buzz about GPU programming is really a sane idea...
    First of all, from a video coding point of view, one would not transrate a video (I mean... same resolution, same GOP structure) and perform motion estimation again from scratch.
    And this is what GPUs might really be good at.
    Lastly, I'm not that impressed at the speed numbers.
    Is the performance/power ratio favourable to GPUs (in this application)?
    Is the transcoding done entirely on the GPU?
    Because... if 75% of the time is spent in communication/synchronization between CPU and GPU, I think that the future of computation is not in GPUs... and perhaps some sort of less powerful DSPs integrated in the CPU might really do dthe job better (see Cell).
    After all, it's just a matter of communication speed:
    sometimes sending a job to a remote CPU is not really worth it.
    Any opinion?
  • JonnyDough - Wednesday, June 25, 2008 - link

    I was thinking that myself as I read this article. The CPU simply isn't designed with this in mind, and if it was more specialized it could probably outperform the GPU. I think what you're suggesting is the merging of the GPU and the CPU...and it's my understanding that that merger is now finally underway. Once we hit 32nm and smaller, and begin to utilize more power saving features we'll see laptops REALLY begin to take off. Gaming on a 3 day battery powered laptop here we come. Hopefully.
  • Pjotr - Tuesday, June 24, 2008 - link

    [quote]In the worst case scenario, the GTX 280 is around 40% faster than encoding on Intel's fastest CPU alone.[/quote]

    Why can you never learn simple maths. If something completes in 8 seconds over something that completes in 14 seconds, it's 75% faster. (If it had run in 7 seconds over 14 seconds, it's obvious it's 100% faster not 50%, isn't it?)
  • strikeback03 - Tuesday, June 24, 2008 - link

    It is not so much math as semantics. The 6 seconds between the NVIDIA number and the fastest 9770 is about a 40% time savings (6/14=0.4285...), which could be thought of as 40% faster.

  • Pjotr - Wednesday, June 25, 2008 - link

    To be done in 40% less time (43.9% in this case though) you are done in 60% of the time. To be done in 60% of the time, you must process 1/60% = 1.67x faster = 67% faster. To be done in 57.1% of the time (8 seconds of 14) you must process 75% faster.

    If something is running 40% faster as stated in the article, it should be done in 71.4% of the time (use 28.6% less time). The article is incorrect in claiming processing is done 40% faster, it should say 75% faster OR done in 40% less time.
  • JonnyDough - Wednesday, June 25, 2008 - link

    The two of you, are correct. 40% faster is wrong.
  • DigitalFreak - Tuesday, June 24, 2008 - link

    Why can YOU never learn simple Englishs
  • INNAM - Tuesday, June 24, 2008 - link

    and to make it clear the VOB file can hold H.264 but i think i has to be under 4gbs and no DTS... Also the can play H.264 as well.
  • INNAM - Tuesday, June 24, 2008 - link

    as a dvd/blu-ray ripper myself i suggest other formates besides MKV. The fact that MKV can only be played on the computer. This makes it painful to than convert it once more to VOB(PS3 playback) or WMV/AVI(X360 playback). Don't get me wrong, if BadaBOOM wants to make it they NEED to add MKV support because if you're going to watch it via PC/HDMI it has Chapter support all the way to VC-1 and DTS.

    oh and i know having bult in AC3 or DTS encode plugin cost money but you could leave a option for the hardcore user to allow their own plugins. That would make it oh so good!
  • shiggz - Tuesday, June 24, 2008 - link

    Also shame on ATI years ago i spent 380$ to buy an x1900 to help speed up video encodes (as they promised) and they never came through on gpu accelerate! Even all these years later its still software accel. They just dropped the program. That was the last expensive ati card i bought.

    However with 4850's total 800 stream processor count If these could work same as nvidias' as mentioned potentially ATI could blow them away. Or am i missing something about 4850 SP count that would make it not directly proportional to Nvidia?
  • mmntech - Tuesday, June 24, 2008 - link

    ATI still has the software for AVIVO encoding in their drivers section. However, it only works with certain X1000 series cards. Somebody did make a crack of the program to work on all cards but I couldn't get it to work.
    I have no idea why they stopped it for the HD series. So far the only exercise my HD 3850 is getting lately is Folding@Home and Compiz. I agree they really missed the boat on GPU encoding. It would be nice to rip and encode a full DVD movie to DivX or AVC without it taking two hours.

  • djc208 - Tuesday, June 24, 2008 - link

    I would think ATI is either seriously re-thinking that plan or trying to find someone to pair with for similar software on their cards. Their new "shoot for the middle" strategy is good but this will give nVidia not only an advantage in the graphics card market but also help sell more high end cards. ATI may not be able to compete on the high end but they can't afford not to compete in this new space at all.
  • shiggz - Tuesday, June 24, 2008 - link

    I hope they include a ps3 264 compatible profile.

    I've started noticing that lots of of my old 4-6yr old downloaded video files are just not very compatible these days. So Ive started encoding all of my files with the ps3 profile in nero. Hoping that they will 99% be compatible with ps4 or even later thus giving me hopefully another decade or more of guaranteed hardware compatibility.
  • ViRGE - Tuesday, June 24, 2008 - link

    Speaking of profiles, did Elemental say what profiles BadaBoom and their professional applications will support? I've been told that the BadaBoom beta was limited to the Baseline profile, which is fine if you're encoding for mobile devices but not very useful for your backup scenario. Will the final version be able to encode material with High Profile features?
  • ltcommanderdata - Tuesday, June 24, 2008 - link

    Well, I definitely can't wait for OpenCL to avoid these platform dependent situations.

    In any case, I wonder if trying to reduce CPU load should be the only option. As in, if a GPU can encode very quickly, why not have an option where multiple CPU cores can be loaded to make things even faster while leaving a single-core free for other system tasks. I'm sure there are lots of cases where most of a quad-core processor would be free anyways, so why not use them too?
  • phatboye - Tuesday, June 24, 2008 - link

    I was against CUDA from the start. There really needs to be an Open API so we don't get tied to one developer's hardware. Not sure if OpenCL will be it but I do hope something comes soon.
  • ltcommanderdata - Tuesday, June 24, 2008 - link

    I imagine it could be a bit difficult to deal with all the different architectures out there. I believe only nVidia's new GT200 series has IEEE 754 compliant 64-bit support, while the older 8-series and 9-series were only 32-bit. ATI's 2k-series and 3k-series only had partial IEEE 754 64-bit support and the 1k-series were GPGPU capable as well through their Pixel Shaders. With so many different platforms on the market right now, it'll be interesting to see where they find the common ground or whether OpenCL will try focus more on setting the standard for future generations. I suppose OpenCL could use OpenGL's extensions approach which will presumably allow all current GPGPU forms to be supported, but that will still leave developers having to optimize for each platform.
  • SlyNine - Tuesday, June 24, 2008 - link

    Also unless I missed it, What quality level did the GPU Transcode equal too , Fast Balanced or Insane Quality?
  • Anand Lal Shimpi - Tuesday, June 24, 2008 - link

    We couldn't really do a direct comparison - the bitrates put it at somewhere between balanced and insane.

    Take care,
    Anand
  • Manabu - Wednesday, June 25, 2008 - link

    >We couldn't really do a direct comparison - the bitrates put it
    >at somewhere between balanced and insane.

    To make an comparison, is only set the x264 encoder to make an encode of the same size as your gpu-acelerated one. It don't matter that the gpu-acelerated one isn't tweakable, because x264 is. Bitrate is only size(bits)/time(seconds). Then you can compare quality. If the quality is too high in both to make an good comparison, then an reduction of bitrate would be necessary, to see witch one loose quality faster.

    And I couldn't get exact infomation on AutoMKV profiles, but maybe the fastest profile try to retain an minimum of decency. DarkShikari speculates that you can reach 240+ fps in an quadcore, on an 720p stream. http://forum.doom9.org/showpost.php?p=1152036&...">http://forum.doom9.org/showpost.php?p=1152036&...
  • tuteja1986 - Tuesday, June 24, 2008 - link

    err :( IQ testing needed. Anyways , I use AVIVO converter to do quick conversion for my zen. Its takes me 3mins to do a whole 40min episode of a TV show or 6mins for a movie.
  • rhangman - Tuesday, June 24, 2008 - link

    You would need to at least compare profiles. How many reference frames, etc. More advanced settings require more resources to encode (and decode).

    Really comes down to actually viewing the encodes though. If it can't touch x264's quality, then it doesn't matter how much faster it is. The scene will stick with x264 anyway just like they did with Xvid over DivX, 3ivX, etc. So it is x264 selling players, hardware decoding graphics cards, etc. just like it was Xvid selling Standalone DivX players.
  • JonnyDough - Wednesday, June 25, 2008 - link

    Agreed. I don't rip CDs to my computer at low bit rate either. The higher the quality (even if you can't see it) the better. Just because your computer monitor can't display uber-high res doesn't mean that the massive tv set or projector you buy a few years down the road can't. There's simply no point in building a library of movies and music if you aren't going to rip it in the highest quality possible. If you need to transfer it to a limited sized mobile storage device THEN you convert it on the fly. With larger than 1TB hard drives on the way I don't see who wouldn't want to store their movies in the sharpest image possible.
  • 7oby - Tuesday, June 24, 2008 - link

    [quote]You would need to at least compare profiles. How many reference frames, etc. More advanced settings require more resources to encode (and decode).[/quote]

    The bitrate hardly has an impact on encoding speed. As far as I understood CABAC is done on the 30% loaded CPU here. That means if you change the bitrate, the quality will change, but the encoding speed should basically be the same.

    As you said: profile comparision gives some more information. I expect Baseline Profile at most since it's a derived product:
    http://elementaltechnologies.com/products.php?id=4">http://elementaltechnologies.com/products.php?id=4

    Maybe it's only intra frame:
    http://elementaltechnologies.com/products.php?id=1">http://elementaltechnologies.com/products.php?id=1

    One would need more advanced quality tests to tell e.g. how good the motion estimation works. If this one is bad, you will need additional bandwidth to compensate.

    In any case: I think this is an innovative product for the intended target use case of transcoding movies for iPhones, PS3, HTPC etc.

    x264 developers recently turned away from CUDA, although they started experimenting in December 2007:
    http://forums.nvidia.com/lofiversion/index.php?t53...">http://forums.nvidia.com/lofiversion/index.php?t53...
  • Anand Lal Shimpi - Tuesday, June 24, 2008 - link

    The problem is that the beta of BadaBOOM doesn't expose any of what it's doing to the end user, we'll have to wait for the pro-version for that it seems.

    -A
  • rhangman - Tuesday, June 24, 2008 - link

    Should be able to extract the raw video stream and do some analysis on it though. Get an idea what the encoder is doing. For a visual comparison you don't need to know anyway.

    At any rate, in terms of encoding, speed is only half the equation.
  • Anand Lal Shimpi - Tuesday, June 24, 2008 - link

    Agreed :) Working on that part, I may wait until the next beta though so we have the latest code at our disposal.

    -A
  • Rainman200 - Tuesday, June 24, 2008 - link

    That great news thanks Anand, also do compare with Ripbot as well as it uses more bleeding edge versions of x264 with patches for film grain optimization and more.

    Also a comparison against the constant quality mode would be interesting, I use Ripbot (default high profile) with a setting of 18 and with most movies I can barely tell the difference on the HDTV vs the original. On a 3Ghz quad core 2 it takes at worst about 1 hour 30mins on average for a film, usually shorter but noisy/grainy movies like Downfall or Minority take that little bit longer.

    Will be very interesting to see how RapiHD fares against in terms of image quality.
  • SlyNine - Tuesday, June 24, 2008 - link

    Make it support other transcodeing functions, like for audio WMA-MP3. I know that for the most part the only thing that really needs this type of boost is HQ H.264 Blue Ray movies ( and HD-DVD).

    But it would be awesome to accelerate the other stuff as well.
  • icrf - Tuesday, June 24, 2008 - link

    Those are pretty lighweight, compared to H.264 so there's probably little reason. Besides, writing a codec in CUDA is no small task. I think they should focus on making the H.264 encoder as flexible as possible.

    I assume the feature request is more to do with features of the application doing the encoding, and not about what code to run on the GPU itself. My encoding settings of choice tend to be a Level 4.1 compliant MPEG4 file, H.264 video and 5.1 AAC-LC audio. Ideally, it should use any VFW codecs installed and dump to the various common containers (avi/mkv/mp4/ogm with things like mov/wmv being distant seconds).

Log in

Don't have an account? Sign up now