Coming Soon to HD DVD: Silicon Optix HD HQV

by Derek Wilson on 2/8/2007 1:25 AM EST
POST A COMMENT

27 Comments

Back to Article

  • bigpow - Monday, February 12, 2007 - link

    I'd like to see more results, maybe from xbox 360 hd-dvd & toshiba HD-DVD players before I can be convinced that ATI & NVIDIA totally suck

    Reply
  • thestain - Sunday, February 11, 2007 - link

    Suggest a redo Reply
  • ianken - Friday, February 09, 2007 - link

    ...I meant that in the context of post processing. FWIW. Reply
  • ianken - Friday, February 09, 2007 - link

    Since every HD DVD and BRD I've seen is authored at 1080p, I don't think 1080i film cadence support is that critical for either next-gen disc format.

    It is critical for HD broadcasts where 1080i content is derrived from telecined film or HD24p content and not flagged, which is very very common on cable and OTA feeds.

    Noise reduction: just say no. It is NOT more important for HD. Noise reduction simply replaces random noise with deterministic noise and reduces true detail, I don't care how much magic is in there. With FUBAR analog cable is can make an unwatchable image moderalty palatable but keep it away from my HD-DVD, BRD content or broadcast HD.

    On my 7800GTX I get film cadence detection and adaptive per-pixel vector deinterlace on 1080i. The problem you're seeing may be with the HD-DVD/decoder app failing to properly talk to the GPU. On XP they need to support proprietary APIs to get anything beyond base VMR deinterlacing, particlarly for HD. With Cyberlink there is even a "PureVideo" option in the menus for this. If they do not support PureVideoHD then you will get none of those advanced features on Nvidia hardware. Not sure what ATI does, but I do belive they only support film cadence and noise reduction on SD content.



    Reply
  • peternelson - Friday, February 09, 2007 - link

    "Noise can actually be more of a problem on HD video due to the clarity with which it is rendered. While much of the problem with noise could be fixed if movie studios included noise reduction as a post processing step, there isn't much content on which noise reduction is currently performed. This is likely a combination of the cost involved in noise reduction as well as the fact that it hasn't been as necessary in the past. In the meantime, we are left with a viewing experience that might not live up to the expectations of viewers, where a little noise reduction during decoding could have a huge impact on the image quality.

    There are down sides to noise reduction, as it can reduce detail. This is especially true if noise was specifically added to the video for effect. We don't run into this problem often, but it is worth noting. On the whole, noise reduction will improve the clarity of the content, especially with the current trend in Hollywood to ignore the noise issue. "

    > Doing noise reduction at the player is less than ideal. You take noisy content then waste much of your datarate describing noise. The NR should be done as a PRE PROCESSING (as opposed to POST) step prior to feeding the encoder (not post processing as you suggest). Any movie studios making disks without NR are just lazy, and the customer deserves better. Obviously a generous bitrate and efficient encoding standard like mpeg4 are desirable, but you waste the benefit if you don't either noise-reduce it or have substantively no-noise content like CGI animation sequences from Pixar.

    Thus the workflow ought to be Telecine scan data or digital intermediate eg 2K film res into colour correction into pan/scan cropping or aspect ratio conversion scaling (eg cinemascope into 16x9) then into noise reduction (statial and temporal etc) into encoder.

    Done professionally different portions of the movie can be encoded with different processing parameters which kick in at the desired timecodes. These are often hand-optimised for sequences that can benefit from them. Such setups may be called ECL (encoder control lists) rather like EDL (edit decision lists).

    Equipment to do excellent realtime noise reduction in high definition is readily available eg from Snell and Wilcox, and if you can't afford it you should either not be in the encoding business, or should be hiring it for the duration of the job from a broadcast hire supplier. Alternatively NR processing may be a feature of your telecine/datacine capture platform.

    Ideally the encoded streams can be compared with the source material to identify any significant encoding artifacts like noticeable DCT macroblocking. This is basic QA and can be done in software and/or visually/manually.

    If the NR is done by the studio prior to disk mastering, I see no reason to rely on the cheap and nasty NR in the player, and of course using a display capable of the proper bit depth and resolution will avoid quantisation banding and scaling degradation.

    Poor attention to production values is diminishing the experience of what ought to be great content.

    Contrary to your statement, noise reduction ought to have been used at standard definition too by anyone doing encoding professionally for DVDs etc. Even moderately expensive/affordable gear from FOR-A could do NR and colour correction using SDI digital ins and outs (that's if you can't afford the Snell gear LOL). The difference is certainly noticeable even before moving to HD content and bigger screens.

    Not all noise reduction techniques reduce detail, particularly when done at the preprocessing stage. Taking noise out make more bits available for the denoised content to be described in MORE detail for equivalent bitrate. Clever algorithms are able to take out hairs from frames of movie film and replace with what ought to be there from adjacent frames (including using motion vector compensation). At this stage the maximum uncompressed source data is available on which to perform the processing whereas NR in the player suffers from only having the bit-constrained compressed material to recreate from. Other pre-processing might include removing camera shake (eg Snell Shakeout) so that compression bits are not wasted on spurious motion vectors where these are undesired. Genuine pans, zooms etc can be distinguised and still get encoded.

    You rightly point out that video using deliberately added noise as simulation of film grain can be troublesome to encode, but there are several other techniques for making video appear film-like, eg Magic Bullet hardware or software as pioneered by The Orphanage which can do things like alter the gamma curve, and replicate various film lab processes like bleach bypass (like opening sequences of Saving Private Ryan).
    Reply
  • DerekWilson - Sunday, February 11, 2007 - link

    Thanks for the very informative post.

    I think we've got a bit of a miscommunication though ...

    I'm not referring to post processing as post-encoding -- I'm referring to it as hollywood refers to it -- post-filming ... as in "fix it in post". You and I are referring to the same step in the overall scheme of things: after filming, before encoding.

    It seems a bit odd that I hadn't heard anyone talk about processing from the perspective of the encoding step before, as a brief look around google shows that it is a very common way of talking about handling content pre and post encoding.

    In any event, it may be that studios who don't do noise reduction are just lazy. Of course, you'd be calling most of them lazy if you say that. We agree that the customer deserves better, and currently they aren't getting it. Again, go pick up X-Men 3. Not that I liked the movie, but I certainly would have appreciated better image quality.

    Does your statement "If the NR is done by the studio prior to disk mastering, I see no reason to rely on the cheap and nasty NR in the player" go the other way as well? If studios do not perform noise reduction (or, perhaps, adequate noise reduction) prior to mastering, is NR in the player useful?

    I think it is -- but I do want to be able to turn it on and off at will.
    Reply
  • Wesleyrpg - Thursday, February 08, 2007 - link

    Read more like an advertisement for silicon optix than an article for Anandtech?

    The future of advertising? Buy an article?
    Reply
  • JarredWalton - Thursday, February 08, 2007 - link

    Hardly. People email us about all kinds of topics, and one of those has been HD video support. We've don't HQV image quality comparisons before, as have many websites, and it's not too surprising that NVIDIA and ATI decoder quality improved after many of the flaws were pointed out. It appears that there are plenty of flaws with the 1080i decoding now, and I'd bet that in the future it will be dramatically improved. We find the results to be useful - i.e. both ATI and NVIDIA are doing essentially nothing with HD video other than outputting it to the display. Now, readers will know that and maybe we'll see improvements. Not everyone cares about improving HD video quality, but for those that do this is good information to have. Reply
  • Wwhat - Sunday, February 11, 2007 - link

    quote:

    both ATI and NVIDIA are doing essentially nothing with HD video other than outputting it to the display

    Well that's clearly not true, they both try to de-interlace the test shows, it's just not a good effort, so don't make such silly statements.


    Reply
  • Wesleyrpg - Friday, February 09, 2007 - link

    sorry jarred, i must of woken up on the wrong side of the bed this morning, i didnt mean to take it out on you guys. I love Anandtech, and may of been a bit confused with the article.

    Sorry again
    Reply
  • JarredWalton - Thursday, February 08, 2007 - link

    *grumble* Should be "we've done HQV...." Reply
  • ShizNet - Friday, February 09, 2007 - link

    big part of GPU driver problems are backwards compatibility [GF2-7, Rad.7-X1, DX6-9..];
    DirectX is totally new beast - why not draw the line and develop drivers from now on for Legacy devices and DX10+ ones?
    this will keep old and new drivers in 'good' shape and there's no need for over bloated size files with old junk.
    Reply
  • Wwhat - Sunday, February 11, 2007 - link

    Since DX10 is vista-only and vista uses a whole new drivermodel it is obvious and inevitable that there are separate drivers developed for post-DX10 heh.
    So why are you asking for something that everybody already knows is going on and sees happening? Have you not heard about the issues concerning vista and the issues the graphics companies have/had releasing drivers for it?
    Plus since ATI-nay-AMD has lots of X1- cards only stuff, it's clear that they also separated their drivers in that sense already.
    Reply
  • kilkennycat - Thursday, February 08, 2007 - link

    I'm sure that Silicon Optix would only be too happy to quickly develop a hardware HDTV silicon-solution for nVidia and ATi/AMD or their board-partners as manufacturing-option for their graphics cards.. No doubt Silicon Optix developed the HD-HQV tests both to weed out the under-performers AND encourage the widest possible use of their silicon......... Would save nVidia and ATi the bother of even more driver-complication and possible tweaks to their GPU hardware (for mucho, mucho $$) for the few that want the highest-quality HD replication ( regardless of whether the source is 1080p or 1080i or even 720p) from their PCs... The same few would probably be only too willing to shell out the $50 extra or so for the "High-quality-HD" Option-version of their favorite video card. Reply
  • abhaxus - Thursday, February 08, 2007 - link

    I use either VLC or DScalar to watch 1080i on my PC. I've got an X800XL so I don't have the ability to use avivo. Would be interested to see how this disc fairs on those two solutions, I've always liked VLC's X method deinterlacing. Reply
  • RamarC - Thursday, February 08, 2007 - link

    The testing seemed to focus on de-interlacing issues. HD DVD (and Blu-Ray) are intended to store progressive (non-interlaced) content. Some early titles (and crappy transfers) may be stored as 1080i, but by the middle of this year, 95%+ off all HD titles will be 1080p and de-interlacing will be non-issue. Reply
  • ShizNet - Thursday, February 08, 2007 - link

    why focus on interlaced content?
    ______________________________________
    can you say TV-broadcasting?
    same 95%+ of 'stuff' you'll be watching is TV/Cable/Dish [which are 1080i] and not [HD]DVDs nor IPTV for next 5 years+
    even when all TV stations will go digital it's only 540p, don't confuse it up w/ HDTV - 720/1080[i/p]. only BIG ones with deep pockets will go HDTV full time.
    Reply
  • autoboy - Thursday, February 08, 2007 - link

    You guys are missing the point of this test. Broadcast TV is almost all 1080i content and deinterlacing is very important. The HD-DVD is simply the source of the benchmark but should be able to test the playback capability of PCs for broadcast HD as well as interlaced HD-DVD if it exists. Playing progressive scan images is easy and the only thing that should affect it is the noise reduction which I don't use because it ussually reduces detail.

    Still...this article left me with more questions than answers.

    What decoder did you use for the ATI and Nvidia Tests? Nvidia purevideo decoder or purevideo HD?
    Did you turn on Cadence detection on the ATI and Inverse Telecine on the Nvidia card?
    What video cards did you use? You ussually use a 7600GT and x1900pro
    What drivers did you use?
    What player did you use?
    Is this test only for HD-DVD decoders or can you use any mpeg2 decoder which would make this a much more relevant test since 1080i HD-DVD is rare and Broadcast HD is what really matters here.
    What codec does the HQV use? Mpeg2? VC-1? H.264? Because most VC-1 and H.264 are progressive scan anyway and Nvidia does not claim to support purevideo with anything but mpeg2.
    Did you turn on Noise reduction in the Nividia control panel?
    Why does Nvidia claim HD Spacial Temporal Deinterlacing, HD Inverse Telecine, HD Noise Redution, etc in thier documentation but cannot do any of the above in reality? Is this h.264 and not supported?
    Reply
  • hubajube - Thursday, February 08, 2007 - link

    Well this settles whether or not I build an HTPC for HD movie play. This combined with needing a fast CPU (read expensive) as well as a HDCP capable video card pretty much kills a HTPC in the short term. I'll just get a standalone player for now. Reply
  • cjb110 - Thursday, February 08, 2007 - link

    Could you get more hddvd players and push them through this test?!?!

    Also include the DVD results too, as its no good if it can only do one format correctly.

    tbh I think it is pretty atrocious that only recently with the Denon 5910 and the Oppo players that we have a dvd player that actual plays dvd's 'properly'.
    Reply
  • ShizNet - Thursday, February 08, 2007 - link

    i agree with last dude - if we are talking about PC Hard/Software mixed with Cust.Electronics [40"+ LCD i guess] why not add http://usa.denon.com/ProductDetails/623.asp">this guy or similar to the mix? and see: should people put more money into VidCard/CPU [for best 1080p] or save for receiver/DVD in their HTPC?

    otherwise - great that you guys getting down and dirty to address some issues and breaking ice for the rest of us, before we spent all that $$$ and get mid of the road performance
    Reply
  • Visual - Thursday, February 08, 2007 - link

    i dont even understand exactly what you guys just tested... was this just some test-disc played with a software player? why didn't you start the article with more information about the test?
    what was the system's configuration?
    what codec is used for the content, and does it have the proper flags and information needed for correct deinterlacing?
    which player app and decoders you used, etc?

    if there were flaws in the playback, isn't it the software's fault, not the hardware's? if there were differences on ati/nvidia hardware, isn't it because the software used their built-in capabilities improperly and in different ways? surely there can be player software that handles deinterlacing perfectly without even using any hardware acceleration...

    with a digital source like a hddvd/bluray disc, i don't think these kind of tests can even apply. noise reduction, wtf? we're talking of digital storage, not audio tapes after all. noise can't just appear with age. if there is "noise" on the source, it was probably put there on purpose, not real "noise" but something that was meant to be there. why should the playback system remove it?
    resolution loss and jaggies, stuff that is related to deinterlacing, and it just pisses me off. why oh why should anyone be bothered with deinterlacing in this day and age?
    you say "Interlaced media is available on both HD DVD and Blu-ray" but from what i've heared, the majority (if not all) of hd-dvd and blue-ray content is currently stored as 1080p on the discs. who and why would be as dumb as to produce interlaced hd content?
    Reply
  • DerekWilson - Thursday, February 08, 2007 - link

    I've updated page 3 of the article with information on the HD DVD player used and the drivers used for AMD and NVIDIA cards.

    The software player enabled hardware acceleration which enables AMD and NVIDIA to handle much of the decode and deinterlacing of the HD content. This is a test of the hardware and drivers provided by AMD and NVIDIA.

    Codec doesn't matter and proper flags don't matter -- a good deinterlacing algorithm should detect the type of content being played. In fact, AMD and NVIDIA both do this for standard definition content.

    It might be possible for software HD DVD and Blu-ray players to handle proper deinterlacing, but most software DVD players don't even do it as effectively as possible. There are no HD DVD or Blu-ray players that we know of that support the type of adaptive deinterlacing necessary to pass the HD HQV test.

    I do appologize if I didn't explain noise well enough.

    The problem comes in the transfer of a movie from film to digital media. CCDs used to pick up light shining through film will absolutely introduce noise, especially in large blocks of similar color like sky. Even digital HD cameras don't have an infinite color space and will have problems with noise in similar situations due to small fluctionations in the exact digital color at each pixel for each frame.

    This type of noise can be reduced by post processing, but studios usually do not do this. All you need to do is watch X-Men 3 on Blu-ray to see that noise is a huge problem.

    In addition, encoding and compression introduce noise. This noise can't be removed except in the decode process.

    Noise is a major issue in HD content, and while much of it could be fixed with post processing, it looks horrible at high resolution.

    As for interlacing, most movies will definitely be progressive. But there are some that are 1080i and will need good deinterlacing support.

    The big issue, as has been pointed out elsewhere in the comments, is TV. 1080i is the standard here.

    In fact, when stations start distributing series on HD DVD and Blu-ray, it is very likely we will see them in interlaced format. Most of my DVD collection consists of TV series, so I consider deinterlacing an imporant step in HD video playback.

    As much as I dislike interlaced content in general, it is unfortunately here to stay.
    Reply
  • RamarC - Friday, February 09, 2007 - link

    Because a TV program is broadcast in 1080i does in no way mean that's the format it is captured/mastered in. "24p" is the current standard for mastering of most network programming and it can result in 720p or 1080i or 1080p content.

    http://www.digital-digest.com/highdefdvd/faq.html#...">http://www.digital-digest.com/highdefdvd/faq.html#...
    In an interview with Microsoft in the Audioholics magazine in January 2006 indicated that HD DVD movies will be stored in 1080p format like BD, even if initial players can only output at 1080i.

    Interlaced HD/BluRay content will be a rarity and the performance of playback software with that content is a trivial issue.
    Reply
  • ianken - Friday, February 09, 2007 - link

    " There are no HD DVD or Blu-ray players that we know of that support the type of adaptive deinterlacing necessary to pass the HD HQV test. "

    Becuase they don't need it as the content is 1080p.

    Silicon Optix is in the business to sell video processing chips. Their benchmark is designed to get people to look for players with their hardware.

    For properly authored discs NR and adaptive deinterlace is wasted.

    The thing I like about the HQV dics is that sites like this use them and that motivates ATI and NVIDIA to pass them and that gets folks a better 1080i broadcast experience. It's in the realm of poorly encoded broadcast HD TV that this stuff is important.

    IMHO.
    Reply
  • autoboy - Thursday, February 08, 2007 - link

    Sorry about being a huge pain in the ass. I really do like reading your articles about video processing and they are always quite good. For me though, there is always something that seems to be missing.

    I just found this quote from the head of the mutlimedia division at Nvidia

    FiringSquad: PureVideo seems to do more than regular bob deinterlacing when tested with the HQV Benchmark DVD. Can you give us any more details on what's being done?

    Scott Vouri: Yes, we do much more than regular ‘bob’ deinterlacing, but unfortunately we can’t disclose the algorithms behind our de-interlacing technology. I do want to point out that HQV doesn’t even test one of the best things about our spatial-temporal de-interlacing – the fact that we do it on 1080i HD content, which is quite computationally intensive.

    So it appears that they at least do adaptive deinterlacing which means they do what they say which means they should do inverse telecine and 3:2 pulldown correctly as well. I just can't help but think there is something missing from your setup. They should score better than a 0. Is the HQV benchmark copy protected? Can it be played on regular mpeg2 decoders? Is the PowerDVD hardware acceleration broken?
    Reply
  • autoboy - Thursday, February 08, 2007 - link

    So the codec doesn't matter for deinterlacing? The decoder decodes the video in a sort of raw format and then the video card takes over the deinterlacing? Hmm. I didn't know that. I was under the impression that the codec was the most important part of the equation. Why is interlaced video such a mystery to most of us. i have been trying to fully understand it for 6 months and I find out that I still don't know anything. I just want proper deinterlacing. Is that too much to ask?

    Is is really that hard to get good video playback on a PC for interlaced material! Come on...
    Reply

Log in

Don't have an account? Sign up now