Talk about HDMI

So, more specifically, what is HDMI? HDMI - High Definition Multimedia Interface - is actually just a logical progression on top of DVI. The video segment of the HDMI signal is actually compatible pin for pin with DVI, but in a much different package. HDMI improves on DVI by transmitting digital audio on the same interface, adding support for HDCP and also better DDC options for manufacturers.

HDMI provides 5Gbps over copper interconnects up to 15 feet - that's enough headroom for a 1080p signal and 8 channel audio. For those who like to do the math, a 1080p raw video signal and eight 192kHz audio channels require less than 4Gbps. So, there is a significant portion of unused overhead built into the HDMI specification. We've seen demonstrations of hooking your DVD player, receiver, and PVR each with a single cable at shows like CES and the word is that adoption of HDMI is going even faster than originally planned.

Below, you can see a cross-section of what the 19-pin HDMI cable looks like. The smaller, sturdier cable was designed with laptops and slimmer devices in mind. The DVI cable on the right shows the large difference in size.


Click to enlarge.

Right now, HDMI cables, like the original DVI cables, are very expensive. High quality cables easily retail for more than $100 each, although middle of the pack HDMI cables in the one and two meter range can be had for less than $20.

Remember the interoperability and quality issues with older DVI connectors on video cards? Since DVI is a relatively loose protocol, manufacturers are not strictly enforced to adhere to design principles. Signal quality on DVI connectors hit a low point in 2001/2002, but fortunately, it seems that awareness of the problem has started to rectify these issues. Since Silicon Image had a significant influence on the original DVI and HDMI specifications, they have taken it upon themselves to set up their own quality control laboratories, PanelLink Cinema (PLC). New devices will go through a very stringent verification process to assure that the next generation interfaces don't have the same problems which plagued DVI. The lab also works directly with Intel's HDCP spinoff licensor, Digital Content Protection, to assure that HDMI-ready devices adhere to the HDCP guidelines. Copy protection is a large facet in the HDMI specification, so it only makes sense that Intel and Silicon Image have so much invested in building trust with the content providers.

Today, the largest factor that plagues HDMI in the living room is whether or not devices are actually taking advantage of 8 channel audio. Many of the first generation HDMI ready devices only utilized two channels with the thought that TVs in particular would not need anymore than 2 channels. As a result, many new devices still ship with separate stereo inputs just as they do with DVI, but obviously, the push will be for new devices to drop these inputs in favor of digitally-protected high fidelity capabilities built in the cable specification. Stereo would just be a fall back.

Significance on Desktop
Comments Locked

56 Comments

View All Comments

  • PrinceGaz - Tuesday, January 18, 2005 - link

    Combining audio and video onto one output is a non-issue really, even though it offers no benefit to most PC users and if anything is a slight inconvenience. If HDMI video-cards don't have their own audio-hardware, as someone said they'll just include suitable audio-in headers on the card so you can connect your soundcard directly to it.

    As for the output of the card, the obvious answer is a splitter-cable so you can connect it to both a monitor and amplifier/receiver. Here in Europe we've been using SCART cables for A/V connections for years and you just get the right cable to suit whatever inputs the device or devices on the other-end require. Problem solved.

    As for HDCP and other forms of DRM, yes it sucks. And yes it will arrive and be adopted despite the fact it probably reduces signal-quality (such as the unnecessary D/A then A/D conversion someone mentioned). Many protection methods impact the quality of the material so that is nothing new. However just as surely as HDCP will be adopted, when there is sufficient incentive to do so, it will be cracked. It doesn't matter how complex the protection is, nothing is uncrackable and HDCP is likely to be a prime target. It may take a while, it might require inside help from sympathetic people who work with HDCP (there are always some who don't agree with what they are working on who will anonymously spill the beans), but sooner or later HDCP will be cracked and you'll be able to do what you want with your movies on HD discs.
  • Alphafox78 - Tuesday, January 18, 2005 - link

    Who cares how many cables are behind your TV, its not like everyone can see them. 1 cable or 4, who cares. im sure these new devices will have some older ports on them for component or svideo input or many people wont be able to use them.
  • gonzo2k - Tuesday, January 18, 2005 - link

    I would want HDMI to go from the cable/sat box to CAPTURE HDTV video/sound on a HTPC. I would have no need for unified A/V transfer from HTPC to monitor/TV since the sound would be handled by a seperate audio receiver. An A/V receiver to handle switching between multiple HDMI connections would be nice... but that seems a long way off.
  • crazyeddie - Tuesday, January 18, 2005 - link

    Is it really going to be necessary to combine audio and video onto a single card for the sake of HDMI? I would imagine that a simple coaxial digital feature connector from sound card to video card could solve the problem easily enough. The video adapter would simply have to grab the digital out from the sound card and patch it in with the video signal to the HDMI out.

    Of course getting the picture and sound to sync may or may not be a challenge, but that is another issue entirely...
  • KristopherKubicki - Tuesday, January 18, 2005 - link

    endrebjorsvik: thanks for the catch - they should both read gigaBITS.

    Kristopher
  • endrebjorsvik - Tuesday, January 18, 2005 - link

    In the 2nd paragraph on page 1 you have written that the cable kan handle 5 Gigabit per second (Gbps), while you say it is okay to tranfer 1080p video and 8 channel audio who requiers 4 GigaByte per second (GBps).

    4 GigaByte will be approximately 32 gigabit, but the cable only takes 5. How do you figure that out?
  • arfan - Tuesday, January 18, 2005 - link

    Anand, where is your newest article about Nvidia Ultra vs SLI ??? why suddently missing ??? What happen ????
  • Fluff - Tuesday, January 18, 2005 - link

    In a perfect world we would all be using HD-SDI and AES/EBU.

    Remember HDMI can support up to 12bit .
    DVI is upto 8bit.

  • bersl2 - Tuesday, January 18, 2005 - link

    #30: One of the reasons why I voice my displeasure about DRM in places like here is because there aren't enough people aware that such a thing is taking place. Letters and emails are great and all, but unless people start *talking about it*, it's not going to get through politicians' thick skulls that we disapprove of their listening to the lobbying of the entertainment industries.

    I stick behind my earlier assertion that to allow DRM is to relinquish control of one's viewpoint on the world. It will lead to the Internet becoming like TV is, controlled by an oligarchy of "content providers," backed by law.

    And just out of spite, I'm putting this post in the public domain. :P
  • quasarsky - Tuesday, January 18, 2005 - link

    http://www.anandtech.com/news/shownews.aspx?i=2290...

    This neat article I saw was about using GPU's to crunch audio. Worth a look as relates to HDMI :)

    9 - Posted on Sep 6, 2004 at 10:02 PM by Saist Reply
    when I first read the title here on Anand, I thought the article was going to be about how Creative Labs was spending more time creating and patenting IP instead of developing newer sound systems ( e.g. Carmark's Reverse).

    Another example is this story over at Xbit : http://www.xbitlabs.com/news/mmedia/display/200409...

    some business has figured out how to get GPU's to crunch out high quality audio. Now, I don't know about you, but right now a Radeon 9600 XT or a Geforce FX 5700 is cheaper than a top of the line Audigy2. To think that I could use that in place of an audigy2 and get superior sound quality (and yes... I am aware of the physical issue of how to pipe that sound out of a graphics card, but at least the capability is there)

    I agree... Tweaktown really dropped the ball. There's a whole other place they could have gone to but didn't

Log in

Don't have an account? Sign up now