POST A COMMENT

23 Comments

Back to Article

  • bigmac - Monday, November 28, 2005 - link

    Actually, this is a clever product and it will generate a fair number of sales in Matrox's traditional marketplace -- Trading Desks.

    Here's the problem: What do you do if you need to add another monitor to your desk?

    Answer -- Most often, you're already maxed out on monitors and you end up adding another computer (we're talking 8-12 monitors here, guys!). Then you've got keyboard switching (and short term memory) issues. Sure, you could swap out a box for a bigger one that can handle more displays -- but most desks are on two to three year equipment cycles (an awful lot of desks still use W2K for just this reason).

    This new Matrox device provides a simple and elegant solution -- just perform mitosis on one of your existing displays -- double the horizontal resolution -- and move and add windows accordingly.

    You gamers ought to get out in the real world and see how people make money with computers instead of spending it on them!
    Reply
  • Koing - Monday, November 28, 2005 - link

    A lot of guys here are making decent money already.

    This product is 'purely' for the laptop market imo as far as the gaming idea goes. Guys with a decent enough graphics card to run at least 2048x768 WILL have a dual header already. The projector thing would only be for the laptop as most of the users would have a decent gaming card already.

    If the user wanted more space this would be a decent enough product for them.

    Koing
    Reply
  • Fluppeteer - Tuesday, November 29, 2005 - link

    Actually, if anything, the portable use seems least plausible to me
    (although clearly Matrox think different, and presumably they've
    done some market research to prove me wrong.) Actually, I should
    qualify that; *moving it* seems least plausible. Using it with
    laptops makes sense, although rearranging my desktop so that it
    sometimes has lots of pixels and sometimes doesn't would drive
    me nuts in short order.

    I can see that people wanting a multi-screen setup at work might
    want to be able to extract their laptop from it, but I'm not
    sure what the point of making it portable would be. Although, on
    the other hand, why not? :-) I do think the number of people moving
    between desks with two spare screens hanging around not already
    attached to a computer they want to use (unless these boxes
    proliferate) will be in the minority, unless they're also
    carrying the monitors.

    Treating it as part of the docking station does make sense, though
    - so long as you've only got SXGA screens anyway. The idea clearly
    has enough appeal that the VTBook guys can sell their dual output
    splitter cables.

    For desktop use, I don't really consider it a plain dual-head gaming
    solution. I *do* think it's a viable triple or quad-head gaming solution,
    or for non-gaming multihead markets. It's also a dual head gaming solution
    which would work with SLI, if two LCDs float your boat more than a big CRT.
    And it would work for small form factor devices with either half-height
    cards. Sadly, for all my claims that anything modern should handle it, I
    have to eat humble pie and admit that my EPIA's on board graphics wouldn't...

    --
    Fluppeteer
    Reply
  • Fluppeteer - Monday, November 28, 2005 - link

    It does seem to be a cheaper option than Matrox's own QID or the Xentera cards;
    I'm worried, if anything, that Matrox are poaching their only remaining market!

    Only thing is: as someone knowledgeable about trading desks, how much are 17/19"
    LCDs still in use? Are people switching up to 20" and larger screens, for which
    the DH2Go isn't useful? There's still a price premium to that, of course, but
    I see it as a product with a limited shelf life when the market starts seeing
    more pixels as a good thing.

    I know someone running six Dell 2405FPWs (off one PC) for trading purposes, so
    I can independently verify the desire for the desktop space!

    We've all been talking about large numbers of heads. Matrox don't seem to support
    this, at least directly (from what I read in their support forums, although I
    thought I saw something saying this more explicitly, which I now can't find). I
    presume the only problem which would arise would be with their Powerdesk software
    - can anyone confirm that everything works okay with more than one of these attached?

    --
    Fluppeteer
    Reply
  • roontex - Sunday, November 27, 2005 - link

    This will be a big deal for daytraders who need maximum screen space. The price is cheap especially if you are going for > 4 displays. Reply
  • tjr508 - Friday, November 25, 2005 - link


    I guess the biggest question here is can this device really improve your screen real-estate. Being my primary monitor uses somewhere between 300-330 MHz of DAC bandwidth at my resolution and refresh rate, this device would be very useless to me. Since we are talking pre-mm display machines here let's say you have a 300MHz bandwidth as opposed to 400 like today. (I think it would be even lower on a lot of systems (laptops for sure)) I thnik it would be quite liberal to suggest an older card could drive two 1024x768 displays @ 85 Hz. This makes desktop usability out of the question since a base cards stard at around $30. And if I am not mistaken, you would need two displays for a laptop instead of the lcd and an external. Hummmmmm. Then again external LCD's dont need to be refreshed as much as CRT's do so you could save some bandwidth there but this is very pointless.

    Anandtech didnt want to come right and say it but I will =)
    This thing is useless.
    Reply
  • Fluppeteer - Monday, November 28, 2005 - link

    It's not so useful for decent CRTs - if I wanted two 1024x768 screens, I'd have
    thrown them away and got a better one which I could drive at 2048x1536. 60Hz
    1280x1024 would annoy me on any CRT with any modern phosphor (that said, I'm
    running that setup at work out of necessity - and it *does* annoy me). On the other
    hand, 60Hz in just fine for all the 17" and 19" LCDs which are so popular right
    now. Bit of a shame it's only got the analogue connection, but if you're going for
    cheap screens (and haven't just got a 24" instead) then I can see the point in that,
    and it avoids the need for dual-link DVI cards for input, for better or for worse.

    1280x1024 at 60Hz, according to VESA timings, only needs a pixel clock of 108MHz.
    Even doubling that for 2560x1024 is well within the range of pretty much every
    plug-in graphics card available in the last 5-10 years; 1024x768 at 85Hz is only
    a 94.5MHz clock, and I'm sure cards from longer ago could double that. Laptops and
    integrated graphics may be more limited (my older laptops wouldn't cope), but more
    modern laptops - capable of 1600x1200 at least - should be fine with it.

    More specifically, 1600x1200 at 85Hz needs a faster pixel clock than any of the
    stretched modes this device supports. If your spare head can do that, it should
    cope. A lot of PCI cards can do that, and relatively few are dual head. Most
    cards with any form of dual head capability have some bandwidth headroom - certainly
    anything modern does. And you can always go with 16bpp. If this hits the limits of
    even the cheapest current PCI cards (other than quad head oddities), I'll be
    surprised.

    If anything, it's disappointing that this device can't make use of higher pixel
    clocks. Modern cards can often double the 1600x1200 60Hz pixel clock (162MHz for
    VESA, without reduced blanking), and four 20" LCDs on one device has some appeal.

    I'm not sure what your point about needing two screens for a laptop is. That's
    the point of this device, no? Whether this disables the internal screen depends
    on the laptop.

    It may be useless to you, and I can't deny that a DVI version would have more
    appeal (but I'm sure that the majority of customers, without dual-link DVI
    graphics cards, would rather have analogue support than nothing - at least,
    without a reduced refresh version holding a complete frame buffer). It *does*
    have a use, though. Four 17" LCDs, two of these, and a decent graphics card are
    much cheaper than a 30" Apple Cinema Display with lower resolution; four heads
    here are nearly filling my machine (two on an AGP card, one each from two PCIs);
    workstation cards are very expensive to buy lots of (and the PCI ones are slow)
    to get the same head count; it *is* an option for all but the oldest laptops,
    and cheaper (and potentially faster) than a VTBook.

    I think their strong point is the proliferation of 17" and 19" TFTs. From that
    point of view, I'm not sure the 1024x768 modes are useful, but since the most
    common TFT sizes can't use higher resolutions, being able to use more of them
    has some appeal. Sure, dual UXGA would be nice, but there's still a premium for
    that. I doubt CRT owners will be quite so tempted - either the refresh or the
    low resolution will be limiting (although, as I said, I have to admit to having
    a 60Hz SXGA CRT on my desk). I'm not sure about laptops, but I can see its
    desktop advantages, and laptop alternatives are at least less obvious.

    I'm sure it's got a market, although I'm not planning to get one just yet myself.
    I'll be more tempted if they make the DVI one I suggested. :-)

    I'm not affiliated with Matrox, btw, lest I sound too keen!

    --
    Fluppeteer
    Reply
  • GTMan - Friday, November 25, 2005 - link

    I think you are missing the point if you evaluate a product on whether it is useful for everyone. Matrox specializes in niche products particularily for business. So if it is useful in that area then it deserves a thumbs up.

    The article ended up giving the product a "must have" and a "nice" combined with "price ... to high", "will have even less need for this box" and "impractical". All the negatives are for people the product is not designed for. Should have left those out. People who want the product don't care about an extra two hundred dollars to drive their expensive dual displays.
    Reply
  • DerekWilson - Saturday, November 26, 2005 - link

    I think we covered the bases pretty well. We pointed out the fact that this is a niche product and that those who have a need for something like this will be well pleased with the functionality.

    But the need for this product isn't huge. I feel very comfortable saying most of our readers will not need or want something like this. I think it's important to let them know that, while there is a curiosity factor associated with the dh2g and gaming, this isn't a product with universal appeal.

    Most people who need one more display than they have would save much money dropping in a PCI graphics card on the desktop side. For notebooks, there isn't really another solution for multidisplay, that limits the market even more. The real advantage of the product is that it offers the possibility of fully accelerated overlay and 3d windows across multiple displays.

    But the fact is that people who need something like this won't have another option. And this is a good option for them anyway. For everyone else it isn't really worth it. I don't see how a complete review could leave out either point of view.
    Reply
  • g33k - Friday, November 25, 2005 - link

    hmm I tried 2560x1024 with AOE3 and your right the seperation in between the monitors is to annoying to be useful. Although the game did not appear to suffer performance wise with a 6800gt. Maybe if I got used to playing with two screens, I wouldn't notice it as much. But yeah one large screen wide screen monitor would be way better. I almost never have to scroll with such a large display. Reply
  • DerekWilson - Friday, November 25, 2005 - link

    If the compatibility list isn't enough, Matrox has a tool here:

    http://matrox.com/graphics/offhome/dh2go/try.cfm">http://matrox.com/graphics/offhome/dh2go/try.cfm

    Derek Wilson
    Reply
  • Fluppeteer - Friday, November 25, 2005 - link

    This thing may be aimed at laptops (and they really ought to think about
    a USB power adaptor for that market), but it's not a bad thing for
    desktops, too. For laptops, assuming there's some decent hardware
    acceleration on board, it's a valid alternative to a VTBook
    (http://www.villagetronic.com/e_pr_vtbook.html)">http://www.villagetronic.com/e_pr_vtbook.html) or Sitecom's USB2/VGA
    adaptor (or a compactflash VGA card plugged into a pcmcia adaptor...)
    and it's a possible alternative to a Colorgraphic Xentera for getting
    lots of screens on a desktop.

    Matrox, ironically, don't seem to like the idea of plugging two in at
    once - but I suspect that just confuses Powerdesk (Erwos - this is
    Matrox's software for splitting the screen so that you don't have
    problems with the border; only useful for desktop stuff, not games,
    but would solve your concerns about how it's treated by Windows).
    I don't see any reason why there should be a problem twinning them
    otherwise.

    I'm running four monitors here, using two PCI cards plus a dual-head
    AGP card, and being able to use the AGP card for more than two of them
    has some appeal. I'd like the idea of plugging two into an nVidia card
    set to vertical span, and having one continuous 2560x2048 desktop (as
    opposed to several display devices, from Windows' point of view) which
    could be used for gaming - although with the borders in the way. They'd
    have to be LCDs, though, or the refresh flicker would drive me nuts.

    Presumably, in order for this to work, there must be a (double)
    scanline-sized buffer in the device. Just to clarify an issue in the
    article, EDID is transmitted from the display device to the graphics
    card, not the other way around - so the DH2Go box will send an EDID
    to the computer (showing it can do the wide screen modes), but not
    to the monitors. I suspect the output mode options are standard VESA
    timings, which the monitors will either cope with, or not - it'd take
    more intelligence (and a full frame buffer) to handle arbitrary
    monitor timings on the output.

    To mirror what others have been saying (and there are rumours that
    Matrox *are* working on a DVI version), what I'd really like to see
    is a box with 256MB of video memory, a dual link DVI input (with the
    latest card generation there are lots of people out there with dual
    link DVI outputs which they can't use) and two dual link DVI outputs.
    The decoder should be simpler if it's just a TMDS receiver (DVI-D).
    If there was enough intelligence to decode the monitor EDIDs and
    present a total resolution (at a range of timings) to the card, the
    device could be a lot more flexible; an on-board frame buffer would
    mean, e.g., dual 1920x1200 at single link would work, at reduced refresh,
    and that uneven resolutions or refresh rates would work.

    It'd bean relying on the highest resolution presented by the monitor's
    EDID as being the native panel resolution (*usually* true, except in
    one of Iiyama's recent devices), and might require extra intelligence
    if analogue outputs were also wanted (probably set a lower refresh rate
    limit, and pick a resolution accordingly...), but it could be much
    more flexible. Stick a "horizontal/vertical" toggle on the back (*not*
    some complex and flakey bit of software to do what a button does better)
    and you could daisy-chain them to get a cheap and very large display with
    lots of monitors (at low refresh) - hence my suggestion of 256MB rather
    than anything much smaller; I'm not sure what the largest pixel count
    in a single display is for various devices, but 8192^2 at 24bpp would
    fit in 256MB. All it'd be doing is streaming pixels into and out of a
    buffer (as fast as either end supports doing it), so the electronics
    wouldn't otherwise be all that complex, even compared with the abilities
    of your average video card's DACs/DVI outputs.

    So the part cost would be up a bit over the "2Go" analogue version, but
    I bet it'd sell otherwise. I'd buy one at $200. It might kill the sales
    of Matrox's QID products, so unfortunately I doubt they'll do such a
    flexible device (or if they do, they'll have to charge more than a third
    the price of a QID), but I'm sure there's a market.

    Making it all work with HDCP protection would be a bit more complicated,
    but I'd be prepared to wait for version 2 for that. :-)

    Still, fingers crossed. I'm a bit surprised that Matrox have a patent
    pending on this - screen splitters aren't a particularly new idea,
    even if I've not seen many products just yet. I hope this doesn't stop
    someone else producing a DVI one, if Matrox don't.

    --
    Fluppeteer
    Reply
  • DerekWilson - Friday, November 25, 2005 - link

    To clarify, when we wrote that the dh2g reports EDID to the "display device" we were talking about the graphics card not the monitor ... Reply
  • Fluppeteer - Friday, November 25, 2005 - link

    Ah, sorry Derek. I'd misread the paragraph talking about the
    EP1Cs and CH7301C-Ts as saying that the Chrontel chips were
    responsible for the EDID rather than the Cyclone's being
    responsible. My bad. "Display device" is an annoyingly ambiguous
    term...

    Given the high bandwidth of the AD9888 and the fact that many
    modern graphics cards have 400MHz pixel clocks, it's a bit of
    a shame that 1600x1200x2 (at, say, 75Hz with reduced horizontal
    blanking, or standard VESA 60 and 70Hz timings) isn't supported.
    It might not be so useful for some laptops, but it'd improve the
    desktop situation for those of us with CRTs. Ah well, here's
    hoping for the next version...

    Big hand to Anandtech for pulling the device apart in such
    detail, by the way. :-)

    --
    Fluppeteer
    Reply
  • Fluppeteer - Friday, November 25, 2005 - link

    (Follow-up.)

    I mentioned a horizontal/vertical toggle. Come to think of it, for unequal
    resolutions, it would be nice to have an alignment toggle too (left/top vs
    right/bottom). Just to be complete. I'm presumining the missing areas of the
    display would just be invisible (and the price for not using matched monitors),
    rather than providing some complicated virtual desktop scheme or trying to
    tell Windows about them. You could add "centred" to the alignment options,
    but I doubt that's as common.

    Now we just have to hope they make it. :-)

    --
    Fluppeteer
    Reply
  • erwos - Friday, November 25, 2005 - link

    A DVI variant of this would be pretty slick.

    I also think that the forced stretching across both screens kills a lot of the device's utility. If they could figure out some sort of driver hack to treat it as two discrete monitors, that would make this infinitely more useful.

    -Erwos
    Reply
  • Donegrim - Friday, November 25, 2005 - link

    Not for games that don't support dual monitors. If a game thinks it's just one big screen, then it wont have any compatibility issues. Reply
  • DerekWilson - Friday, November 25, 2005 - link

    This is a key point - not many games support multiple displays.

    Even if a game does support multiple display devices, performance usually suffers greatly.

    Since a 2560x1024 display requires about as many pixels as a 1920x1440 display, we can expect similar performance characteristics between the two modes (if the hardware doesn't have a problem with custom resolutions or aspect ratios).
    Reply
  • Fluppeteer - Friday, November 25, 2005 - link

    I believe both ATi and (certainly) nVidia have modes which
    present both heads on a single device to Windows as though
    they were one monitor, which is better for gaming (but
    arguably worse for general Windows use) than having Windows
    running in "extend my desktop onto this monitor" mode.

    This obviously has the problems:
    1) The display is likely to be a funny shape which the game
    may not support (unless you've got two portrait monitors), and
    2) Assuming the monitors are matched, there'll be a bezel right
    in the middle, where you want to see most.

    However, combining the DH2Go with this feature gives two
    options which are more appealing:
    1) Use two DH2Go boxes and run four monitors, which is back to
    your original aspect ratio (as I suggested in my longer post), and
    2) Using one DH2Go box to present three monitors as a single
    widescreen display, putting the centre of the action in the middle
    of the centre monitor (like Matrox's triple head mode).

    Your pick of whether 2560x2048 or 3840x1024 appeals more. :-)
    (I run 3840x2400+2048x1536 at home and four horizontal monitors
    at work, but not as one display surface.)

    This little box is growing on me, as evidenced by how much I'm
    posting about it. :-)

    --
    Fluppeteer
    Reply
  • wien - Friday, November 25, 2005 - link

    Soo.. What if I connect this thing to one of my monitor outputs, and another monitor to my other output. Could I in effect get a triple head system? That would be most excellent for driving- and flight-sims, or any other type of game really. Gaming with dual-head just doesn't work, unless the game is built with that in mind. Reply
  • ViperV990 - Friday, November 25, 2005 - link

    I'd definitely love to see a 1-to-3 solution for some slick triple-head gaming. Reply
  • Donegrim - Friday, November 25, 2005 - link

    Or connect one to monitor out 1, and 1 more to monitor out 2...and have 4 monitors...mmmm...salivating. Although I suppose triple head would be better for getting round the image split down the middle thing.
    Or 4 projectors
    4 dual link DVI projectors
    mmmmmmmm...mortgage required....
    Reply
  • Calin - Friday, November 25, 2005 - link

    And some of them don't have two "good" video outputs - and in some cases upgrading the video card is the more expensive solution. This would be well for engineering workstations or similar machines that have single outputs and video cards costing hunderds of dollars.
    Matrox is used to have limited success (unfortunately), so I hope they will win with this.
    Reply

Log in

Don't have an account? Sign up now