mini-DVI splitting to DVI plus SVideo

Dear Lazyweb,

Let's say I was thinking about upgrading the second monitor I have attached to my iMac from a CRT (SVGA) to an LCD (DVI). I want that second monitor to run in 1600×1200 or so, and yet be mirrored onto my TV (projector) via an SVideo cable. Right now, I am accomplishing this by having the iMac's mini-DVI going into a mini-DVI-to-SVGA converter, and thence to an AverKey 300 SVGA splitter, which gives me an SVGA pass-through plus a downsampled SVideo output. (As discussed earlier.)

So, there are cables like this which split DVI into DVI-plus-SVGA, but they only work with DVI-I... and as far as I can tell, the only mini-DVI-to-DVI adapters that exist are DVI-D only (suggesting that perhaps Macs don't do analog DVI output at all?)

So, what's the cheapest way to do it?

Tags: , , ,
Current Music: Frostiva -- Pathos ♬

36 Responses:

  1. semiclever says:

    There exist mini DVI to VGA cables, so it's electrically possible. Unfortunately Apple doesn't seem to sell an appropriate cable and I'm not aware of any 3rd party cables.

    • jwz says:

      Are you saying that the existence of mini-DVI-to-VGA implies that the video card in the iMac is, in fact, putting out signal on the analog lines of the mini-DVI connector? (I don't particularly understand how this shit works.)

      • netik says:

        A two port DVI splitter would deal with this nicely.

        Imac -> mini-DVI to standard DVI -> Splitter.

        1 leg to the LCD and a 2nd leg to a DVI->VGA adapter cable.

        This is a $200 fix + misc cabling, and probably not the kind of money you want to spend.

        I think you're going to run into a problem with display sensing, though. The DVI cables will tell the Mac to adjust it's resolution. The projector will ask for one resolution and the LCD will ask for another. This'll cause display issues.

        I don't know what your projector's native resolution is, but if you want to do this, the LCD and the projector have to match, or it'll look bad. LCD's look terrible when they're not running in their native resolution.

        There's a way around this:

        But it involves cutting pins 6 and 7 going to the projector to disable the Mac's display sense circuit.

        • jwz says:
            I don't know what your projector's native resolution is, but if you want to do this, the LCD and the projector have to match, or it'll look bad. LCD's look terrible when they're not running in their native resolution.

          This is demonstrably not true, since right now I'm going iMac→SVGA→AverKey→SVideo-cable→Tuner→component-cable→Projector (whose native is probably 1024x768) and it looks just fine. (For values of "fine" meaning "as good as SVideo can ever be".)

          • netik says:

            My powerbook g4's LCD looks pretty bad when it's not at native resolution. I think projectors are much better at dealing with different resolutions because they expect many different input signals.

            I think the solution I gave you will work, though.

            • jwz says:

              A DVI splitter (plus a DVI-to-SVGA cable) will only work if I can get a DVI-I signal from the Mac, not a DVI-D, which, as I said, is what Apple's mini-DVI-to-DVI cable does.

      • baconmonkey says:

        typically, yes, DVI cards output both analog and digital. that's how the dvi->vga works.

        also, doesn't your projector support higher resolutions? if yes, run component or even VGA directly.
        there are many more easy options to convert to component from other high-res systems.

        also, here is that $35 cable for $6

        and mini-DVI to VGA for $21

        100' M/F svga cable - $24

        • jwz says:

          None of those links come anywhere close to solving the problem I am trying to solve.

          Read it again:

          Not just trying to go from computer to TV.

          Trying to go from computer to DVI monitor with DVI monitor mirrored onto probably-different-resolution TV.

          • baconmonkey says:

            here, let me re-phrase this all:

            Does your gigantor-expensive projector have VGA or Component in?

            if yes, try letting it handle the resolution-rescaling and feed it full-res via component or VGA. Most projectors will do that.

            if no, er, well, that sucks.

            • netik says:

              Come to think of it, Does gigantor projector have DVI through or VGA through? Lots of high end projectors do this, and it might solve your problem.

            • jwz says:

              The problem, as stated, involves an SVideo cable. Let's just leave it at that so that I don't have to spend four more paragraphs explaining why.

          • baconmonkey says:

            having re-re-re-re-read your post, and untangled everything, I gather what you want it thus:

            DVI-out with Digital out to LCD and Analog out split off to something that will probably be converted to svideo.

            in which case, you mean this:

            DVI-I Y cable with DVI-D and VGA outputs

            • jwz says:

              Again, that requires DVI-I. I have not yet seen a way to convert mini-DVI to DVI-I instead of DVI-D.

              • jhnc says:

                DVI-D to VGA converters do exist.
                (eg. Lindy, iTM)

                But it's probably cheaper to use your mini-DVI to VGA adapter and VGA splitter and skip the DVI-D to VGA stage entirely.

      • semiclever says:

        Yup. The analog pins of mini DVI (as well as DVI-I) are just VGA signalling pins stuffed into the DVI connector. These adaptors just change the connector without doing anything smart.

        • cacepi says:

          Have you ever seen Apple's mini DVI to DVI adaptor?

          It looks like this.

          It doesn't have pins for an analog signal. At all.

          • semiclever says:

            Yeah, it's a mini DVI to DVI-D adaptor. I never claimed it did. What I claim is that the mini-DVI port on the back of an iMac has the signal pins for analog. So it should be possible, even though with apple parts it isn't.

            Thanks for finding the pic though. Now I don't have to dig up a mini-DVI to DVI cable when I get to work just to check that apple's marketing wasn't lying.

      • eugene_o says:

        I am currently using such adapter to connect a mini-DVI output on my previous generation intel iMac to a TV that has a VGA input - and it works fine.
        What I don't know is whether it outputs both digital and analog signals at once or switches depending on what's connected to it.

  2. ultranurd says:

    This description of the Mini-DVI pinouts makes it sound like the graphics card can only accept a single display identifier at a time, so it gets either the info passed through the DVI-D or SVGA adapter, or it gets the identifier of the Mini-DVI to S-Video/Composite breakout box. I'm not sure why, since DVI-I also only has one set of DDC pins, and the splitter you mentioned can do it... but it also says it depends on the graphics card, which must be the issue with the iMac.

    The BlackBox 2-port DVI splitter is listed for $260 on CDW, and you'd need to add a DVI-to-S-video box (the Apple one is $19) but the manual for the BlackBox is unclear what the box does if you try to have analog output and a digital input. I suspect the box is just duplicating the input signals, as opposed to being able to do downsampling like your current box, so if that's true the combination wouldn't work.

  3. dojothemouse says:
    • This is not an answer to your question.
    • iBooks used to do video out through the minijack. You could get a special minijack to S-Video and RCA adapter that had an extra segment or two on the minijack end. This feature was... uh... unheralded. I doubt that it still exists on modern macs because I suspect that was what used to give my iBook the hard freezes when I plugged in headphones.

    Does anyone know what that feature was called or whether it's really gone?

  4. divelog says:

    Is your new lcd panel DVI only? Most have an SVGA input. Can't you just use that? Then you don't have to buy anything.

    • jwz says:

      I haven't bought the new LCD yet, but if I do, I'd sure rather get one that is digital instead of analog.

      • divelog says:

        I'm pretty sure all lcds are digital. Even the ones that only have a vga connector.

        My Samsung SyncMaster 204t's have both a dvi and a vga connector. I've used both; it looks like a 1600x1200@60hz monitor to the os when using the vga. I believe there's just a A->D converter inside the lcd. I could tell no difference in quality between using the vga or DVI connectors, both were detected automatically (in linux!). YMMV.

        • jwz says:

          If there's a VGA cable involved, then it's getting converted from digital to analog and back. And that's not how we roll in the Twenty-First Century.

          • divelog says:

            Yes, but survey says you'll never be able to tell the difference.
            Sorry, I have no easy answers for your all digital desires.

            • semiclever says:

              I gotta second this. It sucks from an engineering aesthetic point of view but has the advantage that it will work with no observable loss of quality up to at least 1600x1200.

              I hate to be the voice of reason but you could always start with this (you've got all the right boxes anyway) and upgrade to digital when you figure it out.

          • fnivramd says:

            If you don't go with this option then you need fancy gizmos (because you need to simultaneously feed DVI-D to a display AND use the same DVI-D feed to generate a lower resolution S Video output), and in my experience if we recommend that you buy lots of fancy gizmos and you have trouble getting even one of them to work, you'll throw your toys out of the pram. So let's take a bit more of a look at the alternative.

            DVI-D is (RGB) analog signal encodings of three serial bitstreams (six if you have a high resolution display but let's not worry about that) at 165MHz. It doesn't really do error correction, but being notionally digital you can obviously recover from quite a lot of interference. It transmits 8 such serial bits per pixel on each channel (all of which will end up on the screen unless you're too cheap to buy a good one). As a result it will probably work... fine. Or not at all.

            VGA (and thus DVI-A since they're the same except for the connector) is three (RGB) analog signals generated from 8 (or sometimes more) bits per channel at up to 400MHz. Again no error correction, but since you have a digital flat panel it will digitise the input signal to 8 bits (or if you're cheap, 6 bits) effectively recovering from some interference. It will also use a PLL to lock onto the correct signal, just like the DVI-D interface, except that the method is probably patented by some Taiwanese company instead of being specified in a standards document. As a result it will probably work... fine, after you push the "Auto" button on the LCD panel once. Or it might be fuzzy, and you'll need to buy new cables, but you already did that once so probably not.

            Arguing that one of these technologies is "digital" while the other is "analog" is a bit... pendantic. One of these days they'll stop sending the same white pixel that hasn't changed to the display, over and over 60 times per second. One of these days one of DVI-D's successors will be so much better than VGA that we'll laugh about it and no-one will use a VGA connector any more. But DVI-D was actually worse than VGA for a lot of applications, it's like Hyperthreading, or the TV series Lost, it should be really good but it's actually just been built up too much and leaves you disappointed. It didn't even feature better hotplug than we already had with VGA thanks to EDID.

  5. telecart says:

    Sorry to bug you on an old post, but did you find anything for this..? Since I'm pretty much in the same situation (only with a MacBook Pro and a DVI rather than miniDVI)..

  6. jcurious says:

    Go mini-DVI to DVI to HDMI get an HDMI splitter, go HDMI to DVI

    Some examples:

    DVI to HDMI

    HDMI Splitter

    • jwz says:

      Do those DVI-to-HDMI cables work bidirectionally? Are they electrical, or is there logic in there? Because I think to go that route, I'd need both varieties:

      iMac → Mini-DVI-to-DVI-D-femaleDVI-D-male-to-HDMI-maleHDMI-female-femaleHDMI-male-to-2-HDMI-female
      1→ HDMI-male-to-DVI-D-male → monitor;
      2→ HDMI-male-to-Component → TV.

      Does that make sense?

      Also, does HDMI have an intrinsic resolution? Will the monitor still run at 1600x1200? Or do both outputs of the splitter run in the same (lower) resolution?

    • jwz says:

      Here's why I think that won't work, and neither will the similar attempt using DVI splitters instead of HDMI splitters: there exist cables that convert DVI and HDMI to Component, but, being cables, surely they just break out the analog signals on the DVI/HDMI lines. It seems unlikely that those cables have a tiny D-to-A converter embedded in them.

      The iMac Mini-DVI port is DVI-D only, it does not emit analog signals at all.

      • jcurious says:


        [..]a DVI-D source can drive an HDMI monitor, or vice versa[..]

        There is no analog "magic" here... as I said before.. this permits you digital to digital feed, end to end... also HDMI cables are generally cheaper then DVI cables ;)

        • jwz says:

          I don't doubt that a DVI-D source can drive an HDMI monitor, since that HDMI monitor would use the digital lines. That would be great if I was going to (only) an HDMI monitor, but I need component video.

          • jcurious says:

            Didn't you mention that it could accept DVI? if so, then you can move back and forth from DVI easily, and HDMI is splitable. Why not just go back to DVI from a split HDMI signal.

            • jwz says:

              No, my tuner (video switcher) doesn't have HDMI or DVI inputs; it has composite, svideo, and component (and toslink). The monitor has SVGA and DVI inputs.