
- A 1x2 HDMI splitter;
- An HDMI LCD monitor, and;
- Every HDMI input (but not output) on my receiver.
That sounds unlikely, doesn't it? It does. It really does. But testing against other devices seems to prove this to be true. It happened when I had shut down my iMac to move some cables around, then turned it back on.
I'm guessing that the Mac's already-known-to-be-haunted video card decided to say HEY GUYS HOW ARE YOU, NICE DAY FOR SOME VOLTAGE ISN'T IT?
I already bought a new Mac. Now I get to buy a new everything else. Hooray!
Wasn't there just some new HDMI standard released that requires you to upgrade ever component and cable, anyway?
Is it possible that you have a ground loop in this mess somewhere? I ask because I don't think you've mentioned it before, but that might explain a few things.
Pretty sure not. And all this crap worked fine for years, until a couple months ago.
Assuming that the basic topology has not changed in eons: Have you ever, in analog times, had any issues with audio or video hum?
I assume the original topology was Mac to splitter to receiver & LCD monitor? If the splitter is an active, powered device (and I'm not sure how a passive HDMI splitter would work), I'd suspect that it was the culprit rather than the Mac -- as in, it died and did nasty things to stuff downstream.
It's unfortunate that HDMI is not an electrically isolated design like ethernet. Not every HDMI device is designed with completely robust protection circuits on its inputs, which means that very bad things can happen on plug/unplug even if both ends started out healthy.
Also, I'd be suspicious of no-name splitters or converters. These are often active, powered devices and the power regulators are built with the same Chinese attention to detail that results in melamine in your kid's milk.
http://www.righto.com/2014/05/a-look-inside-ipad-chargers-pricey.html
To be fair, the decent ones are probably made in China as well as the budget crap. Also, even Apple's official chargers aren't all that.
Yes, everything is made in China. I wasn't painting the whole country with the same bucket. But there is "made in Chinese factory to our specifications and validated with our tests" and "made in China in someone's backyard and shipped in a plain brown envelope".
The official chargers are indeed "all that" in that 1. you get a replacement if they find a defect and 2. they fix a problem a dozen different ways for future devices. Note the response to the 2008 charger recall (prongs embedded in plastic casing that can't be pulled out with pliers):
http://www.righto.com/2012/05/apple-iphone-charger-teardown-quality.html
(midway down page, heading "Apple's 2008 charger safety recall")
Not an Apple apologist, but the quality in these components is not merely cosmetic.
That was exactly my thought as soon as I saw the list of things that died. Luckily for Jamie, this is an academic thought exercise, as he has to replace everything anyway.
That Shatner really ties the post together.
I have an active DVI splitter sitting right here if you would like it. Free. I even have some DVI-to-HDMI cables and converters. It doesn't do HDCP, is the only thing.
http://jeffreyatw.com/static/photos/splitter/
That'll learn you to blow ur bucks on a fuckin' Mac.
j/k. had a rough day at work.
o_O bricked simultaneously?
HDCP does have a key revocation system. It has been broken, by academics and researchers; but in most cases I imagine would still function as part of the design. Perhaps something triggered it? Maybe someone is testing your mettle as a hacker and wanting you to have to unleash your wrath on shitty copy protection schemes? I have no idea, that sounds like ass; but then HDCP (and DPCP [seriously, WHO COMES UP WITH THESE ACRONYMS?! Worst associations ever]) are vile. While Europe had SCART, and Japan had their own 21pin analogue RGB interfaces (and later, the obscure only-in-Japan D-Terminal digital RGB interface), by the time the U.S. got anything better than Never Twice Same Colour, we got the ASTC "standard" with no fewer than 18 different resolution/framerate specifications, and digital RGB & audio, with terrible copy protection.
To wit I procured the first sub-$3000 LCD with an HDMI port that I could find many years ago, as well as an upscaling DVD player with an HDMI out to pair with it, and found that about 50% of the time, it would not negotiate the HDCP link correctly. I vaguely cherish the Neo Geo X as being a hackable linux SoC, with HDMI that does not have HDCP; because trying to get a capture device (such as those vended by Black Magic) to sync to an HDMI signal with HDCP is basically impossible, no matter what caliber of shitty DVI/toslink timing attack/injectors can be procured.
sigh At least in the professional realm, they use sane things like HD-SDI! Consumer grade video devices are anathema.
But it is the simultaneous thing that raised an eyebrow; I have seen the occasional suspicious claim on fora about devices perhaps having gotten a revocation issued; though how to determine such things, let alone debug or circumvent them is beyond the level of most I have encountered (and some of the published researchers on that subject are ex-coworkers of mine).
DVI was far better.
ATSC is a red herring here. ATSC is a broadcast television standard, which ends up meaning you take a standard AV stream in a format that can be cheaply handled in hardware, add some metadata like "The Simpsons - Bart gets yet another elephant" and then multiplex a load of those together into a few megabits of data, smear on a thick layer of error detection and correction and shove it over some existing broadcast transmission system, maybe radio like it's the 1940s, or maybe satellite or cable. ATSC isn't notably worse or better than DVB, the standard used in most of the countries that don't do ATSC, it's just more American. I guess some of the defaults are accordingly more American, 60 fields per second (ish), some weird resolution choices. But mostly nobody cares. And importantly, none of the display technologies or cables that run between devices, give a hoot, they just move pixels.
Yes, it was just contextual (as was mentioned SCART, and D-Terminal and other common country-sanctioned RGB interfaces that do not have copy protection mechanisms designed into them, and a licensing process to complement "approved" equipment as much).
I could go deeper into the conniving BS that is the FCC oriented "standards" sanctioning, but you really don't need to go back much further back than Edwin Armstrong's harrowing history to realize not only is this country geared more towards big businesses and bribery than it is towards actual innovators or offering consumers the best technology available; it has only got worse since then.
How that is interwoven with digital RGB is most definitely related to consumer-driven product culture, and the presence of HDCP in HDMI is a damning example (after all, we had open standards for digital RGB on computer products ranging from the PC to the the arcade JAMMA harness to the Amiga for decades before ATSC was ever ratified).
Thankfully, increasingly people seem to just be viewing things on computer monitors anyway; but in content creation, professionals paying for premium HD-SDI equipment do so to avoid the entire headache that is HDMI and HDCP to begin with; consumer focused technologies in North America tend to put big business budgets over consumer quality, every time. The middle-ground (such as folks shooting on high grade consumer cameras like the Canon 5D Mark II and III) is a very strange land, but there are more disruptive players shaking up the land of 4K such as Black Magic, undercutting the price points that RED was already working to disrupt. If you rewind to earlier HD capturing such as what Sony and Lucas partnered on with some of the more recent Star Wars episodes, not only were the costs astronomical, but the resolutions and quality were inferior to what can be procured today. But that is just more of Engelbart's scaling observation/Moore's law in action in another field. But when it comes to inter-coupling devices, Apple stays a bit trailing edge, while still pushing their own agenda; Thunderbolt is a fascinating bus from the lineage of firewire and SCSI and SASI, but it itself uses DPCP for video signals, and there are no Thunderbolt switchers (or even switches, which is a shame, because they would make for a really inexpensive L2 10GbE replacement if there were). Apple is pretty firmly entrenched with content producers though, so if you want things without arbitrary proprietary licensed protections for interconnects, look elsewhere.
"they would make for a really inexpensive L2 10GbE replacement if there were"
You may need to get treatment for your concussion. Thunderbolt is kinda cool, but "inexpensive" it is not. Apple wants $300 for a ten metre cable. That's "audiophile nutter" pricing‡, not a competitive price for L2 networking. And to reach ten metres (which realistically means, any time the two pieces of equipment aren't on the same desk) with Thunderbolt you're already onto glass fibre, which means every few months some clumsy idiot will trash it and you'll have to buy a new one!
‡ In case you might believe that this is the limit of audiophile nuttiness, do a search for P.W.B Real Foil
The $300 10m cable is indeed optical. It has the equivalent of two XFP 10G transceivers integrated into each head of the cable, a total of four transceivers per cable. I invite you to google the price of XFP modules. This isn't audiophile pricing; it's actually rather cheap.
Is it a good idea to integrate everything so you have to replace the whole cable, including transceivers, if some asshole violates the fiber's bend radius? Well, that's another question. But it might help to know that this design seems to be a remnant of Intel's grand plans to bring 10G optical to the mass market. Their part was based on their silicon photonics research, which integrates lasers into logic processes in order to dramatically reduce the power, physical volume, and price of 10G transceivers. They also got Corning to work on a new type of fiber which was supposed to be more physically robust, with a very small minimum bend radius.
These two technologies were supposed to be the only Thunderbolt media, back when Intel was calling it Light Peak. That's the context in which transceivers got integrated into the cable heads. High performance optical fiber connectors are a bit dodgy for consumer electronics (as are eye safety warnings). However, as I understand it, Intel didn't quite make the cost targets on its transceivers, and had to fall back to copper media for the short cables.
Note that even the copper Thunderbolt cables contain silicon at each end, most likely dispersion compensation chips. 10Gbps is hard, even over relatively short cable runs.