Snow Crash (simulated)

When there's no input plugged into the HDMI port on the TVs that we have in DNA Pizza, they do this:

If you are of a certain age, this probably seems normal and sensible to you. But think about it again. Video static is caused by amplifying random radio white noise and feeding it to the input stage of a scanning electron gun. HDMI is digital and DRM'ed all the way through: there is no analog phase here at all!

This TV is playing a built-in MPEG of static, instead of just displaying solid blue or solid black like they used to do.

I think that's kind of awesome. The map has become the territory.

I also bet it's a short loop, and I'd see the pattern if I stared long enough. Either that, or I'd contract a Videodrome tumor.

Previously, previously.

Tags: , , , , ,

48 Responses:

  1. Dusk says:

    Not to spoil your fantasy, but are you sure it isn't coming from the (analog) tuner?

    • jwz says:

      It was definitely set to the HDMI input when it was displaying this.

      • Tim says:

        It doesn't look like real analog video noise -- if you freeze frame it, the 'grain' is very coarse.

        Another fun map/territory thing is that as technology marched on and analog tuning functions were replaced by DSP, lots of late model analog TVs had firmware which wouldn't ever display pure analog static. If the DSP firmware couldn't lock up to a NTSC/PAL carrier, it would output a bluescreen.

        BTW, if it is artificial static, I wouldn't be surprised if it's the output of a hardware PRNG, converted into B&W blotches.

        Previously:

        http://recroomhq.com/downloads/2010/04/14/tv-static-freebie.html

        • Adolf Osborne says:

          It's worse than that: It's three shades (black, medium grey, light grey). Analog noise, by definition, has every shade of grey: It doesn't know what "3 shades of grey" means.

          There seems to be a lot of strange curving going on with the edges of some of the blotches, which random analog noise doesn't generally do (though it could, if it's random). It's difficult to tell from this rendition if that is a product of Youtube, the camera, a combination of both, or the source, but it appears to not be the latter.

          • jwz says:

            The blocks definitely have that rounded-edge look IRL too, which I assumed was the result of a low-rez MPEG being fed through an upscaler.

            • Michael Dwyer says:

              For me, the information overlay is what tips it off. To overlay analog video, you have to first get the frames synchronized... and it is pretty damn hard to sync to random noise. So, if they're successfully overlaying a message box on top, then that static has been somehow Well Formed.

              I'm not sure that means much anymore, since it all turns into a digital signal at the LCD anyway, but there it is. By the way, did you happen to mention what brand TVs these are? At the risk of advertising, I think that's a nice feature, and while I wouldn't buy a TV strictly because of it, I really like it.

              • gryazi says:

                I bet people are already saying this further down the thread, but really, the setup in the TV is no different than a PC with a TV card - the PC's just gotten smaller and cheaper and more efficient. So analog frames show up as the synchronized output (or, well, rendering of a bitstream as N pixels on N lines at N frequency) of the analog decoder.

                This isn't like trying to bodge an overlay into a CRT set where the e-beam is actually syncing via magical direct techniques. It's just a window on top of another 'window.' (And even then, didn't the first OSD TVs manage to pull it off fine with nothing but the cheapest possible genloc arrangement? You'd get a little flicker compared to a framebuffered overlay but still usually see what channel you were on.)

                Meanwhile, if it _is_ the analog tuner, it's taking the visible portion of NTSC's ~525i or whatever, scaling it with some less-than-dumb algorithm (because halfassedly good algorithms are necessary to make 720 at 1080 not look like crap) *and* trying to apply motion comp (possibly even in the LCD controller chipset rendering the LVDS-type output from the 'tuner', which likely includes a level of Magic to make the LCD panel have a better response time than it really has on top of the Magic already in the tuner trying to compensate for LCD response times).

                All these artifacts mean that real analog static looks weird and artificial on both our DLP set (which can actually match pixels 1:1 with optical scaling, as far as I know) and the LCDs I finally picked up to get with last-decade. Including the missing-gray-levels thing - I'm not sure where that comes from, but probably just the sampling in the analog tuner chipset having a different 'curve' to it when fed noise (since with a real signal it needs to make sure the digital output falls perfectly between the black and white points rather than relying on a brightness knob). The DLP set seems to display it at a weird slow framerate, suggesting it just drops frames (and holds the last "good" one) that don't seem to sync based on wherever it thinks the sync signal should be (possibly this also has something to do with the chipsets being designed to handle crap down at the weird DVD "cinema" framerates even if the TV built around them doesn't bother to).

                I Want to Believe (even though I'd personally find this really annoying), but given the crap quality of the software in most sets, I think you'd have to populate every analog input with a signal (tuner on Ch3 with a C64 joystick in; something else on AV and S-Vid) to prove it's not just _saying_ it's on HDMI but not actually _switching_ to HDMI until something else raises a "there's a signal on HDMI" pin. [In fact, I _think_ my cheap Viore LCDs are like this - software will understand you want to give HDMI 'priority' by switching to it, but won't actually switch what it's showing until a signal is connected and it knows what resolution and rate it's trying to render.]

  2. Laura Rubin says:

    A whole new class of skeuomorphs emerges! In just a bit, we'll see BSODs with a friendly "reboot" button embedded in them or something.

  3. Ian Young says:

    Without this, Neuromancer becomes fantasy instead of sf:

    "The sky above the port was the color of television, tuned to a dead channel"

    "what, black?"

  4. Joe Crawford says:

    The sky above the port was the color of a digital display, outfitted to mimic television tuned to a dead channel.

  5. This TV is playing a built-in MPEG of static

    It is suddenly my life's ambition to get a job programming firmwares for HDTV tuners, so that I can insert an easter egg of the max headroom broadcast to play at random (and very long) intervals.

    • jwz says:

      I would totally donate to this Kickstarter campaign.

    • Jered says:

      It would be better to get a job writing HDTV firmware just so they weren't SO FUCKING BROKEN.

      If you want to cry, go read this recent summary of how HDTVs fail at resolution management.

      • Sweet fancy moses. That explains so much about what happens when I plug my mac mini into my projector.

        Pass the gin.

      • Owen says:

        At the bottom he says "some," but basically all new TVs have an option not to do overscanning. The 720p thing is true, but I doubt 720p tvs will be in production for much longer. There's just no difference in price between that and 1080p.

        tl;dr: this isn't really a big deal and will go away in time.

        • Jered says:

          Most to allow you to turn off overscan, though good luck finding the option -- the comments on the original post give an example of what it's called on a new Samsung TV: "Only scan" (of course that's how to translate "Don't overscan" from Korean)!

          As the 8086 emulation your x86 processor still does on boot, I think history shows that idiotic decisions of the past will not go away in time.

          • zzz says:

            My .fi LG calls it "Vain haku" which means approximately "search only". Makes perfect sense now.

      • gryazi says:

        That article explains the "problem" well but (GOOD NEWS!*) is probably completely backwards about the most commonly encountered variant of it.

        The official AMD/ATI drivers on at least Linux and Windows (and probably Mac) make an insane assumption that anything connected to the HDMI output will overscan and Something Must Be Done. So they turn on soft-scaling by default. The result? You get a fuzzy fucked-up shrunken scaled display and start cursing at the HDMI display.

        Then you find the switch hidden in the "Catalyst Control Center" to turn that shit off [fellow Ubuntards - it's hidden in /usr/lib/fglrx/bin/amdcccle if they didn't finally start putting it back in the menu again, and you may have to sudo if the settings won't take] - go to Display Manager, select the display, switch to 'Adjustments' tab, pick "Use display for scaling" and drag scaling to 0% anyway to be sure, then possibly restart X - and you discover your cheap Chinese set of course doesn't bother doing insane bullshit to what every engineer knows is a sensible 1:1 digital input, and you get a pixel-perfect display.

        I can dream that the engineers put this in to try to push naive users to Displayport and kill off HDCP (if they haven't HDCPified DP by now anyway?), but it really makes no sense at all.

        *The good news is that you do get a 1:1 output with a couple clicks, whereas if any sets in the real world actually overscanned 1080 you'd be hosed.

  6. Adam A. says:

    I saw the fnords.

  7. Amy says:

    No one's started babbling since you discovered this, right?

    I'd still be on the look out for a really tall guy with Poor Impulse Control tattooed on his forehead.

  8. Chas. Owens says:

    I am now wondering how much effort it would take to build an Arduino (or maybe a Raspberry Pi) that has a bunch of HDMI/Component/RCA inputs and one HDMI output. So long as one of the inputs is active, it is sent to the output; however, when none are active it generates random static and sends that to the HDMI input. This would add static to any HDMI based TV. It would also fix the problem of telling certain family members how to switch inputs on the TV (turn off whatever device is on the TV and turn on the device you want to see).

    • The bad news is that I'm pretty sure your entirely reasonable desire will be thwarted by the abomination known as HDCP.

      The good news is that Bunnie has you covered.

      • Chas. Owens says:

        If I pass on everything I see on the line, how does encryption cause a problem? I don't need to insert anything into the devices' streams; I just need to generate static when no devices are sending data. In that case, I am the endpoint and control the encryption.

        • The problem is, I'm not sure there's any way to reliably tell the difference between a live and an idle HDMI connection, save for when the other side is physically powered down, without being able to do an HDCP handshake. Obviously this would only be an issue for the HDMI input: you could certainly do voltage sensing on the component/rca/etc lines.

          But this is not my field, so I could certainly be wrong.

    • It will cost you $119 and probably pretty little code writing: http://adafruit.com/products/609

  9. Jim says:

    My XM receiver (Acura) plays a small bit of static right when the signal dies or is marginal. Makes me laugh every time.

  10. Incidentally, I just updated my static noise generator for iOS ("Shush" on the app store) to generate video "noise" as well. Like this HDMI pseudo-static, it's entirely digital. But I didn't implement it as a video loop or with any technique that is liable to portray a perceivable pattern.

    The clever (to me, anyway) trick in my case (inspired by friend Mike Ash) was to write a simple OpenGL pixel shader that generates pseudo-random pixels in the target scene. Since OpenGL ES doesn't provide (that I know of) for anything resembling even passable pseudorandom numbers, I used a trick I found online for basically generating a reasonable amount of perceived randomness from the fractional result of some math, and feed in the scene coordinates combined with a random number that is changed every frame and provided by the client app. The result looks, IMHO, pretty statictastic, is extremely fast because it's rendering on the GPU, and I doubt it has a perceptible pattern.

  11. Lloyd says:

    Do the tvs have any analogue inputs? What make and model?

    An alternative explanation: when set to HDMI but lacking HDMI input, the tvs default to showing analogue input, to avoid support calls from analogue-only buyers who have inadvertently set HDMI on and now have 'broken' tvs. Shorting across any analogue/coax socket or touching an antenna and seeing if the static changes while set to HDMI is a good way to test this.

  • Previously