I, for one, welcome our new algorithmic snitch overlords

Automated DRM keeps spoiling the show, from the DNC to Mars. obots aren't smart enough to decide if video or song is used lawfully; instead of trying to improve content monitoring software, we should look to ditch it.

The Hugo Awards debacle wasn't just an isolated instance, either. After last night's Democratic National Convention, anyone who sought to watch the video of the evening's presentations on BarackObama.com or YouTube found the video flagged by copyright claims shortly after it finished, according to Wired. Ironically, YouTube is the official streaming partner of the Democratic National Convention, yet according to Wired, the site put a copyright blocking message on the video. Anyone trying to access the video was presented with a message claiming the stream had been caught infringing on the copyright of one of many possible content companies, including, "WMG, SME, Associated Press (AP), UMG, Dow Jones, New York Times Digital, The Harry Fox Agency, Inc. (HFA), Warner Chappell, UMPG Publishing, and EMI Music Publishing." [...]

"Quantization of discretion" is a nice turn of phrase:

It's not just Web robots: The AirPlay feature in Apple's OS X Mountain Lion similarly blocks the streaming of DVDs to TVs on the network, even though it is perfectly legal to play a commercial DVD on your TV, even if the DVD player happens to reside on your computer.

People talk of "fair use," but what they actually mean is that we all depend on the exercise of judgment in every decision. Near the bulls-eye of copyright where it was meant to apply -- the origination of works by specialist producers -- most people are clear what it means. But as legal scholar Lawrence Lessig eloquently explained in his excellent book "Free Culture," in the outer circles we have to make case-by-case judgments about what usage is fair and what usage is abuse. When a technologist embodies their or their employer's view of what's fair into a technology-enforced restriction, any potential for the exercise of discretion is turned from a scale to a step, and freedom is quantized. That quantization of discretion is always in the interest of the person forcing the issue.

These technology-imposed restrictions aren't just a problem for now. The natural consequence of having the outlook and business model of one person replace the spectrum of discretion is that scope for new interpretations of what's fair usage in the future is removed. Future uses of the content involved are reduced to just historic uses the content had at the time it was locked up in the technology wrapper (if that -- for example, digital books are generally not able to be loaned to friends, and even when they are, it's treated as a limited privilege).

The law may change, the outlook of society may mature, but the freedom to use that content according to the new view will never emerge from the quantized state the wrapper imposes. The code becomes the law, as Lessig again explains in his book "Code." Although the concept of "fair use" is potentially flexible and forward-looking, "historic use" is ossifying.

Thus the calls for better robots that understand fair use are misguided and pointless, a plot device that would fail the sniff test at the Hugo Awards, or in the wake of the Mars Rover landing or the DNC. Any technology that applies restrictions to text, music, video, or any other creative medium quantizes discretion and inherently dehumanizes culture. We don't need better robots; we need the reform of copyright so that it only applies to producers and not to consumers.

Another article that says the same thing with more examples.

Previously, previously, previously, previously, previously, previously.

Tags: , , , , , ,
Current Music: has been blocked by a strategic content partner

11 Responses:

  1. I would really appreciate it if the entire American (and, frankly, worldwide) legal system would stop working so hard to make RMS's terrible short fiction seem prophetic.

    • Max says:

      The DNC example is YouTube acting on behalf of its partners. The legal system only enters into it if you figure that it forced Google into building a content id system that overwhelmingly favors a small group.

      Would-be silo builders also seem to be working a lot harder than legal systems, with little to show for it except hassled customers (little to show for it in the sense that there are (cheap!) drm stripping tools for pretty much every drm, meaning it presents little hassle for illicit distributors). Some of the systems (like, say, the Kindle) are probably helping the backers build up their marketplaces, but I would guess that works less well over time, as more people get burned by it.

      The DMCA is of course the legal system, but the safe harbor provisions there are about as liberal a system as you can get in a world that has actual copyrights.

      • gryazi says:

        Entirely fitting.

        (And it's sort of surprising how likable Biden turns out to be as a person, after all that. Thankfully he's not drafting legislation now.)

      • Stephen Harris says:

        It's a balancing act.

        The problem the silo builders like google and Ustream have is the consequence of failing to protect rights. If they choose a weaker interpretation and it turns out to be insufficient then they've spent a lot of money building something useless _and_ they become legally liable. (Note that "throwing bodies at the problem" is not an acceptable solution, not even outsourcing to India; it must be a technological solution. So sayeth Corporate Wisdom). So they build something that means they've got a strong defense in case something gets through that shouldn't (it will; everything fails). So what we end up with is something too restrictive, and something that may cost them as a business (how much did the DNC pay to have Youtube as a partner? How many companies will think twice in future; will Ustream survive?).

        This is very similar to regulated environments (banking, insurance, Sarbanes Oxley etc). Each corporation will interpret the regulations their own way; after a couple of failed audits they'll go too strict and have internal productivity issues because controls are put in place that'll meet (overly) strict interpretations of the regulations to the detriment of letting people get the job done. Result is increased costs, which get passed onto consumers.

        On the one hand we have regulations that help protect your pension; on the other hand we have regulations that prevent you from copying DVDs. Both regulations have negative impact on all of us because of overly strict interpretation by corporations. We can't win.

        • Max says:

          I don't particularly see YouTube as a silo, I was talking more about all the attempts to limit playback of media to certain devices (turning a free live service off doesn't really have the same problems as drm).

          As I understand it, Google built content id to appease media companies, not to protect the rights of anybody (their small fry users are a lot less likely to sue than big media companies). They are likely to operate it in such a way as to maintain the legal protections they get from the DMCA. So I'm not sure these takedowns have anything to do with concerns over liability.

          • Stephen Harris says:

            You're thinking like a small company. Think large. Small-fry law suits are a nuisance but a $10mm fine is only 40 people; if it takes more than 40 people to solve the problem then don't do it, pay the fine. Heck, Chase/Citi/AIG/Google/MS/Apple/et-al have good lawyers; chances of losing are only 10% so let's call that 10 people. If they lose the first then they'll change the technology.

            DRM device restrictions is a different argument, and not related to the Hugo/DNC/DMCA issues. DRM is losing. BluRay is probably the last bastion 'cos even eBooks are going DRM-free (music went that way once Apple switched).

            • Max says:

              Drm is directly related to RMS's terrible short fiction. I spoke of it because it is a second example of companies doing most of the work (rather than the legal system). Incidentally, Bluray drm is more or less irrelevant to motivated illicit distributors, hdcp has been cracked.

              You're saying that content id is some sort of scheme to protect Google from actual liability. I'm pretty sure it is a scheme to avoid an antagonistic relationship with big media companies that happen to produce a lot of the content that is worth wrapping in advertising. It is not a balancing act, it is an appeasement is the more profitable strategy act (and who gives a shit if some percentage of less profitable users get screwed by it).

  2. James says:

    Bring on the parody and criticism recognition CAPTCHAs! This is how the AI will arise.

    • Stephen Harris says:

      Many people have blogged that AIs will come into existence as part of the war between spamming and anti-spamming technology. Spammers need to avoid technology _and_ fool people into reading their message; anti-spam needs to distinguish between real communication and stuff designed to look like real communication. We're gonna end up building a model of human cognition on both sides to fight the spam war...

      • James says:

        The practical issue here is that there is no way to present a CAPTCHA based on recognition of parody or criticism to a user without presenting an unbounded amount of background information. Any query to the user with a fixed amount of data isn't going to appear as genuine criticism or parody. If you don't believe me. just try to think of examples. Can you think of anything that doesn't appear inauthentic with a fixed amount of text?

  3. Erbo says:

    I would have to agree with my friend Jeff's assessment: "The very idea of copyright, on which artists in many areas depend, is being weakened in the public mind by crap like this. If something eventually kills copyright, it won’t be the pirates."