Surprisingly, headline ending with a question mark, but the answer is yes!

"Can Apple read your iMessages?"

I spent the past week weighing the evidence and believe it's an overstatement for Apple to say that only the sender and receiver of iMessage and FaceTime conversations can see and read their contents. There are several scenarios in which Apple employees, either at the direction of an NSA order or otherwise, could read customers' iMessage or FaceTime conversations, and I'll get to those in a moment. But first, I want to make it clear that my conclusion is based on so-called black-box testing, which examines the functionality of an application or service with no knowledge of their internal workings. No doubt, Apple engineers have a vastly more complete understanding, but company representatives declined my request for more information. [...]

"In the case of iMessage intercept capabilities, Apple is taking a page from Skype's playbook -- make very carefully worded statements about the existence of encryption, and then let people read far more into their claims than they have actually made," Chris Soghoian, who is principal technologist and senior policy analyst for the American Civil Liberties Union, told Ars. "When reading Apple's carefully worded PRISM denial, remember it was written by a hybrid team of lawyers and PR folks. Every word matters. At best, they are being cagey, at worst, outright deceptive."

Previously, previously.

Tags: , , ,

17 Responses:

  1. ix says:

    So, Apple may be able to retrieve stuff backed up on iCloud, which is a nice run-around of the original encryption of those messages.

    But (from the article) Apple's statement: "For example, conversations which take place over iMessage and FaceTime are protected by end-to-end encryption so no one but the sender and receiver can see or read them," [..] "Apple cannot decrypt that data. Similarly, we do not store data related to customers’ location, Map searches or Siri requests in any identifiable form."

    That is not lawyery and cagey, that is pretty outright saying they cannot get that data. Actually replacing people's public keys and outright monitoring conversations seems pretty incompatible with that statement. I would argue that even retrieving old convos from iCloud is incompatible with that statement. I wonder if an Apple lawyer would really have allowed this kind of statement if that is the case.

    • Nick Lamb says:

      "no one but the sender and receiver"

      Sure, the sender is you, Alice, and the receiver is Mallory. You thought you were talking to Bob but Apple forgot to mention that they reserve the right (for "national security" or "quality assurance" reasons) to silently substitute Mallory's credentials for Bob's. The article even shows that this seems to be trivially easy for Apple to do, though it doesn't prove they ever have or will.

      Like "bogus" CA certs but unlike say DNSSEC, only the victims would ever have even a chance to detect this malfeasance and only while it was happening. Before and after it would leave no trace, and on other people's systems it would never show up at all.

      In an open system you could fix this by replacing Apple's certificate system with one you trust, or by ripping this infrastructure out and replacing it with the Socialist Millionaire's Protocol so that you can do intuitive "If you're really Bob, what was the name of the porn movie you watched when you were 12?" verification questions during crypto setup. But it's not an open system, ultimately you are obliged to trust that Apple aren't lying to you by omission, or accept that they might be and decide you don't care about it.

      • ix says:

        It's clear that you have to trust them not to silently replace keys. But what I'm arguing is that their statements do not (to me) seem compatible with that kind of action, e.g. "no one but the sender and receiver can see or read them". Where does that leave them wiggle room to screw you over?

        (I'm half convinced they must be screwing people over somehow, which is why the categorical nature of their statements surprised me)

        • Nick Lamb says:

          I thought I addressed that. Only the sender and recipient can see or read, but you were just mistaken in believing that the recipient was your friend Bob, and Bob was mistaken in believing that the sender was you. You were both talking to Mallory who trivially impersonated the other party by forwarding.

          If I were a lawyer, arguing for Apple, (who for this hypothetical we will imagine have been caught red handed using MitM, and are defending a lawsuit about the practice) I would say that the word "recipient" just means anybody who received the communication, and that only the phrase "intended recipient" could imply Apple meant to actually guarantee some sort of privacy.

          • nooj says:

            I am sure there's even a way around "intended recipient" with a dedicated enough lawyer and a sympathetic court, both of which are in place here.

            • antabakayt says:

              Quite easily. It would come down to the perspective from which the word "intended" is defined. If sniffing the data was a design feature, than Mallory has always been the "intended recipient".

          • nooj says:

            ix, you can read about MitM attacks here:
            Man in the middle attacks

            This kind of attack is not limited to iMessaging. In fact, it is widespread and exceptionally harmful. login.yahoo.com and google.com were hacked two years ago: "On March 15th, 2011, an HTTPS/TLS Certificate Authority (CA) was tricked into issuing fraudulent certificates that posed a dire risk to Internet security. Based on currently available information, the incident got close to — but was not quite — an Internet-wide security meltdown."

      • phuzz says:

        But surely if the government show up with a warrant or whatever sekrit way they have of getting data for the NSA then Apple have no choice but to do what they say?

        • jwz says:

          The point is not whether Apple is legally required to comply.

          A reasonable person would interpret Apple's statement as saying that compliance is impossible, for technical reasons.

          Evidence suggests that this is not true. So their claim is either deceptive or an outright lie.

          If Apple had said, "We archive all your messages, and we turn them over when we get a valid warrant", we would be having an entirely different conversation about this.

          • phuzz says:

            I see where you're coming from, but I guess I'm just a cynic. I expect companies, especially large one, to lie when it suits them.

    • Zygo says:

      We do not store data related to customers’ location, Map searches or Siri requests in any identifiable form.

      allows such data to be stored in a non-identifiable form. They'd be fools not to, because it's a hugely valuable data source to improve the quality of their products.

      Methods of extracting identifying information about anonymous individuals by analyzing and correlating records of their behavior are improving all the time. Sometimes Apple helps by putting straight-up bugs (in multiple senses of that word) into their products. For practical purposes, we can shorten that disclaimer sentence to just:

      We store data related to customers and that's all we know.

      Secretly trojaning the end-to-end encryption by software update is a capability that has already been deployed in the field by court order. It is now a concrete threat, not an abstract possibility (although, as is often the case in such matters, there is no public information about a Mac port yet ;).

    • hattifattener says:

      "Apple cannot decrypt that data" isn't at all the same thing as "pretty outright saying they cannot get that data". My read of the situation based on that article is that:

      - Apple allegedly cannot decrypt your iMessages in transit; only the endpoint phones can.
      - Those phones keep a log of messages sent and received (like any phone does).
      - Those logs are by default iCloudy.
      - Apple can access your iCloud data (mud puddle test), and therefore, can read your message logs, unless you've taken steps to avoid this.

      • nooj says:

        or the govt can force apple to download a message log retrieval tool to your phone.

  2. gryazi says:

    Herp derp?

    Has anyone determined that private keys are both unpredictable and not being escrowed at time of manufacture or otherwise, for the 'fishing expedition' use-case?

    And I don't think anyone's said anything about them being irretrievable in the "person of interest identified" use-case (at which point they can potentially convince the carrier to do something and/or just use some sort of 'whitehat' commercialized-to-law-enforcement 0-day to sniff from there forwards, too).

    Of course the likely outcome is we're just going to codify having a shadow government for "natsec" that doesn't communicate with civil agencies, because otherwise everyone does get the right to subpoena the alibi archive. (Footnote that the idea of an 'alibi archive' crops up in SF from time to time and assuming the technology for everyone to record everything on everyone doesn't go away - and particularly with some actual crypto that can only be unlocked with the accused's consent - I can't tell if it'd actually be a bad thing.)

  3. volk007 says:

    Hold on, a private company said that they couldn't listen to conversation on their fully controlled devices transmitted through their fully controlled servers and you didn't believe them?

  4. nonplus says:

    blah blah blah something about how it's comically naive to trust a closed source application on a closed source OS using proprietary protocols.

    nothing (okay, very little) that Apple says about security is verifiable by 3rd parties so all of it should be treated as 'broken' until proven otherwise.

    Gibberbot - that's the way to do it (https://guardianproject.info/apps/gibber/)