Apple Has Begun Scanning Your Local Image Files Without Consent

Jeffrey Paul:

Imagine my surprise when browsing these images in the Finder, Little Snitch told me that macOS is now connecting to Apple APIs via a program named mediaanalysisd (Media Analysis Daemon - a background process for analyzing media files). [...]

To recap:

  • In 2021, Apple said they'd scan your local files using your own hardware, in service of the police.

  • People got upset, because this is a clear privacy violation and is wholly unjustifiable on any basis whatsoever. (Some people speculated that such a move by Apple was to appease the US federal police in advance of their shipping better encryption features which would otherwise hinder police.)

  • Apple said some additional things that did NOT include "we will not scan your local files", but did include a confirmation that they intend to ship such features that they consider "critically important".

  • The media misreported this amended statement, and people calmed down.

  • Today, Apple scanned my local files and those scanning programs attempted to talk to Apple APIs, even though I don't use iCloud, Apple Photos, or an Apple ID. This would have happened without my knowledge or consent if I were not running third-party network monitoring software.

By default, Little Snitch allows all connections to Apple and iCloud. To block this process (and others) you have to un-check the "icloud.com" and "apple.com" rules on the "System" tab. And then endure two days of whack-a-mole while re-allowing the ones you actually want to be able to connect to Apple, like softwareupdated and IMTransferAgent and a dozen others.

Update: Lots of people keep sending me this rebuttal, and telling me "it no longer phones home as of the OS update that was released 5 minutes from now, so problem solved." Ok, that may well be. But when my OS was phoning home on my photos yesterday and happens to not be phoning home on them today... that doesn't really build trust. Intent matters, and we know what Apple's intent is because they told us. Code matters, and we are not allowed to see Apple's code.

Maybe the fact that it phoned home with a null response is only because the test photos didn't match some magic neural net -- congratulations, Apple didn't report your test images to the FBI.

We cannot know. But suspicion and mistrust are absolutely justified. Apple is examining your photos and then phoning home. The onus is on them to explain -- and prove -- what they are doing and why. They are undeserving of you taking them at their word.

Previously, previously, previously, previously, previously, previously.

Tags: , , , , ,

26 Responses:

  1. sneak says:
    8

    OP here.  Before any of you too-clever-by-half chime in:

    Siri Suggestions is and was turned off.  This happened anyway.  That eclecticlight.co post claims to contradict me, but he writes:

    > For both images, VisionKit initiated image analysis when the image was being opened in its preview window. For the image which didn’t contain text, this completed in a total processing time of 615 ms, failed to recover any text from that image, and attempted no remote connections. The image containing text took longer, 881 ms, and returned text of length 65 ‘DD’ (as given in the log) after a considerably more elaborate series of processes, including one outgoing secure TCP or Quic connection by mediaanalysisd lasting 58 ms, before the completion of Visual Search Gating.

    • longtimelistener says:
      1

      You nonetheless took what is a valid security concern and jumped to conclusions about what it's for, getting you ridiculed on Hacker News as a fearmongerer.

      • jwz says:
        35

        Duuuuuude, getting ridiculed on "Hacker" "News", the comment section of a VC's blog, should be taken as a badge of honor. they should make award ribbons for that.

        • longtimelistener says:
          1

          Fine. I just think the empty assumption that this is being done as a law enforcement thing is a real conclusions-jumping moment. It could be. It could also be being done for any number of reasons, as we know Apple does other things with photos. There is no evidence concluding that motive, just a very Tucker-esque "as you know..." rundown of last year's iCloud Photos Library outrage that just encourages readers putting the dots together to arrive at the author's intended motive.

          • phuzz says:
            2

            Yes, I'm sure the massive corporation has our best interests at heart. After all, who among us hasn't accidentally uploaded the data we were trying to process locally? It's such a simple mistake to make!

            • longtimelistener says:

              There is no reason to trust the giant corporation. None.

              However if concluding that the mysterious data access is intended to fingerprint users, there should be some sort of supporting confirmation. This is data security we're talking about, and documentation matters more than just guessing.

              I give the author credit for finding this issue, but they could just say, "I don't know why it's doing this and I'm concerned." It's honest, and it encourages people with more time and/or skill to dig deeper.

              • MattyJ says:
                5

                Whatever we find out it is, I'm 100% certain it won't be something I want. Remember, this is scanning your local photos, not ones in iCloud. My local storage is MY local storage. Keep out.

          • Joe Luser says:
            4

            for a  long time listener it doesn't appear that you have been, y'know, actually listening. you are confusing what you imagine OP is claiming happened with what was claimed.  the claim is "apple scans photos and uploads something to an apple server". check it out, that is all that was claimed. and it appears to be true. and while obviously, there is an obvious use involving law enforcement, there are also unobvious uses, like say some chinese government looking for chinese dissent. or whatever. attacking an imagined use-case is, as you know from listening for a long time, called a "strawman argument" (google it). the point is that it is happening at all, was something apple said they had developed, never said they had shelved, and stopped happening immediately after it was discovered. if you have some value to add indicating that none of those things are true, then go ahead. otherwise, probably back to listening is the best path

          • asan102 says:
            1

            You're really too dense to understand why we would be concerned about "big company's binary blob with access to all my data is phoning home about my files without informed consent," regardless of whether it turns out they're checking them for kiddie porn or not?

  2. CSL3 says:
    5

    First, iTunes deletes all your non-iTunes-purchased music. Then, "the fappening" proves that iCloud is (like all clouds) just an open door through which any walk in and take your personal shit. Now, Apple's gone ahead and put a big sign in front of the day saying "All Cops Welcome". Sounds about right.

    Since AirTags weren't mentioned, I'm guessing that Jeffrey Paul doesn't own any? I'm sure those work just perfectly in Apple's new plan for totally dominant stalkerware that will inevitably be confiscated by cops and feds.

    • CSL3 says:
      2

      Also, the "let's not jump to conclusions" comments slay me.

      Yes, let's not assume that Apple just did something that they have done before, still do now, and will likely continue to do in the foreseeable future.

    • asan102 says:
      1

      First, iTunes deletes all your non-iTunes-purchased music.

      does what now?

  3. 10

    I don't know if I'm just getting older, but I am increasingly exhausted by the constant vigilance necessary to uncrap things I've actually paid for.

    Running uBlock Origin, piHole/Blocky, using a VPN, and now Little Snitch.
    Getting a new phone and spending hours turning off all the attention-grabbing notifications, and opting out of the sell-my-info-and-please-spam-me.

  4. dzm says:
    4

    macOS: 13.2
    Mac: m1 MacBook Pro (13")

    Unable to repro the behavior. Lil' Snitch indicates zero network traffic in the last 24 hours from mediaanalysisd.
    • Kevin says:
      1

      Yes; it was apparently a bug. Or, it was a real privacy issue that was found and no longer being used.

      • jwz says:
        6

        Or! They were doing exactly what was hypothesized, and have either temporarily paused it, or are just hiding it better.

        That is just as possible as your credulous theory, since neither of us are allowed to read the code and see what it is actually doing.

        Your blind trust of Apple is unjustified. There is no reason to take them at their word.

        • Kevin says:
          4

          Yes, that was my option #2. Maybe it was a real privacy violation, but they got caught and pulled it.

          However, when the choice is between bug and evil, I’m willing to believe the answer is bug, especially for something like this that someone would have to explicitly be looking for to notice.

  5. Garry says:

    It’s likely not what the clickbaiters are shouting. Please read the very knowledgeable Howard Oakley’s post on the subject.
    The post referring to Visual Look Up is here.

    Thank you for your many contributions to our community of humans worldwide.

  6. thielges says:
    1

    Somewhat related: iphone owners may be familiar with the "For You" slideshows that IOS periodically assembles from your photo library.  Usually they are titled like "Trip to X" and contain a sample of the photos you took.  I guess they are scanning your photo library and using GPS and temporal clustering to identify those trips.  Then using some sort of ML algorithm is used to pick the "best".

    Usually the slideshows are indeed relevant but I'm also seeing a strange bug in their algorithm.  A few years ago I worked on a project in the Sierra foothills to survey a new road and took photos of the marked up road site.  Every few months I get a "Trip to Mariposa" slideshow that contains no photos of people but lots of photos of wooden stakes painted neon pink.  The same damn photos.  Every time.

    I had planned to turn off this IOS feature but instead decided to leave it running to see how long this bug persists.  Plus those survey markers trigger an interesting nostalgia.  Dust, frustration, smoke, red county contractors, heat.

  7. dzm says:

    I have no idea if this is nefarious or not (as I noted above - I'm unable to reproduce it using the steps the OP described). I've worked in enough corporate environments to know that Dumb Shit Happens because people just don't think through the repercussions of decisions they're making. I also know an awful lot of stuff in those same corporate environments is downright malevolent. I don't know where this lies.

    What I am finding interesting is to search through DNS lookup logs for the last week. Searching for "smoot" is turning up DNS queries from myriad Apple OSes (macOS, iOS, iPadOS, etc) for (among others):

    smoot-feedback.v.aaplimg.com
    fbs.smoot.apple.com
    smoot-searchv2-ausw2b.v.aaplimg.com
    api-glb-ausw2b.smoot.apple.com
    bag-smoot.v.aaplimg.com
    api.smoot.apple.com
    etc

    Random searching of the Interwebs seems to produce a lot of speculation that this is all related to Siri and/or Spotlight Search. There's a good amount of fear mongering (again, I don't know if it's accurate or not) and speculation. Interestingly none of the ad-block/privacy/malware block lists I use to on my network include these domains in their corpus. I wonder what research they've done to exclude them.

    I miss the old days when stuff I own didn't constantly chat with the outside world.

    • dshea says:
      1

      The various smoots may be unrelated. The dumb shit happening in this case is someone from MIT needed to come up with a host name, and went with hey remember that one time a frat hazed some kid (https://en.wikipedia.org/wiki/Smoot). That situation could well have come up more than once.

      • dzm says:

        Could be, but seems unlikely. On my macOS the requests are coming from two daemons buried in /System/Library/PrivateFrameworks/CoreParsec.framework

        Lil' Snitch describes the daemons as:

        “parsec-fbf” is a macOS system process that periodically sends Siri search analytics data to Apple servers.

        and

        “Parsec Daemon” is a macOS system process that is used for suggestions in Spotlight, Messages, Lookup, Safari, Siri, and other places.

        Lil' Snitch describes the dozen or so domain names these daemons typically reach out to, but it doesn't seem to have a description of the *smoot* domains.

        Between these two daemons they have reached out to:

        api-glb-ausw2b.smoot.apple.com
        api-glb-ausw2c.smoot.apple.com
        api.smoot.apple.com
        fbs.smoot.apple.com

        and in the last 7 days have pushed 55.6k of data to them, and have received 160k of data from them.

        I don't know what's happening under the hood and would only be speculating. What I see superficially is what appears to be telemetry used by the "Suggestion" system. I rarely use Safari, but some times I need to. That may account for the traffic I'm seeing here (typing in the URL bar makes many user-agents to traffic to a server to come up with suggestions).

  8. memecode says:

    This post probably sold several Little Snitch licenses. Which may or may not have included me.

Leave a Reply

Your email address will not be published. But if you provide a fake email address, I will likely assume that you are a troll, and not publish your comment.

Starting a new top-level thread.

  • Previously