Apple Has Begun Scanning Your Local Image Files Without Consent

Jeffrey Paul:

Imagine my surprise when browsing these images in the Finder, Little Snitch told me that macOS is now connecting to Apple APIs via a program named mediaanalysisd (Media Analysis Daemon - a background process for analyzing media files). [...]

To recap:

  • In 2021, Apple said they'd scan your local files using your own hardware, in service of the police.

  • People got upset, because this is a clear privacy violation and is wholly unjustifiable on any basis whatsoever. (Some people speculated that such a move by Apple was to appease the US federal police in advance of their shipping better encryption features which would otherwise hinder police.)

  • Apple said some additional things that did NOT include "we will not scan your local files", but did include a confirmation that they intend to ship such features that they consider "critically important".

  • The media misreported this amended statement, and people calmed down.

  • Today, Apple scanned my local files and those scanning programs attempted to talk to Apple APIs, even though I don't use iCloud, Apple Photos, or an Apple ID. This would have happened without my knowledge or consent if I were not running third-party network monitoring software.

By default, Little Snitch allows all connections to Apple and iCloud. To block this process (and others) you have to un-check the "" and "" rules on the "System" tab. And then endure two days of whack-a-mole while re-allowing the ones you actually want to be able to connect to Apple, like softwareupdated and IMTransferAgent and a dozen others.

Update: Lots of people keep sending me this rebuttal, and telling me "it no longer phones home as of the OS update that was released 5 minutes from now, so problem solved." Ok, that may well be. But when my OS was phoning home on my photos yesterday and happens to not be phoning home on them today... that doesn't really build trust. Intent matters, and we know what Apple's intent is because they told us. Code matters, and we are not allowed to see Apple's code.

Maybe the fact that it phoned home with a null response is only because the test photos didn't match some magic neural net -- congratulations, Apple didn't report your test images to the FBI.

We cannot know. But suspicion and mistrust are absolutely justified. Apple is examining your photos and then phoning home. The onus is on them to explain -- and prove -- what they are doing and why. They are undeserving of you taking them at their word.

Previously, previously, previously, previously, previously, previously.

Tags: , , , , ,

42 Responses:

  1. acb says:

    It looks like mediaanalysisd is an on-device image classifier used with local search, and the Apple API calls are empty and were a bug, which has now been fixed.

    • jwz says:

      I witnessed this process phoning home on me yesterday (fully updated system) so "has now been fixed" is very much [citation needed]

      • people keep citing that mysk thread but that's not a very definitive proof of anything, IMO. I think it's very reasonable to be concerned about an OS service making a network query to Apple's servers upon viewing an image. I _hope_ it's just a bug, but considering sneak hadn't opted into any online services, certainly no network request should be attempted in this situation. I'm definitely curious where this will all end up...

        • I also made a speculation in response to mysk's thread - that perhaps no network payload was sent because no CSAM was detected in the image? Or, perhaps this is merely a stubbed out functionality that will later be used for reporting image hashes? Again I'm purely speculating. Regardless, some people are being far too trusting of Apple, considering their statements and past conduct. We have every reason to distrust them (and indeed, corporations will *never* be your friend).

  2. Elliot Shank says:

    These folks have had a look at it:

    Supposedly, the behavior does not exist in the new macOS version released yesterday.

  3. sneak says:

    OP here.  Before any of you too-clever-by-half chime in:

    Siri Suggestions is and was turned off.  This happened anyway.  That post claims to contradict me, but he writes:

    > For both images, VisionKit initiated image analysis when the image was being opened in its preview window. For the image which didn’t contain text, this completed in a total processing time of 615 ms, failed to recover any text from that image, and attempted no remote connections. The image containing text took longer, 881 ms, and returned text of length 65 ‘DD’ (as given in the log) after a considerably more elaborate series of processes, including one outgoing secure TCP or Quic connection by mediaanalysisd lasting 58 ms, before the completion of Visual Search Gating.

    • longtimelistener says:

      You nonetheless took what is a valid security concern and jumped to conclusions about what it's for, getting you ridiculed on Hacker News as a fearmongerer.

      • jwz says:

        Duuuuuude, getting ridiculed on "Hacker" "News", the comment section of a VC's blog, should be taken as a badge of honor. they should make award ribbons for that.

        • longtimelistener says:

          Fine. I just think the empty assumption that this is being done as a law enforcement thing is a real conclusions-jumping moment. It could be. It could also be being done for any number of reasons, as we know Apple does other things with photos. There is no evidence concluding that motive, just a very Tucker-esque "as you know..." rundown of last year's iCloud Photos Library outrage that just encourages readers putting the dots together to arrive at the author's intended motive.

          • phuzz says:

            Yes, I'm sure the massive corporation has our best interests at heart. After all, who among us hasn't accidentally uploaded the data we were trying to process locally? It's such a simple mistake to make!

            • longtimelistener says:

              There is no reason to trust the giant corporation. None.

              However if concluding that the mysterious data access is intended to fingerprint users, there should be some sort of supporting confirmation. This is data security we're talking about, and documentation matters more than just guessing.

              I give the author credit for finding this issue, but they could just say, "I don't know why it's doing this and I'm concerned." It's honest, and it encourages people with more time and/or skill to dig deeper.

              • MattyJ says:

                Whatever we find out it is, I'm 100% certain it won't be something I want. Remember, this is scanning your local photos, not ones in iCloud. My local storage is MY local storage. Keep out.

          • Joe Luser says:

            for a  long time listener it doesn't appear that you have been, y'know, actually listening. you are confusing what you imagine OP is claiming happened with what was claimed.  the claim is "apple scans photos and uploads something to an apple server". check it out, that is all that was claimed. and it appears to be true. and while obviously, there is an obvious use involving law enforcement, there are also unobvious uses, like say some chinese government looking for chinese dissent. or whatever. attacking an imagined use-case is, as you know from listening for a long time, called a "strawman argument" (google it). the point is that it is happening at all, was something apple said they had developed, never said they had shelved, and stopped happening immediately after it was discovered. if you have some value to add indicating that none of those things are true, then go ahead. otherwise, probably back to listening is the best path

          • asan102 says:

            You're really too dense to understand why we would be concerned about "big company's binary blob with access to all my data is phoning home about my files without informed consent," regardless of whether it turns out they're checking them for kiddie porn or not?

  4. CSL3 says:

    First, iTunes deletes all your non-iTunes-purchased music. Then, "the fappening" proves that iCloud is (like all clouds) just an open door through which any walk in and take your personal shit. Now, Apple's gone ahead and put a big sign in front of the day saying "All Cops Welcome". Sounds about right.

    Since AirTags weren't mentioned, I'm guessing that Jeffrey Paul doesn't own any? I'm sure those work just perfectly in Apple's new plan for totally dominant stalkerware that will inevitably be confiscated by cops and feds.

    • CSL3 says:

      Also, the "let's not jump to conclusions" comments slay me.

      Yes, let's not assume that Apple just did something that they have done before, still do now, and will likely continue to do in the foreseeable future.

    • asan102 says:

      First, iTunes deletes all your non-iTunes-purchased music.

      does what now?

  5. Well without what else? Consent is for people with power. We are not such.

  6. 11

    I don't know if I'm just getting older, but I am increasingly exhausted by the constant vigilance necessary to uncrap things I've actually paid for.

    Running uBlock Origin, piHole/Blocky, using a VPN, and now Little Snitch.
    Getting a new phone and spending hours turning off all the attention-grabbing notifications, and opting out of the sell-my-info-and-please-spam-me.

  7. Endareth says:

    Nope, that was a bug (and never actually did anything other than make an empty connection request):

  8. dzm says:

    macOS: 13.2
    Mac: m1 MacBook Pro (13")

    Unable to repro the behavior. Lil' Snitch indicates zero network traffic in the last 24 hours from mediaanalysisd.
    • Kevin says:

      Yes; it was apparently a bug. Or, it was a real privacy issue that was found and no longer being used.

      • jwz says:

        Or! They were doing exactly what was hypothesized, and have either temporarily paused it, or are just hiding it better.

        That is just as possible as your credulous theory, since neither of us are allowed to read the code and see what it is actually doing.

        Your blind trust of Apple is unjustified. There is no reason to take them at their word.

        • Kevin says:

          Yes, that was my option #2. Maybe it was a real privacy violation, but they got caught and pulled it.

          However, when the choice is between bug and evil, I’m willing to believe the answer is bug, especially for something like this that someone would have to explicitly be looking for to notice.

  9. Ale Muñoz says:

    you probably want to read this article, if you haven’t already

    I think it does a good job of explaining what is happening (and why) under the hood, and ways to disable the behavior if you are not convinced anyway.

    Hope it helps!

    • TLDR; it was an empty call. A bug, now fixed.

      • jwz says:

        You people are just *shockingly* credulous about Apple. You'd have to see Tim Cook actually holding a bloody chainsaw before you wouldn't say "nothing to see here".

        • I don’t appreciate the tone. Having said that, you might be right. There’s not much choice in the consumer space, so you have to choose the lesser evil. Given you found the original behavior, perhaps worth validating if the behavior persists after the upgrade?

          • jwz says:

            Well if you don't appreciate the tone, I'd recommend unfollowing me. But just because a black box that phoned home yesterday doesn't phone home tomorrow.... Let's say that does little to increase my level of trust. (And I didn't find the original behavior. I did however also observe it.)

            • it was the “you people” bit, sorry if I was harsh. Anyway, I didn’t want to get into this debate, just wanted to point out something. Have a nice day.

  10. Garry says:

    It’s likely not what the clickbaiters are shouting. Please read the very knowledgeable Howard Oakley’s post on the subject.
    The post referring to Visual Look Up is here.

    Thank you for your many contributions to our community of humans worldwide.

  11. this has been analyzed and was likely a bug, which since then has been fixed.

  12. thielges says:

    Somewhat related: iphone owners may be familiar with the "For You" slideshows that IOS periodically assembles from your photo library.  Usually they are titled like "Trip to X" and contain a sample of the photos you took.  I guess they are scanning your photo library and using GPS and temporal clustering to identify those trips.  Then using some sort of ML algorithm is used to pick the "best".

    Usually the slideshows are indeed relevant but I'm also seeing a strange bug in their algorithm.  A few years ago I worked on a project in the Sierra foothills to survey a new road and took photos of the marked up road site.  Every few months I get a "Trip to Mariposa" slideshow that contains no photos of people but lots of photos of wooden stakes painted neon pink.  The same damn photos.  Every time.

    I had planned to turn off this IOS feature but instead decided to leave it running to see how long this bug persists.  Plus those survey markers trigger an interesting nostalgia.  Dust, frustration, smoke, red county contractors, heat.

  13. dzm says:

    I have no idea if this is nefarious or not (as I noted above - I'm unable to reproduce it using the steps the OP described). I've worked in enough corporate environments to know that Dumb Shit Happens because people just don't think through the repercussions of decisions they're making. I also know an awful lot of stuff in those same corporate environments is downright malevolent. I don't know where this lies.

    What I am finding interesting is to search through DNS lookup logs for the last week. Searching for "smoot" is turning up DNS queries from myriad Apple OSes (macOS, iOS, iPadOS, etc) for (among others):

    Random searching of the Interwebs seems to produce a lot of speculation that this is all related to Siri and/or Spotlight Search. There's a good amount of fear mongering (again, I don't know if it's accurate or not) and speculation. Interestingly none of the ad-block/privacy/malware block lists I use to on my network include these domains in their corpus. I wonder what research they've done to exclude them.

    I miss the old days when stuff I own didn't constantly chat with the outside world.

    • dshea says:

      The various smoots may be unrelated. The dumb shit happening in this case is someone from MIT needed to come up with a host name, and went with hey remember that one time a frat hazed some kid ( That situation could well have come up more than once.

      • dzm says:

        Could be, but seems unlikely. On my macOS the requests are coming from two daemons buried in /System/Library/PrivateFrameworks/CoreParsec.framework

        Lil' Snitch describes the daemons as:

        “parsec-fbf” is a macOS system process that periodically sends Siri search analytics data to Apple servers.


        “Parsec Daemon” is a macOS system process that is used for suggestions in Spotlight, Messages, Lookup, Safari, Siri, and other places.

        Lil' Snitch describes the dozen or so domain names these daemons typically reach out to, but it doesn't seem to have a description of the *smoot* domains.

        Between these two daemons they have reached out to:

        and in the last 7 days have pushed 55.6k of data to them, and have received 160k of data from them.

        I don't know what's happening under the hood and would only be speculating. What I see superficially is what appears to be telemetry used by the "Suggestion" system. I rarely use Safari, but some times I need to. That may account for the traffic I'm seeing here (typing in the URL bar makes many user-agents to traffic to a server to come up with suggestions).

  14. memecode says:

    This post probably sold several Little Snitch licenses. Which may or may not have included me.

  • Previously