Facebook says "send nudes"

This article is composed entirely of "sick burn":

The naked truth about Facebook's revenge porn tool:

Facebook has announced it's trialling a tool in Australia to fight revenge porn on its platform, one that requires victims to send the company a copy of the violating images. Amazingly, this is true, and not a Clickhole story. It's the kind of thing that makes you wonder if there are human people at Facebook, and do they even understand what words mean? Because as we unravel the details of this tool -- totally not conceived by actual robots or a company with a zero percent trust rating among users -- we realize it's a very confusing tool indeed. [...]

This apparently sends a copy of the image to the probable-Cybermen behind the scenes at Facebook, who momentarily pause from massaging advertisers with whale tears, laughing at people worried about Holocaust denial, high-fiving over scenes of unbelievable human devastation, and destroying democracy.

Then a person, and totally not a heartless tech bro, who works for Facebook looks at it. They decide if it is revenge porn, or if on that day you are just shit out of luck for getting your nonconsensual nudes removed.

At some point, according to what Facebook told Motherboard, the image has portions of it blurred out. This may happen with magic grey alien technology in transit, somehow preserving the privacy and dignity of the revenge porn victim. Maybe the employee just blurs their eyes over the sensitive parts by squinting really hard or rubbing their eyelids. Perhaps a superhacker Facebook cyber-script blurs the private bits so quickly you can feel a breeze come off the Facebook employee's computer.

But probably not. A Facebook spokesperson told Motherboard that when the image is blurry, a highly specialized and incredibly trained team are the only people who have access to it for a few days. It is my personal hope that their training is in martial arts. [...]

Anyway. As best we know, after employees look at the photo (and it may or may not be altered for the privacy and dignity of its subject), Facebook's machines take over. Facebook makes a hash of the photo and stores it in a repository that's cross-checked against photo uploads on the service. We can rest assured that this part will work perfectly because Facebook has never made a mistake. [...]

Facebook is asking people to trust it. The company that said Russian propaganda advertising only reached 10 million people then was forced to admit the true number was 126 million. The company that reached into people's address books on phones and devices, and altered Facebook users' contact information, re-routing communications to Facebook. The company that enforces a "real names" policy on users despite the fact that the National Network to End Domestic Violence proved that Facebook is the most misused social media platform by abusers. The company that let advertisers target users by race, outs sex workers, said "fake news" was not a real problem, and that experimented on its users' mental health.

Trust is something Facebook literally has none of.

Previously, previously, previously, previously, previously, previously, previously, previously, previously, previously, previously, previously, previously, previously, previously, previously, previously.

Tags: , , , , , , ,

5 Responses:

  1. Jens Knutson says:

    "If you work for Facebook, quit" would be a good candidate for a Previously link here.

  2. Jan Kujawa says:

    You'd have to be ... how was it put?

    Oh, yeah. You'd have to be a "dumb fuck" to do this.

  3. Derpatron9000 says:

    Open question: If you use Facebook or their services can you explain why you're happy to become part of the product this scum sell?

  4. thielges says:

    Computer savvy people know that you can’t recover a photo from a checksum or signature. Still the crunch from jpg to signature should occur on the client side and even savvy people have a hard time confirming that the racy jpg never flew over teh intertubes.

    But the big question is why not just rely on generic naughty photo detection? That’s much more scalable and doesn’t rely on anyone being proactive to submit their photos.

Leave a Reply to Mark Cancel reply

Your email address will not be published. But if you provide a fake email address, I will likely assume that you are a troll, and not publish your comment.

You may use these HTML tags and attributes: <a href="" title=""> <b> <blockquote cite=""> <code> <em> <i> <s> <strike> <strong> <img src="" width="" height="" style=""> <iframe src="" class=""> <video src="" class="" controls="" loop="" muted="" autoplay="" playsinline=""> <div class=""> <blink> <tt> <u>, or *italics*.

  • Previously