"Yeah, sure, sorry, whatever," muttered Twitter, while making the "wanking" hand motion.
Everything Twitter has said or done about trolls, abuse, harassment, threats, Nazis or "the health of conversations" treats it as someone else's problem. To Twitter these are PR issues, and its solutions are oriented to media coverage. This is why nothing ever really changes, least of all its enthusiasm for product features that might accomplish more. I suspect they can't do better because it is incomprehensible to them. Its understanding of its own product prohibits an understanding of what's bad about promoting "Kill All Jews" a week after 11 were slaughtered by a wingnut. Much that is human is alien to them.
Earlier this week, The Intercept was able to select "white genocide conspiracy theory" as a pre-defined "detailed targeting" criterion on the social network to promote two articles to an interest group that Facebook pegged at 168,000 users large and defined as "people who have expressed an interest or like pages related to White genocide conspiracy theory." The paid promotion was approved by Facebook's advertising wing. After we contacted the company for comment, Facebook promptly deleted the targeting category, apologized, and said it should have never existed in the first place.
Our reporting technique was the same as one used by the investigative news outlet ProPublica to report, just over one year ago, that in addition to soccer dads and Ariana Grande fans, "the world's largest social network enabled advertisers to direct their pitches to the news feeds of almost 2,300 people who expressed interest in the topics of 'Jew hater,' 'How to burn jews,' or, 'History of "why jews ruin the world."'" The report exposed how little Facebook was doing to vet marketers, who pay the company to leverage personal information and inclinations in order to gain users' attention -- and who provide the foundation for its entire business model. [...]
Facebook draws a distinction between the hate-based categories ProPublica discovered, which were based on terms users entered into their own profiles, versus the "white genocide conspiracy theory" category, which Facebook itself created via algorithm. The company says that it's taken steps to make sure the former is no longer possible, although this clearly did nothing to deter the latter. Interestingly, Facebook said that technically the white genocide ad buy didn't violate its ad policies, because it was based on a category Facebook itself created.