
"Right-wing populism is always more engaging", a Facebook executive said in a recent interview with POLITICO reporters, when pressed why the pages of conservatives drive such high interactions. The person said the content speaks to "an incredibly strong, primitive emotion" by touching on such topics as "nation, protection, the other, anger, fear."
"That was there in the [19]30's. That's not invented by social media -- you just see those reflexes mirrored in social media, they're not created by social media," the executive added. "It's why tabloids do better than the [Financial Times], and it's also a human thing. People respond to engaging emotion much more than they do to, you know, dry coverage. ...This wasn't invented 15 years ago when Mark Zuckerberg started Facebook." [...]
"It's absurd for Facebook to say this is just something that's playing out in a neutral way. Facebook is not a mirror -- the newsfeed algorithm is an accelerant."
Absolutely cannot stop thinking about this quote. They're saying the forces that drove the world into war and killed millions of people simply happen to perform better on the website, and they say that's a reason they don't have to change anything.
And remember, kids!
If you work for Facebook, you are a white supremacist.
If you have a "friend" who works at Facebook, cut them out of your life, like you would your racist cousin.
You can do it. I believe in you.
To be, ah... frank, I'm beginning to wonder if at this point the only thing to alter FB's effect in time to make a difference would be internal monkeywrenching. I'm okay with anyone working there and actively working to do/be in a position to do that, though one must keep in mind the need to manage one's kool-aid intake.
At least rank-and-file conservatives are convinced that FB hates them, not that they can quit the platform any easier than anyone else. You'd think enlightened self-interest would eventually point towards a different position to take.
There is no "mak[ing] a difference" from inside Facebook. At all. They have to be broken up. As someone wiser than I said earlier this year:
And as for why breaking the Big Tech would be great for the economy:
Also, RWNJs aren't the brightest bulbs, so the fact that claim FB is "infringing on [their] free speech" holds no more water than those same RWNJs claiming their rights are infringed by marginalised people gaining equality.
Zuckerberg and his shithole company know what they're doing. They always have. They won't change on their own. I'll end with this vid on how their targeted advertising works (quote at 4:37 - "It cost [advertisers] 1 1/2 times more for an ad linking to [Bernie] Sanders' site to reach the same number of conservatives as a
Trumpad."):To be clear, my use of "monkeywrenching" right after the "making a difference" you quote was pointing at outright sabotage, not some hearts-and-minds thing. The outright sabotage can be of the algorithm, RWNJ groups, or even the whole enchilada.
AT&T is now the AT&T of old.
I'm very curious about this. Not trying to sealioning or anything (hopefully I don't sound as a douche, because I'm just too ignorant about the matter) - are there any reasonable steps that a corporation like Facebook could take in order not to be, well, Facebook? If, for some misterious reason, a miracle happened and Facebook cease to exist, what can be done to prevent another corporation to just take its place?
The answer to both questions is "Regulation and regulation". That's what keeps companies in check, especially the big ones. You push back regulation, you get FB, Google, Amazon, Microsoft, Comcast/Universal, Disney, and every other monopoly that Libertarians falsely trumpet as "champions of the free market".
Regulation is a tactic, not a policy. You might as well say that the answer is for "someone" to do "something". What regulations? Does Congress need to setup a board of Facebook post reviews that Donald Trump appoints someone to head, and that board sets up a system for deciding which posts are disallowed? What regulation, run by who, is the answer? I'm not trying to be flippant, I just don't see what the rule fix is, much less understand who is supposed to be given the power to enforce the rules when both sides are going to be eventually handed the reins of power.
And the worst part is that Facebook isn't even the problem. Facebook could vanish tomorrow, and nothing would get better. Everyone would just move somewhere else, and that somewhere else would almost certainly find a way to be as bad or worse than Facebook. Facebook is the symptom of a much larger social media problem that we don't understand how to combat. We cover up our ignorance about what to do by making a hand waving motion and saying that the solution is "more regulation", whatever that means.
If you think
Trumpwould set up any regulatory body, you haven't been paying attention (and anyone who says "I'm not trying to be flippant..." is about as sincere as someone who says "I'm not trying to be racist, but...") Regulation is the goddamn point of a working government; it's why you can't just sell a bottle of cyanide on the shelf next to Pepsi.Trumphas spent the last four years pushing back each and every regulation Obama put in place. You want to know "what regulation"? Start with restoring all of those; especially the push for Net Neutrality. Bring back the anti-monopoly policies that would have crushed Microsoft into a billion pieces, had not Dubya rolled all those back.Yes, I say "more regulation", because a lack thereof is one of the things got us in shit in the first place.
The movie The Social Dilemma makes a pretty damning case against the human ability to sort through the kind of bubble-reinforcing bullshit that Facebook algorithmically serves up. Centralized and monopolistic media sites manipulate human attention & behavior to drive shareholder profit. Full stop.
Waving your hands and saying “oh no, we can’t regulate this, no one understands it” is disingenuous. As they point out in the film, certain business are illegal because they always have a net harm: human organ trade, human trafficking. Why is trafficking in human attention any different? We weren’t born to spend our life at the beck and call of social media apps, despite the koolaid they’re peddling.
F*cebook and friends believe they can self-regulate their way out of this—see how well that’s gone.
No matter who is in charge, my question stands - what kind of regulations? Size, market share? Implementing measures to "avoid fascism"?
I'm extremely wary of regulation, because most of the time they result in unclimbable walls for corporation's gardens. Take, for example, the Copyright regulation imposed by the EU. It costs YouTube about 100 million to implement and it's an ongoing cost, but they gladly paid it because it effectively got rid of all that pesky competition. Of course, I didn't thought of this myself, have a look at Cory Doctorow's column on The Economist.
Ok. So if we restore all of the regulations that Trump removed (something very likely to happen if Biden wins), bring back anti-monopoly policies and break Facebook up into a billion pieces, and restore net neutrality, will that fix the problem? Are you saying that once the Obama regulations are back, we restore net neutrality, and we utterly destroy Facebook, the problem of hyper polarization over? Are you sure that the hyper engagement of networked rage cured if the government never allows a tech company to get over a few billion dollars big again?
I'll be honest, I don't think that restoring net neutrality, restore Obama's regulations, and destroying Facebook are going to fix the hyper polarization. I think these problems are bigger than Facebook. I don't think Facebook or even Facebook's size are even much a part of the problem. I think we'd be equally fucked if there were 10 different shitty competing social media platforms and Facebook was dead. Hell, we already half live in that world. Facebook IS dead to a large number of people, and they are happily tearing themselves apart on other social media networks, big and small.
So yeah, I don't think you offer any solutions. I don't think any of the policies you listed will fix or even really touch the problem. You say "regulation", and sure, I agree that some regulation might very well be the answer, but the thing no one seems to have a good answer for is, "what regulation". And by "what regulation", I don't just mean a wish list of mean things you want to do Facebook because they suck; I mean things that are real, enforceable, will actually start to fix the problem, and is a power you are willing to entrust to your enemies when they gain control of the government.
Don't give me a wish list of bad things you want to do to shitty companies. Tell me, even in the vaguest of terms, what regulation you imagine will cause Facebook, Twitter, Reddit, TikTok, Instagram, and all of the other social media platforms that will take the place of those platforms when they fall, to not profit off of rage bait and conspiracy that is driving people crazy? I don't think that there is any such regulation that can actually improve the situation, and I think that's why no one can describe anything other than a wish list of mean things they want to do to shitty companies. But seriously, prove me wrong and point me to a website, a policy paper, someone's confused ramblings, or whatever where someone describes the types of regulations that would make it so that a bunch of conspiracy and extremist insanity doesn't run rampant across social media.
Huh? Sorry, what? I trailed off early when you basically said:
Well, if you happen to find one of those ideas that you think attacks the problem, feel free to point it my way. I'd like to know the answer too, and I'm pretty interested in anyone who can even begin to articulate an answer that they imagine being executed in the real world that they think would actually improve the situation. It's pretty hard to advocate for a policy that can't be described or articulated.
I don’t have an answer for you, but I did want to say that these are excellent questions, thank you, and if you keep on asking these questions politely, it will get more and more people thinking about possible solutions...
What regulation? See below, with the assumption that we agree that monopolies can pervert the exchange of information needed for a free market (by either by hiding information that's vital to your decision to trade, especially pricing or by removing entire portions of the marketplace from equal access so you have to be on their platform to see those things you might buy and sell). If we don't agree and you want to be as great as our current monopolistic tech companies, you will have to rein them in so that you can unseat them and take their crowns.
So the standard mechanisms are:
* limit the extent of influence that single organisations or cartels can have by breaking monopolies up
* cap the percentage of the market that influential organisations can control
* tax the interactions between organisations so that there's a slowing effect on integration (coupling organisations together makes monopolies possible); and
* make everybody and every organisation pay their taxes so you can enforce protection from monopolies
I think those things are solid and you know who has to do it: governments curate marketplaces. I'm unsure about tracking profit-per-employee and adding extra taxes to the extremely-efficient ones (sharing in their efficacy, when the free market is supposed to eat the extremely-inefficient ones).
However, Facebook is a business-to-business concern and voters are divorced from much of a stake in the firm. Most people are producing content that feeds in, the people buying the adverts have a stake in their advert purchasing eyeball time and attention in service of sales. This disconnect makes it harder to put a story in front of voters that they can no longer share their photos or memes.
K3n.
You describe fine regulations if the only thing that upsets you about Facebook is its market share and influence. If the thing that upsets you is the extremism, conspiracy, and insanity in social media that is tearing society apart, your proposed regulations are useless.
I don't think that even in some fantasy land where the government executes every tech company that gets over a few billion dollars big, that any of the problems of hyper-connectedness will go away. Rage inducing conspiracy content works as well for smaller corporations as it does for multi-national juggernauts. It isn't like Facebook is making any of the content; they are just reposting it. A small company can do that too. Any idiot can do that, and then select for stuff that encourages engagement. I don't think a world of 20 Facebook is all that better. Hell, it might be worse if for no other reason than that it is easier to regulate a keep an eye on one big company.
Wanting to burn down Facebook is like thinking that burning down one of the first printing presses will stop the chaos of the reformation. Sure, the press might be turning out inflammatory material causing the chaos, but that particular printing press isn't the problem; it's the technology. Lots of printing presses were burned, but it didn't stop anything. Facebook is a part of the problem, but it isn't THE problem. You can burn down Facebook, but social media will still be with us, and it will be just as rage inducing.
Facebook's size, reach, and power is a huge part of the problem, but you are right that it is not the entire problem. I have seen a lot of the same problems with Twitter over on Mastodon, consistently, and sustained targeted harassment happened well before the age of "social media". Societies need to both rein in powerful media companies and take a good hard look at themselves in the mirror. But every action k3ninho lists would help immensely; they would not be useless in massively reducing the power of media companies (and I include news and entertainment media in that). Hell, it would help break up the consolidation of corporate power across the board, if you can get a government with enough of a spine to do it and enforce it.
i quit fb a few years ago and i'm cool with it. this ny times tech newsletter hit my inbox the other day. i decided i had no idea what to tell "the regulator" to do - "make facebook stop making me feel bad" isn't useful and its one thing for the customer to say "stop publishing that bullshit" and quite another for the (us) government to do the same. until i can see a path i'm fine not participating.
i disagree with most of this article but i did think it'd be relevant on this forum. calling facebook essential is laughable. saying that choosing to align your pocketbook and your values is an aberration in history, like that's not a good thing, or even true, is sad. "obey! consume!"