A Billion Nazis at the Table
If you're too big to moderate you are not responsible enough to moderate
“If there's ten people and one Nazi at the table, you’ve got a table with eleven Nazis.”
— German aphorism
If you encounter a Nazi on social media (whose account isn’t banned within a few hours), it means one of two things:
The owner is OK with Nazis.
The owner is bad at moderating.
This may appear unfairly black-and-white at first, but there really is no middle ground. Either the person in charge is aware of the Nazi and chooses not to ban, or they want to ban Nazis but for some reason aren’t aware of them, in which case they are failing to do their job.
Part your brain right now is probably searching for excuses: “Nazis employ coded language! It’s impossible to search through billions of posts and identify Nazis with a 100% success rate! Non-Nazis could get accidentally banned!”
None of these excuses justify it though, though, and here’s why:
These are private platforms. Nobody is forcing them to do anything. If they are “too big to moderate” it’s because they’ve decided to be that way. Unlike the government, corporations are not obligated to permit universal freedom of speech. They’ve chosen to be the way they are and they alone are responsible for it.
It’s actually not hard at all to have a social gathering without Nazis. If you’ve ever hosted a potluck and none of the guests were spouting antisemitic and/or authoritarian talking points, congratulations! You’ve achieved what some of the most valuable companies in the world claim is impossible.
Private Parties
When online spaces seem complicated and confusing, thinking about them like IRL social spaces can help cut through the marketing and excuses. For example, in a house party, you may have one room with people cracking jokes; in another, sharing advice, in yet another, debating politics. The host creates the space for socializing, and it's their job for the evening to make sure everyone is having a good time.
If a party isn’t fun, if people aren’t able to be social, it is a bad party. There are myriad reasons why a party may be disrupted —someone starts loudly spewing Nazi stuff maybe— and when that happens it is the responsibility of the host to take immediate steps to fix the problem and make it fun again.
Guests of course may intervene on the hosts’ behalf and solve the problem —thus making the party fun again— but they shouldn’t have to. Guests are there to socialize, and after a few cases of having to take responsibility for a bad host, a reasonable guest will just stop attending that hosts’ parties. Few people would go back to the restaurant where the responsibility of preventing the cook from spitting in the food falls to the customer while the manager sits idly by. Few non-Nazis would continue to attend social gatherings where there’s a strong possibility of having the awkward job of kicking Nazis out.
This gets to the “paradox of tolerance” that the quote at the top of this essay is referencing. If a party allows Nazis, it will inevitably become a party for Nazis, as those who won’t acquiesce quickly tire of the burden of managing the unmanageable and stop attending. But even if a host doesn’t want Nazis and yet is unable to keep them out, the social effects are the same. If everyone knows the host is unable to responsibly monitor their own parties, either because it got too big, or the host got too drunk, eventually most sane guests just… won’t attend anymore.
What I want to illustrate is that the excuse of being “too big” to effectively moderate ones’ self shouldn’t apply when nobody is forcing social media companies to grow to these gargantuan sizes in the first place. If the person in charge of keeping Nazis off a platform is unable to consistently identify context-dependent and coded Nazi language it means they are bad at their job. If you can’t throw a dinner party without accidentally inviting Nazis, you are bad at dinner parties. Similarly, if you are unable to run a social media company that bans Nazis quick, you aren’t responsible enough to be running a social media company.
The “balance” a lot of the big companies strike (instead of paying human moderators or otherwise not growing too-big-to-moderate) is using far from perfect automated tools1 combined with a handful of humans to remove enough of the Nazi stuff so as to not to give off “Nazi site” vibes while hoping users will put up with some Nazis. They are hoping that you will sit at their table and tolerate some Nazis.
Will you?
Addendum: A Seat at The Table
The Fediverse model demonstrates it’s actually very easy to run a social media platform without Nazis running roughshod. Mastodon instances don’t generally have any unwanted guests because (unlike corporate social media) there’s zero incentive to grow beyond an ability to self-moderate. If an instance were to become known for hosting Nazis —either via malice or an incompetent owner— other more responsible instances would simply de-federate (cut themselves off) from the Nazi instance until they got their shit together. Problem solved, and no “trust and safety dance” required!
One might assume that after the rug pull Elon did with Twitter, journalists and media types would want control over the platforms that distributes their work. But for whatever reason that doesn’t seem to be a selling point as many are instead turning to Threads (the latest product from the genocide-enabling company Meta) and the startup company BlueSky, whose lack of moderation tools is addressed in this mildly technical but still interesting analysis by ethical hacker and activist Michał "rysiek" Woźniak:
The only way to effectively fight harassment in a social network is effective, contextual moderation. The Fediverse showed that having communities, which embody that context and whose admins and moderators focus on protecting their members, is pretty damn effective here. This is exactly what [BlueSky] is not doing.
“Neutrality” and “speech” and “voice” and “protection from bans” is mentioned right there, front and center, in BS’s overview and FAQ. At the same time moderation and anti-harassment features are, at best, an afterthought.
Basically, BlueSky’s method for dealing with Nazis is to do nothing, and rely on users to… argue them away I guess? (I’m reminded of an adage about wrestling with pigs). Threads on the other hand seems to be taking the opposite approach and trying to make a sterile, “friendly” place full of ads and free of politics which of course isn’t working because Nazis will soapbox wherever they are allowed.
For the record- I don’t fault Joe Emerging Artist for posting his work to 𝕏, Instagram or Reddit. The reality is that these companies have a monopoly on the deciding-who-gets-attention market and for better or worse represent the only feasible path towards financial independence for emerging creatives that aren’t born into wealth.
…But I do fault every well-established media company, journalist and otherwise influential person who continues to contribute their efforts and content exclusively to these platforms, knowing full well the kind of intolerance they offer sanctuary to. A year or two ago perhaps most people could be given a pass, but today I see no excuse. Leading one’s following to a place like 𝕏 is irresponsible at best, and these professionals are smart enough to know better.
I dream of a world where the media companies grow a spine and take back control of their own means of distribution by endorsing open platforms. If the Place to Talk About Things becomes owned and operated by the companies and communities who want to talk about things, nobody will have to tailor their headlines to appease Elon or Zuck’s toxic algorithms or be forced to sit at the table with Nazis because it’s the only table available. We know now that we don’t need social media companies in order to be social about the media. We can do it on our own, and not only that- we can do it better.
Consider for a moment that even if automated tools did function perfectly, you would still be limiting yourself to a single perspective of what constitutes allowed speech. Why put our faith in the good intentions of Mark “genocide enabler” Zuckerberg when we don’t have to?
Meanwhile now he says the word "decolonize" is hate speech. Gotta hand it to the guy, he seems to like ethnic violence no matter what ethnicity the victims are.