Wartime in the ‘digital wild west’ 

Ellery Roberts Biddle

 

As Israel continues its advance into Gaza, the need for oversight and accountability around what appears on social media feels especially urgent. Forget for a minute all the stuff online that’s either fake or misinformed. There are reams of real information about this war that constantly trigger the censorship systems of Big Tech companies. 

Consider the subject of terrorism. The biggest players all have rules against content that comes from terrorist groups or promotes their agendas, many of which align with national laws. This might sound uncomplicated, but the governing entity in Gaza, for instance, is Hamas, a designated terror organization in the eyes of Israel and, even more importantly, the U.S., home to the biggest tech companies on earth. Digital censorship experts have expressed well-founded fears that between Big Tech’s self-imposed rules and regional policies like the EU’s Digital Services Act, companies could be censoring critical information such as evidence of war crimes or making it impossible for people in the line of fire to access vital messages.

Although the stakes here couldn’t be higher, we also know that content moderation work is too often relegated to tiny teams within a company or outsourced to third parties.

Companies are typically coy about how this works behind the scenes, but in August the Digital Services Act went into effect, requiring the biggest of the Big Techs to periodically publish data about what kinds of content they’re taking down in the EU and how they’re going about it. And last week, the companies delivered. The report from X showed some pretty startling figures about how few people are on the front lines of content moderation inside the company. It’s been widely reported that these teams were gutted after Elon Musk took over a year ago but I still wasn’t prepared for the actual numbers. The chart below shows how many people X currently employs with “linguistic expertise” in languages spoken in the EU.

X has expertise on fewer than half of the bloc’s official languages, and for most of them, it employs literally one or two people per language. The languages with teams in the double digits are probably explained by a combination of regulation, litigation and political threats that have tipped the scales in places like Germany, Brazil and France. But for a company with this much influence on the world, the sheer lack of people is staggering.

Industry watchers have jumped all over this. “There is content moderation for the English-speaking world, which is already not perfect, and there is the Digital Wild West for the rest of us,” wrote Roman Adamczyk, a network analyst who previously worked with the Institute for Strategic Dialogue. “Will this change in light of the 2024 elections in Finland, Lithuania, Moldova, Romania and Slovakia?” asked Mathias Vermeulen, director of the privacy litigation group AWO. Great question. Here are a few more, in no particular order:

What are people who speak Hungarian or Greek — of which there are about 13 million each in the EU — supposed to make of this? What about all the places in the EU where the Russian language has a big presence, sometimes of the fake news variety? What happens if the sole moderator for Polish gets the flu? Is there any recourse if the two moderators for Hebrew, whose jobs I seriously don’t envy right now, get into an argument about what counts as “terrorist” content or “incitement to violence”? These moderators — “soldiers in disguise” on the digital battlefield, as one Ethiopian moderator recently put it to Coda — have serious influence over what stays up and what comes down.

After reading accounts from moderators working through Ethiopia’s civil war, I shudder to think of what these staffers at X are witnessing each day, especially those working in Arabic or Hebrew. The imperative to preserve evidence of war crimes must weigh heavily on them. But ultimately, it will be the corporate overlords — sometimes forced by the hands of governments — who decide what gets preserved and what will vanish.

GLOBAL NEWS

Elon Musk has once again been taking potshots at the world’s largest online encyclopedia. Two weeks back, he poked fun at the Wikimedia Foundation’s perennial donation drive and then jokingly considered paying the foundation $1 billion to change the platform’s name to — so sorry — “Dickipedia.” It is hard to know where to begin on this one, except to say that while Wikipedia functions on a fraction of the budget that X commands, it takes things like facts and bias a lot more seriously than Musk does and supports 326 active language communities worldwide. In the meantime, Wikipedia’s fate in the U.K. still hangs in the balance. Regulators are sorting out the implementation of the country’s new Online Safety Act, which will require websites to scan and somehow remove all content that could be harmful to kids before it appears online. There’s a lot wrong with this law, including the fact that it will inspire other countries to follow suit.

One recent copycat is Sri Lanka, where the parliament is now considering a bill by the same name. Although proponents say they’re trying to help protect kids online, Sri Lanka’s Online Safety Bill treads pretty far into the territory of policing online speech, with an even broader mandate than its British counterpart. One provision aims to “protect persons against damage caused by communication of false statements or threatening, alarming or distressing statements.” Another prohibits “coordinated inauthentic behavior” — an industry term that covers things like trolling operations and fake news campaigns. A committee appointed by Sri Lanka’s president gets to decide what’s fake. Sanjana Hattotuwa, research director at the New Zealand-based Disinformation Project, has pointed out the clear pitfalls for Sri Lanka, where digital disinfo campaigns have been a hallmark of national politics for more than a decade. In an editorial for Groundviews, Hattotuwa argued that the current draft will lead to “vindictive application, self-serving interpretation, and a license to silence,” and predicted that it will position political incumbents to tilt online discourse in their favor in the lead up to Sri Lanka’s presidential election next year.

Greek lawmakers pushed through a ban on spyware last year, after it was revealed that about 30 people, including journalists and an opposition party leader, were targeted with Predator, a mobile surveillance software made by the North Macedonian company Cytrox. But efforts to get to the bottom of the scandal that started it all — who bought the spyware, and who picked the targets? — have been stymied, thanks in part to the new conservative and far-right elements in parliament. The new government has overhauled the independent committee that was formed to investigate the spyware scandal, in what opposition lawmakers called a “coup d’etat.” And now two of the committee’s original members are being investigated over allegations that they leaked classified information about the probe. When it comes to regulating — in this case, banning — spyware, EU countries probably have the best odds at actually making the rules stick. But what’s happened in Greece over the last 18 months shows that it’s still an uphill battle.

WHAT WE’RE READING

  • Wired’s Vittoria Elliott has a new report on the rise of third-party companies that provide what’s known in the tech industry as “trust and safety” services. A key takeaway of the piece is that when companies outsource this kind of work, it means they’re “outsourcing responsibilities to teams with no power to change the way platforms actually work.” That’s one more thing to worry about.
  • Beloved sci-fi writer and open internet warrior Cory Doctorow brought us a friendly breakdown this week of some really important legal arguments being made around antitrust law and just how harmful Amazon is to consumers and sellers alike. In a word, says Doctorow, it is “enshittified.” Read and learn.

From biometrics to surveillance — when people in power abuse technology, the rest of us suffer

More Coda Newsletters