Fleeing war? Need shelter? Personal data first, please

Ellery Roberts Biddle

 

More people have been displaced by violence and natural disasters over the past decade than ever before in human history, and the numbers — that already exceed 100 million — keep climbing. Between ongoing conflict in the Democratic Republic of Congo, Pakistan’s mass expulsion of people of Afghan origin and Israel’s bombardment of Gaza, millions more people have been newly forced to leave their homes since October. 

When people become displaced en masse, organizations like the U.N., with its World Food Program and refugee agency, will often step in to help. But today, sometimes before they distribute food or medicine, they typically collect refugees’ data. Fingerprinting, iris scans and even earlobe measurements are now a common requirement for anyone seeking to meet their basic needs.

This week I caught up with Zara Rahman, a tech and social justice researcher who studies the drive across international humanitarian and intergovernmental organizations like the U.N. and the World Bank to digitize our identities.

“Of course, U.N. agencies are trying to figure out how much food and what resources we need,” Rahman told me. But “the amount of information that is being demanded and collected from people in comparison to what is actually needed in order to provide resources is just wildly different.” 

In “Machine Readable Me: The Hidden Ways Tech Shapes Our Identities,” her new book on the global push to digitize our lives, Rahman looks at the history of data collection by governments and international agencies and what happens when their motives change or data falls into the wrong hands. Nazi Germany is a top pre-digital case study here — she has a great passage about how members of the Dutch resistance bombed Amsterdam’s civil registry office during World War II to prevent Nazis from using the registry to identify and persecute Jews.

She then leaps forward to Afghanistan, where U.S. occupying forces deployed data collection systems that were later seized by the Taliban when they skated back into power in 2021. These databases gave Taliban leadership incredibly detailed information about the lives of people who worked for the U.S. government — to say nothing of women, whose lives and opportunities have been entirely rewritten by the return to Taliban rule. We may never know the extent of the damage incurred here.

Data collection and identity systems are also used, or could potentially be used, to persecute and victimize people whose nationality is contested, like many of those being expelled right now from Pakistan. Rahman emphasized that what happens to these people may depend on who the state perceives them to be and whether they are seen as people who “should return to Pakistan at some point.” 

Rohingya Muslims, she reminded me, were famously denied citizenship and the documentation to match by the Myanmar government for generations. Instead, in the eyes of the state, they were “Bengalis” — an erroneous suggestion that they had come from Bangladesh. In 2017, hundreds of thousands of Rohingya people fled the Burmese military’s ethnic cleansing operations in western Myanmar and landed in Bangladesh, where the government furnished them with IDs saying that they were from Myanmar, thereby barring them from putting down roots in Bangladesh. In effect, both countries leveraged their identity systems to render the Rohingya people stateless and wash their hands of this population. 

What recourse do people have in such circumstances? For the very rich, these rules don’t apply. People with deep pockets rarely find themselves in true refugee situations, and some even purchase their way to citizenship — in her book, Rahman cites a figure from Bloomberg, which reported that “investor-citizens spent $2 billion buying passports in 2014.” But most of the tens of millions of people affected by these systems are struggling to survive — the financial and political costs of litigating or challenging authorities are totally out of bounds. And with biometric data a part of the package, the option of slipping through the system or somehow establishing yourself informally is too. Your eyes are your eyes and can be used to identify you forever.

GLOBAL NEWS

Facial recognition tech is a key tool in China’s campaign against ethnic Uyghurs. This isn’t news, but the particular ways in which Chinese authorities deploy such tools to persecute Uyghur people, most of whom are Muslim, continue to horrify me. It came to light recently that Hikvision, the popular surveillance camera maker that offers facial recognition software, won a state contract in 2022 to develop a system that conducts “Assisted Analysis Of Ethnic Minority Students.” It’s worth noting that Hikvision in the past has boasted of its cameras’ abilities to spot “Uyghur” facial features, a brag that helped the technology get blacklisted in the U.S. But while you can’t buy it here, it’s pretty common across Asia, Africa and even in the U.K. The recently leaked tenders and contracts, published on IPVM, show that the company developed tools that alerted Chinese authorities about university students who were “suspected of fasting” during Ramadan, as well as monitored travel plans, observation of holidays and even things like what books ethnic minority students checked out at the library. Paging George Orwell.

Israel is also doubling down on facial recognition and other hardcore surveillance tech, after its world-renowned intelligence system failed to help prevent the deadly attacks of October 7. In the occupied West Bank, Palestinians report their daily movements are being watched and scrutinized like never before. That’s saying a lot in places like the city of Hebron, which has been dotted with military checkpoints, watchtowers and CCTV cameras — some of which are supplied by Hikvision — for years now. In a dispatch this week for Wired, Tom Bennett wrote about the digital profiling and facial recognition surveillance database known as Wolf Pack that allows the military officers to pull up complex profiles on all Palestinians in the territory, simply by scanning their faces. In a May 2023 report, Amnesty International asserted that whenever a Palestinian person goes through a checkpoint where the system is in use, “their face is scanned, without their knowledge or consent.”

Some of the world’s most powerful tech companies are either headquartered or present in Israel. So the country’s use of technology to surveil Palestinians and identify targets in Gaza is a burning issue right now, including for engineers and tech ethics specialists around the world. There’s an open letter going around, signed by some of the biggest names in the responsible artificial intelligence community, that condemns the violence and the use of “AI-driven technologies for warmaking,” the aim of which, they write, is to “make the loss of human life more efficient.” The letter covers a lot of ground, including surveillance systems I mentioned above and Project Nimbus, the $1.2 billion deal under which Amazon and Google provide cloud computing services to the Israeli government and military. Engineers from both companies have been advocating for their employers to cancel that contract since it first became public in 2021. 

The letter also notes the growing pile of evidence of anti-Palestinian bias on Meta’s platforms. Two recent stand-out examples are Instagram’s threat to suspend the account of acclaimed journalist Ahmed Shihab-Eldin over a video he posted that showed Israeli soldiers abusing Palestinian detainees, and the shadowbanning of digital rights researcher Mona Shtaya after she posted a link to an essay she wrote for the Middle East Institute on the very same issue. Coincidence? Looking at Meta’s track record, I very much doubt it.

WHAT WE’RE READING

  • I’ve written a few times about how police in the U.S. have misidentified suspects in criminal cases based on faulty intel from facial recognition software. Eyal Press has a piece on the issue for The New Yorker this week that asks if the technology is pushing aside older, more established methods of investigation or even leading police to ignore contradictory evidence.
  • Peter Thiel is taking a break from democracy — and he won’t be bankrolling Trump’s 2024 presidential campaign. Read all about it in Barton Gellman’s illuminating profile of the industry titan for The Atlantic.

From biometrics to surveillance — when people in power abuse technology, the rest of us suffer

More Coda Newsletters