Nina Dewi Toft Djanegara


I am the Associate Director of the Technology & Racial Equity Initiative at Stanford University's Center for Comparative Studies in Race & Ethnicity.

I am also pursuing my PhD in the Anthropology Department at Stanford University, where I use ethnographic methods to better understand the intersection of politics and technology. In particular, I examine how surveillance technology — such as facial recognition and biometric identification — is applied to border management and law enforcement.

The Objective
Why are women of color erased from their own science stories?

Scientific facts cannot be separated from the context of their production or the identities of the researchers who discovered them. Without Buolamwini’s personal experiences of being undetected by facial recognition, researchers may have never realized that these algorithms can be riddled with bias.

Fast Company
How 9/11 sparked the rise of America's biometrics security empire

In the highly-charged months after 9/11, identity was framed as a matter of national security, and biometrics was posed as the solution. Now, however, the wisdom of this program is being called into question, especially with the recent news that sensitive biometric records collected by U.S. troops may have fallen directly into the hands of the Taliban.

Fast Company
The great misunderstanding at the core of facial recognition

Ultimately, any computer-vision project is based on the premise that a person’s outsides can tell us something definitive about their insides. These are systems based solely on appearance, rather than identity, solidarity, or belonging. And while facial recognition may seem futuristic, the technology is fundamentally backward-looking, since its functioning depends on images of past selves and outmoded ways of classifying people. Looking forward, instead of asking how to make facial...

Privacy International Special Report
Biometrics and counter-terrorism: Case study of Iraq and Afghanistan

"Detecting adversaries and “denying anonymity” became a matter of national security. That is to say, Iraq and Afghanistan were not merely sites where biometric information was collected; the DOD’s biometrics policies and practices represented a political and policy shift, which set precedent for more recent intelligence and counter-terrorism operations, such as the collection of biometric data from suspected ISIS fighters and affiliates in Raqqa."

Technology Pill podcast
Biometrics domination under the pretext of combating terrorism

"Biometric data collection and use in the name of countering terrorism has been accelerating around the globe, often abusively, without being effectively regulated or subject to accountability mechanisms. This week we talk to Fionnuala Ní Aoláin, UN Special Rapporteur on Human Rights & Counter-Terrorism, Nina Dewi Toft Djanegara about biometrics in Afghanistan and Iraq, and Keren Weitzberg about uses in Somalia and Palestine."

AI Now Institute
A New AI Lexicon: Recognition

I myself was inspired to reflect on the different meanings of identity and recognition when Noor, an L.A.-based anti-surveillance activist, told me, “That’s how we defeat surveillance…instead of watching each other, seeing each other.” Noor’s words helped me to understand how seeing is about mutual understanding and validation, while watching is about objectification and alienation.

The Power to Selectively Reveal Oneself: Privacy Protection among Hacker-activists

"Privacy advocates and hacker activists (‘hacktivists’) oppose biometric technologies, such as electronic fingerprinting and facial recognition, that are increasingly used to sort and identify people. This paper draws from ethnographic fieldwork among technology experts and hacktivists in Denmark and Germany to explore how these individuals work to resist the encroachment of surveillance technologies in their physical and digital lives."

Anthropology News
Watching Our Words

"Creepiness gestures at the affective nature of privacy intrusions, an unease that is felt at the level of the body. Data collection becomes creepy when it breaches social norms about intimacy, transparency, consent and trust. While the recent EU GDPR has outlined stricter standards for the collection and storage of personal data, creepiness exceeds legal and technical definitions of data (in)security. Framing intrusions in terms of creepiness rather than illegality, injustice or other...

Real Life
Take the Wheel

"The development of autonomous cars is enmeshed with contemporary debates about emerging technologies, such as the future of artificial intelligence, the automation of labor, government regulation of technology, and the ever-increasing trust in computation as the wellspring of truth and safety."