- About Us
- Join + Support
Adults Only (18+)
The age of artificial intelligence is upon us—from the cars we drive to the way we do business, it’s everywhere. As companies increasingly rely on algorithmic decision-making and facial recognition technology becomes more widespread, researchers have broken down the veneer of technological neutrality and revealed troubling concerns about surveillance, privacy, and bias. Learn more about the ways algorithmic decision-making and facial recognition technologies reinforce human bias and racism, and the work being done to support algorithmic accountability.
Timnit Gebru is a computer scientist and the technical co-lead of the Ethical Artificial Intelligence Team at Google, where she works on algorithmic bias and data mining. She is an advocate for diversity in technology and the cofounder of Black in AI, a community of black researchers working in artificial intelligence.
AI, Ain't I A Woman, a film by Joy Buolamwini (2018, 4 min.); learn more at the Algorithmic Justice League.
Looking for more? Here are some of the resources you'll hear about in Dr. Gebru's presentation:
Black in AI
Founded by Dr. Timnit Gebru, Black in AI (BAI) is a multi-institutional, transcontinental initiative creating a space for sharing ideas, fostering collaborations, and discussing initiatives to increase the presence of Black individuals in the field of AI.
Algorithmic Justice League
The Algorithmic Justice League is an organization that combines art and research to illuminate the social implications of artificial intelligence.
The Perpetual Line-Up
From the Georgetown Law Center on Privacy & Technology, Perpetual Line Up is the result of a yearlong investigation and over 100 records requests to police departments around the country—the most comprehensive survey to date of law enforcement face recognition and the risks it poses to privacy, civil liberties, and civil rights.
America Under Watch
Also from the Georgetown Law Center on Privacy & Technology, America Under Watch looks at how biometric surveillance technologies are used in various cities.
A Critical Summary of Detroit’s Project Green Light and Its Greater Context
Released by the Detroit Community Technology Project, this report looks at the history of Detroit's recent real-time crime surveillance program and its relationship to facial recognition. Research for this paper was conducted by Noah Urban, Jacob Yesh-Brochstein, Erica Raleigh, and Tawana Perry.
Our Data Bodies
Our Data Bodies is a mixed-method participatory research project that explores the nature and experience of digital privacy and “data rights” in three cities facing historic forms of socioeconomic disparity.
Data4BlackLives is a movement of activists, organizers, and mathematicians committed to the mission of using data science to create concrete and measurable change in the lives of Black people.
Race After Technology, by Ruha Benjamin, 2019
In Race After Technology, Ruha Benjamin argues that automation, far from being a sinister story of racist programmers scheming on the dark web, has the potential to hide, speed up, and deepen discrimination while appearing neutral and even benevolent compared to the racism of a previous era. Benjamin is an associate professor in the Department of African American Studies at Princeton University and studies the social dimensions of science, technology, and medicine, race and citizenship, knowledge and power.
Algorithms of Oppression: How Search Engines Reinforce Racism, by Safiya Noble, 2018
Dr. Safiya Umoja Noble is an associate professor in the Department of Information Studies at the University of California, Los Angeles (UCLA), where she serves as the co-director of the UCLA Center for Critical Internet Inquiry. In Algorithms of Oppression, she challenges the idea that search engines like Google offer an equal playing field for all ideas, identities, and activities.
Programmed Inequality: How Britain Discarded Women Technologists and Lost Its Edge in Computing, by Mar Hicks, 2018
Mar Hicks is a historian of technology, gender, and labor, specializing in the history of computing. In Programmed Inequality, they explore the story of labor feminization and gendered technocracy undercutting British efforts to computerize.
Lessons from Archives: Strategies for Collecting Sociocultural Data in Machines is a paper co-authored by Eun Seo Jo and Timnit Gebru.
Seeta Peña Gangadharan is an associate professor in the Department of Media and Communications at the London School of Economics and Political Science. Her work focuses on inclusion, exclusion, and marginalization, as well as on questions around democracy, social justice, and technological governance.