Dr. Ruha Benjamin identifies two contrasting stories about technology: the first portrays technology as our savior, while the other portrays technology as our slayer.
“Both stories assume technology is in the driver’s seat,” Benjamin said. “Rather than agonizing about a coming dystopia or longing for a future utopia, we have to reckon with an ‘UStopia,’ a word that describes the fact that the future is us.”
As a part of Harvey Mudd College’s “Being Human in the Age of AI” Nelson Speaker Series, Dr. Ruha Benjamin presented her lecture “From Artificial Intelligence to Collective Wisdom” on Oct. 19. Benjamin discussed how inequitable practices long present in non-digital industries have bled into digital technologies, most recently in AI.
Dr. Benjamin, Alexander Stewart 1886 Professor of African American Studies at Princeton University, has authored four books and is the founding director of the Ida B. Wells JUST Data Lab.
At the start of the talk, Benjamin invited the audience to consider the costs of collectively subscribing to the imposed visions of the few.
“Millions [are] being poured into the future of medicine, but healthcare for all [is] somehow farfetched,” Benjamin said. “[I’m speaking of] … a lopsided imagination, where we can imagine regenerating sick bodies and not an ailing body politic.”
This conversation on inequities of healthcare brought Benjamin to the story of Georges Cuvier, a French naturalist who attributed Black people’s inferiority to physical characteristics.
“Scientists are creating these distortions,” Benjamin said. “We know the academy created it, naturalized it, diffused it, so we are responsible for tackling it head on.”
Recognizing the racism that has historically pervaded the sciences, she cautioned the audience to be wary of the often empty promise of diversity and inclusion in oppressive systems.
“Diversity and inclusion doesn’t necessarily mean liberation or fairness,” Benjamin said. “[People of color] can be included into harmful and violent systems …We have to look beyond the buzzwords to develop a new vision to meet our social reality.”
Benjamin proceeded to use inhospitable architecture as a metaphor for coded inequity. Benches with spikes, such as in the United Kingdom and China, only go down if one pays every 15 minutes of sitting, reflecting the extent of this phenomenon.
She pointed out that in the digital world, these metaphorical spikes are often built into hierarchical structures. The widely used Student Success predictor score, where Black students are more likely to be assigned higher risk scores, reflects the hidden spikes in society.
“Why are we labeling students medium, low or high?” Benjamin said. “We should label the departments, the fields that are producing the risk … Always we should [consider] from whose perspective are the questions being asked and the data being collected.”
“If inequity is woven into the very fabric of society, then each twist, coil and code is a chance for us to weave new patterns, practices, politics. Its vastness will be its undoing once we accept that we are patternmakers.“
Moving to how these inequities have moved into the technological sphere, she coined the term ‘The New Jim Code’ to describe the process of a history of exclusion causing discriminatory inputs and outputs in technology.
“Here, we’re talking about the ways in which the imagined objectivity of technology is coupled with coded bias in order to obscure forms of containment,” Benjamin said.
Benjamin then offered guidance on ways we can rethink the impact of technology and challenge existing power structures.
Currently, leaders are attempting to regulate AI through measures like the AI Bill of Rights and AI Accountability Act. She said that this reckoning should not just be a top-down process. She advocated for building “consentful” technology, which uses the principles of consent in data storage, access and user interaction.
Benjamin concluded her lecture with a call to action.
“If inequity is woven into the very fabric of society, then each twist, coil and code is a chance for us to weave new patterns, practices, politics,” she said. “Its vastness will be its undoing once we accept that we are patternmakers.”
During the Q&A after the lecture, Bradley Gommiah HM ’24 asked how the power structures of Big Tech operate.
”Is it tech-driven versus people-driven or is it rich people-driven versus the rest of us-driven?” Gommiah asked.
Benjamin emphasized that the harms of inequitable technology are universal.
“The elite know the harm they’ve created,” Benjamin said. “How do we design differently? Some call it public interest technology or decolonizing technology … we should be thinking about how to create different approaches that are sustainable … Instead of platform capitalism, think about a cooperative approach to [an] economic model for creating digital tools.”
Seohyeon Lee PO ’25 said the lecture reflected her concerns about her potential career path.
“As a student studying computer science and mathematics, I do have a fear that I will be sucked into a career that will be contributing to implicit harm that’s baked in technology,” Lee said.
Zachary Dodds, a computer science professor at Mudd, felt empowered by Benjamin’s final call to action.
“This vision … is part of the framing of everything we do as a professor,” Dodds said. “I found it very inspiring to think that at every moment, there’s an opportunity to embrace [these] truths and perspectives.”