“Technology does not create the problems. It reflects and reinforces – and often hides – already existing forms of inequality and hierarchy,” she said during her lecture.
“When we talk about remaking technology and society, we always want to open up the framework and look not only at the impact of technology – that’s only half the story – but the input, the previous configurations of history and society that are part of the training of these newfangled tools,” she added.
Benjamin’s visit to campus was sponsored by the Harold and Iris Chandler Lectureship Fund, which brings researchers to Bowdoin who look at the impact of technology on the humanities, social sciences, and our society. The event was also part of the College’s celebration of the tenth anniversary of its Digital and Computational Studies (DCS) program.
Mohammad Irfan, Bowdoin’s John F. and Dorothy H. Magee Associate Professor of Digital and Computational Studies and Computer Science, introduced Benjamin, describing her as an extraordinary teacher and scholar who writes books that are “at once powerful, hopeful, and even practical. ”
Benjamin is the Alexander Stewart 1886 Professor of African American Studies at Princeton University. She is also the founding director of the Ida B. Wells Just Data Lab and author of the award-winning book Race After Technology: Abolition Tools for the New Jim Codeamong many other publications.
In her Bowdoin lecture, “Race to the Future? Reimagining the Default Settings of Technology & Society,” Benjamin emphasized the importance of education and academic programs like DCS to “radically expand who and what shapes our collective future. The issues are too important to leave it to those who only have the technological know-how when so many other forms of knowledge are needed as part of the process.”
Benjamin also argued that we need to broaden our framing of technology to include the people who work to support our consumer desires, such as warehouse workers from Amazon, or even those who perform the critical but invisible task of tagging data .
Part of the problem with our current notions of technology, she said, is that we remove ourselves as active agents, repeating two basic histories that divide technology into drivers of dystopia or utopia. “While these sound like opposing narratives and certainly have different endings, they actually share an underlying logic,” she said. “We can call it techno-deterministic logic, the idea that technology determines us. But the people behind the screen are missing from both scripts.”
She placed some of the burden of constructing a just technological future on scientists and people who learn. “I want to emphasize this because a lot of times when we think about racism and other forms of domination, we associate it with people who are ignorant, people in the backwoods, the hills in the trailer park. When in fact they are the ones who” have been in the citadels of learning that have created these stories, ideologies and lies and spread them to the rest of us,” she said.
“The reason this is so important,” she added, “is because if science and scientists have been so essential in erecting this architecture, then it means that science and scientists, researchers and people who learn are responsible to dismantle it.”
She talked about how we could avoid the pitfalls of “discriminatory design”—when the knowledge and design that teaches and constructs AI is harmful and biased—by embracing, for example, “liberating technologies.”
As an example, she described Breonna’s Garden, “a digital public sphere” and immersive virtual reality experience created in memory of Breonna Taylor, a 26-year-old black woman killed by police in her apartment. The garden, which can be downloaded from the App Store, allows people to share their sorrows and hopes. The project was created by developers who worked with Taylor’s family to turn their pain into purpose, Benjamin said.
“Developers listened to the family and checked in with them during every step of building the platform,” she continued.
In Breonna’s Garden, “they created a really beautiful, immersive space that we can put in the category of a liberating digital space.” When building these products, she said, designers should ask, “Who do you advise? Who do you involve? Are the people most affected by the problem your technology is supposed to solve part of the process?”
Other positive developments she encouraged the audience to explore include President Biden’s AI Executive Order and the European Union’s AI Act. (But she cautioned students about finding “loopholes” that could exclude or even harm some communities.)
In terms of “bottom-up organizing, community-building efforts,” she mentioned The Data Nutrition Project, which applies the model of food labels that tell us what’s in our food to datasets. This way, developers can see what’s in datasets before using them to train a computer. “So you see the biases and omissions that can be in it,” Benjamin said.
Another hopeful example is a movement called “consensual technology,” which applies the FRIES sexual consent framework to technology and the way data is developed, stored, accessed, and interacted with. FRIES stands for freely given, reversible, informed, enthusiastic and specific.
Benjamin purposefully ended his speech on an optimistic note: “I promised to be a little hopeful!” she said. “If inequality is woven into the very fabric of our society — in policing, education, health care and work — we can feel overwhelmed. But in my view, all of these become fronts for change, for places where we can restore the status quo. “