麻豆免费版下载

Skip to main content

When it comes to tech, think inputs before impacts

When it comes to tech, think inputs before impacts

By Joe Arney
Photos by Malinda Miller (Engl, Jour鈥92; MJour鈥98) and Annie Hergenrader

There is perhaps no more coveted title among the disruptors of Silicon Valley than 鈥渋nnovator.鈥

It鈥檚 a term Ruha Benjamin wants to take back.

The tech set, she argues, has continued to create products and services that offer convenience and demonstrate the power of imagination. But to do so, they have advanced discrimination by ignoring the history of technical advances coming at the expense of marginalized groups.

鈥淭hose who鈥檝e created the technical problems that we see here 鈥 are constantly putting themselves in the position of the thought leaders for the very problems they鈥檝e been instrumental in creating,鈥 said Benjamin, the Alexander Stewart 1886 professor of African American Studies at Princeton University and founding director of its Ida B. Wells Just Data Lab.

鈥淥ne of things we could do is to stop calling them innovators. It shouldn鈥檛 count as innovation if you鈥檙e not wrestling and dealing with the social, ethical and political implications of your work.鈥

Benjamin visited the 麻豆免费版下载 on March 6 to help launch the Center for Race, Media and Technology at the College of Media, Communication and Information, while also serving as the first speaker in the college鈥檚 Distinguished Lecture Series.

A need for courageous voices

In introducing Benjamin and the speaker series, Nabil Echchaibi, associate dean of creative and scholarly work at CMCI, talked about a world of growing violence and fear, disinformation, and complicity before hostile powers and technologies.

鈥淚n this world, we need more than ever courageous voices, voices that speak truth to power, voices that slow things down so we can see things again,鈥 Echchaibi said.

It鈥檚 a fitting overview of key themes in Benjamin鈥檚 work. Her four books draw on her extensive research into technology and racism, including what she calls the 鈥淣ew Jim Code,鈥 a nod to both the Jim Crow laws that enforced segregation and the biases encoded into technology. She implored students to play an active role in questioning assumptions about technology and the world around them by bringing historical and cultural contexts to the forefront.

As an example, she showed students something decidedly low tech鈥攁 park bench with armrests. She noticed the benches on a trip to the Bay Area, and while she appreciated the convenience, the benches offered the hidden feature of deterring people from sleeping on them鈥攊n an area struck by high homelessness amid vast income inequality.

It got worse, as she showed single-seat benches, benches that were caged overnight and a coin-operated bench with retractable spikes on the seat.

Fortunately, that last one was an art project, but Benjamin called it a strong metaphor of something 鈥渢hat鈥檚 nominally for everyone, but with forms of harm and exclusion beneath the surface.鈥 Technology, she said, often plays by the same rules, whether it鈥檚 algorithms that make assumptions about a student鈥檚 test scores based on their ZIP code, or facial recognition technology鈥攄eployed in consumer products and by police鈥攖hat struggles to correctly identify Black faces.

Technology as a mirror

Notably, technology isn鈥檛 the problem, she said. Rather, the technology is simply reflecting the biases of the people who create and code it.

鈥淚n every conversation about technology, we have to start not with technology鈥檚 impact, but with the social inputs,鈥 Benjamin said. 鈥淲hat鈥檚 being used to train, to design; what are the values, the ideas, the preexisting forms of hierarchy that are shaping that design process.

鈥淐omputational depth without the social and historical depth is, in my view, superficial learning, not deep learning.鈥

Benjamin covered a range of concepts around artificial intelligence and its hidden costs鈥攍abor, societal and environmental鈥攂ut also challenged students to find ways to use these tools to create positive societal change. She pointed to a project that aimed to use A.I. to better assess patient pain, but the team designing it considered that doctors have historically underdiagnosed pain in Black patients.

鈥淐omputational depth without the social and historical depth is, in my view, superficial learning, not deep learning.鈥
Ruha Benjamin

So rather than train the A.I. on doctors鈥 notes, they had it crawl through patient-reported data on the pain they experienced, improving accuracy and removing bias.

鈥淭his is an example of questioning where we go looking for knowledge and data,鈥 Benjamin said, 鈥渁nd instituting that idea of social and historical literacy into our framing of the problem, which leads us down a different road and gets us perhaps better results.鈥

In other words, be imaginative about possibility, and don鈥檛 get locked into what she called 鈥渢he default settings.鈥

鈥淲e are in many ways trapped inside the lopsided imaginations of those who monopolize power and resources to benefit the few at the expense of the many,鈥 she said. 鈥淭he fantasies of these futurists too often rest on the nightmares of others. 鈥 How might the power of our collective imagination begin to transform the world around us?鈥

Bryan Semaan, founder of CMCI鈥檚 Center for Race, Media and Technology, had hoped Benjamin would be available to formally launch the center, and her lecture 鈥渨as beyond anything I could have imagined.鈥

鈥淲hat I really hope people took away from this is a new sensibility and ethic for how they can approach every facet of their everyday life,鈥 said Semaan, associate professor of information science and associate chair for undergraduate studies. 鈥淪o much of what Ruha pointed out are things that exist in our world that are super invisible to many, but hyper visible to others. And I hope people left inspired, and with a greater understanding that not everyone is living in their world in the same way and experiencing life in the same way, but that together we can address these widescale and pressing societal issues.鈥

Visit