麻豆免费版下载

Skip to main content

#RecommenderSystems

By Joe Arney

Digital recommender systems have long been a part of our lives. But those systems might be serving up inequality along with new music, viral videos and hot products.

Now, a leading expert on the technology powering these systems is turning his attention to the way news is recommended and shared. 

鈥淚f a system only shows us the news stories of one group of people, we begin to think that is the whole universe of news we need to pay attention to,鈥 said Robin Burke, professor and chair of the information science department. 

Burke鈥檚 research studies bias in recommender systems, which tend to favor the most popular creators and products鈥攗sually at the expense of newcomers, underrepresented groups and, ultimately, consumers who have fewer choices. That鈥檚 problematic because these systems are proprietary, so researchers aren鈥檛 able to examine how they work. 

鈥淭he people who do this kind of research in industry don鈥檛 publish very much about it, so we don鈥檛 know exactly what鈥檚 going on in terms of how their systems work, or how well they work,鈥 he said.

A quick primer for the uninitiated: Recommender systems use data from individual subscribers to serve personalized content鈥攁rt, news, commerce, politics鈥攚hich may limit exposure to new ideas and influences.

It鈥檚 why the National Science Foundation awarded Burke and others, including associate professor Amy Voida, a nearly $1 million grant in 2021 to develop 鈥渇airness-aware鈥 algorithms that blunt biases baked into recommender systems. And the NSF saw the potential to do something similar in news, leading to a $2 million grant earlier this year to build a platform for researchers eager to experiment with the artificial intelligence that powers news recommender systems.

A platform like this could be game-changing for academic researchers, who are locked out of the proprietary systems built and studied by tech and social media companies. And as more nontraditional providers become sources of news, understanding how these algorithms work is essential: You may think of TikTok as a place for music videos, but a Pew Research Center survey found one in four American adults under 30 get their news from the platform.

鈥淲e have put all this control over the public square of journalistic discourse into the hands of companies that don鈥檛 have any transparency or accountability relative to what they鈥檙e doing,鈥 Burke said. 鈥淚 think that鈥檚 dangerous. And so, it鈥檚 important to think about what the alternatives might look like.鈥 That includes the business model itself, which is predicated on selling ads while keeping users on a platform.

If successful, this latest grant will build a robust system for live experiments on recommender systems that will eventually become self-funded through contributions from other researchers. He compared it to the way space telescopes and supercolliders have created a platform where experts can better understand the world around them. 

鈥淯nless you work at one of these companies, you don鈥檛 have any insight into how these systems work, or control over them,鈥 Burke said. 鈥淚 hope that, through this infrastructure, we鈥檙e able to understand how these things are governed, and for what objectives鈥攁nd who gets to decide what those objectives are. That鈥檚 something I鈥檓 very interested in.鈥

Lisa Marshall (Jour, PolSci鈥94; MJour鈥22) contributed reporting.