Panel addresses media’s role in racism
Our free press has historically guarded democracy, but rarely equality, said Janae Cummings, moderator of Tuesday’s panel discussion, Confronting Racism: Media, Technology, and Social Media.
Media School associate professor Marissa Moorman, Media School doctoral candidate Ryan Comfort, University of Minnesota assistant professor Danielle Kilgo (formerly of The Media School) and Simon Fraser University professor Wendy Chun spoke Tuesday in the livestreamed discussion, part of the IU Arts and Humanities Council’s ongoing series, Confronting Racism: Conversations on Systemic Racism and Protest.
The panelists each gave an overview of their research on racism and media before diving into audience questions submitted in the livestream’s comments section.
Kilgo researches the news media’s coverage of protests. Because no reported story can fit every detail and perspective, she said reporters settle for specific framings. Those framings are then assumed to represent objectivity, and reporting that reflects biases can come to be understood as the opposite.
“The truth is that journalism’s end product — what we see and what we read — isn’t always fair, and hasn’t always been fair for a long time,” Kilgo said.
In the case of anti-racist protest coverage, which often sensationalizes and exaggerates rather than communicates the ideals or demands of protestors, that can severely skew public perception, she said.
Kilgo tied this to an explanation of the protest paradigm: A growing movement needs the media, which then trivializes it. As negative protest coverage affects public opinion, the status quo is reinstated and reinforced.
She described various protest coverage tropes as “delegitimizing” — framing of protests as violent and disruptive, as antagonistic or as shocking, and routinely using official sources rather than protestors’ voices — and offered “legitimizing” alternatives: recognizing protestors’ ideals, demands and social critiques.
That shift is important in furthering a public reckoning with ideas of race and realities of racism, she said.
For Indigenous communities, getting their stories into the mainstream media at all is often difficult, said Comfort.
His research focused on Indigenous communities trying to achieve environmental and resource conservation goals. He said the news media typically depict Indigenous people as activists, warriors, historic relics, victims of poverty and more, but rarely as scientists.
“It’s going to be up to Indigenous nations and Indigenous people to take advantage of new media technologies and new media channels to tell these stories first,” he said, “to hopefully start to create some change in terms of public awareness and understanding.”
In his research, Comfort found that Indigenous communities have found Facebook to be a vital tool to spread their messages.
Moorman, author of “Powerful Frequencies: Radio, State Power, and the Cold War in Angola, 1933-2002,” said radio was a cultural connective tissue for the colonialist state in colonial Angola. First used as a propaganda tool for the state, the medium was later used for liberation by anti-colonialist revolutionaries.
This history and her work, she said, can help us think about authoritarianism and activism in the United States today: What do they look like, and what is the relationship between technology and trauma?
Chun’s research spotlights the failings of technology to diverge from societally perpetrated racist norms. As corporations and software developers promise a bright future where “colorblind” software can ensure equal opportunities for all, the shortcomings of technology become increasingly clear.
Despite its potential to correct for human error, manmade software more often comes to reflect it. Correctional Offender Management Profiling for Alternative Sanctions, a program marketed during the Obama era as a solution to human bias, has been extensively documented as discriminatory.
“The problem was that they got rid of individual bias by replacing it with institutional bias,” Chun said.
Because machine-learning algorithms are built and tested using existing data, and then verified against it, the algorithm of a program meant to correct for potential racial biases continued to espouse them, just under the guise of objectivity.
“More starkly, they’ll only be verified as correct if they make racist predictions,” Chun said.
Ignoring race doesn’t help combat racism, the panelists said.
“For the public, the idea of colorblindness and the denial of race as a thing still drive many of our public epistemologies of ignorance,” Kilgo said.