Countering online networked extremist conspiracy theories

 

Phase II 

Countering online networked extremist conspiracy theories

Due to a lack of gatekeeping, misinformation can spread unimpeded on social media. The team’s Phase II team’s overarching goals are (1) to design a Unified Network COgnitive Virtual Ethnography Rhetorical (UNCOVER) Model that quantitatively captures the causal processes and propagation work will focus on the forms of misinformation that cause the most harm: extremist conspiracy theories (ECTs). ECTs have motivated not just incorrect beliefs, but also polarization, prejudice, criminal behavior, and political violence. An online ECT about white genocide, for example, recently motivated social media users to murder fifty people at two New Zealand mosques, eleven people in a Pittsburgh synagogue, and one mother in a California synagogue. Despite the ubiquity and consequences of online ECTs, scholars do not yet understand the links between ECT content, readers' cognitive and psychological processes, and network amplification. Moreover, scholars do not know how to design countermeasures to combat the spread of beliefs in ECTs. The dynamics of how ECT beliefs spread in online social networks; and (2) to develop effective countermeasures to curb the spread of ECTs and mitigate their harmful effects. The team will take a broad approach that links ECT content, cognitive and psychological processes, and social networks in a unique model that allows for the design of effective countermeasures for stymieing ECTs. To that end, the team will adopt a multidisciplinary approach blending text and visual rhetorical analysis, computational and cognitive linguistics, social and behavioral science, network science, and signal and information processing.

Phase I Renewal (2019) 

Extremist Content and Conspiracy Theories in Online Social Networks – Understanding and Disrupting the Causal Processes Linking Content to Violence. 

Previously known as the SCORE team, the U-LINK ‘Extremism’ team will build on their previous Phase I work by working towards understanding how extremist groups attract and motivate members via social media. Noting that “social media has allowed extremist groups to bypass traditional gatekeepers and share their messages directly with potentially billions of people”, the team will adopt a transdisciplinary approach to understanding and counteracting the causal mechanisms linking extremist content to violence. To do this, scholars from English, Political Science, Electrical and Computer Engineering, Communication, Political Science, Computer Science, and Anthropology will come together to conduct innovative analyses to better understand the process leading to the incitement of violence via social media, and to develop strategies for countering the harmful effects of extremist ideas on democratic norms.

Phase I (2018) 

Systems Approach to Controlling the Online Rise of Extremism (SCORE)

Combining the know-how of nine faculty members from eight disciplines, this proposal advances the new methodology that three team members previously published in the journal Science to understand the online ecology of extremism and hate speech through a combination of big data and complex network systems science.

SCORE broadens and generalizes that earlier pilot study led by Physics Professor Neil Johnson by going beyond the numbers and focusing on online narratives and content. By targeting the wide range of extremist hate groups and hate-speech forums across U.S. social media outlets, investigators hope to determine how extremism develops online across platforms, target groups, and languages, and to suggest technological, social, and legal avenues to control and curb its impact.

“By embedding this know-how built on solid science as opposed to guesswork, SCORE will ultimately be in a position to develop effective automated software bots that circulate in the online space to do the policing,” the team wrote in its proposal.

Team

Manohar Murthi, Electrical & Computer Engineering; Kamal Premaratne, Electrical & Computer Engineering; Michelle Seelig, Cinema and Interactive Media; John Funchion, English; Caleb Everett, Anthropology; Stefan Wuchty, Computer Science; Casey Klofstad, Political Science; Joseph Uscinski, Political Science; Lisa Baker, Richter Library