New journal article examines vaccination misinformation on social media

Tre Tomaszewski
Tre Tomaszewski
Jessie Chin
Jessie Chin, Assistant Professor

Research conducted by Assistant Professor Jessie Chin's Adaptive Cognition and Interaction Design Lab (ACTION) provided the foundation for an article recently published in the high-impact Journal of Medical Internet Research. PhD student Tre Tomaszewski is the first author on the peer-reviewed article, "Identifying False Human Papillomavirus (HPV) Vaccine Information and Corresponding Risk Perceptions from Twitter: Advanced Predictive Models."

According to the researchers, vaccination uptake rates of the HPV vaccine remain low despite the fact that the effectiveness of the vaccine has been established for over a decade. Their new article addresses how the gap in vaccinations can be traced to misinformation regarding the risks of the vaccine.

"If we can understand the contents of these misconceptions, we can craft more effective and targeted health messaging, which directly addresses and alleviates the concerns found in misconceptions about various public health topics," said Tomaszewski.

Tomaszewski uses the analogy of an outbreak of infectious disease in characterizing the spread of misinformation about vaccination, colloquially called an infodemic. The detection of misinformation is a mitigation method that reduces further spread after an "outbreak" has begun, he said. Understanding the types of concerns people have regarding public health measures, such as HPV vaccination, could lead to improved health messaging from credible sources.

"If we can target root causes—reasons people believe misinformation in the first place—through methods akin to those we devised, health messaging can provide valid information prior to the exposure of misinformation. Continuing the analogy of a disease, this pre-exposure to valid information can act as a psychological 'inoculation' from the known falsehoods," he said. "Of course, while the analogy of misinformation as a disease or epidemic is useful for conceptualizing the problem, it is imperfect and should not be taken too literally, as goes for most analogies."

For their study, the research team used machine learning and natural language processing to develop a series of models to identify and examine true and false HPV vaccine–related information on Twitter. Once a model was developed that could reliably detect misinformation, the researchers could automatically classify messages, creating a much larger data set.

"We were able to extract cause-and-effect statements in a process called 'causal mining.' This resulted in sets of concepts (or misconceptions) related to a given 'cause' term," said Tomaszewski.

The researchers found that valid messages containing "HPV vaccination" often return terms under a category of "effective" (expressing the vaccine efficacy) but also "cancer" (as the vaccine helps prevent cancers which may develop over time due to an HPV infection). They found that HPV vaccine misinformation is linked to concerns of infertility and issues with the nervous system. After the messages were categorized as positive or negative cause-effect statements, the research team found that misinformation strongly favors the negative-leaning, "loss framed" messaging.

"Misinformation tends to be more fear provoking, which is known to capture attention," said Tomaszewski.

This research was funded by the National Institutes of Health (National Cancer Institute). In addition to Tomaszewski and Chin, the research team included Alex Morales (Department of Computer Science, University of Illinois Urbana-Champaign); Ismini Lourentzou (Department of Computer Science, Virginia Polytechnic Institute and State University);  and from the University of Illinois at Chicago, Rachel Caskey (College of Medicine); Bing Liu (Department of Computer Science); and Alan Schwartz (Department of Medical Education).

Updated on
Backto the news archive

Related News

Knox appointed interim dean

Professor Emily Knox has been appointed to serve as interim dean of the School of Information Sciences, pending approval by the Board of Trustees. Until officially approved, her title will be interim dean designate. The appointment will begin April 1, 2025.

Emily Knox

iSchool instructors ranked as excellent

Fifty-six iSchool instructors were named in the University's List of Teachers Ranked as Excellent for Fall 2024 and Winter 2024-2025. The rankings are released every semester, and results are based on the ratings from the Instructor and Course Evaluation System (ICES) questionnaire forms maintained by Measurement and Evaluation in the Center for Innovation in Teaching and Learning. 

iSchool Building

Scholarship alleviates financial burden for returning student

During her time as an active-duty Naval Officer, Anna Hartman realized that she had a passion for helping others and building community. That passion, combined with a lifelong love of reading, led her to pursue an MSLIS degree at the University of Illinois. Hartman is receiving support for her studies through the Balz Endowment Fund, which was established by Nancy (BA LAS '70, MSLIS '72) and Dan (BS Media '68, MS Media '72) Balz to help make education more affordable for returning students.

Anna Hartman

Ocepek and Sanfilippo co-edit book on misinformation

Assistant Professor Melissa Ocepek and Assistant Professor Madelyn Rose Sanfilippo have co-edited a new book, Governing Misinformation in Everyday Knowledge Commons, which was recently published by Cambridge University Press. An open access edition of the book is available, thanks to support from the Governing Knowledge Commons Research Coordination Network (NSF 2017495). The new book explores the socio-technical realities of misinformation in a variety of online and offline everyday environments. 

Governing Misinformation in Everyday Knowledge Commons book

Faculty receive support for AI-related projects from new pilot program

Associate Professor Yun Huang, Assistant Professor Jiaqi Ma, and Assistant Professor Haohan Wang have received computing resources from the National Artificial Intelligence Research Resource (NAIRR), a two-year pilot program led by the National Science Foundation in partnership with other federal agencies and nongovernmental partners. The goal of the pilot is to support AI-related research with particular emphasis on societal challenges. Last month, awardees presented their research at the NAIRR Pilot Annual Meeting.