School of Information Sciences

New study reveals TikTok’s spread of COVID-19 misinformation

Morgan Lundy
Morgan Lundy

A new study conducted by PhD student Morgan Lundy, which was recently published in the International Journal of Communication, reveals how TikTok's unique features have been used to spread COVID-19 misinformation. Unlike Twitter, which uses a text format, the micro-video format of TikTok makes it more difficult to detect deceptive information.

"That's why I lean towards qualitative methods for deeply understanding how misinformation is appearing on TikTok," Lundy said. "The information is passed through such rich media objects—you have sound, visuals, text, body language, captions, and meme elements that require context, and all these factors interact at once to create the 'meaning' or (mis)information that is being shared."

Lundy used a dual approach of algorithm training and hashtag sampling to gain data for her research. She also used methods of searching for "community language" rather than expected terms, to get a much more representative and useful picture of how misinformation looks on the platform. According to Lundy, the incredible reactiveness of TikTok's algorithms' collaborative filtering poses a particular challenge to containing the infodemic.

"The more misinformation you interact with, the more that you see—you can quickly find yourself immersed in massive numbers of TikTok videos relating to COVID-19 vaccine misinformation just after liking a few videos," she said.

Lundy learned that TikTok users who oppose the COVID-19 vaccine use intentionally coded language, misspelled words, and alternate hashtags to evade anti-misinformation efforts. She found that misinformation topics featured in previous COVID-19 vaccine hesitancy literature—parodies of vaccine side effects, concerns about vaccine production and approval, conspiracies about governments and vaccine contents, and claims that COVID-19 is not dangerous—are still prevalent despite public health efforts. Her research illustrated how COVID-19 vaccine misinformation often appears in the form of logical fallacies, where some information may be true but misleads to false conclusions.

"I very much hope that better understanding of how misinformation spreads on TikTok will be helpful to public health officials," said Lundy, who is a member of Assistant Professor Jessie Chin’s research group. "I received some exciting feedback from CDC (Centers for Disease Control and Prevention) and WHO (World Health Organization) contacts of my instructor, Dr. Ian Brooks (iSchool Research Scientist and the Director of the Center for Health Informatics), and it would be wonderful if this paper could in some way contribute to important conversations about health research involving complex videos/images/audio."

Updated on
Backto the news archive

Related News

iSchool participation in iConference 2026

The following iSchool faculty and students will participate in iConference 2026, which will be held virtually from March 23–26 and physically from March 29–April 2 in Edinburgh, Scotland. The theme of this year's conference is "Information Literacies, Authenticity and Use: The Move Towards a Digitally Enlightened Society."

Wang receives AccessComputing funding for video game project

Informatics PhD student Olive Wang has been awarded a minigrant by AccessComputing, an organization that supports people with disabilities in computing. The $5,000 grant will support Wang's work on the video game Loadouts, which teaches players why accessibility is important. In the game, players learn why video games are inaccessible for players who are low-vision and how accessibility features such as high contrast, auditory cues, and multimodality can be effective.

Olive Wang

Hassan and Bashir receive distinguished paper award

A paper co-authored by PhD student Muhammad Hassan and Associate Professor Masooda Bashir received the Distinguished Paper Award at the Workshop on Security and Privacy in Standardized IoT, which was held last month in San Diego, California, in conjunction with the Network and Distributed System Security (NDSS) Symposium 2026. 

iSchool researchers to present work at Technocracy Conference

This week, iSchool PhD students and faculty will present their research at the Technocracy Conference. Hosted by the Unit for Criticism and Interpretive Theory at the University of Illinois on March 5–6, the conference will begin with a panel of graduate student papers and continue the following day with invited speakers and a keynote. All events will take place at the Levis Faculty Center on the Urbana campus. 

New multi-institutional project to use AI to represent past historical periods

A new project led by a team of researchers from four universities aims to create and evaluate language models that represent past historical periods. The project, "Artificial Intelligence for Cultural and Historical Reasoning," was recently selected for a 2025 Humanities and AI Virtual Institute (HAVI) award from Schmidt Sciences. The $800,000 grant will be split among four institutions: Cornell University, the University of Illinois Urbana-Champaign, The University of British Columbia, and McGill University. Professor Ted Underwood will serve as the principal investigator for the portion of the project at Illinois.

Ted Underwood

School of Information Sciences

501 E. Daniel St.

MC-493

Champaign, IL

61820-6211

Voice: (217) 333-3280

Email: ischool@illinois.edu

Back to top