Across the nation, 911 dispatch centers are facing a worker shortage. Unfortunately, this understaffing, plus the nature of the job itself, leads to dispatchers who are often overworked and stressed. Meanwhile, when community members need to report a crime, their options are to contact 911 for an emergency or, in a non-emergency situation, call a non-emergency number or fill out an online form. A new chatbot, SafeRBot, designed and developed by Associate Professor Yun Huang, Informatics PhD student Yiren Liu, and BSIS student Tony An seeks to improve the reporting process for non-emergency situations for both community members and dispatch centers.
SafeRBot is a Large Language Model (LLM) that aids dispatch centers in providing community members a method for reporting their situations through a series of consistent questions and answers, both informational and empathetic, when crimes are reported online. According to Huang, principal investigator on the project, the unique strengths of SafeRBot are that it turns unstructured chats into a structured form, supports both English and non-English speakers, and automatically asks follow-up questions to improve the quality of the report.
"SafeRBot aims to provide immediate responses for users who prefer not to or cannot engage with human dispatchers, or when human dispatchers are unavailable," she said. "By automatically asking relevant questions, SafeRBot reduces the time required to collect incident details and improves the quality of the information gathered. It also helps reduce dispatcher workload, potentially preventing burnout."
When a dispatch center elects to use SafeRBot, a community member who needs to report a non-emergency situation can go to the SafeRBot website and start answering questions asked by the chatbot on the left side of the user's screen. The details are automatically filled into fields of the incident report on the right side of the screen. SafeRBot is multilingual, so if the reporter's first response is in Spanish, the follow-up questions will switch from English to Spanish.
"We plan to launch SafeRBot as a grassroots effort, getting people's consent to use the system," said Liu. "This will provide a solution for the police department and greatly assist the multilingual members of our community."
According to Huang, when SafeRBot is launched, police agencies will be able to access, process, and download data from the system's dashboard and easily integrate the information into the systems that they currently use. "The information collected from our system is encrypted and stored on Amazon Cloud, which offers multiple layers of security," added Liu.
SafeRBot's design and development was inspired by the empirical evidence obtained through the team's previous research, where they studied LiveSafe, a community safety reporting system popular with universities. Through their analysis of the system logs, the researchers learned that the amount of emotional support a community member receives via a text-based system varies.
"Our research found that different users have varying levels of need for emotional support when reporting incidents. The goal is to enable users to personalize their reporting experience based on their emotional needs," said Huang.
The researchers found that community members are more responsive to answering follow-up questions when empathetic support is provided; SafeRBot can be deployed with the level of empathy users desire. A paper discussing this work, "Discovering the Hidden Facts of User-Dispatcher Interactions via Text-based Reporting Systems for Community Safety," was published in Proceedings of the ACM on Human-Computer Interaction.
"SafeRBot complements human dispatchers by asking similar questions, optimized with emotional support through empathy and compassion," said Huang.
The Urbana Police Department has been a key collaborator in developing SafeRBot's features. Huang's team has been collecting their feedback to improve the system for community use. The Police Training Institute at the University of Illinois Urbana-Champaign has also been an active research partner with Huang's development group; a version of SafeRBot was created as a training tool that provides recruits opportunities to experience being interviewed with varying levels of empathy from SafeRBot and to practice their interviewing skills in the early stages of their development.
Huang's team will present a paper describing their recent work, "Improving Emotional Support Delivery in Text-Based Community Safety Reporting Using Large Language Models," at the ACM Conference on Computer-Supported Cooperative Work and Social Computing (CSCW 2025).
Huang specializes in human-AI interaction and social computing. She is passionate about developing systems that foster collaborative innovation between humans and AI, whether it is to conceive new services or enhance existing ones. Her work is sponsored by government agencies such as the National Science Foundation, Institute of Museum and Library Services, and Administration for Community Living, as well as companies such as OpenAI, Google, and IBM. Huang received her PhD from the Donald Bren School of Information and Computer Sciences at the University of California, Irvine.