Doctoral candidate Si Chen successfully defended her dissertation, "Designing Ethical Emotion AI-based Learning Experience among Ability-Diverse Users," on July 30.
Her committee included Associate Professor Yun Huang (chair); Professor Yang Wang; Assistant Professor Nigel Bosch; and Raja Kushalnagar, professor in the Department of Science, Technology, Accessibility, Mathematics, and Public Health at Gallaudet University.
Abstract: Emotion AI, also known as affective computing, encompasses the recognition, interpretation, simulation, and response to human emotions and cues. Despite its potential, there has been limited systematic exploration of its ethical and inclusive design, particularly in the realm of online learning. This thesis examines the ethical considerations surrounding emotion AI for inclusive online education. Specifically, the research makes novel contributions for two learner groups: the hearing community and the d/Deaf or hard of hearing (DHH) community. For hearing learners, recognition of emotions from facial movements can be applied to enhance their self-awareness and improve knowledge sharing in video-based learning. Combining facial expression recognition with self-reported emojis enables these learners to express and reflect on their emotions more comprehensively than by using emojis alone. For DHH learners, emotion AI is more effective when they have access to video comments in American Sign Language (ASL) rather than just English captions in video-based online learning. Additionally, ASL video comments featuring cartoon-like filters displaying human-like emotions are more entertaining and engaging for DHH learners, fostering a stronger sense of connection with their peers. To promote inclusive learning between hearing and DHH learners, a design fiction approach is further employed, which proposes customizable overlay solutions for seamless interactions among these diverse learners, enhancing inclusivity while preserving emotional authenticity. While this thesis centers on designing inclusive emotion AI for video-based learning, the insights offer both theoretical and practical implications in broader application domains.