Jett presents at digital humanities conference

Doctoral candidate Jacob Jett presented his research in digital cultural heritage collections at the Japanese Association for the Digital Humanities annual conference (JADH 2018), which was held September 9-11, in Tokyo, Japan. The theme of this year's conference was "Leveraging Open Data."

Jett presented the paper, "Towards Unifying our Collection Descriptions: To LRMize or Not?," which he coauthored with Professor J. Stephen Downie and Katrina Fenlon (MS '09, PhD '17). The paper examines a new aggregate model set forth by International Federation of Library Association's Library Reference Model (LRM) which treats aggregates like digital-cultural heritage collections as FRBR manifestations. According to Jett and his coauthors, this modeling choice results in metadata that fails to express the topicality of digital collections. In the paper, the researchers maintain that these collections should be treated as first-class bibliographic objects in their own right. This approach would benefit scholars by providing a method for linking collections together by topic thereby fulfilling FRBR’s identification and selection user tasks. 

Jett's research interests include the conceptual foundations of information access, organization, and retrieval, especially with regard to web and data semantics. He received his MS/LIS from the iSchool in 2007 as well as his CAS in digital libraries in 2010.

Updated on
Backto the news archive

Related News

Faculty receive support for AI-related projects from new pilot program

Associate Professor Yun Huang, Assistant Professor Jiaqi Ma, and Assistant Professor Haohan Wang have received computing resources from the National Artificial Intelligence Research Resource (NAIRR), a two-year pilot program led by the National Science Foundation in partnership with other federal agencies and nongovernmental partners. The goal of the pilot is to support AI-related research with particular emphasis on societal challenges. Last month, awardees presented their research at the NAIRR Pilot Annual Meeting.

Winning exhibits highlight evolution of music media and Uni High magazine

MSLIS students Monica Gil, Holly Bleeden, and Harrison Price were selected as winners of this year's Graduate Student Exhibit Contest, sponsored by the University of Illinois Library. Gil and Bleeden won first place for their exhibit, "Echoes of Time: The Evolution of Music Media," and Price won second place for his exhibit, "Unique-ly Illinois: Creative Writing from High School to Higher Education." The exhibits will be on display in the Marshall Gallery in the library through the end of March.

MSLIS students Monica Gil and Holly Bleeden standing next to their exhibit, "Echoes of Time: The Evolution of Music Media," at the Main Library.

Wei receives Amazon Post Internship Fellowship

PhD student Tianxin Wei has been awarded an Amazon Post Internship Fellowship, which will provide $20,000 in unrestricted funds and $20,000 in Amazon Web Services (AWS) credits to support Wei's research with his advisor, Professor Jingrui He. For the past two summers, Wei has served as an applied scientist intern at Amazon in Palo Alto, California. He has been part of a team that is working on search query understanding within Amazon apps and services, as well as developing shopping foundation models.

Tianxin Wei

iSchool participation in iConference 2025

The following iSchool faculty and students will participate in iConference 2025, which will be held virtually from March 11-14 and physically from March 18-22 in Bloomington, Indiana. The theme of this year's conference is "Living in an AI-gorithmic world."

Youth-AI-Safety named a winning team in international hackathon

A team of researchers from the SALT (Social Computing Systems) Lab has been selected as a winner in an international hackathon hosted by the Berkeley Center for Responsible, Decentralized Intelligence. The LLM Agents MOOC Hackathon brought together over 3,000 students, researchers, and practitioners from 127 countries to build and showcase innovative work in large language model (LLM) agents, grow the AI agent community, and advance LLM agent technology.