1st Workshop on Reliable Evaluation of LLMs for Factual Information (REAL-Info)
June 3, 2024
Collocated with ICWSM 2024
Dates
Workshop Papers Submission deadline: March 31, 2024
Notifications: April 15, 2024
Final Camera-Ready Paper Due: May 5, 2024
ICWSM-2024 Workshops Day: June 3, 2024
Workshop on Reliable Evaluation of LLMs for Factual Information (REAL-Info)
LLMs can potentially influence various information tasks of millions of users, ranging from personal content creation to education, financial advice, and mental health support. However, there is also a growing concern about LLMs' ability to identify and generate factual information. Currently, there is no standardized approach for evaluating the factuality of LLMs.This half-day workshop will enable a broad and diverse conversation regarding assessing the factuality of content generated by LLMs and their overall performance in factuality-related tasks, e.g., fact-checking, misinformation detection, rumor detection, and stance detection. The objective is to encourage the development of new evaluation approaches, metrics, and benchmarks that can better gauge LLMs' performance in terms of factuality. Additionally, the workshop will explore human-centric approaches for mitigating and correcting inaccuracies to enhance LLMs' factuality for critical applications. This workshop will facilitate interactions among academic and industry researchers in computational social sciences, natural language processing, human-computer interaction, data science, and social computing.
Organizing Committee
If you have any questions, please reach out to the organizing committee.
Dartmouth College
University of Edinburgh
University of Toronto
Carnegie Mellon University