1st Workshop on Reliable Evaluation of LLMs for Factual Information (REAL-Info)
June 3, 2024
Collocated with ICWSM 2024
Dates
Workshop Papers Submission deadline: March 31, 2024
Notifications: April 15, 2024
Final Camera-Ready Paper Due: May 5, 2024
ICWSM-2024 Workshops Day: June 3, 2024
Workshop on Reliable Evaluation of LLMs for Factual Information (REAL-Info)
LLMs can potentially influence various information tasks of millions of users, ranging from personal content creation to education, financial advice, and mental health support. However, there is also a growing concern about LLMs' ability to identify and generate factual information. Currently, there is no standardized approach for evaluating the factuality of LLMs.This half-day workshop will enable a broad and diverse conversation regarding assessing the factuality of content generated by LLMs and their overall performance in factuality-related tasks, e.g., fact-checking, misinformation detection, rumor detection, and stance detection. The objective is to encourage the development of new evaluation approaches, metrics, and benchmarks that can better gauge LLMs' performance in terms of factuality. Additionally, the workshop will explore human-centric approaches for mitigating and correcting inaccuracies to enhance LLMs' factuality for critical applications. This workshop will facilitate interactions among academic and industry researchers in computational social sciences, natural language processing, human-computer interaction, data science, and social computing.
Keynote Speaker
Munmun De Choudhury
Associate Professor
School of Interactive Computing
Georgia Institute of Technology
Title: Ensuring Quality and Factuality in Health Information from Large Language Models
Abstract: Large language models (LLMs) are increasingly utilized to seek health information, a trend that holds promise but also presents significant challenges. Ensuring the accuracy and equity of health information from these AI systems is crucial for public health. This talk overviews these challenges through a series of interlinked studies. First, I will present the varied cultural disparities in factual accuracy, consistency, and verifiability of health information provided by LLMs across different languages. The findings reveal substantial inequities, with non-English responses often less reliable, and highlight the urgent need to enhance the multilingual capabilities of LLMs to ensure equitable access to accurate health information globally. In parallel, I will also discuss red-teaming efforts that reveal the potential for LLMs to generate misleading health information. Our studies demonstrate that AI-generated content can be highly persuasive and sometimes more credible than human-created misinformation. This underscores the critical need for media literacy and critical evaluation skills to mitigate the risks of AI-driven misinformation. Ultimately, the talk will emphasize the necessity for robust evaluation frameworks and comprehensive solutions to ensure the safe and equitable dissemination of high-quality health information. By collaborating with healthcare professionals, policymakers, and technologists, the talk hopes to inspire how we can develop targeted interventions that address these challenges and promote a more reliable and inclusive digital health information landscape.
Bio: Munmun De Choudhury is an Associate Professor at the School of Interactive Computing in Georgia Institute of Technology. Trained as a computer scientist, Dr. De Choudhury is passionate about how novel forms of social interaction online might shape, and even benefit or harm our health and well-being. Dr. De Choudhury is best known for laying the foundation of a new line of research that develops human-centered computational techniques to understand and improve mental health outcomes, based in ethical analyses of online data. Dr. De Choudhury has been recognized with the 2023 SIGCHI Societal Impact Award, the 2023 ICWSM and the 2022 Web Science Trust Test-of-Time Awards, the 2021 ACM-W Rising Star Award, the 2019 Complex Systems Society – Junior Scientific Award, as well as nearly two dozen paper awards from the ACM and AAAI. In 2024, she was inducted into the SIGCHI Academy. Dr. De Choudhury's research has resulted in practical and policy implications. These range from collaborating with the Centers for Disease Control and Prevention on suicide prevention, to supporting mental health and gun control advocacy, and to contributing to a consensus report by the National Academies of Sciences, Engineering, and Medicine on the impact of social media on the wellbeing of young people. Notably, Dr. De Choudhury was an invited contributor to the Office of U.S. Surgeon General's 2023 Advisory on The Healing Effects of Social Connection and is currently serving on the Technical Advisory Group of the World Health Organization’s Commission for Social Connection.
Organizing Committee
If you have any questions, please reach out to the organizing committee.
Dartmouth College
University of Edinburgh
University of Toronto
Carnegie Mellon University