Description: Social support is a critical resource for coping with a range of life challenges. Supportive communication has been shown to help buffer stress, enhance coping, and improve mental and physical health. The utility of social support has led researchers to examine the potential of using artificial intelligence (AI) chatbots to increase the availability of this resource among the lay public. AI chatbots can serve the role of support providers who directly interact with humans to help them cope with stress. A different perspective, generative AI and large language models (LLMs) can serve as intermediaries in human-human communication for various interpersonal goals. Our research focuses on both AI-human supportive communication and AI-mediated human supportive communication and their downstream mental health outcomes.
AI-Mediated Social Support: The Prospect of Human-AI Collaboration
Building on artificial intelligence (AI)-mediated communication (AI-MC), this study examines how people use LLM-based chatbots to generate support messages and how patterns of human–AI collaboration shape message features and influence message evaluations of helpfulness and authenticity. We propose the process-adoption model, categorizing message generation into four patterns: human-only, AI-only, modified-AI, and AI-guided. Results showed that AI-only and modified-AI messages were more likely than human-only messages to include informational and emotional support, which in turn, enhanced viewers’ evaluations of message helpfulness and authenticity. AI-guided messages were more likely to provide reciprocal self-disclosure than AI-only messages, which enhanced perceived authenticity. Lastly, AI-guided messages were rated as more authentic than AI-only messages even after accounting for the mediating effects of message features. These findings provide a nuanced understanding of the AI-MC spectrum, and discussions are provided about human–AI collaboration in support provision.
Selected Publications
Meng, J., Zhang, R., Qin, J., Lee, YJ., & Lee, YC. (2025). AI-mediated social support: the prospect of human-AI collaboration. Journal of Computer-Mediated Communication, 30(4), zmaf013.
Meng, J., Rains, SA., Qin, J., & Rheu, M. (2025). Examining the content and form of supportive conversations with chatbot. International Journal of Human-Computer Interaction.
Rheu, M., Dai, Y., Meng, J., Peng, W. (2024). When a chatbot disappoints you: Expectancy violation in human-chatbot interaction in a social support context. Communication Research, 51(7), 782-814.
Meng, J., Rheu, M., Zhang, Y., Dai, Y., & Peng, W. (2023). Mediated social support for distress reduction: AI chatbots vs. Human. AMC Human-Computer Interaction CSCW 7(1), 72.
Meng, J., Dai, Y. (2021). Emotional support from AI chatbots: Should a supportive partner self-disclose or not? Journal of Computer-Mediated Communication, 26, 207-222.
Description: Online health misinformation is health-related information disseminated on the Internet that is false, inaccurate, misleading, biased, or incomplete, which contradicts the consensus of the scientific community based on the best available evidence. Online health misinformation carries serious social and public health implications, including links to hesitancy in vaccines, hesitancy in cancer treatment and screening, as well as distrust in science, medicine, and the medical and research communities. Although health misinformation can be traced back to the time of hunting and gathering societies, online health misinformation, especially on social media, poses unique challenges due to its use of persuasive strategies, making it more difficult to identify and address. It is critical to understand what persuasive strategies are used in misinformation, how individuals process and succumb to them, and how technology-enabled interventions can be effective in correcting misbeliefs and facilitating the identification of misinformation.
Persuasive strategies in online health misinformation: a systematic review
Health misinformation proliferates online, especially during public health crises. While prior studies have classified misinformation types and debunking strategies, little attention has been paid to the persuasive strategies embedded within misinformation. This systematic review examined such strategies by searching four databases for studies published between 2011 and 2021. From 1,700 articles, 58 met the inclusion criteria, yielding 258 persuasive strategies. Using affinity diagramming, 225 strategies were categorized into 12 thematic groups, including narrative fabrication, anecdotes, distrust of authorities, politicization, misuse of science, rhetorical tricks, biased reasoning to make a conclusion, emotional appeals, highlighting uncertainty and risks, and establishing surface legitimacy. Findings highlight the need for media literacy education to counter health misinformation.
Selected Publications
Peng, W. Lim, S., & Meng, J. (2023). Persuasive strategies in online health misinformation: a systematic review. Information, Communication & Society, 26, 2131-2148.
Chen, A., Chen, K., Zhang, J., Meng, J., Shen, C. (2023). When national identity meets conspiracies: the contagion of national identity language in public engagement and discourse about COVID-10 conspiracy theories. Journal of Computer-Mediated Communication, 28, zmac034.
Lee, S., Ma, S., Meng, J., Zhuang, J., Peng, TQ. (2022). Detecting sentiment toward emerging infectious diseases on social media: a validity evaluation of dictionary-based sentiment analysis. International Journal of Environmental Research and Public Health, 19, 6759.
Chen, K., Chen, A., Zhang, J., Meng, J., Shen, C. (2020). Conspiracy and debunking narratives about COVID-19 origination on Chinese social media: How it started and who is to blame. Harvard Kennedy School Misinformation Review.
Meng, J., Peng W., Tan, PN., Liu, RW., & Cheng, Y. (2018). Diffusion size and structural virality: The effects of message and network features on spreading health information on Twitter. Computers in Human Behavior, 89, 111-120.
Description: Social networks—our direct and indirect connections with others—serve as vital channels for acquiring resources and exchanging information. These ties strongly shape our health and well-being: the behaviors and outcomes of those around us can influence our own, while the quality of support we receive through our networks is closely linked to our overall well-being. In this line of research, we examine how the properties of people’s social networks, both online and offline, relate to their physical and mental health. We also design and evaluate network-based interventions, often implemented on social media platforms, to promote positive behavior change. Leveraging computational methods, we analyze digital trace data to uncover patterns of influence within networks and their impact on health outcomes.
Linking Network Structure to Support Messages: Effects of Brokerage and Closure on Received Social Support
Building on the network theory of brokerage and closure, this study takes a structural approach to examine network structure and its influence on reception of different types of social support. The study extracted ego networks of 227 active users from a large online health social network and tracked their received supportive comments on personal profiles for 3 months. A total of 3,270 comments were analyzed. The results showed that network brokerage (operationalized as effective size) predicted the amount of informational and network support, whereas network closure (operationalized as local clustering coefficient) predicted the amount of emotional and esteem support received on one's profile. Moreover, brokerage was found to predict received emotional support.
Selected Publications
Qin, J., & Meng, J. (2025). Support network typology and psychological well-being among young adults. Health Communication.
Reynolds, R., Meng, J., & Hall, ED. (2020). Multilayered social dynamics and depression among older adults: A 10-year cross-lagged analysis. Psychology and Aging, 35, 948-962.
Meng, J., Peng, W., Shin, S. Y., & Chung, M. (2017). Online self-tracking groups to increase fruit and vegetable intake: a small-scale study on mechanisms of group effect on behavior change. Journal of Medical Internet Research, 19.
Meng, J., Martinez, L., Holmstrom, A., Chung, M., & Cox, J. (2017). Research on social networking sites and social support from 2004 to 2015: A narrative review and directions for future research. Cyberpsychology, Behavior, and Social Networking, 20, 44-51.
Meng, J., Chung, M., & Cox, J. (2016). Linking network structure to support messages: Effects of brokerage and closure on received social support. Journal of Communication, 66, 982-1006.
Meng, J. (2016). Your health buddies matter: Preferential selection and social influence on weight management in an online health social network. Health Communication, 31, 1460-1471.