By Hanh Ho and Emily Luong
Summary:
In the podcast, we will discuss how AI Chatbots affect older Americans (ages 60 to 70) and adolescents. We will be exploring themes around what AI Chatbots are, how AI has been evolving throughout the past few years, real-life examples of AI usage, human relationships vs AI relationships, and AI in the therapy space.
With AI use becoming more prominent and embedded in our lives, we can't avoid the influence it has on how we see relationships both through AI and in real life. In the podcast, we talk about the ways AI has become somewhat of a replacement for relationships in real life and how AI relationships play a part to our mental health. This resource is useful as it discusses real-life examples in regards to research on how AI Chatbots are used for adolescents and older adults. One of the many research articles mentioned talks about the effectiveness AI Chatbots have on mental disorders and how beneficial it is for individuals with early symptoms of depression but also how it lacks any significant beneficial changes in generalized anxiety, stress, and mental well-being (Feng et al., 2025). The best way to utilize this resource is to find time to listen and think about how AI affects you as we bring up research results and examples we go over and how that could be applied in practice.
Introduction of who we are and the main topic
What Chatbot AI is and what it does in therapy space
Discussion of academic articles and the main topics discussed
Talking about our thought process and analyzing results (difference in teens and boomers, negatives and positives)
Real life examples (in moderation) and talk about culture when approaching AI
Brain plasticity and over excessive use and dependence
Discussion on why humans avoid real life relationships and seek AI
Volume Warning!!! Turn it down a notch.*
Cultural and Identity-based questions
How do norms around emotional expression in the client’s culture influence their comfort with AI vs. human disclosure?
A person could feel more open to expressing their emotional burdens to AI because of the lack of judgment due to the AI's lack of criticism. In some cultures, therapy/counseling can be perceived negatively and can be difficult for a person to reach out to an in-person therapist, which prompts their use of AI as emotional support.
While AI Chatbots are often more available compared to in-person therapists/counselors, they lack the criticism that often comes with therapy. Having someone point out things someone may not see helps them to think about how they want to approach it and discuss ways to better handle situations that negatively affect them. This leads to growth and is something AI can't provide at this time.
What are some potential reasons that the client may believe that AI "understands" them more than real people?
Some examples would be the AI validating the client's beliefs compared to real people who may have rejected the client's beliefs prior to their AI usage.
What role do community (extended family, ethnic organizations, faith groups) play in reducing AI usage?
Having ample amount of support from those around you can help fill the emotional needs that AI Chatbots tend to provide. Reach out to those around you and express your need for support.
What is the biggest difference between AI Chatbot usage among adolescents and boomers?
Adults in their 60s and 70s often use AI Chatbots to alleviate loneliness, while adolescents use them for therapy and emotional support.
What are the positive and negative effects of using an AI Chatbot?
AI Chatbots provide immediate responses to questions from the user in a way that validates them. Unfortunately, overuse of AI Chatbots can lead to users being dependent on the AI for constant validation that can lead to real-life relationships to diminish over time.
References:
Feng, Y., Hang, Y., Wu, W., Song, X., Xiao, X., Dong, F., & Qiao, Z. (2025). Effectiveness of AI-driven conversational agents in improving mental health among young people: Systematic review and meta-analysis. Journal of Medical Internet Research, 27. https://doi-org.offcampus.lib.washington.edu/10.2196/69639
Huang, S., Lai, X., Ke, L., Li, Y., Wang, H., Zhao, X., Dai, X., & Wang, Y. (2024). AI technology panic—Is AI dependence bad for mental health? A cross-lagged panel model and the mediating roles of motivations for AI use among adolescents. Psychology Research and Behavior Management, 17, 1087–1102. https://doi.org/10.2147/PRBM.S440889
Oliver, M. (2025, August 5). Older Americans turning to ai-powered chatbots for companionship. CBS News. https://www.cbsnews.com/news/ai-chatbot-companionship-older-americans/
Sanford, J. (2025, August 27). Why ai companions and young people can make for a dangerous mix. News Center. https://med.stanford.edu/news/insights/2025/08/ai-chatbots-kids-teens-artificial-intelligence.html
Xia, Q., Chiu, T. K. F., Chai, C. S., & Xie, K. (2023). The mediating effects of needs satisfaction on the relationships between prior knowledge and self‐regulated learning through artificial intelligence chatbot. British Journal of Educational Technology, 54(4), 967–986. https://doi.org/10.1111/bjet.13305
Yang, Y., Wang, C., Xiang, X., & An, R. (2025). AI Applications to Reduce Loneliness Among Older Adults: A Systematic Review of Effectiveness and Technologies. Healthcare, 13(5), 446. https://doi.org/10.3390/healthcare13050446