Written by: Chiara Salvatore
This article will explore how artificial intelligence is being used in the field of speech-language pathology through interviews with three clinicians.
Image: ChatGPT prompt. "Hey ChatGPT, write me an article about speech-language pathology and artificial intelligence for the University of Toronto Alumni Association newsletter."
Artificial intelligence (AI) refers to technology designed to simulate human learning, problem-solving, creativity, and decision-making (Cordella et al., 2025; Stryker & Kavlakoglu, 2024). Platforms like Open AI’s ChatGPT and Google’s Gemini are generative AI, meaning they create content including writing, art, and music, by learning from vast amounts of data (for debate on the originality of this content, see Appel et al., 2023). These tools rely on machine learning algorithms, such as artificial neural networks modeled after human brain connections (Stryker & Kavlakoglu, 2024).
Public use of AI is booming. In 2022, arguably early in the eruption of generative AI platforms, almost half of surveyed Americans reported engaging with AI regularly, with 27% stating they used AI multiple times a day (Kennedy et al., 2023). This growing usage strengthens AI systems; the more data they process, the better they perform (Samborska, 2025). OpenAI, the creator of ChatGPT, had a valuation of $157 billion towards the end of 2024, and has been working on converting to a for-profit company through subscription services and other initiatives (OpenAI, 2024; Pequeño, 2024). Currently, for-profit initiatives are said to not include the sale of data inputted into ChatGPT by users (see OpenAI, 2024b).
In the last few years, AI seems to have been mentioned in connection to practically every field, including the field of speech-language pathology (SLP). Speech-Language and Audiology Canada hosted a webinar for clinicians discussing AI, describing how it could soon benefit SLP practice (Sperry, 2024). However, AI is already benefiting or at least impacting SLPs. In response to an anonymous question I posed on the GTA SLP Facebook group, clinicians described using AI for a variety of purposes, such as documentation through AI scribes like Heidi (Heidi Health, n.d.) treatment planning/material development, or as a compensatory tool for clients.
To learn more about how clinicians are using AI in practice, I had the opportunity to talk with Goldie Litvack, an SLP working in the outpatient stroke and neuro program at Providence Healthcare. She described utilizing the AI platform ChatGPT to help her develop personalized materials for her clients that are relevant to both their goals and interests. Some examples of materials that Goldie has created using ChatGPT are word lists for motor speech intervention and scripts to help clients relearn specific information necessary for their jobs.
“I am immensely grateful to all my mentors for their unwavering support and encouragement, for shaping my path, for unleashing my potential, and for helping me discover my true passion as a champion of optimal outcomes for those living with cleft and craniofacial conditions.”
Goldie also added she frequently uses AI to translate materials into a client’s preferred language, noting that her multilingual clients have stated that ChatGPT’s translation is more accurate than other platforms, such as Google Translate.
“It’s enabled me to be more person-centered in my therapy. It used to be so laborious to do all these things, and so much more challenging to be customized and personally relevant.” – Goldie Litvack
When asked about ethical considerations for using AI platforms to provide services, Goldie emphasized the importance of protecting patient privacy through inputting only non-identifiable information. Though she’s interested in seeing how AI may change the field of SLP, she stated:
“AI doesn’t replace the rapport you have with a patient—the repetition and practice is fabulous, but working with a therapist where you have the counselling piece, and the personal aspect is so important.”
ChatGPT generated image of an SLP conducting an oral mechanism exam, in a New Yorker-esque style.
Regarding how AI may change SLP, much recent research has focused on potential ways these tools can be integrated into the field. From AI interpretation of videofluoroscopic swallow studies (Girardi et al., 2023) to AI systems that help individuals communicate through brain-computer interfaces (Uyeno, 2025), it seems as though researchers are eager to explore these possibilities.
To learn more about research initiatives involving AI in SLP, I was able to sit down with Dr. Monika Molnar, an assistant professor in the Department of Speech-Language Pathology.
Dr. Molnar is currently carrying out a pilot project with Dr. Karla Washington and Dr. Tom Chau at Holland Bloorview, working on using AI to equitably evaluate young children’s speech. Dr. Molnar discussed how monolingual and multilingual children produce English speech sounds differently, and how these differences may be flagged as disordered. She shared that children may not always have access to SLPs who are familiar with their language background, and that standardized speech assessments are most often normed on monolingual English speakers. As it would not be feasible to generate standardized tests that are normed for every individual’s unique language background, Dr. Molnar and her team are working to train AI algorithms with diverse child speech samples to help differentiate between children with speech sound disorders and those without.
“We want to use artificial intelligence to help, not replace, SLPs’ analysis of the speech of monolingual and bilingual children and tell us if they might be at risk of any speech sound disorders.” – Dr. Monika Molnar
This tool is planned to be available open access and not for profit. Privacy is also a consideration for this project; all collected samples are completely anonymous, and parents were explicitly informed about their use. When asked about the implications of tools like this for SLPs, Dr. Molnar shared that:
“This algorithm is also built by humans; it’s as good as we build it.
It is designed to complement the toolkit of an SLP. This tool cannot be and should never be alone as a diagnostic; AI cannot get certified. It’s a human’s decision at the end of the day.”
Olivia Petric, an SLP who currently works as a healthcare consultant, echoes the sentiments that AI should serve as a tool for SLPs as opposed to a replacement.
“I believe the role of AI in healthcare and in speech-language pathology should be as a co-pilot designed to save time and allow clinicians to focus on work only a trained clinician could do,” shared Olivia.
However, Olivia discussed potential concerns about job loss due to automation of administrative and assessment tasks (Farhud & Zokaei, 2021). Though a common theme of all three interviews is that AI may support but not replace SLPs in providing services, this concern exists. Speech-Language and Audiology Canada refer to AI as posing potential threats to SLPs (Sperry, 2024), and a recent survey found clinicians and student clinicians felt AI could replace their role (Austin et al., 2025). A controversy erupted on social media with the introduction of Jessica, an AI speech therapy platform claiming they can provide “personalized speech therapy sessions tailored to each user's unique needs” (Better Speech, 2024, para. 2).
“AI is a tool that requires oversight from a knowledgeable individual who is able to identify AI’s mistakes and correct them. Our role is to think critically to make the best decisions for our patients. There will need to be intervention from professional colleges to create practice standards and educational opportunities to help clinicians understand how to leverage AI in ways that benefit each individual’s practice.” – Olivia Petric
Olivia recently spoke to the University of Toronto SLP Year 2 students regarding ethical considerations such as consent, privacy, and handling of personal health information. She used this opportunity to discuss AI and some of the ethical considerations of its use. Olivia states that it would be beneficial for clinicians to be exposed to more education surrounding AI, as well as opportunities to interact with these new technologies.
“Education would also help to combat the opposite from occurring, which is clinicians becoming complacent and not understanding the limitations and biases of AI.” – Olivia Petric
To conclude, when I reflect upon the use of AI in SLP as a student, I think about one of the principles of neuroplasticity discussed in our coursework: Use It or Lose It. It has been established that “new neurons are kept alive by effortful learning, a process that involves concentration in the present moment of experience over some extended period of time” (Shors et al., 2012, p. 450). Therefore, we must critically think about the implications on our own performance when we choose to use AI to accomplish tasks as SLPs. Using ChatGPT to create treatment activities or AI algorithms to screen children for speech sound disorders is very different from offloading clinical interpretations and decision-making to a computational system. Clinical practice in SLP requires constant reflection and examination of one's own limitations, influences, and values, whereas AI models simply do what they are told using information that has been fed to them, information which may frankly be incorrect, out of date, and biased. One can also consider the implications of SLP students utilizing AI in foundational courses as it relates to skill development and independent clinical reasoning (see Walsh, 2025, for a discussion of AI use in higher education).
Furthermore, if it is necessary for SLPs to rely on platforms like AI in order to meet workplace demands, what does this say about the expectations placed upon SLPs?
Overall, if AI is used as a tool to supplement SLP practice, its effectiveness and utility depends on SLPs. It will be interesting to see how AI and other technological advancements reshape the field in the coming years.
As a note, there are many aspects of AI that this article did not touch on, such as environmental impact. For readers interested in learning more, check out the AI Index from Stanford University at https://hai.stanford.edu/ai-index.
Or, you could just ask ChatGPT.
A huge thanks to Goldie Litvack, Dr. Monika Molnar, and Olivia Petric for their participation in this article. Additional thanks to Vicky Luo and Devora Goldberg for their help with the conceptualization of this article, and to Adrienne Yau and Abiramy Thayanantha for editing support.
References
Appel, G., Neelbauer, J., & Schweidel, D. A. (2023, April 7). Generative AI Has an intellectual property problem. Harvard Business Review. https://hbr.org/2023/04/generative-ai-has-an-intellectual-property-problem
Austin, J., Benas, K., Caicedo, S., Imiolek, E., Piekutowski, A., & Ghanim, I. (2025). Perceptions of artificial intelligence and ChatGPT by speech-language pathologists and students. American Journal of Speech-Language Pathology, 34(1), 174-200. https://doi.org/10.1044/2024_AJSLP-24-00218
Better Speech. (2024). Meet Jessica - the First AI Speech Therapist Helper. Retrieved April 20, 2025, from https://www.betterspeech.com/post/jessica-ai-speech-therapy-helper
Cordella, C., Marte Manuel, J., Liu, H., & Kiran, S. (2025). An introduction to machine learning for speech-language pathologists: Concepts, terminology, and emerging applications. Perspectives of the ASHA Special Interest Groups, 10(2), 432-450. https://doi.org/10.1044/2024_PERSP-24-00037
Farhud, D. D., & Zokaei, S. (2021). Ethical issues of artificial intelligence in medicine and healthcare. Iran Journal of Public Health, 50(11), i-v. https://doi.org/10.18502/ijph.v50i11.7600
Girardi, A. M., Cardell, E. A., & Bird, S. P. (2023). Artificial intelligence in the interpretation of videofluoroscopic swallow studies: Implications and advances for speech–language pathologists. Big Data and Cognitive Computing, 7(4), 178. http://doi.org/10.3390/bdcc7040178
Heidi Health. (n.d.). AI medical scribe for Canadian clinicians. Retrieved April 1, 2025, from https://www.heidihealth.com/en-ca
Kennedy, B., Tyson, A., & Saks, E. (2023, February 15). Public awareness of artificial intelligence in everyday activities. Pew Research Center. Retrieved March 15, 2025, from https://www.pewresearch.org/science/2023/02/15/public-awareness-of-artificial-intelligence-in-everyday-activities/
OpenAI. (2024a). New funding to scale the benefits of AI. Retrieved April 20, 2025, from https://openai.com/index/scale-the-benefits-of-ai/
OpenAI. (2024b). Privacy policy. Retrieved April 20, 2025, from https://openai.com/policies/row-privacy-policy/
Pequeño, A. (2024, October 2). OpenAI valued At $157 billion after closing $6.6 billion funding round. https://www.forbes.com/sites/antoniopequenoiv/2024/10/02/openai-valued-at-157-billion-after-closing-66-billion-funding-round/
Samborska, V. (2025, January 20). Scaling up: how increasing inputs has made artificial intelligence more capable. https://ourworldindata.org/scaling-up-ai
Shors, T. J., Anderson, M. L., Curlik, D. M., 2nd, & Nokia, M. S. (2012). Use it or lose it: How neurogenesis keeps the brain fit for learning. Behavioural Brain Research, 227(2), 450-458. https://doi.org/10.1016/j.bbr.2011.04.023
Sperry, D. (2024). AI and SLP: Friend or foe? Exploring the use of artificial intelligence in speech language pathology (2024). Speech-Language and Audiology Canada. https://www.sac-oac.ca/event-education/ai-slp/
Stryker, C., & Kavlakoglu, E. (2024, August 9). What is artificial intelligence (AI)? International Business Machines Corporation. https://www.ibm.com/think/topics/artificial-intelligence
Uyeno, G. (2025, March 12). New brain tech gives voice to ALS patients: Cognixion’s headset offers a communication tool for people with locked-in syndrome. Institute of Electrical and Electronics Engineers. https://spectrum.ieee.org/als
Walsh, J. D. (2025, May 7). Everyone is cheating their way through college. New York Magazine. https://nymag.com/intelligencer/article/openai-chatgpt-ai-cheating-education-college-students-school.html