"I get to experience the world in a new way, and I believe that these unique experiences that people with disabilities have are what's going to help us make and design a better world."
Elise Roy
AI and Disability refers to the use of artificial intelligence tools to support accessibility and inclusion for people with disabilities. This can include assistive technologies such as speech-to-text, screen readers, predictive text, personalised learning supports, and adaptive communication tools.
When designed and applied ethically, AI has the potential to reduce barriers, promote independence, and enhance participation in education, work, and community life.
At the same time, there are risks and considerations, including bias, privacy concerns, over-reliance on technology, and the need to ensure tools are co-designed with people with disabilities.
This page helps educators explore how AI can play a supportive role in accessibility and disability inclusion without replacing the essential human relationships that underpin learning and participation.
If you're new to the topic:
If you want practical strategies: See Benefits, Limitations, and Challenges for actionable insights.
If you need examples and resources: Explore Case Studies, Real World Examples, Research Papers, and Webinars and Talks.
Use this page as a conversation starter with colleagues, inclusion leads, and accessibility coordinators.
Note: All tools included on this page are not recommendations. Each tool is also free or has a free option for educators to explore them.
We must first understand what disability is (moving from the medical to the social model has been a common shift in Australia).
These models shape how we view disability and subsequently how we engage in utilising AI.
Full Image: https://piet.apps01.yorku.ca/assistivetechreadingsinreview/
Individual identities, cultures, and contexts are complex, and as such, AI tools must respect and reflect this complexity.
It is common for disability, and related ableism, to co-occur with other forms of discrimination
The combined and compounded experiences of oppression is highlighted in the following video: Don't Put People in Boxes by NewHope Church www.youtube.com/watch?v=O1islM0ytkE
Disability is diverse and often intersects with other identities and experiences. Ableism can overlap with racism, sexism, classism, and other forms of discrimination, creating compounded challenges. Explore the UDL page to find out more.
Disability is diverse and often co-occur with other disabilities
"Recognizing the diversity within disability categories is crucial. Disabilities can encompass physical, sensory, cognitive, or neurological impairments, and they can vary significantly from one individual to another. Taking a user-centered perspective means acknowledging these variations and tailoring AI solutions to meet the unique needs and challenges faced by people with disabilities."
Source: https://www.scienceopen.com/hosted-document?doi=10.57197/JDR-2023-0060
AI should be used in ways that respect the individual first, not just the disability label. This means recognising the student’s identity, goals, and preferences, and actively involving them in decisions about which AI tools are used and how. Co-designing solutions with students and families ensures that AI supports are meaningful, empowering, and contextually relevant.
AI significantly champions inclusion and accessibility, transforming lives across various domains for individuals with disabilities. By removing barriers and enhancing autonomy, AI revolutionises disability support, fostering empowerment and self-sufficiency. AI tools create inclusive learning environments that adapt to individual needs, improving communication and participation for students with disabilities.
AI empowers students to realise their full potential and actively engage in social, intellectual, and professional spheres, shifting the paradigm from merely embracing limits to fully embracing abilities. For instance, AI-driven assistive technologies for mobility, communication, and cognitive functions increase independence, allowing students to navigate the world on their own terms. Personalised learning platforms are a key benefit, dynamically adjusting content and strategies to individual styles, thereby promoting academic success and equity.
Furthermore, AI directly supports advocacy by providing data-driven insights into student performance and engagement. This enables educators and policymakers to develop targeted interventions and improve outcomes, leading to evidence-based policy changes that enhance equity and inclusion for students with disabilities. The integration of AI tools can also foster collaboration and community building among students, teachers, and parents, allowing students with disabilities to connect with peers and share experiences, which is vital for collective advocacy. Ultimately, AI has the potential to facilitate a shift towards a more equitable and inclusive learning environment, recognising and valuing human diversity.
Sourced from the research articles below.
AI can significantly contribute to co-designing learning for students with disabilities by actively involving them in the development and refinement of educational technologies. A key approach is user-centric design, which ensures AI solutions are tailored to the distinct needs and preferences of individuals with disabilities through continuous user input. This process goes beyond merely addressing functional limitations, aiming to align assistive technologies with users' goals and lifestyles, thereby fostering empowerment and self-confidence.
Collaboration and coproduction are essential, advocating for disabled people, advocacy groups, and experts to be partners in the research and development process from the outset. This ensures that the community's needs and viewpoints are properly reflected in AI solutions, helping to create genuinely universal technologies and avoid "disability dongles"—well-intended but often useless solutions.
Furthermore, it is crucial to centre disabled voices in discussions about implementing AI technologies, as their lived experiences offer invaluable insights for developing inclusive approaches. Empowering students with disabilities in the AI landscape also requires informed consent and meaningful participation in decision-making, ensuring they understand how these technologies work and how their data is used, and maintaining their autonomy over choices and actions. Ultimately, AI in education should be designed to empower and enhance students' existing capabilities, providing them with technological agency to co-create inclusive learning environments rather than simply being passive recipients of the technology.
Sourced from the research articles below.
AI offers substantial support for students with disabilities in fostering relationship building and human connection by significantly improving communication and facilitating social interaction.
Communication is fundamentally enhanced through AI-powered tools. Speech recognition and text-to-speech technologies empower students with physical or speech impairments to express themselves clearly, cultivating a profound sense of belonging in both social and professional settings. For students who are deaf or hard-of-hearing, real-time transcription and AI-based sign language recognition and interpretation systems eliminate communication barriers, allowing them to actively participate in classroom discussions and collaborative projects. Augmentative and Alternative Communication (AAC) devices, integrated with AI features, further support clearer verbal and non-verbal communication. These devices enable individuals to effectively express their thoughts and needs, leading to stronger friendships, more frequent and meaningful social interactions, and active engagement in family and community activities.
Moreover, AI-powered social robots and virtual environments directly aid in social skill development. Humanoid robots, such as Kaspar and ZB, are specifically designed to improve social interaction skills for students with Autism Spectrum Disorder (ASD), providing companionship and facilitating engagement. Virtual Reality (VR) and avatar-based interactions can simulate real-life social scenarios, promoting social skill improvement and integration for students with intellectual disabilities. AI also fosters community building by enabling students with disabilities to connect with peers, share experiences, and receive support through online forums, virtual classrooms, and social learning networks, thereby creating more inclusive educational spaces. This, combined with the observed boost to students' emotional well-being from engaging with AI tools, significantly helps in overcoming isolation and forming meaningful relationships.
Sourced from the research articles below.
AI creates significant privacy concerns for students with disabilities due to the extensive collection and analysis of sensitive personal data. AI systems often require information about health, communication patterns, and mobility, which heightens existing worries about data protection for individuals whose lives are intertwined with multiple external systems, such as clinical care, social support, and education. This interconnectedness increases the risk of inadvertent data disclosure.
A primary ethical issue is the lack of transparent consent processes, as many students and their caregivers are not fully informed about the extent of AI-based monitoring or how their data will be used. The "black box" nature of many AI systems further complicates understanding their decision-making, fostering distrust and making it difficult to identify discriminatory practices. There is also a substantial risk that collected data could be misused for punitive rather than supportive measures, potentially causing harm to vulnerable students.
Robust privacy measures, including strong encryption, data anonymization, and secure storage protocols, are paramount to prevent unauthorized access, breaches, and potential misuse of personal information. Compliance with data protection regulations like HIPAA or GDPR is essential to safeguard privacy rights and uphold trust. AI must be developed with a keen awareness of these specific sensitivities to avoid exacerbating existing inequities.
Sourced from the research articles below.
AI creates significant bias concerns for students with disabilities, primarily because AI systems are often designed and trained based on normative assumptions about ability, intelligence, and learning. The datasets used to train these machine learning algorithms frequently lack adequate representation of minority populations, meaning the unique characteristics and needs of students with disabilities can be either overlooked or treated as "outliers," leading to distorted algorithmic responses.
This can manifest in various ways within educational AI tools. For example, AI-driven facial recognition technologies may embed normative assumptions about typical facial expressions and behaviours, potentially pathologising neurodivergent students' natural expressions as suspicious or disengaged. Similarly, e-proctoring software has demonstrated systemic bias, classifying original work by neurodivergent students or non-native English speakers as fraudulent due to variations from neurotypical norms. Chatbots often privilege linear and unambiguous communication styles, which can lead to misunderstandings or dismissals of students who communicate differently. Even personalised learning platforms, while intended to adapt, can encode prescriptive pedagogical models that may systematically invalidate nonlinear learning paths common among neurodivergent students.
A major concern is the "black box" nature of many AI systems, which makes it challenging to identify and address these embedded biases or discriminatory patterns. This opacity, combined with the risk that collected data could be misused for punitive rather than supportive actions, means AI has the potential to amplify existing inequalities for vulnerable students. The "Blueprint for an AI Bill of Rights" acknowledges the risk of algorithmic discrimination based on protected characteristics like disability, stressing the need for proactive measures to prevent such outcomes.
Sourced from the research articles below.
An over-reliance on AI can negatively impact students with disabilities by diminishing their autonomy and fostering excessive dependence. While AI offers valuable support, it's crucial to strike a delicate balance to avoid inadvertently reducing individuals' self-efficacy and decision-making capabilities. AI should be designed to empower and enhance existing human capabilities rather than replacing them, with transparent communication and user education ensuring individuals retain agency over their choices and actions.
Furthermore, AI systems often institutionalise medicalised and individualistic models of disability, aiming to "fix" disabled individuals rather than addressing systemic barriers. This technosolutionist approach can devalue the expertise and lived experiences of students with disabilities, positioning them as passive recipients of technology rather than active co-creators of their learning environments.
The "black box" nature of many AI systems creates further concerns, as their opaque decision-making processes can foster distrust and make it difficult to identify and address embedded biases or discriminatory practices. This lack of transparency means students may not fully understand how their data is used or how AI tools shape their learning experiences, potentially leading to algorithmic marginalisation rather than genuine inclusion.
Sourced from the research articles below.
Assistive technology is broadly defined as any tool, device, or system that helps individuals with disabilities perform functions that might otherwise be difficult or impossible. Traditionally, this has included things like screen readers, hearing aids, AAC devices, or mobility aids. Increasingly, AI-powered tools are being developed that enhance or extend these supports.
Resource categories and free tools predominantly fit into categories, including Augmentative and Alternative Communication (AAC); Real-Time Transcription; Writing Assistance; Reading support; Executive Functioning Aids; Automated Image Descriptions; Audio Description Generation; Real-time descriptions of surroundings; Translations, Captions, and Speech Recognition.
More information about these AI-enhanced assistive technologies, see:
Harrison, M., Rowlings, J., White, E. H., Vallence, M., & Potemkin, N. (2024). Neurodiversity and digital inclusion: creating the conditions for inclusive education through universal design for learning [Commissioned report]. The University of Melbourne and SMART Technologies.
This project aims to develop an understanding of how technology is utilised within inclusive classrooms, focusing on neurodiverse students and students with disability. This report explores two main themes of technology use in inclusive classrooms and trends in research and practice.
The potential impact of Artificial Intelligence on equity and inclusion in education (OECD, 14 August, 2024)
A Conceptual Model for Inclusive Technology: Advancing Disability Inclusion through Artificial Intelligence (Almufareh, Kausar, Humayun, & Teshin, 2024)
Advancing Personalized and Inclusive Education for Students with Disability Through Artificial Intelligence: Perspectives, Challenges, and Opportunities (Ahmed, Rahman, Shamim, & Hosen, 2025)
Intellectual disability and technology: an artificial intelligence perspective and framework (Almufareh et. al., 2023)
Diversity and Inclusion in Artificial Intelligence (CSIRO, 2025)
Disability and AI: Much more than assistive technologies (Scully, 2025)
The Use of Artificial Intelligence with Students with Identified Disabilities: A Systematic Review with Critique (Rice & Dunn, 2023)
Disabling AI: power, exclusion, and disability (Foley & Melese, 2025)
The Impact of AI in Advancing Accessibility for Learners with Disabilities (Gibson, 10 September, 2024)
How AI in Assistive Technology Supports Students and Educators with Disabilities (Every Learner Everywhere, April 2025)
Brenda McDermott from the University of Calgary explores how generative AI can enhance accessibility for students with invisible disabilities such as ADHD, dyslexia, and dysgraphia. She highlights practical tools like text prediction, interactive reading aids, and AI-supported writing assistance, while also emphasising ethical use, inclusivity, and the need for critical thinking as students and educators navigate AI in education.
University of Calgary. (2024). AI for Inclusion: How generative AI can support students with invisible disabilities [Video]. YouTube. https://www.youtube.com/watch?v=Y5aiBF9RiwM
Michelle Deal from Landmark College presents how AI tools can support students with learning disabilities in reading, writing, focus, time management, and organisation. She highlights prompt design for personalised learning, self-advocacy, and practical strategies for safe, responsible, and collaborative AI use.
Michelle Deal. (2024). Empowering students with disabilities: Harnessing AI for success [Video]. YouTube. https://www.youtube.com/watch?v=qtvcminrYtY
This webinar, presented by Tiana Blazevic and hosted by ADCET, explores how AI-powered tools can support students with ADHD in academic settings. It covers a range of practical strategies and study aids designed to help with organisation, focus, time management, and improving learning outcomes. Additional Resources.
ADCET. (2025, April 15). ADCET Webinar: ADHD & artificial intelligence – Strategic tools and academic practices for students with ADHD [Video]. YouTube. https://www.youtube.com/watch?v=yxrJPsMPDpM
This website contains links to numerous webinars discussing AI and inclusive education.
This website describes how AI can help neurodivergent adolescents as they transition into post-school academic environments such as university or TAFE
We value all contributions to this page.
Please contact Alfina Jackson or Annelise Dixon on LinkedIn if you would like to contribute.