This site follows WCAG 2.1AA accessibility guidelines. For support or alternative formats, contact the AI and Accessibility project team.
Artificial Intelligence (AI) Basics
Artificial Intelligence (AI) - AI refers to computer systems that can perform tasks that normally require human intelligence, such as reasoning, pattern recognition, and learning. In education, AI can support accessibility, automate feedback, and personalize learning experiences, though it must be used responsibly to ensure transparency and accountability (Educause, 2023; Cheong, 2024; UNESCO, 2023).
Algorithm - An algorithm is a set of programmed instructions that tell a computer how to solve a problem or perform a task. Algorithms are the foundation of AI systems and directly affect how data is processed and decisions are made (Cheong, 2024).
Machine Learning (ML) - Machine Learning is a branch of Artificial Intelligence that allows computers to identify patterns and improve their performance through experience, without being explicitly programmed. The system learns by analyzing large amounts of data to make predictions or decisions (Educause, 2023; UNESCO, 2023; Cheong, 2024). In education, ML can be used to personalize learning materials or automatically generate accessibility supports; though it must be carefully monitored to avoid biased results (Cheong, 2024).
Generative AI - Generative AI refers to advanced AI models such as ChatGPT that can create new content like text, images, audio, or video based on patterns learned from existing data (OpenAI, 2025; MasterWriter, 2025).
Large Language Model (LLM) - A Large Language Model is a type of AI that learns from huge amounts of text to understand and create human-like language. It can write, summarize, translate, or explain information based on what it has learned (OpenAI, 2025; Educause, 2023).
Prompt - A prompt is the input or instruction a user gives an AI system to generate a response. The quality and clarity of the prompt influence how accurate or useful the AI’s output will be (OpenAI, 2025; MasterWriter, 2025). For example, asking ChatGPT to write Alt Text for an image of a student using a screen reader helps the AI generate a focused, accessible description.
Bias in AI - Bias in AI occurs when data or algorithms produce unfair outcomes that reflect or reinforce social inequalities. Ensuring fairness requires transparency, accountability, and human oversight (Cheong, 2024; Fedele et al., 2024).
Accessibility - DAccessibility means designing digital spaces, tools, and materials so everyone can use them, including people with disabilities. It ensures content can be seen, heard, and understood in different ways (Level Access, 2024; W3C, 2018; CAST, 2018).
Inclusive Design - Inclusive Design focuses on creating products and learning materials that work for the widest range of people. It values different abilities, experiences, and needs so that everyone feels included and supported (CAST, 2018; UNESCO, 2023).
Universal Design for Learning (UDL) - UDL is a framework for designing inclusive learning environments that give all students equal opportunities to succeed by providing multiple means of engagement, representation, and expression (CAST, 2018).
Assistive Technology (AT) - Assistive technology includes digital tools and software designed to support individuals with disabilities, such as screen readers, speech-to-text programs, or transcription tools like Otter.ai and Microsoft Stream (Otter.ai, 2023; Microsoft, 2023).
Alt Text (Alternative Text) - Alt Text is descriptive text added to images that allows people using screen readers to understand visual content. Effective Alt Text improves accessibility for blind and visually impaired users (Alcázar, 2022; Tri Julianto et al., 2023).
Closed Captions - Captions are on-screen text that displays spoken dialogue and relevant sound cues, making video content accessible to deaf and hard-of-hearing users (UCF Learning, 2023; YouTube Help, 2024).
Screen Reader - A screen reader is software that reads aloud what is on the screen for people who are blind or have low vision. It can describe text, links, and images when Alt Text is added (Alcázar, 2022; Tri Julianto et al., 2023).
Digital Literacy - Digital literacy means knowing how to find, use, create, and share information safely and effectively online. It includes understanding how digital tools and AI work, and how to use them responsibly (UNESCO, 2023; Educause, 2023).
Ethical AI Use - Ethical AI use means using Artificial Intelligence in fair, transparent, and responsible ways. It focuses on protecting privacy, reducing bias, and keeping humans accountable for AI decisions (Cheong, 2024; Fedele et al., 2024).
Data Privacy - Data privacy means protecting personal information that is collected or stored online. It ensures that AI tools and websites handle user data safely and only for the right reasons (Cheong, 2024; UNESCO, 2023).
Human-in-the-Loop - Human in the Loop means that people are always involved in checking and guiding what AI systems do. Humans make the final decisions to keep AI tools accurate and ethical (Cheong, 2024; Fedele et al., 2024).
Accessibility Checker - An accessibility checker is a digital tool that scans websites or documents to find accessibility issues such as missing Alt Text or poor colour contrast. It helps creators make sure their content meets accessibility standards like WCAG (Skynet Technologies, n.d.; Level Access, 2024).