In the last section we saw how running AI systems requires significant computing power and strong internet connections. These resources are not equally available everywhere. As a result, many communities are already at a disadvantage because they cannot access or use the tools in the first place.
At the core, the challenge is bringing AI into classrooms with the resources that are already available. In many LMICs, mobile phone penetration is higher than any other device, while data remains expensive. So the question becomes: how can we work around these constraints?
Start with the apps you already use
Teachers and students can access AI through familiar apps on their phones, even with limited internet. In some cases, social bundles or zero rated data packages make these apps free or much cheaper than regular browsing. All the heavy computing happens elsewhere on a remote server. The app is simply the doorway.
Run AI directly on the devices you have
Advances in compressing AI models now make it possible to run them on devices with limited memory, completely offline. This reduces the need for constant internet access, though there are still limits based on the device’s computing power and storage.
Set up a central server that works as a hotspot
If individual devices cannot handle the load, a classroom or school can set up a small local server that works as a WiFi hotspot. Students connect to it and the AI runs from the server instead of their devices. This removes the burden from each phone or laptop, though it shifts the challenge to maintaining the central server.
Before introducing any technical complexity, it helps to start with the apps that teachers and students are already using. WhatsApp is one of the most powerful of these tools. It works in low bandwidth environments and, in many LMICs, is included in social bundles or zero rated data packages, which makes it free or much cheaper than regular browsing.
What makes WhatsApp particularly interesting is that it already supports AI models, so people can interact with AI directly through chat. Many organizations have built education focused solutions on top of it, such as:
TheTeacher.AI – provides AI driven pedagogical advice, lesson plans, and learning assessments designed to support teachers in remote areas of Sub Saharan Africa.
Rori (Ghana) – an AI math tutor that helps students in grades 3–9 improve their scores through interactive exercises.
Dzidzo PaDen / Imfundwe’ndlini (Zimbabwe) – delivers curriculum aligned resources like notes, past exam papers, and SRHR materials to learners with limited internet.
It is not just limited to WhatsApp. Other platforms are also being used in creative ways:
On Telegram, Solve Education!’s Ed the Learning Bot in Nigeria delivers gamified learning modules through a lightweight chat interface.
Through SMS, M-Shule in Kenya provides AI powered micro courses and personalized tutoring that reach students even on basic phones with no internet access.
Now if you need a fully offline solution that works without relying on internet, you can run the AI model directly on your own device. This means the phone, tablet, or laptop itself does the computing. In recent years, there have been many advances in compressing AI models so they can fit on devices with limited memory.
On mobile devices, apps like PocketPal allow you to download AI models and use them offline. These can include both language models and multimodal models (text and images).
On desktops and laptops, lightweight frameworks such as LM Studio, Ollama, GPT4All, and Llamafile let you run smaller models directly on your computer. These tools are designed so that even without cloud servers, you can ask questions, generate lesson plans, or work with text completely offline.
For voice-based AI, tools like Whisper.cpp, Vosk, and Picovoice allow speech recognition to run locally on devices. This can make AI more accessible for classrooms with literacy challenges or multiple local languages. For example, teachers can use offline transcription to create captions for lessons, or students can interact with AI by speaking instead of typing.
This approach reduces dependence on constant connectivity, though it is still limited by the storage and processing power of the device you are using
Sometimes the devices you have simply do not have enough memory or processing power to run AI models. In that case, a school or community can set up a central server that works as a hotspot. This way, the heavy computing happens on one shared device, and teachers and students connect to it through WiFi just like they would connect to the internet.
One good example is Beekee Box, a portable server that already works as a classroom hotspot for offline digital content. Beekee has begun experimenting with running large language models directly on their device, which means AI could be accessed in classrooms even without internet.
Another example is Kolibri, a widely used offline learning platform. Kolibri can act as a central hub for curriculum aligned resources, and it is beginning to explore AI capabilities that would allow teachers and students to get personalized support aligned with national curricula.
If you would like to see more examples of central server solutions, you can explore them here and here. They are not necessarily AI by design, but they show how offline hubs can help address challenges in limited resource environments.