The Australian ESafety Commissioner defines digital footprint as "the name given to the unique data that’s created from online activities". This includes content collected when interacting with AI tools such as chatbots and other similar applications. It is possible, these early interactions will shape educational and career opportunities in the future.
Data can be used to enhance learning by providing evidence-based, scalable support that is easy to understand and age-appropriate, to meet the needs of students
Enhances learning opportunities and digital literacy skills
Establishes more effective and robust conversations on consent regarding data collection and use
Some AI tools are not age restricted, putting younger students at risk of accessing inappropriate content at a young age due to collation of digital footprint of others.
Biased information may be used for learning which may impact learning progression and achieving deeper understanding.
Data from a digital footprint could be used by other generative AI tools and platforms.
The wellbeing of students may be harmed if data is inappropriately used.
Historical learning data may be used for career suggestions, possibly limiting options based on old data.
The e-safety Commissioner has released the following guide in support of managing risks and protecting data. For more information about the potential risks and ways to minimise these, follow the links below:
E-Safety Guide - providing information on various tools and apps regularly used.
A neurodivergent Year 10 student attends a school that uses an AI-powered classroom analytics tool to monitor engagement. The system tracks eye movement, facial expressions, and time spent on tasks. The student often looks away when thinking and prefers to work offline. The AI flags the student as “disengaged” and sends automated alerts to teachers. Without context, the student is placed in a support program for underperforming students. His parents are unaware that the system is interpreting his learning style as a deficit. The school later reviews the tool after concerns from staff and discovers that the algorithm was not calibrated for neurodiverse behaviors — a digital footprint misread as a red flag.
A young student engages with an AI-powered careers guidance tool and inputs that they’re interested in fashion, dislike public speaking, and feel overwhelmed in maths. That data is recorded and stored as part of their learner profile. Years later, when using a different AI-enhanced platform in high school, the system draws on this historical data to recommend design-focused careers and avoid pathways involving STEM or leadership. The student never realises their earlier responses — made at a time of insecurity or uncertainty — are still shaping the algorithm’s assumptions about their future. This illustrates how digital footprints can follow learners across systems and years, affecting their opportunities without transparent oversight or the ability to revise outdated self-representations.
Australian Government. (2024, February 26). Digital Footprint. ESafety Commissioner. https://www.esafety.gov.au/young-people/digital-footprint
Australian Government. (2022). AI companions: information sheet | eSafety Commissioner. ESafety Commissioner. https://www.esafety.gov.au/educators/training-for-professionals/professional-learning-program-teachers/ai-companions-information-sheet
Department of Education. (2023, November). Australian Framework for Generative Artificial Intelligence (AI) in Schools. Education.gov.au. https://www.education.gov.au/download/17416/australian-framework-generative-artificial-intelligence-ai-schools/35399/australian-framework-generative-ai-schools/docx