To what extent can students use AI tools (e.g. ChatGPT, Claude, Grammarly, etc.) during the drafting phase of written assessments?
Students may also use generative AI tools to support their assessment development and writing. These tools are increasingly integrated into software used by students, and at times students may be using them without them realising that they are (including predictive writing support provided by many word processors). While the use of these tools is appropriate, students must ensure that the work submitted for assessment is their own. These tools can only be used to support a student’s own writing processes, not to replace them.
For example, it may be appropriate for a student to use an app to provide suggestions that they can consider to improve their writing during their drafting process. However, it is not appropriate for students to enter their draft into an app which could change the syntax and structure of the text without the students making decisions about phrasing.
Is the use of AI to generate practice questions, simulate arguments, or provide brainstorming prompts considered acceptable as part of the learning process (not the final assessment)?
Students can access generative AI sources to research and inform their assessment, just as they would for a textbook or other traditional information source. And, as above, it might be appropriate for a student to use AI to brainstorm topics or lists of resources at the beginning of or during an assessment task. This could also include the generation of questions in order to prepare for a test or examination.
It is expected that students will acknowledge any use of AI in a way that is appropriate for the subject and school context. This acknowledgement should declare which tools were used and provide a list of all prompts that were entered to generate any information for the task. This practice is particularly useful for tasks where individual sources are not directly referenced throughout, or where the AI provided broader support of the student work. In some cases, such as image generating AI, providing the output images generated and any reference images entered into the tool, would also be appropriate.
In some cases, it is also appropriate for students to make specific references to AI generated work when used throughout their task, as they would when citing other information sources. In most cases, this would include students providing a reference to work created by generative AI when quoted or paraphrased in their task including: the name of the AI tool used, a link to access this resource (if appropriate) and any prompts that were entered to generate the response.
What are the Board’s current expectations for educators in terms of supervising or verifying that AI has not been misused, particularly for take-home tasks?
The SACE Board does not specify which ways of accessing knowledge are valid across subjects. We rely on the discipline expertise of educators to teach students how to evaluate the many sources available to them, and which are appropriate for use in their disciplines. We encourage broad research from a number of sources and for students to always view sources critically, as we know their teachers have taught them to.
Further advice and clarification can be found in our Supervision and Verification of Students’ Work Policy and Procedure: https://www.sace.sa.edu.au/documents/652891/91d6c2ae-1e6d-4d07-8c03-6abd619f1070
Are there any plans to release updated or subject-specific exemplars that demonstrate appropriate AI integration or boundaries within SACE assessments?
We currently have no plans to release subject specific exemplars around the use of AI.
Jane Marshall
Faculty Manager (SACE Board)
Education Services
While we still don’t have a national policy for AI in education (it’s on its way…), several states have released guidelines. The NSW Department of Education, SACE, and QCAA all have dedicated pages with AI advice and I’m sure more will come soon:
Each of these policies has strengths and limitations, but they all offer a tentative first step towards clear approaches for secondary schools. All three institutions share a common view of generative AI as a tool with substantial potential for learning and assessment purposes, but caution against uncritical use. While the SACE Board and the NSW Department make specific references to ChatGPT, the QCAA addresses AI technologies more broadly, encompassing chatbots, deep learning, machine learning, and natural language processing.
The guidelines from all three entities stress the importance of creating original work and not presenting AI-generated content as the student’s own. They agree that AI usage should be cited, with specific details provided when the generated content is quoted or paraphrased. This emphasis underscores the shared commitment to academic integrity.
There are, however, differences in the approach to assessment practices. The SACE Board allows AI in school-based assessments and task design but prohibits its use in external exams. The NSW Department focuses more on the need for staff to verify the accuracy and suitability of AI content, and the QCAA recommends incorporating guidelines on AI usage into school assessment policies.
The importance of data privacy and accuracy is highlighted differently across the guidelines. The NSW Department provides a detailed approach to safeguarding personal information, including de-identification and anonymisation, while the QCAA suggests educating users about potential privacy and accuracy issues associated with AI use. The SACE Board does not explicitly address this issue.
Finally, all institutions advocate for continuous learning and adaptation when it comes to writing school policies and to using these technologies. The NSW Department emphasises mandatory staff training in cybersecurity and child protection, while both QCAA and SACE recommend schools use their existing Academic Integrity courses and resources to discuss ethical scholarship with students.
Leon Furze - https://leonfurze.com/2024/08/28/updating-the-ai-assessment-scale/