The Programmatic Assessment and AI Review (PAAIR) project is part of a suite of assessment advancement activities conducted in collaboration with the Office of the Deputy Vice Chancellor of Education (DVCE). The University aim of the review is to ensure the calibration of our courses for contemporary contexts, engaging with design that enables a programmatic approach to assessment and learning, and advancing assessment by strategically integrating AI or securing from use, as needed. These actions play a role in addressing the plan set out in the Tertiary Education Quality and Standards Agency (TEQSA) 2024 Request for Information (RFI) on AI in education. To hear more about the origins of the review and the University goals, refer to PAAIR: TeachHQ.
University principles for programmatic approaches to assessment
To guide this work, Monash University has articulated a selection of programmatic assessment principles, modified from Heeneman et al. 2021. The principles are grouped into three guiding themes:
Continuous and meaningful feedback
Mix of assessment methods used over time
Equitable and credible decision-making processes
To explore the principles, refer to Programmatic approaches to assessment: TeachHQ.
Faculty vision for PAAIR
To strengthen the already established foundations of assessment reform and create an environment where all courses have exceptional leadership and innovation that creates evidenced-informed educational practices. We will achieve this through empathy, equity, connections, responsiveness, flexibility, transparency, professional and personal identity when enacting digital literacy within health care environments. Our vision is to further embed the principles of programmatic assessment into all our courses.
Faculty goals for PAAIR
Strengthen the existing assessment reform foundations, which were established over the last two years
Facilitate leadership and educational enrichment so as to provide evidence-informed education which is contextualised for MNHS
Embed the core principles of programmatic approaches to assessment into all courses in the Faculty
Ensure students and industry partners voices are included as part of assessment reform and programmatic design
Review the role of emerging technologies in discipline specific practice and research, embedding strategies to improve digital literacy
Got a question about PAAIR? Scroll down to our FAQ Padlet to ask your question (or see if someone else already has).
In this Panopto playlist, Claire Palermo and Tim Fawns discuss Designing programmatic assessment at a discipline level, and Tim Fawns and Ari Seligman discuss Using artificial intelligence - new options for securing assessments.
Watch on Panopto here (19 mins).
CRADLE Deakin ran a seminar in July 2024 on Assessment beyond the individual unit/ module exploring the what, why and how of systemic approaches to assessment.
Watch below (1 hour 25 mins).
Sylvia Heeneman et al. 2021 define a consensus statement and principles for programmatic assessment (referenced in the Monash Programmatic approaches to assessment).
Access here.
Kimberly Wilder-Davis et al. 2021 discuss the benefits of a programme level approach to feedback, offering a list of principles to support implementation.
Access here.
Use the Padlet board below to review Frequently Asked Questions (FAQs) and/ or post your own. The EDS team will monitor the board and respond to your questions as soon as possible. Click on the plus icon underneath the relevant section heading to add your question/s.
Programmatic approaches to assessment
Baartman, L.K.J., et al. (2022). Programmatic assessment design choices in nine programs in higher education. Frontiers in Education, Vol 7. https://doi.org/10.3389/feduc.2022.931980
Baartman, L. K. J., & Quinlan, K. M. (2023). Assessment and feedback in higher education reimagined: using programmatic assessment to transform higher education. Perspectives: Policy and Practice in Higher Education, 28(2), 57–67. https://doi.org/10.1080/13603108.2023.2283118
CRADLEdeakin. (2025). CRADLE Seminar Series 2025 #1: Programmatic Assessment. Insights from Theory and Practice., with Liesbeth Baartman. https://www.youtube.com/watch?v=_ZK9XpK8dek
Schuwirth, L. W. T., & Van der Vleuten, C. P. M. (2011). Programmatic assessment: From assessment of learning to assessment for learning. Medical Teacher, 33(6), 478–485. https://doi.org/10.3109/0142159X.2011.565828
The University of Sydney. (2024). Program level assessment and the two-lane approach. https://educational-innovation.sydney.edu.au/teaching@sydney/program-level-assessment-two-lane/
Van der Vleuten. C.P., et al. (2012). A model for programmatic assessment fit for purpose. Med Teach. 34(3):205-14. doi: 10.3109/0142159X.2012.652239.
Van Der Vleuten, C. P. M., Schuwirth, L. W. T., Driessen, E. W., Govaerts, M. J. B., & Heeneman, S. (2014). Twelve Tips for programmatic assessment. Medical Teacher, 37(7), 641–646. https://doi.org/10.3109/0142159X.2014.973388
Artificial intelligence & assessment
Bassett, M. (2025). DVCA Portfolio GenAI in HE Primer - Charles Sturt University. DVCA Portfolio GenAI in HE Primer - Charles Sturt University on Vimeo
Charles Sturt University. (2025). Rethinking assessment strategies in the age of artificial intelligence. https://www.csu.edu.au/division/learning-teaching/assessments/assessment-and-artificial-intelligence/rethinking-assessments
Lodge, J. et al. (2023). Assessment reform for the age of artificial intelligence. TEQSA. https://www.teqsa.gov.au/guides-resources/resources/corporate-publications/assessment-reform-age-artificial-intelligence
Monash University. (n.d.). AI in Education Learning Circle. https://www.ai-learning-circle-mon.com/
Monash University. (2025). AI and Assessment. https://www.monash.edu/learning-teaching/TeachHQ/Teaching-practices/artificial-intelligence/ai-and-assessment