Summarizes the purpose for the evaluation.
Describes relevant stakeholders and appropriately considers them in selection of evaluation methods.
Articulates the logic model (the premise about how and why the program works including the resources involved, and short/long-term outcomes anticipated) of the program.
Analyzes and synthesizes information from multiple sources that address the evaluation question.
Communicates major program evaluation findings.
"Evaluation" refers to the appraisal and judgment of the effectiveness of an educational program. Because such programs come in so many shapes and for so many purposes, evaluation also take many forms. There are some basic resources in the "general" category below and more specific itens in the categories describing component steps of the evaluation process.
Note: there are many evaluation models, often named after their originator (i.e., Patton, Stufflebeam, Kellogg, etc). They all address many of the same issues but highlight different goals and priorities. Because of this, the assessment of the "evaluation" competency will often want to know why you picked one model over another - so be prepared to provide your reasoning for it.
Program Evaluation
Developing Programs That Will Change Health Professions Education and Practice: Principles of Program Evaluation Scholarship. Sklar D, et al. Acad Med 2017:92(11);1503-05
Handbook of Program Evaluation for Social Work and Health Professionals. Smith, MJ. Oxford University Press, 2010.
Evaluating educational programmes: AMEE Education Guide no. 29. Goldie J. Med Teach 2006:28(3):210-24.
Program evaluation models and related theories: AMEE guide no. 67. Frye AW,, Hemmer PA. Med Teach 2012 34:e288-e299.
Program Evaluation Research Guide - a UM Library website containing a collection of resources about program evaluation
Dr. Rooney provides a high-level overview of program evaluation in the context of medical education.
Following this session, attendees will be able to;
Identify the primary components of a program evaluation logic model
Apply relevant educational framework to the practical example provided
Recognize critical challenges of program evaluation and devise possible solutions for their own challenges
Contribution Analysis: An approach to exploring cause and effect. John Mayne, ILAC Brief 16
What Doesn't Work? Three Failures, Many Answers. Nicoletta Stame, 2010. Evaluation 16(4): 371-87
Recognize pitfalls of evaluating impact of education interventions within complex and complicated systems
Describe the difference between attribution and contribution analysis for understanding impact
Explain the role of program theory in understanding change and impact
Crafting mixed-method evaluation designs. Advances in mixed-method evaluation. Caracelli VJ, Greene JC. New Directions for evaluation. 1997: 19-32.
Mixing interview and questionnaire methods: practical problems in aligning data. Harris L, Brown G. Practical assessment, research and evaluation. 2010. 15: 1.
Pitfalls of data analysis. Helberg C. Practical assessment, research and evaluation. 1996. Retrieved from http://PAREonline.net/getvn.asp?v=5&n=5
Lessons for culturally competent evaluation from the study of a multicultural initiative. King JA, Nielson JE, Colby J. IN M. Thompson-Robinson et al. In search of cultural competence in evaluation: toward principles and practices. New Directions for Evaluation. 2004. 102: 67-80.
The power of outliers and why researchers should always check for them. Osborne J, Overbay A. Practical Assessment, research and evaluation. 2004. Retrieved from http://PAREonline.net/getvn.asp?v=9&n=6.
Accuracy and Accountability Criteria IN Joint Committee on Standards for Educational Evaluation. The Program Evaluation Standards: a guide for evaluators and evaluation users 3rd edition. Yarbrough DB, Shulha LM, Hopson RK, Caruthers FA (eds). 2011. ISBN: 1412986567
Implementing the evaluation study and analyzing the data. IN Smith M. Handbook of program evaluation for Social Work and Health Professionals. Pp 303-354. 2010. Oxford University Press. ISBN: 0195158431
Emerging criteria for quality in qualitative and interpretive research. Lincoln YS. IN: The qualitative inquiry reader. 2002 Denzin NK and Lincoln YS. Pp 327-346. Thousand Oaks, California: Sage. ISBN: 0761924922
The program evaluations standards: how to assess evaluations of educational programs. The Joint Committee on Standards for Educational Evaluation with James R. Sanders, The Joint Committee on Standards for Educational Evaluation-second edition. (JRISBN: 0803957327)
Evaluability assessment: a primer. Practical Assessment, Research & Evaluation, Trevisan, Michael S. & Yi Min Huang (2003). 8(20). Retrieved from http://PAREonline.net/getvn.asp?v=8&n=20.
Evaluability assessment perspective IN: Evaluation: a systematic approach: seventh edition. Pp 136-138. 2004. Rossi PH, Lipsey MW, Freeman HE (Eds). Thousand Oaks California. Sage Publications. ISBN: 0761908943
Evaluability assessment IN: Handbook of practical program evaluation. 4th Edition Wholey J. Wholey JS, Hatry HP, Newcomer KE. San Francisco. Jossey-Bass. 2015.
Evaluability Assessment — Link to resource for building skills in program evaluation
Evaluation Checklists — Link to an evaluation checklist site by Western Michigan University Evaluation Center.
Logic Model Development Guide. W.K Kellogg Foundation. 1998
Analysis of program assumptions and theory: logic models IN Rossi PH, Lipsey MW, Freeman HE. Evaluation: a systematic approach: seventh edition. 2004. Thousand Oaks California. Sage Publications. ISBN: 0761908943
The Program Evaluation Standards: a guide for evaluators and evaluation users. In: Joint Committee on Standards for Educational Evaluation. Los Angeles, London, New Delhi: Sage Publications. 2011.
The logic model guidebook: better strategies for great results, 2nd edition. Knowlton L, Phillips C. . 2013. Thousand Oaks California. Sage Publications. ISBN: 1452216754
Using an evaluation for a needs assessment: Tipsheet #87. Kiernan NE. University Park, PA: Penn state cooperative extension.
Garnering grantee buy-in on a national cross-site evaluation. Rogers S, Ahmed M, Hamdallah M, Little S. American Journal of Evaluation. 2010: 447
Needs assessment. Scriven M. American Journal of Evaluation. 1990. 11: 144
Assessing the need for a program. IN. Evaluation: a systematic approach: seventh edition. Rossi PH, Lipsey MW, Freeman HE. Pp 101-132. 2004. Thousand Oaks California. Sage Publications. ISBN: 0761908943
Needs assessment studies. IN Smith MJ (Ed). Handbook of program evaluation for Social Work and Health Professionals. Smith MJ. Pp 139-162. 2010. Oxford University Press. ISBN: 0195158431
A conceptual framework for implementation fidelity. Caroll C, Wood M, Booth A, Rick J, Balain S. Implementation Science. 2007. 2: 1-9.
A framework for measuring fidelity of implementation: a foundation for shared language and accumulation of knowledge. Century J, Rudnick M, Freeman C. American Journal of Evaluation. 2010. 31: 199-208.
The sustainability of new programs and innovations: a review of the empirical literature and recommendations for future research. Stirman SW, Kimberly J, Cook N, Calloway A, Castro F, Chams M. Implementation science. 2012. 7: 17
Stakeholder involvement in evaluation: suggestions for practice. Reineke R. American journal of evaluation. 1991. 12 (1): 39-44.
Empowerment evaluation. Fetterman D. American Journal of Evaluation. 1994; 15: 1
Program evaluation and evaluating community engagement. IN Principles of Community Engagement. Second edition. pp 161-182. Clinical and translations science awards consortium. NIH publication No. 11-7782. 2011. ISBN: 0160888034
Empowerment evaluation: principles in practice. Fetterman D, Wandersam A (Eds). 2005. New York, New York. Guilford Press. ISBN: 1593851146