Student Learning Improvement Plan (SLIP)
During the 2025-2026 academic year, the College adopted the Student Learning Improvement Plan (SLIP) as its new model for assessment. Spring Hill College assesses every academic program (i.e., majors and certificates) every academic year. An academic year typically refers to fall and spring semesters but summer may also be applicable for some graduate programs.
Each program designates an Assessment Liaison (or more than one) whose job it is to communicate with the Faculty Director of Accreditation and Assessment in submitting these documents. However, assessment is a program-wide process, and it is the responsibility of the entire program to ensure accurate and timely assessment.
Purpose and Description of the SLIP
The purpose of the Student Learning Improvement Plan (SLIP) is to document that each year our programs strive to improve the learning experience and success of our students.
Each academic year, SLIPs should be a summary of the following process:
Select one student learning outcome (SLO) for special attention. Selected SLOs can be the focus of up to, but not more than 3 academic years.
Provide your rationale for choosing this SLO. Why did you focus on this SLO versus the other SLOs in your program?
Describe and reflect on pedagogical and curricular strategies used to improve student learning.
Measure, analyze, and evaluate the apparent results of those strategies.
Suggest strategies for continuous improvement.
The Faculty Director reviews your plans for thoughtful reflection on past efforts, your rationale for focusing on this particular SLO, and your willingness to consider implementing new strategies.
Please keep in mind that programs should continue to collect data and evaluate the extent to which targets and benchmarks are achieved for ALL student learning outcomes. But you will submit a SLIP for ONE selected outcome and that will be the only document you will need to formally submit for your program assessment report for the year. Collecting data and evaluating targets and benchmarks will help you as you work towards continuous improvement of your outcomes and this will be useful for future SLIP submissions when you select other SLOs as your focus.
The Faculty Director will provide feedback on your annual SLIP. The SLIPs will be published on Badgerweb and will also be included in the College’s Fifth Year Interim report for SACSCOC (and ultimately our next Reaffirmation Report) but your program should be prepared to have data (including whether benchmarks and targets were met) available for all Student Learning Outcomes.
The due date for SLIPs is September 15th of each year.
Steps Involved in the Creation of Your SLIP
1. Provide a list of all of your Program Learning Outcomes
2. Select the Learning Outcome (SLO) and indicate how many years you have focused on the selected SLO.
3. Results from Previous Year—What are your results from last year? List data associated with the SLO from the previous academic year including benchmarks (minimum acceptable scores) and targets (expected % of your students who will achieve the benchmark):
Example: Of 25 majors attempting the assignment in the 24-25 academic year, 18 (72%) achieved the benchmark of x. Our target was 80% of students attempting should meet the benchmark of x. Our students did not meet the target for this measure.
4. Strategies Used in the Current Year--Briefly describe the strategies you implemented for improving student learning during the current academic year):
Examples: new teaching techniques, the use of review sessions, changes in readings or other materials, changes in the amount of time spent on particular topics, changes in the assessment techniques or artifacts, etc.
5. Tools/Artifacts for Current Year--Include and describe the tool(s) used and the associated course(s) in which the tool(s) was (were) used during the current academic year:
Example: The tool used to assess the SLO was an essay response to a question regarding the theory of evolution that was part of the BIOxxx final exam.
6. Results for Current Year--What are your results for the current academic year after implementing strategies for improving student learning?
Example: Of 18 majors attempting the assignment in the 25-26 academic year, 17 (94.4%) achieved the benchmark of x. Our target was 80% of students attempting should meet the benchmark of x. Our students did meet the target for this measure.
7. Analysis and Discussion--Did the strategies appear to be effective? Some things to consider may include: Are the new strategies promising? Should they be tried again? Should they be tweaked? If something did not work, think about that too. What have you learned from this experience?
8. Continuous Improvement—What are some possible strategies for improving student learning next year?
Optional: Please feel free to include supplemental materials (rubrics, assignments, disaggregated analyses, etc.). Keep in mind that this tool is for your program’s use so include any information that will help you and faculty in your program as you work to continuously improve student learning outcomes.
You will submit your completed SLIP for each year in the Google folder for your area. The submission procedure has not changed.
You can download the complete form for submission here. When you click on the link you will have to make a copy and rename it using your program name, assessment, and the academic year (ex. PsychologyAssessment25-26).
You should already have access to your division's folder, but if not, please let lhager@shc.edu know. Here are the links to the folders:
Division of Business and Communication
Division of Health and Science
Division of Humanities & Social Sciences
Master Engaged and Applied Humanities
Master Public Health and Graduate Certificate
1. Create or Review Your Program's Mission
Your program's mission should reflect they mission of the College and your learning outcomes should flow from the mission of your program. It is advisable to review the mission of your program every 3 to 5 years.
2. Identify Outcomes
Identify 2–5 intended outcomes for your academic program.
Base these on your program's mission and the content of its curriculum.
Ideally, outcomes are stated in a way that explicitly mentions by when the outcome will occur and how it will be demonstrated to have occurred. You may wish to state your outcomes in this form:
"By completion of the B.A. in Underwater Basket-Weaving, students will:
know and understand the history of underwater basket-weaving, as demonstrated by their final-examination grades in ARH 456;
produce developed round and square baskets in sizes small, medium, and large, as demonstrated in their final projects in ART 424; and
synthesize water environments of optimal salinity and pH, as demonstrated in their water-sample grades in CHM 421."
Notice that these outcomes specify a measurable outcome, how it will be measured, and by when it will be achieved. Notice that they are not limited to simply reporting knowledge that graduates will have, especially because knowledge is best measured by how it is demonstrated.
You may wish to consult Harvard University's Bok Center for for ideas of actions that graduates will take in achieving learning outcomes.
If your program's main contact with students is as satisfying Core or Program requirements, rather than as producing Bachelor's students, you may wish to state your outcomes in terms of what students who take courses in your program will achieve.
You may wish to produce a Curriculum Map for your program as well.
Notice that the outcomes' measurement is in terms of coursework. Course grades are not acceptable measures of student outcomes but grades on an individual assignments, projects, exams, and exam questions are acceptable. You may want to consider other ways of measuring outcomes, including measures external to SHC courses. For example, you might be able to track students' placement rates in jobs or graduate school, or survey alumni. For more on measurement, proceed to the next step.
3. Developing Assessment Plans
Explain in detail how your program will measure whether and the degree to which its graduates or students will achieve those outcomes. If you crafted your outcomes according to the advice above, then this should be straightforward. You can mention specific in-class examinations or projects.
As noted in the previous step's description, it's ideal to include a mixture of internal and external measures.
It is also ideal to include a mixture of formative and summative assessments. Formative assessments are applied during the learning processes, typically within a course, such that students can use that feedback to improve their performance and learning during the course. Summative assessments are applied after the learning process is complete, to some instrument or product that the student has produced. For more information, you can consult various sources on the Internet.
Another possible strategy is to include both direct and indirect assessments.
Direct assessments are from demonstrations of students' knowledge or skills from samples of student work, especially scores from in-class or external (e.g., licensure or post-graduate) exams, written essays (or lab reports, term papers, discussion posts, case-study analyses), and performances (e.g., artistic performances and products, exhibits, presentations) or capstone experiences (e.g., senior or honors theses, portfolios, or research projects) scored using a rubric.
Indirect assessments are not from direct demonstration, but instead, from evaluation or reflection about the students' direct demonstrations of skills or knowledge. Some possible direct measures are student attributes (such as hours spent in class, or class participation-rates), admission rates, placement rates, internships or field evaluations (e.g., from employers or from observations of fieldwork), student surveys, alumni surveys, surveys of others' perceptions of students, exit interviews, self-reports of learning, and awards or scholarships earned.
Rubrics are an ideal way to determine the degree to which students meet the outcomes. Investing time now in rubrics will make actual assessment easier, especially when it comes time to analyze your results.
As with your Intended Outcomes, ideally, explain how the assessment plan you will be using is compatible with implementation of your Continuous-Improvement Plan.
4. Analysis
At this step, your program's faculty and other stakeholders consider the results delivered by the measurements mentioned in the previous step. If you used rubrics, then it's relatively easy to get started with your analysis. Your students' performance according to the rubrics helps tell you the degree to which your program's students are achieving their outcomes—which, in turn, tells you the degree to which your program is achieving its mission.
Explain whether your program followed its previous year's Continuous-Improvement Plan. If it didn't, explain why not.
Provide a discussion of whether and when your students achieved the outcomes, and to what degree. If they achieved an outcome with high-quality performance, say so, and briefly suggest an explanation or explanations for why they did so. This explanation may include following the previous year's Continuous-Improvement Plan. If they did not achieve some outcomes, provide some informed speculation about why they did not achieve the outcomes. As before, you should try to cite the degree to which your program followed the previous year's Continuous-Improvement Plan, and whether following it had any effect on the degree to which your program's students achieved their outcomes.
Whether your students achieved the outcomes, use your information to begin thinking about how to improve your program, which you will detail in the next step: the Continuous-Improvement Plan.
5. Continuous-Improvement Plan
Now that your program has had time to develop, inspect, and digest its Analysis, your program's faculty and other stakeholders can collaborate to compose a Continuous-Improvement Plan.
If your program is generally achieving its mission (and its students are generally achieving the intended outcomes), include in your plan how you will continue to implement your successful strategies. Even if your students are performing well, think carefully about your program's mission and decide how you can further improve the degree to which the mission is accomplished. You may at least wish to identify a new outcome or modify an existing one, or develop a new rubric or curriculum map.
If your program is generally not achieving its mission, or your students are not generally achieving the intended outcomes, including in your plan what you will change in order to improve their performance, or if the problems are chiefly external (i.e., outside your program's control), explain how you will nevertheless mitigate them, or otherwise better-prepare your students to achieve your program's intended outcomes.
Please consult with your Faculty Director of Accreditation and Assessment, Lisa Hager, at lhager@shc.edu.
To view your division's progress and completed submissions, please find the corresponding link to your division's subfolders, and inspect the contents of the subfolders.