Northeast Iowa Community College Assessment Newsletter | Fall 2024
"Assessment is an ongoing awareness of students' learning and their needs, rather than an occasional event in the program." - Unknown
"Assessment is an ongoing awareness of students' learning and their needs, rather than an occasional event in the program." - Unknown
Recommended Reading:
4 Assessment Trends in Higher Education from HelioCampus
Rethinking Student Retention at Community Colleges from Modern Campus
The deadline for blueprinting your courses is approaching! By May 2025, all faculty must have a completed Course Alignment Blueprint for each of their courses. For a refresher on course alignment blueprinting, visit the DIID team's Quality Course Design website.
Starting this fall, you can link your blueprints in a new section in Simple Syllabus. Including your blueprint in your syllabus helps the DIID Team track blueprint completion while also showing students how their coursework aligns with their expected educational learning outcomes.
If you need help with blueprinting, feel free to email the DIID team, and an instructional designer will assist you.
Hello Colleagues,
As we begin another exciting academic year, I want to take a moment to welcome everyone back and provide an update on how the Core C.L.A.S.S. Team plans to revise and refine our assessment work moving forward.
Assessment doesn’t have to be a daunting process of big data and complex analysis. At its core, it’s about understanding what’s working, where we can improve, and how we can make informed, evidence-based decisions to ensure our students receive a valuable and impactful learning experience.
To keep the momentum going, our assessment work will revolve around these three critical questions:
What are we doing well, and how do we know?
Where are there gaps we need to address, and how do we know?
What will we continue doing, or change, based on the answers to the first two questions?
Using this simple yet effective approach, we’ll streamline our assessment work and make assessment a meaningful tool for enhancing our academic programs, fostering a culture of continuous improvement, and supporting student success at NICC.
I want to thank everyone for their work during our October 21, 2024 Faculty Professional Development (PD) Day, as we shared the new work ahead for our assessment efforts and spent time working on our action plans and incorporating those developed through program and discipline reviews. These plans highlight both our strengths and areas that need further attention, providing an excellent starting or continuation point for your annual assessment work. As we move forward, we’ll collaborate closely with program and discipline assessment leads to ensure these action plans remain relevant and actionable, allowing us to reinforce what’s working well and make necessary improvements where needed.
I look forward to working with you as we refine our assessment efforts this year. Together, we can capitalize on our strengths, address any gaps, and continue advancing toward academic excellence.
Here’s to a productive and successful year ahead!
The CLO rubrics were updated by the CLO committees in Spring 2024, underwent a final review during Workshop Week in August, and are now available in Brightspace. The new process for connecting your CLOs to your defined assessment is listed below. A short video is also provided to assist you in the process.
The new process is as follows:
Determine whether your program or discipline will use an embedded or other common assessment, or allow individual instructors to choose.
Attach your course's primary CLO rubric to your chosen assessment as primary or secondary rubric.
Score students using the rubric, even if they did not complete the assessment.
Use your data to improve your course. Real-time data will be provided to each instructor through a dashboard that can be viewed anytime.
2024-25 BENCHMARK GOALS:
65% of enrolled students will complete the selected assessment and be scored with the appropriate CLO rubric.
80% of scored participants will achieve at least an 80% score on the rubric, indicating proficiency was met.
You can scroll through the shared rubrics in the image carousel below:
DATA ACTION PLANS
Instructions for finishing your 2023-2025 Data Action Plans.
Instructions for utilizing the Action Plan items identified in your previous program/discipline review.
Finish
This summer, a unique experiment was conducted in one of our college's courses, designed to explore the relationship between assignment and exam grades. Science instructor, Jeremy Durelle, made a significant change: he chose to omit graded assignments for the semester to see how this would impact student performance and learning outcomes. The intention behind this experiment was to address the notable discrepancy between assignment grades and exam grades that had been observed in previous semesters.
In the prior two summers, the median assignment score was typically high, around 86%, while the median exam score was significantly lower, at just 61%. This gap suggested that while students were performing well on assignments, they were struggling to replicate this success in exams, potentially because graded assignments gave a false sense of security or students did not fully grasp the material despite high assignment scores.
To further support learning, the exams in this course were structured to allow multiple attempts. This formative approach provided students with detailed feedback after each attempt, enabling them to reflect on mistakes, adjust their understanding, and try again. The goal was to help students better engage with the content and make meaningful improvements over time.
For example, in one case, feedback on an exam question helped students understand how to calculate enthalpy changes by reminding them of key principles and equations. The process was not only about memorizing information but also applying feedback to grasp more complex concepts.
Before the experiment, in the previous summers of 2022 and 2023, there was a clear discrepancy between the assignment and exam performance:
Assignment Median Score: 85.95%
Exam Median Score: 60.86%
This gap highlighted that although students performed well on assignments, which often reflected effort and understanding in a low-pressure environment, they struggled to perform similarly under exam conditions.
However, this year, after removing graded assignments from the course, the median exam score improved to 69%. The shift suggests that by focusing on exams as the primary graded activity, students may have been more motivated to engage deeply with the material throughout the course. The instructor also noted that this improvement in exam performance could be attributed to the multiple-attempt structure, which allowed students to learn from their mistakes and continuously improve.
This summer experiment provided valuable insights into how different grading strategies can affect student performance. The omission of graded assignments, combined with the opportunity for multiple exam attempts and meaningful feedback, seemed to encourage students to engage more thoroughly with the course material and improve their exam performance. While assignments are traditionally a staple of academic assessment, this experiment shows that alternative approaches can foster deeper learning and help bridge the gap between coursework and exam success.
When asked about his overall thoughts on the assessment experiment, Jeremy responded: "I noticed that, despite some complaints during the semester, the IDEA survey comments were actually more positive compared to the last two years. While no one truly 'loved' the format, and there were still a few negative remarks, even those recognized the value in the 'struggle.' Some students expressed a desire to bring back graded assignments, but for now, I plan to stick with this format and see how it develops and compares over the long term."
Moving forward, these findings could influence future course designs, particularly in subjects where the goal is to ensure that students not only complete assignments but also demonstrate strong performance in higher-stakes assessments like exams.
On October 11th, the Psychology Department met in Elkader over fish sandwiches to finalize the design of a Common Capstone Project. This project will require students to conduct interviews and write a compare/contrast paper based on their findings. Each course will include five specific interview questions tailored to align with some of the Common Learning Outcomes (CLOs) for that course.
Our goal is to create a shared assessment tool that all instructors in the program will use. We plan to meet again in the spring to finalize the interview questions for each course, with the rollout of the Common Capstone Project set for Fall 2025.
Students will be asked to select three individuals to participate in a brief 15-20 interview consisting of 5 questions. Each interview participant will represent a different generation. Students will select participants from their 20’s, 40’s, and 60’s. After conducting the interview process, students will be asked to construct a 4-6 page paper in which they compare and contrast participant responses.
Page 1 Introduction of Participants and summary of 5 Major Topics (questions)
Page 2 Participant #1 Responses
Page 3 Participant #2 Responses
Page 4 Participant #3 Responses
Page 5 Compare/Contrast/Conclusion
In a recent 5-Minute Flair recorded for the Instructional Innovation and Technology Council (IITC), Michelle shared a neat approach she's recently implemented in her courses. She's allowing students a second attempt on quizzes, but only for the questions they got wrong the first time. As she put it, "It doesn't matter if they didn't learn it the first time, they've now learned the second time." By combining the scores from both attempts, she's making sure students get multiple opportunities to grasp course material.
Michelle also shared her perspective on dropping the lowest test score. She explained, "What I think you're telling the students is I don't really value that material" and instead, she prefers to focus on remediation. Her approach prioritizes student learning, and focuses on giving students another chance to ensure each concept is understood.
Accreditation is a vital aspect of maintaining the quality and standards of higher education institutions. Criterion 4, one of the critical components of the Higher Learning Commission’s (HLC) accreditation process, focuses on the institution's responsibility for ensuring the quality of its educational offerings. This video explores the core aspects of Criterion 4, including program review, student learning assessment, and continuous improvement. We invite you to watch for a better understanding of our assessment efforts.
Criterion 4 emphasizes the institution's commitment to the quality of its educational programs. This includes ensuring that the programs are well-structured, with clearly defined learning outcomes that align with current industry standards. The following four areas are important for institutions to provide evidence of during the writing of their HLC Accreditation Assurance Argument.
Program reviews are not a one-time process but an ongoing effort to keep educational offerings relevant and up-to-date. Institutions must demonstrate that they have a systematic process for evaluating the quality of their academic programs. This includes reviewing course content, evaluating the effectiveness of teaching methods, and ensuring that programs meet the needs of both students and the job market.
Institutions are encouraged to gather data from various sources, including enrollment trends, student feedback, and employer surveys. This data helps inform decisions about curriculum updates, faculty development, and resource allocation.
A key aspect of Criterion 4 is the assessment of student learning. Institutions must show that they have effective processes in place to measure how well students are achieving the desired learning outcomes. This includes both academic and co-curricular activities, which contribute to the overall development of students. For example, service learning projects or internships may be considered co-curricular activities that reinforce academic learning.
The assessment process should involve regular evaluation by faculty, with a focus on continuous improvement. Institutions are expected to use the results of these assessments to make informed changes to their programs and teaching methods.
In some disciplines, external accreditation is necessary to maintain the legitimacy and quality of the program. For example, Nursing and Allied Health programs often require accreditation from professional bodies. This external accreditation ensures that the curriculum meets industry standards and that graduates are prepared for professional practice. Institutions must provide evidence of their compliance with these standards during the accreditation process.
Finally, Criterion 4 emphasizes the importance of tracking student success post-graduation. Institutions are encouraged to conduct exit surveys, track employment outcomes, and maintain relationships with alumni to ensure that their programs are preparing students for successful careers. This feedback loop helps institutions make necessary adjustments to their programs to better align with job market demands.
Criterion 4 serves as the foundation for maintaining educational quality and ensuring that institutions are continuously improving their programs. By focusing on program reviews, student learning assessments, and external accreditation, institutions can demonstrate their commitment to providing high-quality education and preparing students for success in their chosen careers.
The data collected in the past has been placed in the Data Dashboard and has been available for faculty to review benchmarks and have discussions to make improvements in their programs/disciplines.
Data has been used to improve the process internally by those who have felt comfortable doing so. The College is focused on data-driven decisions, and more data will need to be shared externally with our stakeholders in the future. With the revisions to the assessment process, the faculty will use this data yearly to advance the action plan projects developed from their program and discipline reviews.
Full-time faculty are required to attend Professional Development (PD) day as specified in their contracts. However, part-time adjunct and concurrent faculty may have other commitments that prevent them from participating. It is important that these faculty members still receive relevant program and discipline updates from their departments. They should also be invited to provide feedback and engage through the program and discipline leads designated by the deans for their respective areas.
The Brightspace LMS will gather the assessment data from courses and it will be shared out in real-time on an Assessment Dashboard. The Data Evaluation Team is available to work with CLO teams and Programs/Disciplines to understand their data for completion of their report-outs. Customized data dashboards can be developed to support your program or discipline's annual assessment efforts.
As we revise the assessment process, if your program or discipline requires specific data for your action planning, we’re here to assist. We can help connect you with the data experts on campus who can provide the information and support you need to move your plans forward.