Northeast Iowa Community College Assessment Newsletter | Fall 2023
"Assessment is an ongoing awareness of students' learning and their needs, rather than an occasional event in the program." - Unknown
"Assessment is an ongoing awareness of students' learning and their needs, rather than an occasional event in the program." - Unknown
It's with immense pleasure that I introduce the first edition of our biannual Assessment Newsletter. This platform aims to:
Share updates from our CLASS team.
Spotlight the commendable efforts and accomplishments of our faculty and staff.
Shed light on upcoming assessment initiatives.
Our vision is for this newsletter to chronicle the College's journey as we strive for continuous improvement in gauging student outcomes. Two great articles are included in this issue below. I encourage you to read and gain insights from them.
For our faculty members who've recently joined, it's essential to note a few pivotal points:
Accreditations: Northeast Iowa Community College (NICC) boasts federal accreditation from the Higher Learning Commission (HLC) and the Iowa Department of Education (IDOE). Both entities periodically audit our College to ascertain compliance with HLC guidelines and the Iowa Code. [Here are the specific HLC criterion and IDOE codes we abide by.]
HLC's Past Visits: In 2016, during the College's ten-year visit, HLC provided formal feedback highlighting areas requiring immediate growth, including onboarding processes for new faculty, the assessment of general education, assessment and measurement of student learning, data analysis, and co-curricular assessment. NICC demonstrated commendable growth within 18 months, and our interim virtual visit in 2020 was a success. We anticipate our next ten-year accreditation visit in 2026. Notably, a primary objective of our upcoming faculty development day on October 16, 2023, stems from a charge by HLC to incorporate data into systematic processes to evaluate and ensure progressive attainment of student learning outcomes.
IDOE's Interactions: Following a successful ten-year accreditation visit in 2017 with minimal recommendations made, we're preparing for IDOE's interim site review on February 7-8, 2024. Our teams are diligently preparing to showcase our compliance with both HLC and specific Iowa Code elements.
As I reflect on the past ten years, the College has made great progress in not only meeting, but exceeding federal and state standards. A milestone was in 2015 when a select faculty team and I joined the HLC Assessment Academy, leading to the inception of the Celebrating Learning and Student Success “CLASS” Committee. Today, there are a variety of subcommittees that do work under this umbrella; CLO, Course Guide, Course Mapping, Evaluation and Co-curricular engaging over forty faculty and staff in assessment initiatives. As a result, all he College members are part of the CLASS team as we all are learning to live and celebrate ways to assist students in reaching program and personal goals.
I look forward to having the full-time faculty join Susan Hatfield on October 16th in Elkader, Iowa as we apply data to enhance an identified need. After this meeting, we will outline the next steps for our continued journey. These will include:
Reaffirming the four CLO committee groups and having them review the current CLO rubrics;
Rolling out new DOMO dashboards;
Continuing the work to identify common assessments for the same course that all faculty utilize;
Continuing the work on course mapping
Ensuring the Assessment website reflects the subcommittee members, charges and outcomes; and
Working with the Evaluation team to review their recommendations.
Revamping Assessment mapping in Brightspace.
Again, thank you for your exceptional work at the College. I look forward to our continued journey!
Kathy
Recommended Reading:
"Designing Meaningful and Measurable Outcomes: A First Step in Backwards Design" from Faculty Focus
"What are rubrics and how do they affect student learning?" from Turn-it-In
Professional Development Day is right around the corner! Don't forget, this day is required for all full-time faculty. A Zoom link will be provided for fully-remote faculty to attend the two seminar sessions. Remote participants will need to make plans to Zoom with a colleague in the room to participate in discipline/program discussions.
View the agenda to prepare for the event, and be sure to register so we can plan for the day accordingly.
Susan Hatfield is Professor Emerita in the Communication Studies Department at Winona State University (WSU), where she taught from 1981 to 2015. During that time, she developed and directed WSU’s assessment program, served as a department chairperson, and coordinated the First Year Student Experience. She has previously served on the Board of Visitors for the Marine Corps University and the Board of Directors of the Joint Review Committee on Education in Radiologic Technology. Susan is currently a Trustee with the Palmer College of Chiropractic.
In Brightspace, select Insights Portal in the Navigation Bar.
Scroll down and select Insights Report Builder.
On the left menu, select Program Assessment Dashboard.
Once you are in the Program Assessment Dashboard, you can filter the data by selecting a Program from the Program Selection card at the bottom of the page. You may also adjust the current Semester filter at the top of the page, or you can create your own.
To create a filter, you will need to click on the Plus icon, then select the data you wish to filter by such as Course Code, Course Name, etc.
If you have any problems finding your data in Brightspace, send an email to online@nicc.edu and a member of the DIID team will assist you.
Join Us! Contribute to NICC’s Assessment Work by Serving on a CLO Sub-Committee
Northeast Iowa Community College is actively seeking enthusiastic faculty and staff members to join our Common Learning Outcomes (CLO) Assessment sub-committees. These groups are essential in defining the expectations and maintaining the high standards of an NICC education.
The CLOs provide the benchmarks for which our College holds itself accountable, having been established through thoughtful input and leadership from our faculty.
The four established areas with corresponding performance indicators are: Communicate Effectively, Critical Thinking, Lifelong Learning, and Diversity.
Your participation in one of these sub-committees will greatly contribute to our continuous effort in ensuring that our educational programs meet the highest standards and effectively serve our students. If you have an interest or passion in any of these areas, we encourage you to reach out and join us in this essential work. Please express your interest and direct any inquiries by emailing Kelly Kramer.
For more detailed information about NICC’s Common Learning Outcomes and to better understand the impact of your contribution, please visit the CLO page on the Assessment Hub.
We look forward to your participation and are excited about the valuable insights and perspectives you will bring to these sub-committees. Together, we can continue to foster a learning environment that empowers our students and upholds the esteemed reputation of Northeast Iowa Community College. Thank you for considering this opportunity to contribute to the College’s assessment work.
The Program Assessment Dashboard provides a snapshot of our students' proficiency at various scales. Through the use of filters and simple button clicks, you can see assessment trends in your program, at your particular campus, in a particular delivery method, etc. You can drill down to observe these trends at granular as well as broad levels. The data is presented through 5 "cards."
Click each card below to learn more.
The CLO Assessment Trend displays CLO embedded assessment results from Fall 2017 to the present. Critical Thinking is the most common assessed outcome in every semester. Overall, students achieve proficiency over their respective CLOs 95% of the time.
The Embedded Assessment Trend displays direct and summative embedded assessment results from Fall 2017 to the present. The results are closely matched, with a noticeable downward trend in the number of measurements. Overall, students achieve proficiency over the embedded assessments 69% of the time.
The Delivery Breakdown compares proficiency measurements and rates across 3 modalities: face-to-face, online, and hybrid. Most of the data comes from face-to-face courses, followed by online, and then hybrid. The proficiency rate follows the same pattern (73%, 71%, and 65%, respectively).
The Campus Breakdown compares proficiency measurements and rates across the campuses. This also includes high schools. Most of the measurements come from the Peosta and Calmar campuses, followed by high schools. The proficiency rate is similar among these campuses (72%).
The Assessment Heat Map functions as a poor-man's correlation metric. The average final grade is 85%. The map shows that students who achieve high final grades are also more likely to achieve proficiency on their embedded assessments.
For questions or help with using your data dashboards, contact a member of the Data Evaluation Team: Jeremy Durelle (Lead), Evelyn Buday, Chelsea Clegg, Tim Doffing, Denise English, Caleb Feuling, Lora Hannan, Travis Hunt, and Brandon Kadlec.
The assessment of learning plays a pivotal role at the community college level, particularly in CTE programs like Radiologic Technology where the synthesis of theoretical knowledge and practical skills is crucial. In such programs, an ongoing and rigorous assessment structure is integral to monitoring student progress, identifying areas of improvement, and ensuring the attainment of the requisite competencies necessary for a career in these programs.
Here are some insights into the importance of assessment in program areas:
By continually assessing students' understanding through various methods ranging from classroom activities to unit and final exams, I can gain valuable insights into the learning curve and proficiency of each student. This approach facilitates the identification of individual learning needs, enabling me to tailor my teaching methods and strategies to address gaps in knowledge and skills, thereby promoting an environment of continuous learning and improvement.
For my program, adhering to the standards and requirements set forth by the Joint Review Committee on Education in Radiologic Technology (JRCERT) is paramount. The comprehensive accreditation process ensures that my program maintains high-quality standards and produces competent, ethical, and qualified professionals. Integrating goals, measurement tools, timeframes, benchmarks, and historical data into my program assessment plan is essential in demonstrating continuous improvement, accountability, and compliance with JRCERT standards.
The systematic and regular review of my program assessment plan, coupled with the integration of feedback and historical data, facilitates a cycle of continuous improvement. This process allows for the refinement of curriculum, teaching methodologies, and assessment strategies, ensuring that the program remains adaptive, relevant, and aligned with the evolving demands of the radiology field.
The detailed and thorough assessment of learning contributes to enhanced student outcomes. By identifying and addressing the individual learning needs of students, I can foster a more supportive and enriching learning experience. This, in turn, contributes to the overall development of my students to help ensure that they are equipped with the knowledge, skills, and critical thinking abilities required in the field.
Emphasizing the importance of assessment fosters a culture of accountability among us as educators and students alike. This culture encourages self-reflection, responsibility for learning outcomes, and a commitment to upholding the highest standards of education and professional practice.
In conclusion, the assessment of learning is indispensable at the College. It ensures the continuous improvement and adaptability of our programs and the cultivation of competent and ethical professionals. Please reach out if you want to hear more about the assessment work I am doing every day to improve continuously!
It is no secret that teachers are in constant need of and often required to find, some form of validation. Whether it is pressures from academic institutions, stakeholders or students, the measure of a “good” teacher comes down to defining success. Many times, this measure has come in the form of student and peer ratings, that can unfortunately and frequently be riddled with bias and conflict of interest. Historically, the most utilized measure for success has been through learning outcome measures, which to be fair, can also fail to negate test validity/reliability, student motivation or curriculum goals. In more current reflections, as technology advances and the impacts of the COVID-19 pandemic continue to shift our educational platform methods, larger scale considerations for what merits student success will have to be considered.
As a pre-pandemic fully online new hire for NICC, defining success seemed to depend on all facets of evaluation; peer review, student review and assessment. The question came down to what the measure would look like for students that would refute any bias or platform inequalities. Being primarily a science instructor with delegated courses in Anatomy and Physiology, courses required for our larger student nursing population, it seemed pertinent to look into this demographic first. Anatomy education specifically, has traditionally been viewed as “building block” subject matter driven by traditional instructional methods, heavy in hands on laboratory practices with historically low pass rates. Success in these courses is regarded as exceedingly important for community college health care certifications and other higher learning, health science degree programs, making it am ideal focus for measurement.
With the support of Brandon Kadlec from Institutional Research and various nursing department faculty, it was decided to first compare my online Anatomy 1 and 2 pass rates over the last 3-4 years, with that of the department (Image 1) and then to follow with the HESI standardized test results (Image 2) taken prior to entering the nursing program. While course pass rates are a useful initial data set, they paint an incomplete picture and often only provide reflective measures for course design and instruction method. However, the results would create an important baseline that basic department standards were likely being met. Once it was found that pass rates were “consistent” with those of the anatomy course on varying platforms, the next step was utilizing the HESI (a standardized summative general knowledge assessment taken in close proximity of anatomy course completion) results for a more comprehensive analysis. These results again seemingly showed a consistent outcome, but with more yearly deviation. This information allowed me as an instructor to look at what could allow students to be successful in an online course, but not maintain that result during the pandemic. A shift was made in my courses based on this information, to incorporate test taking skills in science as a secondary focus to the anatomical content.
While the results were promising for my personal goals of measuring basic platform transition success for Anatomy courses, there are many limiting factors to the study such as; repeat student attempts, shift of educators to online post pandemic and timing/continued use of the HESI. And although the results display a seemingly clear comparison and potential successes, the post COVID student mindset continues to change and continued evaluations would be necessary to meet their shifting demands. Working with a department of like-minded peers who all strive for data driven success, allows for a more comprehensive analysis of our courses and in turn creates more consistent content delivery for all students, regardless of platform preference.
The deadline for blueprinting your courses is approaching! By May 2025, all faculty are required to have a completed Course Alignment Blueprint for each of their courses. Visit the DIID team's Quality Course Design website for a refresher on blueprinting and Google Form submission instructions. Beginning in Fall 2024, you can link your blueprints in a new section in Simple Syllabus (hidden from student view).
If you need assistance with blueprinting, do not hesitate to email the DIID team, and an instructional designer can help you with the process.