Day 2: Building an Assessment Program

The deeper I dig in, the more excited I get; however, there is work -- major work -- to be done. That said, there's a lot that others have done and excellent resources out there, including a program I found today, which provides a lot of free (!) content for the teaching of reading: I found Read Works, after reading about the South Bronx Classical Charter School, which is called out in the aforementioned Driven By Data. The Director of Curriculum, Ms. Murtha, at SBC used to work at Read Words. What caught my eye first on the Read Works site is their "concepts of comprehension," a list of 19 inferential thinking skills, which they believe are most important for helping children understand what they read. These are the same types of concepts that we worked with at the Grow Network and for which we built teaching tools, designed to help teachers better understand the concepts and develop numerous strategies for working with students so that they could become close, careful, and critical readers.

The work I did today allowed me to start building a list -- or perhaps a rough draft of one -- which may be critical first steps in shaping the program.

Here goes...

1) Craft and shape the details of a comprehensive assessment plan/system/calendar, which includes the interim assessments (SUNY doc says they will take place three times/year), a variety of formative assessments, which are classroom-based and which could include everything from exit tickets to quizzes, tests, writing assignments, and major projects, gaining clarity about the nature of the follow-up that comes once we have analyzed assessment data, particularly that from the interim assessments but also the data gathered daily, weekly, and monthly. The system must be spiraling in nature in that standards aren't abandoned once they are covered; they have to resurface, especially when students haven't gained mastery but also so that knowledge and skills don't erode. In shaping the plan and in determining and building assessments, we can take advantage of Driven By Data's "five core assessment drivers," which are as follows:
  • Transparent starting point: Assessments need to be written before the teaching starts, and teachers and schools need to see them in advance; they define the road map.
  • Common and interim: Assessments should apply to all students in a grade level and should occur every six to eight weeks.
  • Aligned to state tests and college readiness: Assessments should be aligned to state tests in format, content, and length, and also aligned to the higher bar of college readiness via SAT/AP/IB exams, research papers, and so on. [And here I will add that another way to determine college readiness is to use the ACT's College Readiness Standards as well as an awareness of the ACT test and the rigor of its items.]
  • Aligned to instructional sequence: Assessments should be aligned to the teachers' sequence of clearly defined grade-level and content expectations, so teachers are teaching what will be assessed.
  • Reassessed: Interim assessments should [continually] reassess previously taught standards.
2) Develop a format/plan/infrastructure for student data, the work on which might begin with determining what types of data we will gather, how then to store it in order to slice and dice and take full advantage of it. What kind of system is best for managing all this information and enabling its crucial usage? I'd love for us to define data broadly and create a system that is truly comprehensive and allows for a three-dimensional understanding of every student (and one that evolves and grows as the students do). We might also want to consider what data/information we want parents to access.

3) For all four subject areas (ELA, math, social studies, and science), analyze the standards (NY State for all four areas plus CCSS for ELA, math, and social studies); identify clear learning objectives for each standard. We might also provide sample items from the NY State tests for each standard. Further, we could determine "levels of mastery" within a standard or perhaps come up with precursors for standards, which would help when students are struggling with standards. (Such work could involve going back to standards in previous grades or simply trying to determine foundational skills necessary to build in order to tackle certain standards.) Further, this chart could contain specific ideas for what type of "evidence of learning" students need to display to show mastery with particular standards (beyond, of course, the aligned items). Finally, we might align College Readiness Standards (from the ACT) so that we are aware of how students will need to further develop the skills in the standard in order to be college ready.

4) Do a comprehensive analysis of NY State tests, which are available online for several years. In doing this analysis, we can type the items (and NYS provides a "performance indicator" alignment for all of the items). We might also want to ask the following question of each item: "What are the skills and knowledge needed to master this assessment question?" Because we are able to see all the items, we can also do error analysis and determine what kinds of answer choices trip students up, aiming to understand how and why things go wrong for kids. Often there's logic in wrong answers, yet the key thing is to seize the opportunity we have through error analysis to help students learn from the wrong turns and begin to understand why the right answer is better than the one they selected.