Modeling Science‎ > ‎


Better Lab Learning; Easier Lab Grading

posted Jul 18, 2019, 7:49 PM by Mark Schober   [ updated Jul 18, 2019, 7:50 PM ]

Using Standards-Based Grading to Assess Labs   

I don’t like grading lab reports, and I especially don’t like grading them when I get the sense that the lab report is not adding to a student’s content knowledge, lab skills, or communication skills.

For years I did lab reports in a pretty standard way: I’d introduce some equipment ask the students to make some observations, ask what they could measure, refine the list of variables, discuss a procedure, and set the students loose to gather and analyze data. Afterwards, the students would share their results and through our discussion we’d develop the important ideas and definitions. Students are sent home with a rubric to eloquently summarize what happened in the lab and what they learned about physics. — and then I’d end up with a pile of grade reports to assess that generally didn’t accomplish what I was looking for. They regurgitated things without thinking, didn't dig into the analysis like I asked, skipped important points, and spent way too much time on making everything pretty. My grading was filled with many cringeworthy moments. I wrote futile comments and suggestions that would be long forgotten by the next lab report, if read at all, as students looked only at their grades before stuffing them in their backpacks.

I’ve been using standards-based grading in my course for five years, and I wanted to meaningfully extend it to laboratories. After several iterations, I think I’m headed in the right direction.

I chose to assess lab skills in labs, and to assess physics content through other assessments. There are many times, of course, when the two complement each other so well that I assess both in the same lab report. 

I’ve been thinking about the lab skills I want students to acquire and demonstrate. I consulted existing standards, particularly the Next Generation Science Standards’ Science and Engineering Practices, which provides a pretty comprehensive starting point. For practical purposes, learning objectives for SBG need to be fairly broad so that the number of objectives doesn’t become overwhelming. As you develop objectives for your students, decide what’s really important for students to be able to do, and then design your instruction, labs, and assessments to scaffold the students in meeting those objectives. 

I originally used just three lab objectives, and for more advanced students, this may be a good starting place.
Lab.1 I can conduct an experiment, record data, and distinguish between independent, dependent, and controlled variables.
Lab.2 I can analyze and represent data with graphs and equations.
Lab.3 I can construct explanations and models to interpret patterns observed in the lab.
For my 9th graders, I found that breaking down the objectives into smaller skills has helped enormously. I’ve reorganized my objectives several times and am currently using the following:

Lab.1 I can design and communicate data collection procedures with well-explained and labeled diagrams, distinguishing between independent, dependent, and controlled variables.
Lab.2 I can conduct an experiment and properly record qualitative and quantitative data.
Lab.3 I can represent data with graphs, linearizing as needed.
Lab.4 I can write the equation for the trend in lab data, using variables and units appropriately.
Lab.5 I can explain the relationship between variables and explain any physical significance of the slope and y-intercept.
Lab.6 I can make a scientific claim, support the claim with evidence, and provide the reasoning that connects the claim to the evidence.
Lab.7 I can quantify and explain the limits to my precision in my data collection and analysis.

I’ve listed them in lab-report order, but I don’t assess them in this order, as the first one is one of the hardest and takes the longest to learn. I model how we design experiments, but I’m very much guiding students to a particular procedure. Later in the year, there are a number of labs that lend themselves well to saying: here is what we don’t know, how are we going do do a lab to figure it out?

Lab.2,3,4 are the objectives students first learn in our kinematics and force labs. Mastery involves establishing consistent conventions and developing quantitative analysis skills. For older students, these probably don’t need to be broken into separate ideas, but I find it helpful to split up these tasks for 9th graders. After a few labs of discussion and guided practice, on a subsequent lab, we’ll do our typical pre-lab setup, and then I’ll dispatch students in their groups to demonstrate that they can conduct the investigation, record, graph, and mathematically represent the data — for credit. We then come together to interpret the meaning of the data, modeling a skill that will later be assessed.

By the time we are developing quantitative expressions for calculating energy storage, I ask students to design an experiment to determine how to calculate kinetic energy. 

Electrostatics, electric circuits, and light, provide rich opportunities for developing models from largely qualitative observations and data through the claim-evidence-reasoning format. The practice for demonstrating this skill first comes through a qualitative introduction to energy lab and a conservation of momentum lab.

Although we referred to uncertainty and precision, I never took the time to really ground the students in this concept, so I never assessed Lab.7. This objective fits so naturally into chemistry labs, I tend to kick that can down the road to next year’s science class. I also wrote an eighth objective that I didn't assess: Lab.8 I can describe further investigations that may clarify, refute, or expand my claim. This lends itself to sophisticated investigations, but can also be used in introductory courses. I try to introduce this idea so that it can be built upon in future courses.

This is an entire lab report!

How does this make my life easier?

The objectives signal to students the important ideas that will be seen throughout the year, so rather than ignoring a comment about how to set up data tables or graphs, students recognize that they will need that skill in upcoming labs, and that it’s worth figuring out how to execute the skill and earn credit for it.
Rather than nickel-and-diming students based on a rubric, I’ve placed the job on the student to convince me that they understand the objective.
The labs serve to develop key content for the course, but I assess content problem-solving understanding separately. This allows students to truly follow their data to logical conclusions, even if it isn’t “right”. We spend plenty of time in class debate after a lab to come to a consensus understanding, and the best ideas with the most support rise to the top.
I’m generally assessing only a few objectives in each lab. This helps the students to focus their preparation, it keeps the activities short enough to be completed in class, and it keeps the grading manageable. I ask myself, "did this student convince me that they can perform this lab skill?"
When students work together, the discussion between peers is dynamite. They work to convince each other that they are on the right track, and their arguments are rooted in the data they collect. 
Students complete the labs for assessment in class - this helps to nip the issues with tutor-written lab reports. I do have the students work to complete the lab assessments in groups because it builds positive interdependence in the lab teams. As lab tasks get more difficult, the only way they can really accomplish it is to work as a team.

This feedback is so much more useful to the students than a letter or percentage grade. When they recognize they did not fully demonstrate their proficiency on an objective, they do go back to the notes I've made on the lab report and try to figure out how to accomplish that skill satisfactorily. They also know that these objectives will be assessed again in future labs, so it is to their advantage to ask questions and seek help in order to become proficient in that lab skill.

By the end of the year, I feel that my students have not only developed these foundational lab skills, but they have developed a larger picture view about how the various steps in an investigation tie together and reinforce one another. 

Building a Local Professional Development Community

posted Jul 18, 2019, 1:38 PM by Mark Schober

I have been fortunate to be deeply involved with two successful local area physics groups, the St. Louis Area Physics Teachers and STEMteachersNYC. The long-standing St. Louis Area Physics Teachers was ever-present in my formative professional development. When I moved to New York City, I found that no such equivalent organization existed. Leveraging our connections, we assembled a group of teachers for our first workshop in the spring of 2011. Eight years later, we’ve offered over a hundred workshops serving over a 1200 teachers with ever-expanding plans. 
I prepared a talk for the 2019 AAPT Summer meeting to share key components of these organizations that will help you to start or strengthen your own. 

What follows is a series of anecdotes that compare and contrast the St. Louis Area Physics Teachers, that I was involved with from 1996-2010, and STEMteachersNYC, in which I am currently active.  Although quite different in size, they share many of the same features that have benefited and will continue to serve teachers into the future.
It is helpful to keep in mind why local professional development communities are so important and so valuable:
We form professional development teacher groups because there are never enough physics teachers in any one school to have meaningful content-oriented professional development. Other physics teachers are the best source of pedagogical content knowledge. Local teachers have contextual knowledge about state testing, political climate, funding, unions, local events and resources. Person-to-person interaction builds relationships that give you people you can call in a teaching pinch.
I’m going to divide up the basic ingredients of our teacher alliances into four essential parts, though, of course they are all interconnected:
People - those willing to share their expertise and those eager to grow in their profession
Programming - events, activities, and workshops that are of value to teachers
Communication - between members of the group, reaching out to invite new teachers, and communicating our work and mission to school officials and potential donors.
Resources - meeting locations, equipment, organizational structure, human capital, funding

A core group in geographical proximity to each other that are able to get together, face-to-face, approximately monthly is necessary to build and keep momentum. In St. Louis, teachers trained as PTRA’s were tasked with sharing their resources. They formed a core group that pulled teachers together from all over the city.
In New York, we had a university professor who had trained teachers all over the city, including one of my colleagues. That contact led to a discussion that brought together a top-notch group of physics teachers that formed the nucleus for STEMteachersNYC.

Identify a few leaders who will do a bit of everything, considering that you are here, that is probably you. In St. Louis, my mentors were the planners and leaders of workshops. Then I volunteered to update the group’s website and then became the person that dogged people for workshop details to post on the website and send out to the group via email. In New York, Fernand Brunschwig made STEMteachersNYC his focus in his “retirement” putting far more energy into the group than I had imagined possible.

Volunteers come in two types - those who ask “what can I do to help” and those who did not know they were workshop or organizational leaders or until they were persuasively asked. Perhaps the best PD you can offer to another teacher is to take note of something interesting that is going on in their teaching and then ask them to share it with other teachers. 

Participants are your future leaders, volunteers, and core group. Giving each participant the opportunity to be known and to know others moves them from being a passive observer to someone who is integral to the success of the workshop. Immersive workshops in which participants do exactly the activities that their students would was the norm in St. Louis, and soon I had some neat approaches to light going on in my classroom that I was asked to share with others. 
In New York, we quickly realized that calling ourselves PhysicsteachersNYC was too limiting -- so many teachers teach in multiple disciplines, that after a year, we widened our umbrella by renaming ourselves STEMteachersNYC. This shift in identity has helped us to reach physics teachers that we may not have otherwise met -- people who teach a class or two of physics by identify as a physical science or chemistry teacher according to their training. In St. Louis, we would often have a joint meeting with area chemistry teachers on an annual basis, and one of SLAPT’s greatest supporters was a professor of chemical education.

Finding people requires contacts, and word-of-mouth is still the most powerful hook. In both organizations, we’ve mined databases of schools for teacher and administrator contact information to send out mass paper mailings to get the word out, but it’s still the “I heard they hired a new science teacher over at school X, let’s reach out to them” is the long, slow process of building the organization one person at a time.

While the St. Louis group has always been all-volunteer, STEMteachersNYC grew so quickly that we needed staff to handle the day-to-day web maintenance, workshop registration, money handling. In New York, we now have an executive board with active subcommittees that guide the big picture while letting the staff take care of the nitty gritty.

In New York, we’ve adopted the slogan, “For Teachers, By Teachers, About Teaching.”

Whenever we bring teachers together for a workshop, we want to immerse them in the learning process, just as their students would, and then reflect on the experience from a teacher’s perspective: what teaching tools were used? What variations would work in different school environments? What support will students need? Such workshops strengthen content knowledge in a non-threatening way, and the pedagogy discussion pools the experience and ideas of all participants. In both groups, Modeling Instruction pedagogy has factored in significantly, but so has ISLE, TIPERs, PTRA, Quarknet, Hands-on-Universe, and CASTLE, among many others. Most of our school-year workshops are three hours long on a Saturday or Sunday morning, and in the summer we run workshops from a few days to a few weeks long.

I am a big fan of make-n-take workshops. I like sending teachers away with a class set of materials so that they are much more likely to use what they’ve experienced in a workshop with their students. I’ve made low-cost electrostatics kits, homemade photoelectric effect devices, slow acceleration apparatus, customized globes with laser-cut measuring tools for exploring the seasons, inertial mass balances, wave generators, double slit diffraction . . . lots of stuff. And there’s no better way to spend some time with other people and making things. The conversations are great, food is involved when appropriate, and the best kind of collegial bonding occurs. You are sure going to come to the next workshop -- to see your friends as much as to grow as a teacher.

Field trips for teachers are great scouting trips for enrichment activities you might do with your students. In St. Louis, we held our annual meeting/election/workshop planning get-together at the interpretative center for a superfund cleanup site. It was so interesting. In New York, we got a tour of Highbridge, the original aqueduct that brought water into the city with the engineer who oversaw its restoration. We established a great partnership with Six Flags St. Louis to really make physics day a special event where students actually do some very interesting physics. We offered a fall workshop to help teachers learn how to incorporate amusement park physics into their curriculum and prepare them to take advantage of the resources we made available to them for Physics Day in April.

While less of our focus, we’ve also had some fantastic guest speakers. In New York, we heard from the IBM scientist who discovered the foundational principles of Lasik eye surgery, about culturally relevant teaching with Chris Emden, author of “For Folks Who Teach in the Hood -- and the rest of y’all, too,” and climate scientist James Hansen.

Communication has become cheaper than ever before, but that doesn’t mean that it’s easier. As people become attached to a small number of electronic media sources, our organizations now have to post content and monitor responses on multiple platforms. 

Back in the old days, the 2000’s, I managed the website for St. Louis Area Physics Teachers and dogged people with emails for information and sent out notifications to our mailing list as I updated the website. The mailing list was a low-tech affair that required maintenance on the backside from me, and that has been upgraded to using Google forms to capture member sign-ups.

In New York, we’ve used a Google group that allows anyone to post to the group. The emails it generates can contain rich content and we use it to direct teachers to our website (which is really nice). There are such good tools readily available to build the interactivity of the site: Workshop registrations are handled through Eventbrite, donations are accepted through PayPal, and new member signups are managed through Google forms. 

Yet with all this, it’s still nice to have a phone number and an address. When you’re trying to plan a series of workshops, it’s so nice to just be able to talk to someone instead of sending yet another email. And when you’re trying to reach those who do not yet know all you have to offer, snail mail is still a good option to promote what you can offer to teachers and to schools. 
It is possible to run an organization on a shoestring budget, as we did in St. Louis. Current membership dues are $10 per year and are collected on an honor system. It funds website costs, mailings, and an annual award. Workshops are hosted by teachers at their schools on Saturdays or after school. Make-n-take workshops carry a fee that covers the expense of the equipment. 

In New York, we quickly realized that we would need a formal accounting system and non-profit status. For a short time, the American Modeling Teachers Association let us bank through them until we had completed the paperwork and approvals for non-profit status. Part of the process requires adopting a constitution and by-laws, and thinking about such structure greatly provides a structure for stability and transparency. Being a formally recognized organization has allowed us to form an agreement with Teachers College, Columbia University, where we hold the majority of our workshops. We also obtained insurance, and we work with an accountant and a lawyer to make sure we are doing everything just right. Having our governance in order has allowed us to apply for grants and solicit funding from individuals and corporations. Lucky for us, our fundraising committee is led by someone who has had an entire career raising funds for non-profits, and our finance committee is bolstered by several members of the Harvard Business Club, bringing expertise that we teachers never trained for. They helped us to see that $20 for a high-quality three-hour workshop is still cheap compared to George Washington Bridge toll of $14. Further, we developed a partnership with the New York City Department of Education that allows city teachers to receive per session pay for participation in our workshops and also receive continuing education credit.

In St. Louis, I worked with SLAPT to meet AAPT’s requirements to become a section of AAPT. SLAPT is more active than many AAPT sections and is a model for effective local professional development. I hoped that section membership would give us a voice in improving the section structure, and then I got distracted by moving to New York and participating in the fantastic growth of STEMteachersNYC!

Check out our websites for a sampling of the fantastic things that are going on in these organizations!


St. Louis Area Physics Teachers

A New Physics Curriculum Project

posted Aug 18, 2018, 9:07 PM by Mark Schober   [ updated Aug 18, 2018, 9:12 PM ]

Through a fantastic collaboration between STEMteachersNYC and New Visions for Public Schools, I have the great privilege to work with Kelly O'Shea in developing a new high-school physics curriculum. Kelly's leadership in this project has been visionary and the bulk of the credit for what we have produced must go to her. We are bringing together best practices in physics teaching, our own experiences, and insights from our New Visions collaborators, Kiran Purohit and Libby Chatham. 

Beyond the best physics education research inspired practices incorporated into the materials, the curriculum is structured around classroom practices that overtly support and facilitate equity and inclusion. Some of our guiding beliefs are that:
  • Every student can and should learn physics, therefore, classroom interactions must help every student to feel that they belong in the community of physics learners.
  • Every teacher can and should feel confident teaching physics, acknowledging that the majority of new physics teachers are science teachers from other disciplines.
  • Physics is a collaborative endeavor based upon observation and data, and made sense of through reasoned discussion. Collaboration must be taught and practiced in the classroom while downplaying individualistic, competitive behaviors.
The materials are designed to meet the current needs of teachers preparing students for the New York Regents exam in physics while integrating practices and approaches that will enable teachers to prepared their students well for the upcoming revision of the Regents exam that will be based on the Next Generation Science Standards. 

Kelly and I also desire that all the materials be open-access, freely available, and shared under an attribution, non-commercial, share-alike creative commons licenseComplete this brief form to get access to the materials, which are accessible through the New Visions site. We thank New Visions and STEMteachersNYC for supporting us as we develop materials to be freely shared. This is a super-exciting project, but very much a work in progress. I'll post progress reports as we develop these materials.

Progress reports:
  • As of July 2018, seven learning cycles for force and motion (version 0) have been posted and we've run a three-day workshop for a dozen New York City teachers. 
  • As of August 2018, over 300 teachers have requested access to the materials.

Grading Scales, Rubrics, and Adding it All Up

posted Jul 4, 2018, 6:25 PM by Mark Schober

Once you have created your learning objectives and determined assessment questions that address those objectives, you next need to choose how will you rate student performance on those objectives. Finally, you will need to choose how to aggregate all of the scores into a current grade or a final grade. You’ll have to consider the constraints of your school’s learning management system and grading software, and you have to consider your workload. SBG generates a lot student performance data, and you need to be cautious about how each choice you make can affect the quantity of data generated and the amount of time it will take you to process it.


Two-Level Rubric

The simplest rubric system is a binary yes/no; perfect/not quite there yet. It’s what I use for my physics class, and I’ve chosen it because I have a relatively large number of objectives (~35) and not all objectives are equally important. It also works because my students are highly motivated, and when they do well, but not well enough to earn a proficient rating, they don’t need partial credit to encourage them to practice and reassess -- they want to perfect their understanding.


P = proficient (1)

L = learning (0)

 = unable to assess (no gradebook entry)

I want to see a certain number of proficient scores on each objective, and I want to see more proficiencies on some objectives than others. So in my system, there’s no averaging of scores for an objective or across objectives, it’s just a count of proficiencies earned out of the total number of proficiencies I want to see during the year. A student’s gradesheet is a double-sided piece of paper that they can use to track their progress. I also keep a gradesheet of their progress, and we compare before the end of each grading period to correct any errors or omissions.

Three-Level Rubric

I based my scale off of Kelly O’Shea’s scale which I liked for its simplicity.


2 = Mastery shown (this is a “yes”)

1 = Developing mastery— could be an error in process, arithmetic, units, etc, but something about the approach was correct. (this is a “no”)

0 = No mastery shown— so many errors or confusions that the student does not seem at all close to mastering this skill. (this is a “no”)

– = No data —  student misinterpreted a question so much that the skill I’m trying to test is not observable in their response, or I don’t see their response as good evidence either way, or their response simply did not involve the skill. It is sometimes possible to have a completely correct solution without showing a particular skill that I was expecting to see.

Notably, Kelly divided her learning objectives into two levels, “A” objectives and “B” objectives. The foundational ideas, B objectives are what build a student’s grade from 70-90%, and the more sophisticated synthesis tasks, A objectives, are what allow students to earn grades from 90-100%. This makes aggregating the overall score a bit more complicated, but it is a good way of acknowledging that some objectives are more sophisticated than others. See Kelly’s blog for the details.

Four-Level Rubric

Kelly O’Shea’s four-level system uses numerical values that translate into conventional grades easily. (see Taking a student’s best score on each objective, averaging them, and moving the decimal one point to the right yields a conventional grade.

10: Good understanding. All (or all of the relevant) sub-skills are shown. The errors (if any) are merely cosmetic.

8: Developing understanding. Several of the sub-skills are shown. There are some errors or omissions in the work.

6: Beginning understanding. Some of the sub-skills are shown. Significant conceptual errors may be present.

5: No attempt made or work shows no understanding of this skill.

Here is a 4-point scale that I originally used for my astronomy and modern physics class:

4 = correct with exceptional clarity and organization

3 = answer correct

2 = principles applied with partial success

1 = basic principles identified

0 = no understanding demonstrated

I applied the same rating scale for every objective, and at end of the semester I added up the best score a student had earned on each objective to get a total score, and that score was compared against a numerical scale that I had set. Earning all 3’s results in a B, some 4’s are needed to earn an A.

I've now switched to Kelly’s 10-8-6-5 scale for the astronomy class using the following rubric:
10 = principles applied perfectly with attention to details*
8 = principles applied demonstrating understanding
6 = principles identified
5 = no understanding demonstrated 

* Details. My problem solutions (a) are mathematically accurate, (b) include units on all numbers, and (c) report answers to appropriate precision given the precision of the measurements or provided values.

Robert Marzano advocates a four-level marking system, and has developed lots of detail as to why. Once he goes into a detail about fractional points (2.75 vs. 3.0) he loses me -- that kind of splitting hairs is a logistical nightmare. Don't do it! If you adopt a multi-level marking system, it is important to have a small number of learning objectives, or you will be buried in bookkeeping. What he is also suggesting is that for every one of your objectives, you write tasks characteristic of each level for that objective. While that is nice and could bring some consistency between different teachers of the same course, it’s a crazy amount of work of the sort that district curriculum coordinators might tackle.

Here is my mash-up of several of Marzano’s charts into one that I share in order to inspire a much more nuanced sense of what each level in a four-level grading system might mean and how to assess for each level of understanding. Pick out of this what is useful, and if you want to learn more, check out his book, Designing & Teaching Learning Goals & Objectives.

Level of Difficulty/


Learning Goals/ Marzano’s New Taxonomy

Mental Processes

Action Verbs & Tasks

Level 4

Score 4

Traditional grade: A

More complex learning goal:

Knowledge utilization

Decision Making

Problem Solving



Decide, what is the best way, which is most suitable, solve, how would you overcome, adapt, develop a strategy, figure out a way, how will you reach a goal given constraints, experiment, generate and test, test the idea that, what would happen if, how would you test that, how can this be explained, based on the experiment what can be predicted, investigate, research, find out about, take a position on, what are the differing features of, how did this happen, why did this happen, what would have happened if

Level 3

Score 3

Traditional grade: B

Target learning goal: Analysis



Analyzing Errors



Categorize, compare and contrast, differentiate, discriminate, distinguish, sort, create an analogy, create a metaphor, classify, organize, assess, critique, diagnose, evaluate, edit, revise, generalize, infer, create a principle, create a rule, trace the development of, form conclusions, make and defend, predict, judge, deduce, develop an argument, under what conditions

Level 2

Score 2

Traditional grade: C

Simpler learning goal: Comprehension



Describe, explain, paraphrase, summarize, symbolize, depict, represent, illustrate, draw, show, use models, diagram, chart

Level 1

Score 1

Traditional grade: D

With help, partial success at level 2 and 3 content:





Recognize, select, identify, determine, exemplify, name, list, label, state, describe, use, demonstrate, show, make, complete, draft

Score 0

Traditional grade: F

Even with help, no success


Given the additional student performance data from your rubrics, it should occur to you that this won’t fit into a conventional gradebook. In a conventional gradebook, students are listed in rows, assessments in columns, and assessment scores are recorded at the intersection of the two. With SBG, you’ll have multiple scores for each assessment. An SBG gradebook lists students in rows, and the learning objectives as the column headers. The assessments live in a third dimension, as each objective can be assessed multiple times, acting like layers on top of the student-objective grid. Software has been written just for this task, but there are also ways of recording scores on paper, as well.

Ideally, the nuanced objective-by-objective scores would constitute the preferred grade report because it provides so much more information than a single letter grade provides. Unfortunately, our educational systems tend towards simple grade reporting and eschew subtlety, and, therefore, once the data is recorded, we need to find a meaningful way of aggregating the data to generate overall grades. The rubrics above and their linked explanations suggest systems for coming up with an overall grade, and most SBG gradebooks accommodate these calculation schemes for you.

No matter which system you use, I believe that it is vitally important for each teacher to have a voice in the student objectives, rubric, and system of grade aggregation. You need to fully believe in what you are doing, and a system that has been forced upon you will be frustrating in all the ways it is not optimal and in your inability to improve it. SBG does not lend itself well to top-down implementation. To fully integrate SBG into their instruction and assessment, teachers need the flexibility to modify, revise, and adapt their implementation of SBG based on their daily experience of implementing it in the classroom.


As some of the free-to-teachers Standards-based gradebooks like ActiveGrade were bought up by other companies, Josh Gates created a standards-based gradebook from scratch that does all of the things you would want. It supports multiple teachers, individual or shared objectives, and Josh has been wonderfully responsive to providing customizations to meet your needs. It's definitely worth looking closely at.

Each student will need a page with objectives listed in rows, and assessments listed in columns. For every objective measured in an assessment, the scores for each objective are recorded in a column. The nice thing about a paper gradesheet is that each student can be given a blank gradesheet to record their progress during the course, encouraging them to think about their learning in terms of the objectives. I use paper gradsheets for my physics and astronomy classes and it works great. 

PowerSchool (formerly Haiku and ActiveGrade)

PowerSchool is the most widely used learning management system in the country and it has the capability to support SBG, but you might need your school’s PowerSchool administrator to turn on the SBG functions. In most schools, the PowerSchool administrator is the only one who can set the objectives, rubric, and grade aggregation system, so you’ll need to spell out what you need very clearly. Here’s the directions for the administrator to actually make those modifications to the PowerSchool settings:

In order to experiment with PowerSchool without the constraints of your site administrator, you can create a solo teacher account where you have control over all parameters, but your account won’t have access to your school database with student and parent information that would allow students and parents access to the gradebook. You, of course, could add all of that information in your spare time.   :)

Q. How do I create an Solo Teacher account in PowerSchool Learning?

A. Go to and click New to PowerSchool Learning? Let's get started! at the bottom of the page.

That's it! Once you click that link, select I am a Teacher, and we'll walk you through the account creation process.

JumpRope – free for individual teacher users

JumpRope does everything you need SBG software to do, but some components still require the Adobe Flash plug-in, but the programmers are working to eliminate any hassle from that. It's also backed by the NYC department of education, so NYC teachers should can lean on that connection for support.

Schoology -, supports SBG if your school pays extra money for the Enterprise version.

Skedula -, supports SBG.

Kiddom - - Standalone SBG software that is developing integrations with Edmodo, Schoology, Engrade, and PowerSchool LMS’s.

Google Classroom, Edmodo, and Whipple Hill, do not support SBG at this point. Yeah, my school uses Whipple Hill. Therefore, I use the wonderfully reliable technology of paper.

Crafting Multipurpose Learning Objectives

posted Sep 12, 2017, 8:01 PM by Mark Schober   [ updated Jul 7, 2018, 8:24 PM ]

Well-written learning objectives make expectations clear to students, bring focus to instructional time, encompass the external requirements on your course, and guide the design of assessments. In my teaching, learning objectives also serve as the starting point for conferences with students, valuable content for grade narratives, and they form the heart of my standards-based grading system.

Depending upon whom you read, goals, objectives, and standards are all defined differently. It’s important to know your administrator’s definition of these terms so that you can please the powers that be, but this document takes a more pragmatic approach in order to create statements useful for standards based grading, what we’ll eventually call student learning objectives.

Multiple Purposes of Learning Objectives

  • Learning objectives clarify models and connections between models
  • Learning objectives make expectations clear to students and support metacognition 
  • Learning objectives bring focus to instructional time
  • Learning objectives guide the design of assessments 
  • Learning objectives jump-start conferences with students
  • Learning objectives provide meaningful content for grade comments
  • Learning objectives form the core of a standards-based grading system
  • Learning objectives summarize your course for colleagues/administrators
In broad terms, what do you want students to take away from your course? What do you want them to remember years from now? What skills do you want them to develop? You may have prescribed state, district, or national course goals that you are to follow; you may be following the goals for particular end-of-course assessments; or you may be free to design the course as you see fit. In any case, begin by writing down what you want students to know and do by the time they finish your course. If it’s important to you, write an objective for it. Such a list could initially be many pages long, but through successive revisions, you’ll pare it down into a compact, robust list of student learning objectives. 

Version 0: Gathering resources. There's no writing at this stage, it's just researching and cutting and pasting. Get out your course syllabus and any curriculum guides or textbooks you follow. Find the national, state, local, and standardized exam objectives that apply to you. Even if you aren't beholden to these standards, you'll find that these statements/standards/objectives will jumpstart the process and give you something to react to as you write your own learning objectives. Note also that few of these are worded in a way that will be helpful for assessing student growth in your classroom. Through the steps that follow, you'll develop a more user-friendly set of learning objectives.

Version 1: Write down your teaching objectives. Robert Marzano suggests starting your list of learning objectives using these prompts:

Students will be able to _______________________

Students will understand ______________________

The former referring to procedural knowledge (skills) and the latter to declarative knowledge (content). As you craft this initial list, you are writing from a teacher’s perspective, so feel free to use discipline specific terminology that may not be that useful to the student. If you have been teaching for a while, you will be able to write these down off of the top of your head, but if you’re teaching a new course or are new to teaching, using other teacher’s objectives is a good place to start.

Version 2: Choose your words carefully. While using “understand” and “will be able to” are helpful for starters, both should eventually be replaced by more specific and measurable verbs. Marzano suggests: Name, List, Label, State, Describe, and Identify for demonstrating declarative knowledge, and Use, Demonstrate, Show, Make, Complete, and Draft for procedural knowledge. 

Version 3: Organize and simplify your list. Here is where you think about the deep structure of the content of your course. Ultimately, you want fewer than forty objectives for the year -- aim for twenty or fewer if you can. That kind of consolidation requires identifying the big themes, recurring skills, and core concepts of your course. Here are some things to think about when streamlining your list of objectives. 

1. Specific Outcomes: Student learning objectives should be concrete in terms of what the students should be able to demonstrate, but broad in scope. If you get buried too deep into content details, you’ll have too many objectives to keep track of.

2. Reusable Objectives: Ideally, each objective is assessed many times throughout the course -- each new topic builds upon the ideas introduced earlier in the course and requires the application of previously learned skills. Therefore, a new unit of study may only add a few new learning objectives while still using many of the previous learning objectives.

3. Content AND Skill Objectives: Avoid the tendency to think too much along content-driven objectives and conscientiously include skill-based objectives. For example, problem solving skills can be shown in the context of all sorts of different content and may make a better starting point for a unifying objective in your course.

4. Skip Trivia: Topics disconnected from the central themes of the course are what students will forget almost immediately -- so either leave them out or find a way to connect them to the central themes of the course.

5. Show Content Understanding through Skills: Skills and content are often intertwined in middle and high school math and science courses, and combining them into objectives that demonstrate understanding by using a problem solving skill make robust learning objectives.

6. Tiered Objectives: Some of your objectives may refer to different levels of mastery of the same underlying concept. Such groupings lend themselves well to a rating scale, where each level of understanding is specifically defined. (If you don’t have objectives that group in this way, that’s fine, you can use a more generic rating scale.)

Version 4.0: Rewrite as student objectives. Write your objectives so that they make sense to students. The “I can _______” format for each learning objective gets students thinking about what they need to be able to do, where the blank is essentially identical to what you wrote from the teacher point of view. The only changes you might make are to modify the language to make it more meaningful to the student.

Version 4.x: Keep revising. Your classroom activities will shift your objectives and assessment, even while preparing students to meet your learning objectives will drive your classroom activities. Use this feedback loop to improve your objectives and your instruction. Also, while grading a pile of assessments in about March, you’ll think about how much easier the task in front of you would be if you had just worded the objective a little bit differently, or combined two objectives, or added a different type of objective. Make note of those thoughts and keep revising. It’s an ongoing process.

Sharing Your Expertise

posted Jul 30, 2017, 8:17 PM by Mark Schober   [ updated Jul 30, 2017, 8:29 PM ]

Presented at the 2017 AAPT summer meeting in Cincinnati upon receiving the 
Paul W. Zitzewitz Award for Excellence in Pre-College Physics Teaching. 

Thank you, Dr.  Bailey, the AAPT awards committee, my students, nominators, and all of you from whom I have learned so much. We can accomplish so much more with each other's help, and so many instrumental people have shaped my career that have not yet received this award. Please join me in nominating the physics teachers that influenced you for the Paul W. Zitzewitz Award for Excellence in Pre-College Physics Teaching. 

Workshops are great venues for sharing expertise. This is not a workshop, so I won’t try to share much expertise. Instead, I have some stories to get you thinking about the professional nudges you’ve received and those you have given. Perhaps, too, my tales about professional learning communities, Modeling Instruction, and standards-based grading might pique your interest to learn more.

It’s easy to imagine that we forge our own professional path -- that our choices and decisions have built our careers. But it seems to me that our paths are much more like Brownian motion – where countless interactions with others shape our path, sometimes obviously, but often invisibly -- pushing us in a particular direction. 

So throughout this talk I’ll ask you to remember people that have influenced you. First one: Think about the experiences and people who first opened your eyes to the world of science . . . I hope you’re thinking happy thoughts.

In my case, my parents ran a newspaper in southern Minnesota where I was surrounded by technical wonders -- a 3000 pound paper cutter with a massive flywheel that rumbled the building as it spun -- the temperamental offset printing press infused with the smells of ink, oil, gum arabic, and quickwash solvent -- the old trays of moveable type and the heavy lead linotype slugs -- the 14-foot long flatbed camera with one end in the dark room where the funky smell of developer contrasted with the sharp acid smell of fixer which accumulated silver ions the more it was used -- the clanky 9-foot tall linotype on its way to the scrapper. We even had a hand pump for water until my parents rebuilt, remodeled, and replaced the ancient utilities and walls, always allowing me to be fully involved in fixing and building.
At home, we maintained a huge yard and garden — planting, weeding, watering, and noting what grew well and what didn't. Harvest involved lots of cooking, canning, and freezing - work that we enjoyed as little tastes of summer during the long Minnesota winter.  Mom and Dad’s curiosity and inventiveness around physics, chemistry, biology and ecology, as well as their dedicated service to the community, created a formative environment that made science and service almost inevitable for me. 

No doubt, most of us see teaching as service. And I’ll bet that those who freely shared their time and expertise are still some of your favorite people. 

Wayne Johnson and Tom Butterfield helped me learn how to lead others in Boy Scouts while using their vacation time to take us camping in the Minnesota boundary waters. Who pushed you out of your comfort zone?

Leo Morgan endured my unending questions as I learned woodworking and industrial technology with him. Who had unlimited patience for you?

Bob Stephans taught me acting, singing, and tech theater. Who assigned you a task and said, figure it out?

Gladys Meyers taught me how to run that cantankerous printing press at the newspaper. Who was your all-in-one tutor, trouble-shooting manual, and personal support?

Band director Burt Svendsen helped me build sets for four community theater productions. Who kept you going on big projects with, “What’s next, boss?”

Professor Mark Gealy helped me to understand physics -- with a sense of humor. Who helped you keep things in perspective?

Professor Orv Haugsby taught me to love differential equations and took me on a month-long mathematical exploration of Europe. Who showed you what math, science, and art have in common?

College roommate and fellow physics-teacher-to-be Eric Koser and I road-tripped to Boise, Idaho in 1993 for my first AAPT summer meeting. At the meeting, Eric’s dad, John Koser, introduced us to everyone and made us part of this community. Who stood by your side and who welcomed you in?

My point is that these nudges of shared expertise shape us even more than our conscious decisions.

I went to grad school just up the road from here at Miami University where I worked with Jim Poth and Beverly Taylor. Dr. Taylor involved me in the Teaching Science with Toys program, and I still use toys in my teaching today. But most of my time was spent working with Dr. Poth. 
I assisted him in teaching two courses where we tested draft versions of Tutorials in Introductory Physics, and Physics by Inquiry. Jim and I would frequently have lunch together to go over every line of the curriculum. Why is this question being asked? What is the student difficulty that this is seeking to counter? What are the key questions that should be asked at this checkpoint? Jim gave me a sense of how you could teach in a rich learning environment by asking good questions. Jim also encouraged me to make a presentation at the Ohio section of AAPT on my work in assessing student understanding of electricity and magnetism. I didn't have very dramatic results, but Dr. Poth started me into the world of sharing with other teachers.

Pursuing a teaching position in St. Louis, I met Bill Brinkhorst as he led an AAPT make-and-take workshop with Val Michael. 
I had skilled, generous colleagues in Bill and my office mate, Jerry Taylor -- Who took you under their wing? I still struggled as a new teacher, recognizing that the wonderful materials I had worked with in grad school were completely unsuited to my high school students. I borrowed extensively from my colleagues, without really knowing how to use their materials to serve my students best.

In my second year of teaching, I was encouraged to apply to a professional development program called Modeling Instruction, recommended by Rex Rice. Who looked out for you -- identifying opportunities that you didn’t know existed? 
I committed to two summers at UC Davis under the guidance of Don Yost, Wayne Finkbeiner, and Jeff Hengesbach, and it was just what I needed. 

The multi-week modeling workshop placed participants in student mode while the workshop leaders guided us through labs, problem solving, and lab deployments as they would with their own students. We represented problem solutions on big dry erase boards and presented them to each other, fielding inquiries and socratic questions from fellow participants and the leaders. Despite our degrees in physics, we found that we often hid behind terminology or algebra. The immersive experience allowed us to relearn physics content, practice new pedagogical approaches, and discuss common student issues. 
We turned away from a topics-based approach to physics and started thinking about how to use lab data and multiple representations such as graphs, diagrams, equations, and words to develop the underlying and unifying models of physics.

Modeling gave me a practical framework to implement an instructional philosophy that built upon my prior experiences with Dr. Poth. The sample curriculum was editable and adaptable to my students’ needs. The Modeling instruction community, born out of the intense process of questioning everything that we thought we knew about physics teaching, formed committed bonds with our fellow teachers. 
We believed so strongly in keeping Modeling Instruction going, that we formed the American Modeling Teachers Association to promote, strengthen, and train teachers in Modeling Instruction. I led scores of workshops and worked with the most wonderful teachers across the country in the process. I served as AMTA’s president and I’m now back on the board as we strive to fulfill our mission while staying true to our grass-roots origins.
There are a number of in-depth professional development programs for high school teachers like Modeling, and I hope that you have connected with one. My teaching, before and after I started Modeling, is the difference between playing notes and making music.

Of course, we should attentive to our students’ feedback, because any one of them has spent more time in our classroom than any adult ever will. So, once I started Modeling, I asked my students what happens in our physics class that helps them to learn well.

One student said “I like how we work in small groups then share our work with the rest of the class, and we all give each other beneficial comments.

Another said “When we work through problems as a class and have different groups present them. Also, when we are shown physical/drawn representations of different ideas.

A third said “Completing labs helps me to understand the math behind the physics we do and helps me to have a greater understanding of the material by seeing it first hand.

In St. Louis, I was warmly welcomed into the St. Louis Area Physics Teachers – a group of mostly high school teachers that got together once a month in each other's schools to share a workshop on some aspect of physics teaching. Physics teachers are rare enough that we really need local professional learning communities to find peers beyond our schools. 
Here I learned from peers while also being encouraged to share what was going well in my classroom. We learned from each other and we shared with one another. 

I hope that you have been able to find a local professional support community, but these are still few and far between. For example, when I moved to New York City, I looked forward to finding the equivalent of the St. Louis Area Physics Teachers. To my surprise, no such organization existed. 

My new colleague, Nate Finney, had just completed his program at Teachers College Columbia University, and suggested that I talk to his science education professor, Fernand Brunschwig. Fernand was also was interested in forming a teachers’ group and he started reaching out to other teachers. We offered a weekend workshop that attracted enough people and produced enough momentum to run a series of weekend workshops throughout the coming year. It quickly became clear that physics teachers NYC was too limiting a name for the diverse group of teachers we were attracting, so we rebranded ourselves STEMteachersNYC. 
Five years later, our Google Group has 800 members, we run about 20 weekend workshops a year, we’ve hired staff, and we’ve formed partnerships with AAPT and AMTA among others, enabling us to offer extensive summer workshops: for teachers, by teachers, about teaching.

It's hard to imagine starting a group like STEMteachersNYC, (especially without Fernand, and his professional fundraiser spouse, Jennifer Herrig) but an all-volunteer group like St. Louis Area Physics Teachers has pretty straight-forward ingredients: Gather some teachers in the same space. Ask each teacher to introduce themself and share one thing that went particularly well in their classroom this year. That’s it.  You could add coffee and donuts of course. Note the teaching practices that you want to learn more about, and build workshops around them. It’s worked in St. Louis for over 30 years.

Sometimes the expertise shared with you is less like a nudge and more like getting hit by a truck. Seth Guinals-Kupperman and I had dinner to talk about upcoming workshops for STEMteachersNYC, and I asked him to explain standards-based grading to me. He offered an analogy: when you get the credit card receipt for dinner, it shows the total amount, but it doesn’t indicate what was paid for like the itemized bill does. Similarly, a single letter or number grade doesn’t tell students what concepts they do and don’t understand, so standards-based grading provides itemized feedback according your learning objectives. The learning objectives help keep instruction and assessment focused on what’s important in the course and they provide students with clear criteria for success. Also, if a student has trouble with a concept, they practice and try again.

I looked at Seth and took a deep breath. I had a lot of work ahead of me. Have you had one of those moments of realization? He was right -- this would serve my students better. I had been hiding behind the false objectivity of points-based grading and I couldn’t really say what concepts each student did and didn’t understand. Further, like learning a sport or an instrument, mastery requires sustained practice, specific feedback, and room to make mistakes. I wanted to provide that in my physics assessment.

So, sold on the concept but daunted by the logistics, I got the nudge I needed from meeting Kelly O’Shea. She talked me through a manageable system of keeping track of all of the student data, so I jumped in with both feet and it made a monumental shift in my teaching. My old tests walked the line between being too long to finish or having so few items that missing one question completely screwed up a kid’s grade. I now use a series of quizzes that often have a single problem on them that are rich, robust, multifaceted questions that really make it clear what the student does and doesn’t understand. Kelly describes SBG assessments as low stakes, (because you can try again) with high expectations.

I asked my students to describe our system of assessment and how it impacts their learning:

One student said, “I really, really love the system. Especially because with retakes, you’re constantly having to remind yourself of old material and everything is cumulative.

Another said “In Physics class, I can feel that it is okay to make mistakes because Mr. Schober encourages us to try our best and work on our weaker strengths. (weaker strengths?)

Another said “Suddenly, it's not about me and how 'smart' I am but about what I know.

And another “I enjoy it because it is very low stress; instead of worrying about each assessment being a grade set in stone, I can consider it a learning opportunity and a way to prove what I understand. 

We all share our expertise with our students and children -- that’s our job. But I hope I’ve illustrated how important sharing an idea with another teacher can be. I make this plea to you because 75% of the nation’s high school physics teachers aren’t connected to AAPT, and you, by being present at the summer meeting are the one percent. Further, two-thirds of those teaching physics don't have a degree in physics. These non-physics, non-AAPT physics teachers are wonderful people that mostly come to us from the other sciences. Physics got added to their teaching load because administrator looked at them and said "oh you teach science, teach physics too." We can be an extremely valuable resource to this large population of physics teachers. In fact, most of you could lead a whole variety of workshops in a heartbeat. For example:

Student activities for developing a particle model of light.

Teaching electrostatics with sticky tape, pie plates, soda cans and foam cups. The MacGyver approach.

Using energy bar graph representations - or as the kids call them, LOL’s!

Using labs to introduce a concept - From the holy book of Arnold Arons: concept first, then give it a name. 

Or A make-and-take workshop to send participants away with lab or demo equipment they can use.

Such workshops would benefit those new to teaching physics, or have physics as a fourth prep, or anyone else looking to sharpen their content knowledge and teaching skills. No matter the topic, sharing your expertise locally is vital to improving physics education nationally. 

There are also workshops maybe only a few of you could lead, and if you're one of those teachers, please share: 

How do I go beyond "I treat all my students equally” to “how can I best meet the diverse needs of diverse learners in the physics classroom, acknowledging that not all students need the same attention because of the inequities in our society.”? 

How do I teach physics as part of a unified STEM experience for my students, and how can I work with my colleagues to make this happen?

What can I do in my physics class that helps students transfer physics skills to a greater science awareness and scientific literacy?

I hope during my ramblings you have daydreamed a little bit, and that you are reminded about how much you have to share and how much other teachers need what you have to share. Thank you for this award and to all of you who have shared so much with me. Keep sharing your expertise so that your ideas outlive your teaching career in the teaching practice of others. Thanks to Jerry Taylor, Jim Poth, and Paul Zitzewitz, who did just that.

An 8-Step Transition from Points-Based Grading to Standards-Based Grading

posted Apr 14, 2017, 10:13 AM by Mark Schober   [ updated Jul 30, 2020, 8:37 PM ]

Even if you're sold on standards-based grading, the transition can be daunting. Here is a roadmap for sequentially implementing key aspects of SBG, improving a points-based grading system in the interim, and setting yourself up for a successful transition to SBG. Join Manjula Nair and me
 for a three-day online workshop July 27-29, 2020 in which we will flesh out the elements below for making the shift to SBG. Registration is through STEMteachersNYC.

There are excellent articles about why SBG is a less-imperfect system of assessing student performance than points-based systems. (No system is perfect!) I'm assuming that you are interested in SBG and are ready to give it a try.

1. Build your allies and identify your resources. The best laid plans can be frustratingly thwarted without a support base beyond your classroom. Communicating to various constituencies is an ongoing project. SBG is different than what people are used to, and it will take time to win some people over. But as you grow in your understanding of SBG, you will also find more effective ways to communicate why you are doing what you are doing to parents, students, colleagues, and administrators.

2. Write learning objectives. No matter what kind of grading system you use, clearly define what you expect students to be able to do after instruction and practice. Express the objectives in terms that the students can understand and share them with the students so that they know what is expected of them. Writing good objectives that are clear, useful, non-trivial, and concise is a difficult task. 
  • Start with objectives others have created and adjust them to fit your needs.
  • Keep the number of objectives small, perhaps 10-40 for the year.
  • Many objectives will be assessed repeatedly throughout the year, providing the students opportunities to demonstrate growth.
  • Write objectives for the skills and traits you value. (I don't have a homework objective, but next year I will add an "organized notebook" objective for freshman physics.)
3. Design assessments and performance tasks that assess your objectives. Discard question content that isn’t relevant to your objectives or assessment tasks that don’t use the skills your objectives value. Also, use your assessments to rethink your objectives. If there is something in your assessment that you find is really key, modify your objectives to accommodate that concept or skill.
  • Assessments can take many forms: from tests and quizzes to performances, debates, lab reports, papers, projects, discussions, interviews, and team challenges.
  • Keep assessments short - assess just a few objectives at a time. (This gets around the conundrum of designing percentage-graded tests that need to be short enough that students can finish in the allotted time, yet have enough items that failure on one test item won't result in a failing grade on the assessment.)
  • Assessments should have high expectations but low stakes. Ask sophisticated questions and require demonstration of skills that push the students. Students not yet able to succeed are encouraged to practice and try again.
4. Provide feedback in terms of the learning objectives. Rather than rating student performance on a quiz or test in terms of points or a percentage, let students know how well their work met the objectives. Give students metacognitive opportunities to consider for themselves how well their work met the the objectives.
  • Keep the rating system simple, especially when starting out. (Three level scale: proficient, learning, unable to assess. Four level scale: exceeds target, meets target, partially meets target, unable to assess.)
  • If a student is absent for an assessment or unable to attempt an answer, the rating is “unable to assess”. This does not count against a student as a score of zero would in a points based system where scores are combined by averaging. Instead, no proficiency is earned, and the student prepares to assess again in order to demonstrate their understanding.
  • Students can mark up their own assessments, recording feedback that is useful to them and getting instant feedback on their work. When students finish their quizzes, I have them leave their pencil behind and I give them a blue pen to make corrections on their quiz. In this way, I spend a lot less time writing corrections (that the students don't really read) because they have already made meaningful notes to themselves. They should also self-rate how they think they did in terms of the learning objectives. The teacher then adds any comments that the students may have missed and provides their rating of the student work. 
  • Homework can become more self-directed: students practice the concepts and skills they need to demonstrate success in. They don’t need to practice what they’ve already mastered.
5. Record student progress according to the objectives. Instead of a gradebook consisting of a grid of students vs. assessments, each student gets a page with a grid of objectives vs. assessments. Paper versions can be given to students so that they can track their own progress. Electronic gradebooks, such as SBGbook by Josh Gates, JumpRope, or Haiku (now a part of Powerschool), are designed for standards-based grading and can be configured so that students can monitor their progress online.
  • Again -- keep things simple. Your number of students times the number of assessments and reassessments times the number of objectives per assessment means a lot of information to keep track of. Don’t bury yourself in bookkeeping -- spend your time teaching!
  • With the focus on objectives, the discussion around grades completely changes. Instead of a student coming back to you after a poor test and asking how they can raise their grade, they instead ask how they can show their understanding of a particular objective.
  • Allow online access to grades cautiously. You want students to focus on the content and skills they need to explore, practice, and master, not the grade.
6. Foster a growth mindset in your students. Learning complex tasks can’t be mastered in a short period of time. However, with focused, appropriate practice, a student can grow in their mastery of complex ideas and skills. 
  • Too often, students see assessment as a battle between teacher and student. They see themselves as perfect, starting at 100% until the teacher acts as gatekeeper and "takes off" points. In SBG, it is clear that the teacher has high expectations for the students and will give students the instruction, resources, practice, and opportunities to keep practicing until mastery is achieved.
  • In standards-based grading. The students start at zero and build up their proficiencies on the objectives cumulatively over the course of the year. The teacher helps the students to acquire the skills and knowledge to demonstrate mastery, how ever long it takes.
7. Provide opportunities to learn from mistakes, practice, and reassess. The important thing is that students learn the content and skills of the course, not if they learn the content and skills at the same pace as their classmates. 
  • Ask students to show the written practice they have done to prepare for an extra assessment before allowing them to reassess. I require proof of practice before students reassess, and that will look different for different students.
  • Build up a library of assessments so that you have many extra assessments.
  • Feel free to reuse the same assessment when looking for mastery of core skills.
8. Gather feedback from students. Students are pretty perceptive. They find that this system makes the learning objectives clear, they are less stressed by assessments, they feel empowered to determine their level of content mastery (and their resultant grade) as a result of their practice and effort, they realize the importance of not procrastinating -- and they also have great suggestions about how to make the system work better.

There are, of course, a lot more details to be worked through to apply these ideas in the classroom. All the more reason to join me this summer for a 3-day workshop in which we can work together in order to help you develop a standards-based grading system that fits you and your teaching. I hope to see you there!

True Story

posted Mar 22, 2016, 9:33 PM by Mark Schober

This year's Chapel theme at Trinity School has centered on truth and truthfulness. Members of various departments explored aspects of truth in their disciplines and personal lives. I joined forces with my colleague Tammy Gwara to talk about truth in science. A link to Tammy's talk is below.

We introduced the chapel with two readings:

[excerpted from] The Value of Science
by Physics Nobel Prize Laureate Richard Feynman

I stand at the seashore, alone, and start to think.

Deep in the sea
all molecules repeat
the patterns of one another
till complex new ones are formed.
They make others like themselves
and a new dance starts.

Growing in size and complexity
living things
masses of atoms
DNA, protein
dancing a pattern ever more intricate.
Out of the cradle
onto dry land
here it is
atoms with consciousness;
matter with curiosity.

Stands at the sea
Wonders at wondering: I
a universe of atoms
an atom in the universe.


“What we do see depends mainly on what we look for. ... In the same field the farmer will notice the crop, the geologists the fossils, botanists the flowers, artists the colouring, sportmen the cover for the game. Though we may all look at the same things, it does not all follow that we should see them.” 
― John Lubbock, The Beauties of Nature and the Wonders of the World We Live in


Many of the year’s top movies were "based on actual events.” The “based on” disclaimer is important, because even with actual events, can a story ever completely capture the truth about what happened? 

Waves crash on the shore, salt scents the air, and the sun warms our face. Emotion and curiosity are stoked, and we dig deeper. Scientific storytelling is a never-finished draft of a great novel, overrun by editors who obsessively revise and rewrite — trying to get the story just right — striving to be true to nature, working towards a "true story.” Historians work in the same way towards true stories, but with human actions at the heart of their subject matter. Inevitably, the stories from all disciplines cross, moving us closer to a true story. 
It’s hard work to write scientific stories — but it’s deeply satisfying. Richard Feynman, the "atoms with consciousness; matter with curiosity" guy, said “Physics is like sex: sure, it may give some practical results, but that's not why we do it.” For Feynman, the profound spiritual experience of the story of science should be exclaimed through poetry, art, and music, not left to textbooks and lectures. When Albert Einstein found that his theory of gravity exactly predicted the anomalies of Mercury’s orbit, he was overcome with such a deep joy that he couldn’t work for days. True story. 

So I have a story for you — a mix of science and history — about why the sun shines.  

In the late 1800’s, scientists began recognizing that the earth is really, really old, and that was a problem. Keeping the sun hot enough to shine for that long was impossible by the known processes of chemical burning or gravitational contraction. A string of discoveries from all areas of science would eventually come to the rescue. In 1905 Einstein discovered that mass could be converted to energy. In 1911 Rutherford discovered that the mass of the atom is highly concentrated in a tiny nucleus. In 1920, Arthur Eddington proposed that atomic nuclei might fuse together and release energy, but also found that the temperatures inside the sun were too small to fuse positively charged protons to one another. In 1927, quantum tunneling was discovered, meaning that the impossible binding of protons could actually happen — like walking into a wall billions of times, and then suddenly finding yourself on the other side of the wall. Cool! Unfortunately, these diprotons immediately fall apart. The fusion doesn’t stick and the book of science needs further revision. In 1932 James Chadwick discovered neutrons and Eugene Wigner proposed the nuclear strong force — the glue that holds protons together. In 1938 Hans Bethe attended a conference on the dynamics of the sun — which was not his area of expertise — but working in collaboration with Charles Critchfield, they solved the problem of proton-proton fusion before the conference was over. Their results were crazy. On average, once in every billion billion billion proton-proton collisions, the protons stick just as one proton decays into a neutron. As improbable as that is, every second, 8 billion kilograms of mass is converted solar energy according to E = mc^2 and the sun shines. True story. 

Six weeks later, Hans Bethe worked out a different fusion cycle on his own that explained the formation of heavier elements in stars. That night, during a late-night stroll, his fiancee, Rose remarked on how beautiful the stars looked. He responded: "Yes, darling, and I'm the only one on Earth who knows how they do it.” Perhaps the hottest date in the history of science. 

Bethe submitted his paper for publication and then, in need of money, withdrew it. He entered and won a contest by the New York Academy of Sciences and used the prize money to secure his mother’s belongings. His mother had recently fled Germany in fear after Kristallnacht. For adding a page to the story of science, Hans Bethe won the Nobel Prize in 1967. He said that he found the answer by "looking through the periodic table [of elements] step by step. So you see, this was a discovery by persistence, not by brains.”

True Story.

Wait — Really? — Think about it. This story is of a place that we have not and will never go, involving particles only indirectly detected or not detected in Bethe’s time, occurring at a time and size scale that dwarfs us and our short lives. Yet this story allows predictions about supermassive stars and tiny dwarf stars, exploding stars, and dying stars, that match our observations to a ridiculous degree of precision. The predictions match the data — true story. But is our story about why stars shine true? Will future editors need to clean up some details or account for new observations? Look up what’s called the solar neutrino problem and you’ll see that the story has been edited very recently — and that’s why science is reluctant to deem our models “true”.

Albert Einstein is quoted, “One thing I have learned in a long life: that all our science, measured against reality, is primitive and childlike -- and yet it is the most precious thing we have.” 

We could have cut the story short:  Why does the sun shine? Because it’s hot — because of fusion. But that isn’t a story at all: It’s like only reading the title and the last page of a novel. Any belief in the truthfulness of such a story becomes a matter of faith. Scientific belief is rooted in unbroken chains of reasoning carefully edited and revised by the intersecting work of thousands of scientists. And beliefs change with the data. It’s the tool we use to separate sense from nonsense, astronomy from astrology, archaeology from ancient aliens. 

Science also has fictional stories where theory has raced far ahead of observations. We even hold onto some false stories, such as imagining electrons making neat loops around the nucleus of an atom, because, just as Coach Schmidt's fantastic, fabricated story of his past demonstrated, false stories can point us toward great truths.

There are unfinished pages in the book of science as well — about ten years ago, I met astronomer Vera Rubin. After graduating from Vassar in the late 40’s she tried to enroll at Princeton for graduate school but never received their course catalog — Princeton wouldn’t admit women to their graduate astronomy program until 1975. Instead Ms. Rubin went to Cornell and worked with both Richard Feynman (who wrote the opening poem as well as the rules of quantum electrodynamics) and Hans Bethe (who figured out how the stars shine). True story. She earned her PhD, raised four children who all earned science or math PhD’s, and measured the rotation of galaxies that, surprisingly, do not rotate according to Newton’s Laws. Dr. Rubin tried everything to salvage a Newtonian explanation, and to her dismay, she couldn’t. She was forced to the conclusion that there is a lot of unseen matter, unlike the atoms we are used to, that exert gravitational influences on galaxies. Her data became the basis for speculation on dark matter. Based on gravitational behaviors, we know that there is five times more dark matter than regular matter, and even crazier, all of this matter isn’t collapsing together, it’s exploding away from itself — at an increasing rate. In sum, the explosive energy and the dark matter is 95% of our universe — and we don’t know what it is. True story.
What we do with the science story matters.  Lead is toxic in our water supply. Burning fossil fuels increases gases that hold heat to our planet. Buzzing bugs are an important part of the food chain. Exposure to ultraviolet radiation over spring break increases the incidence of skin cancer. Evolution happens. The science story is as close as we can get to “True Story”, though any good scientist with a sense of history knows that we have so often thought we had it right, only to find that we had missed some detail that requires a line edit, new paragraphs, or even whole new chapters to be written.  

Let me close by returning to Richard Feynman:
"The same thrill, the same awe and mystery, comes again and again when we look at any question deeply enough. With more knowledge comes a deeper, more wonderful mystery, luring one to penetrate deeper still. Never concerned that the answer may prove disappointing, with pleasure and confidence we turn over each new stone to find unimagined strangeness leading on to more wonderful questions and mysteries—certainly a grand adventure!”

True story.

Tammy Gwara's thoughts about science and truth completed the chapel. It's a must-read:

STEMteachersNYC 2016 Summer Workshops

posted Feb 14, 2016, 9:56 PM by Mark Schober   [ updated Jul 10, 2018, 11:24 AM ]
Here are some of the neat features of the nine workshops offered by STEMteachersNYC this summer that you won't find in the official workshop descriptions (the official stuff is here). And it's all in New York City -- a pretty cool place to hang out for a few weeks while engaging in great professional development. Even better, bring a friend. Click on the image of the flyer and share the pdf with your colleagues. Our summer lineup is loaded with great opportunities, and we would love to have you join us!

Physics/ Mechanics Modeling Workshop
July 18 – August 5, 2016
Led by Paul Bianchi & Zhanna Glazenburg
I had a great experience in graduate school at Miami University. Working with Dr. James Poth, I was trained to co-teach draft versions of Physics by Inquiry and Tutorials in Introductory Physics, both developed by the Physics Education Group at the University of Washington. In my lunchtime training sessions, Dr. Poth and I discussed the curricular flow and instructional strategies line-by-line. I learned a lot about the design of physics-education informed curriculum materials, socratic questioning, strategies for redirecting incorrect understandings, and the world of pedagogical content knowledge. But when I got my first teaching job, I found that I didn't have curriculum materials that fit my students and could be used with the skills I acquired in grad school. My teaching struggled for a couple of years until I participated in the Modeling Physics Workshop. I very quickly saw how the Modeling teaching framework provided me with the additional tools I needed to implement the other skills I had learned along the way. I was hooked. 

So this is why the workshop is three weeks long -- yes, participants are introduced to a sample curriculum, but, more importantly, it's designed to help teachers learn many of the things I learned in grad school, providing a framework for understanding why things are as they are in an effective curriculum and how to adapt it to your own students' needs. The workshops are designed to help you understand an approach to teaching that's the metaphorical equivalent of teaching you how to fish rather than handing you a fish.  Paul and Zhanna are experts in their own classroom practice as well as experienced workshop leaders. They bring a wonderful mix of expertise that allows them to step to the foreground when needed or to pull back and let participants develop their teaching skills through practicing various elements of Modeling Instruction. And, if you're open to it, I predict that this workshop will transform your teaching like it did mine.

Chemistry I Modeling Workshop
July 18 – August 5, 2016
Led by Donghong Sun & Rachel Ward
When I took chemistry for the first time, the first unit was a completely out-of-context barrage of scientific method, significant figures, and dimensional analysis. The second unit was the quantum model of the atom, and the rest of the year was one random thing after another to the point that the chapter numbering didn't even matter. I'm thankful for my interesting and engaging professor, but this pretty standard chemistry course was filled with endless rules and exceptions to the rules. Therefore . . . I majored in physics, where I never felt that I had to memorize things, but could figure things out from fundamental principles and scientific thinking. When I took the Chemistry Modeling workshop I was so excited to see chemistry instruction built upon observation, fundamental principles, and scientific inquiry instead of the game of Memory. Starting from what we can observe about matter, we make inferences about matter at size scales smaller than we can see. The inferences constitute testable particle models of matter that suggest followup observations and subsequent revisions to the model. I found I could again use my intuition for pattens, electrical forces, and energy to predict what should and often does happen at the atomic level. I sure felt a lot better about teaching chemistry the following school year, and a few years later, my workshop leader, Tammy Gwara, accepted a teaching position at my school and became an integral member of STEMteachersNYC. Donghong has been leading the chemistry modeling workshop since we began offering it in NYC, and Rachel has recently joined the ranks of modeling workshop leaders by apprentice leading last year's chemistry workshop and participating in the leadership workshop. Donghong and Rachel will help you to see chemistry in a new way that you'll love.

Middle School Science Modeling Workshop
July 11 – July 29, 2016
Led by Erin Conrardy & Kathryn Bauer
I taught middle school astronomy and meteorology for a dozen years and found that there are lots of wonderful resources out there (mixed in with some truly awful stuff) but almost all of it lacked larger-scale coherence. Lessons were self-contained and did not build upon one another, representations tended to be largely word-based, and the generally low-level questions were unengaging. I had to work very hard in my own class to develop materials and approaches that engaged the kids' minds and curiosity to produce deep learning. (Check out my meteorology materials if you're interested.) The middle school modeling workshop addresses these issues across a variety of content areas through an infusion of modeling principles: use of the modeling cycle, emphasis on multiple representations, facilitation of student discourse, and structuring course content around key models in science. Through the workshop, we want to move you from being an end-user of curriculum to someone who feels empowered to modify materials in a pedagogically sound manner. Erin and Kathryn have been involved from the start of AMTA's work with middle school science and have seen it grow and develop. They are eager to work with you: helping you think about what do what you do in your classes and how to make your teaching even better. 

Chemistry II Modeling Workshop
July 5 – July 15, 2016
Led by Larry Dukerich & Donghong Sun
There's only so much that can be done in a three-week workshop, so once you've taken the Chemistry I workshop, you'll want to continue on to Chemistry II to see the ongoing model development through modern models of the atom, periodicity and bonding, a particle view of heating and temperature, intermolecular forces in biological contexts, chemical equilibrium, and acids and bases. With the particulate and energy representations of matter developed in Chemistry I, students (and teachers) have much more robust tools to reason through and predict what should happen in these sophisticated chemistry topics. If all of the good chemistry isn't enough reason to come, working with Larry and Donghong for a couple of weeks certainly is. In addition to being our STEMteachersNYC chemistry advocate and chair-elect, Donghong is perhaps the most cheerful person on the planet. Larry has been with Modeling Instruction from the very beginning, and the style and tone of the physics and chemistry materials are due to his writing and editing. Come to learn at the feet of the master. You'll also pick up on some of his "Dukerichisms," such as his criticism of the "factino" model of information transmission. Just ask him about it. . . when you don't have anything in your mouth that might fly out as you start laughing.

Introduction to Modeling
July 11 – July 15, 2016
Led by Mark Schober & Craig Buszka
A three-week investment in a Modeling workshop is a remarkable commitment that thousands of science teachers have made. Of course, it is logistically difficult for many teachers. Therefore, we're offering a one-week introduction to the principles of Modeling Instruction. Craig and I will introduce you to several aspects of Modeling Instruction such as facilitating student discourse, the modeling cycle, multiple representations in problem solving, and model-based curriculum design. We will explore these ideas by examining the role of energy, electricity, and light in biological and physical science contexts at middle school and high school levels to provide relevant contexts for all participants.  You'll be able to apply these teaching tools right away, and you'll also have a good sense of what you could gain through one of the three-week workshops. It's like a movie trailer: enough to get the gist of what's going on, but leaving you wanting more.

Graphical Problem Solving in Physics
July 11 – July 15, 2016
Led by Kelly O’Shea
Most physics teachers know that graphical problem solving is possible in kinematics and dynamics, but seldom do we teach it as the centerpiece of our problem-solving techniques. After you’ve spent a week with Kelly, you will. Graphical problem solving gets kids away from “searching for the right equation” and provides them with much more robust tools to solve sophisticated problems. Heck, you could try an experiment. Ask your students to solve this problem or try it yourself: 
A subway train moving at 18 m/s slows uniformly for 8 seconds, and then slows uniformly at a different rate for 12 seconds until coming to rest 210 meters from where braking began. Find the two acceleration rates of the train. 
I’ll bet that students who start by listing knowns and unknowns before searching for an equation will have a much harder time than those that tackle this by drawing a velocity-time graph and solving it graphically. Then see how well your students tackle angled force problems when they forgo the component vectors and use scale vector addition diagrams. I had no idea how much better these approaches would be for my students until I followed Kelly's lead and gave it a try. This workshop is a don’t miss!

Curriculum Development Camp
July 18 – July 22, 2016
Led by Mark Schober
Any time you go to a conference or workshop, it takes at least as much time as you spent in the PD trying to extract the useful ideas and prepare them in a format tuned to you and your students. The opportunity to have a full week to work on anything you want in a low-distraction environment with the opportunity to bounce ideas off of your peers is a great way of getting school-year ready. I call it a camp because it doesn't have a fixed agenda.  Some projects might involve writing or adapting curriculum for your students, testing labs/projects/curriculum, creating videos for "flipped" instruction, developing learning objectives or assessments for standards-based-grading, curating supplemental resources for your class, practicing your programming skills to prepare your students to learn computational modeling, or whatever else is on your "to do" list. I am by no means an expert on all of these things, but I think that I am a good sounding board who can help you to produce your best work. Every day I work with each participant to hear their thought process, see what they have been doing, and then challenge, extend, question, (and sometimes help!) participants in their work. It makes for a great week in which you can leave with something useful for your own classroom to show for your efforts.

 Modeling Leadership Session I Workshop
July 18 – July 22, 2016
Craig Buszka & Ray Howanski
Modeling Leadership Session II Workshop
July 25 – July 29, 2016
Mark Schober & Colleen Megowan-Romanowicz
As the Modeling Instruction movement grows, we have an ever-growing demand for workshop leaders, therefore, the American Modeling Teachers Association made leadership training a priority. Existing workshop leaders contributed their ideas about what should happen during a leadership workshop, and I got the opportunity to work with Art Woodruff to lead the first leadership workshop in 2014. Last year, Craig Buszka and I led two leadership workshops and, in collaboration with our participants, developed an even richer experience with student mode, teacher mode, and leader mode roles. In short, a good leadership workshop offers participants the opportunity to lead and then reflect on their leadership.  Participation in the leadership workshop is by AMTA invitation only. Here is what we are looking for in our leadership candidates: Teachers who have taken two three-week Modeling Workshops; teachers who implement Modeling Instruction thoughtfully and successfully in their classroom practice; teachers who are interested in leading Modeling Workshops and are willing to commit to multiple summers of workshop leadership; and teachers who have been recommended by their workshop leaders.  If you're interested in workshop leadership, work towards meeting AMTA's criteria, ask your workshop leader to refer you, and reach out to AMTA to express your interest.

Pick your favorite workshops, and sign up now
I hope to see you this summer!

And, if you can't make it to NYC, workshops like these are being offered all across the country. Check the AMTA website for the complete summer 2016 workshop list.

A Simple-ish System for Standards-Based Grading

posted Feb 5, 2016, 12:26 PM by Mark Schober   [ updated Jan 10, 2018, 3:34 PM ]

The best advice for implementing SBG is to keep it simple. A complex system can do more harm than good if it mires you in paperwork and it's too hard for students to understand. After several years of revisions, my system now works smoothly for me and my students and provides the benefits that attracted me to SBG in the first place. So here is an explanation my standards-based grading logistics.

The screenshot below is part of my objectives and grading sheet for my 9th grade physics course. (Clicking the image will open up the full pdf of the objectives and grading sheet.) I've packed all of the key elements for the grading system onto one double-sided sheet of paper. Each student has this sheet in their binder so that they can track their progress, and I keep a sheet like this for each student as my gradebook. So that's one of the simplifications: a paper gradebook. I found that keeping track of all of the individual pieces of data on the computer wasn't worth the effort (I had used ActiveGrade). Before the end of each marking period, I make a copy of my gradesheets and hand them out to the students so that we can rectify any discrepancies.

The learning objectives (standards) I use are listed down the left hand side of the page. I started from objectives that others had written that I keep editing to better fit my course. I try to make each objective sophisticated, clear, and broad so that each can apply in multiple models. For example, when we get to unbalanced forces only two new objectives are added, but objectives from balanced forces and uniform acceleration are also assessed. This helps to keep the number of objectives small and it also helps students to see the connections between units of study. The idea with the objectives is to write objectives for anything that you value and want the students to value. Therefore, I have three laboratory objectives that for assessing lab work (that I describe as take-home quizzes). I've grouped computational accuracy, significant figures, and units into an objective I call "Details" for those situations when students clearly understand the content objective but have slipped on one of these other problem-solving skills. Finally, I have a "Synthesis" objective that requires multiple-model problem solving. Standards-based grading can sometimes become very reductionist, and this helps to address that issue. 

Students complete quizzes about once a week that I announce in terms of the objectives assessed on the quiz. Once students finish their quiz, I give them a colored pen and an answer key to mark their own quizzes with corrections and annotations. This gives them the instant feedback they crave and it also forces them to reflect on their current state of understanding. The relevant objectives are listed at the end of the quiz where I students to self-rate their work on each objective with either a "P" for proficient or an "L" for learning. I then collect the quizzes, add my comments to their work, and make my ratings for each objective. The simplicity of a binary grading system makes record-keeping easier -- it's either good enough or it isn't -- and there are no multi-level rubrics for each objective. There are good arguments for the complexity of more rating levels (see Bob Marzano's work), but with a highly motivated student body that consistently performs well on assessments, the binary system has worked well for us.

The weekly-ish whole-class quizzes are open-ended, a bit hard, and push the students.  Most quizzes look like this: thoroughly represent what is going on in a given problem situation and solve for everything you can to convince me that you understand the concept. Each of the whole-class quizzes are numbered starting from 1, and for multiple class sections I number alternate versions of the quiz 1a, 1b, 1c and so on. The goal is for students to become proficient with every objective, and some students need more practice than others before they can successfully demonstrate their understanding. Therefore, students are welcome to take extra quizzes, after demonstrating their practice, as often as needed. Students can always fill in missing proficiencies from earlier marking periods as well. I've developed an arsenal of extra quizzes that are grouped according to clusters of related objectives. Quizzes are named with a letter for the cluster followed by the quiz number. For example, the cluster of objectives related to quantitative problem solving with unbalanced forces is G, so these extra quizzes are named G1, G2, and so on. The short name for each quiz makes record-keeping easier.

The planner below helps students keep track of which clusters of objectives they need to practice further. 

To facilitate student practice, I've been writing a fleet of extra practice/quiz tickets. Practicing the concepts is the requirement to take an extra quiz. The header of the extra practice/quiz tickets contains the objectives, a place to indicate when the student is available to take the extra quiz, and an explanation of what happens if the practice isn't completed well enough.

Before I had developed the extra practice/quiz tickets, I just asked students to sign up through a Google form. (Extra Quiz Request Form) Students select which cluster of objectives they want to assess, choose when they want to take the assessment, tell me how they practiced, and reflect on what skills they have improved. The quiz request could be done on paper instead, but I've programmed a Google apps script (with help form John Burke) that takes information from the form submission and sends an email confirmation to the student and to me, and also creates a calendar item including the student name and quiz cluster. This makes it easier for me to print out the set of extra quizzes each morning as I'm preparing for the day. It also made the bar for taking an extra quiz so low that the students ended up using extra quizzes as practice, and that's why I implemented the extra practice/quiz ticket system. All the quizzes I give have the date auto-inserted into the header, so when I print quizzes out, the date is already there.

Every bit of assessed student work goes through a double-sided page scanner (Fujitsu's ScanSnap) and is sent to Evernote. I keep an Evernote folder for each student, and sort the scanned files into their folders. The result is a portfolio of each student's work. The students should also have a portfolio of their work in their binder -- as long as they keep things organized. When the students get their quizzes back, for each proficiency they earn on an objective, they record the quiz number in a blue box next to that objective. 

The number of blue boxes indicate my choices for how many times I want to see a proficient score on each objective. For example, I want to see multiple proficient scores on fundamental ideas and skills such as Newton's third law and using graphical representations to solve accelerated motion problems. Late in the year, when there is less time for reassessment, a single proficiency is sufficient. Even though I want to see many proficiencies on the details objective, the large number is mainly to keep students focused, as these proficiencies are not hard to earn. Proficiencies on the synthesis objective are what distinguish the students who know all of the basic concepts in the course from those who can use those concepts to solve novel problems. Therefore, the number of blue boxes, or scored proficiencies, are chosen in such a way so that the number of earned proficiencies out of the number of expected proficiencies form a percentage that can be converted into a grade. The transparency in how proficiencies translate into a final grade is very comforting to the students. Every quarter grade is a progress report that culminates in the year-end grade -- the only grade our school displays on a student's transcript.

With all of their work scanned, I don't have to immediately record each student's work on my copy of their objectives and grading sheet. When I go through the Evernote folder of their assessments, it's easy to record the quiz numbers into the blue boxes where the kids have earned proficiencies, and it's easy to count up the number of proficiencies earned in order to calculate the grade. Every student ends up taking a different set of quizzes depending on the extra quizzes they take, so I keep a running list of the quizzes taken on the front of the objectives/gradesheet. This also helps me not to give them the same extra quiz twice. 

No grading system is perfect, and this one isn't either, but students see how their work translates into their grades, building up from zero rather than down from 100. Taking risks is encouraged - there's no penalty for wrong answers, and even if a synthesis problem isn't answered perfectly, students demonstrate understanding of many other objectives along the way. Students aren't stressed out by assessments and really do see them as opportunities to show what they know. It helps them to clearly see that I'm on their side as they grow in their skills.

Thanks to the many people that I've learned and gained ideas and advice from! A few in particular: Kelly O'Shea, Seth Gunials-Kupperman, Sammie Smith, and Frank Noschese.

1-10 of 14