Using Data to Inform Learning

The concept of "using data to inform instruction" is a goal that teachers aim for when they gather data on student performance. What is also useful is to put the ball on the students' side of the court by allowing students 24/7 access to their own data based on their performance. With the "immediate, nonjudgmental feedback" that online learning offers, students can use their data to inform their learning. They can grasp on a personal level that data analysis is not just something that teachers do. Here's an example of how a student used the availability of taking a practice vocabulary quiz online before the real quiz was taken in class. This example will reveal how she used that feedback to inform her learning.

One of my 8th grade students, Olivia, using a website called Quia, exercised her word attack skills as she practiced for a vocabulary quiz. She practiced the 33 words I had selected from the novel Johnny Tremain between Monday, December 19 through Friday, December 23. Her last encounter with the words was on the morning of the 23rd, the day of the quiz. (The activity was in a multiple choice format with the answer being revealed each time students submitted their choice. You can see or try out a copy of the practice quiz here. Being her teacher, I have access to her practice and actual quiz scores on the "Grading Workbench.") Olivia did very well on the quiz itself, earning 29/33 points, just one point shy of her best practice quiz score.

But the real point is how Olivia's efforts improved what she learned and retained over the course of the week. She encountered each word a total of 9 times over the week, using what we call "spaced repetition" (2x on Mon., 2x Tues., 4x on Thurs., and 1x Fri. morning). Spaced repetition is the antithesis of cramming and is much likely lead to long-term learning. Olivia received "immediate, nonjudgmental feedback" through Quia's automatic grading of the multiple choice questions I had put together, and her score steadily climbed. Each time she logged back on, her time-stamped tally of results awaited her in Quia's "Student Zone."

Sure, paper flashcards can do the same, and they are great for students who need a lot of kinesthetic activity while studying or who don't have much access to the internet. But feedback like the above is vivid and memorable. It is particularly helpful with students on IEPs or who are ELLs, though homework points can be awarded to all students upon completion of several encounters with the words before a quiz. With this incentive, students are willing to encounter the words several times ahead of the quiz and can see the benefits of using spaced repetition.

Not only can teachers who use programs like Quia have access to results for particular students, they can also see data in the aggregate and discuss challenging words in detail with the whole class (results and data informing instruction). See how part of Quia's Grading Workbench for this same quiz is displayed below:

To sort the questions by percentage correct, I click on the column "Average Score." I can then "mouse over" the "green dots" on the left (at which point, a window pops up with the question asked) and realize (without leaving the page) that the most challenging words in the aggregate were these: carriage -- as one's posture (Question 30), seditious (Question 9), condescended (Question 15), implicit (Question 17), instigated (Question 18), inundated (Question 17) and enmity (Question 3). This view of the data is available online as soon as students practice for the quiz. It's in effect a "formative assessment" -- the next day I can show students this view (without revealing individual scores) and we can review the most challenging words. This is a targeted intervention. We can talk about word roots, prefixes, or suffixes in these words, and we can connect to related words that they might be familiar with. For example, "implicit" (literally, "folded in") can be compared to "explicit" and the word root "plic," meaning "to fold" as in explicate, implicate, duplicate, multiplication, etc. So instead of trying to discuss 33 words and definitely losing their attention, I can focus on these difficult words based on their practice results.

The actual quiz, later on, is a summative assessment. Using an online learning approach lets me assign more words (33 used on this quiz) without overdoing it, which gets students prepared for the intensive vocabulary study that accompanies SAT prepping in high school. The ease of tapping into and using data like this opens up teachers up to the possibilities of embracing the "data culture" mindset. For me, having this data readily available has opened my mind to the spirit of working with data. Because objective quizzes and tests are graded immediately online, analyzing the data is not an additional layer of work that demands burning more midnight oil. Online learning is a game changer.

But back to Olivia: Go, Olivia! -- and students like her -- who are using data to inform her learning process. Sure, teachers can reflect on and use student data to inform classwide instruction, but the student is closest to his or her own data, in real time and relevancy, and is best positioned to have it inform his or her own learning.

Correlating Practice Testing to Actual Testing in the Aggregate

This spring we completed a unit on Clauses (Adverb, Adjective and Noun Clauses) which culminated in two assessments. Clause patterns are not a breeze to teach, as they can build on understanding other elements of grammar. These elements include distinguishing between a phrase and a clause and knowing the parts of the sentence, such as the subject, direct object, and indirect object. But nonetheless we proceeded with paper activities in class and online activities for homework.

The first extended assessment was a Practice Test (it was formative, online and provided instant feedback) and the second was the Test itself (an online summative assessment). The average score on the test for this class was a C+. With the tests scored and the data available, I wanted to see the correlation between how often students attempted the Practice Test (each time they open it online is an "attempt"), their best score on the Practice Test, and their score on the Clause Test itself. I made the Practice Test available for five days ahead of the Test. Many students practiced a lot. One student attempted the Practice Test 19 times, probably proving that the law of diminishing returns also applies to grammar, since a number of students who did well practiced fewer times. Actually, upon closer inspection, about 8 of those 19 times were ditched or not really done from start to finish, but the other 11 times were completed more fully. (There were other ways of reviewing, too, including activities on paper and online.)

The average number of practice attempts for the class as a whole was 4.4, but the average number for those scoring in the A range on the test was 7.2 attempts. So being committed to practice sessions is strongly correlated to doing well on the actual test. Engagement leads to learning. It's conversely true that for students who did not practice much, their scores reflected this lack of commitment. A lack of engagement yields diminished learning. All these students received the same instruction in class, the same handouts, so the teaching inputs were pretty similar. What differed was the amount of time and effort they spent practicing, learning and reflecting on their own. This chart says a lot:

I provided a Retest on Clause Patterns three weeks later (we took the test itself April 13, had a week off, then a field trip to Washington D.C, so we had the retest at the end of the following week on May 4. I told them that I wanted everyone to take it again and recommended but did not require that they practice for the retest. Students knew it was a no-risk situation -- they would keep the better of the two grades. So not all were motivated to improve their scores, but some clearly were. Here were the results of students who improved on the retest. Those who did not improve are not listed -- perhaps they were too busy with a research paper or were content with their first score, or they were not sufficiently motivated. Students would have 100% if they scored 66 points; the 67th point was extra credit. I crunched these numbers to determine what effect practicing had on improving their test scores.

What conclusions can we draw here? What I see is this: the students who improved marginally (such as one point) mostly knew the material well anyway and some didn't have much room to improve on this test. By the way, any and all students who did worse on the retest (none were represented on this spreadsheet) did not practice at all. Many students who improved only somewhat (those who improved by 0 to 6 points) did not seem to practice much either -- or perhaps they practiced in ways I couldn't track, such as on paper or by doing independent research online. But those seven students towards the bottom who practiced the most also improved the most -- they improved by 7, 10, 11, 12 and 15 points. It's a pretty strong correlation. Most significantly, consider the results of two of the three students who practiced 3 times for the retest: One went from 52 -> 59 points or 79% to an 89% -- she had practiced several different activities that were available online. The student who improved the most (+15 points) went from 40 -> 55 points or 61% to an 83%. He let me know that he had practiced a lot, and I could tell he was pleased and proud about his preparation. So the students who were motivated to raise their scores improved the most, and their efforts to improve helped. These efforts included challenging themselves to practice, learning from "instant, nonjudgemental feedback" that I had arranged and asking questions after class. (In class, we had moved on to other topics.) Perhaps if they had used "spaced repetition" strategies by spreading out their practicing over a number of days, they may have improved more. So: Motivation + strategies --> progress. You never know when a student's potential is going to be revealed.

-- John Chamberlain, Clarke Middle School, Lexington, MA

Steps for our work together:

1. Register on Quia for a free month's trial, or use one of the accounts I provide.

2. Click on "Create New Quiz."

3. Assemble a multiple choice quiz, using words or terms you have brought to the session. (You can use

the Basic or Advanced quiz manager, though the Advanced has more options and is just as easy.)

a. Select "Add x# of multiple choice questions to your quiz. I group together 7-10 words at a time.

(You could arrange a matching quiz, but then the feedback is more complicated to analyze.)

b. Put the definitions in the Question area.

c. Put the words in the first Answer Choice area only for that group of words.

d. Use the "Duplicate Previous Answers" to copy all those answers for those questions.

e. Chose "randomize answers" and "randomize question orders" as settings if you wish.

4. Find and incorporate a visual, using the "Files" section.

a. Find a visual online or send it from a phone or camera to your computer.

b. Upload the .jpg or .gif to Quia's "Files" area.

c. Incorporate the visual into your quiz. "Edit," then "Insert," then "Image"

5. Set up a class for your students. (Once you know how to do one, the others are easy.)

6. Have student login information emailed to you for you to print out and distribute.

7. Take the leap and assign the quiz. Tell students to log on directly to Quia and click on the class web page

appearing near the top of their Student Zone on Quia. Or have a link from TeacherWeb.

8. Recommendation: First have students practice logging on and taking the quiz using school computers.