Data collection and Analysis

Initially led Allan Feldman

Currently led by Kathy Shafer

Introduction

Action research is fundamentally a political, moral and ethical endeavor. It goes beyond traditional scholarship because we want it to have a direct effect on our practice situations for ourselves; our students, patients, or clients; and for the communities in which we live and work. It is political work because what we learn and share through action research can influence policies that affect educational, health, and other social structures. Because we want the effects to be positive changes, that is to make better our practice situations, we need to be clear what we mean by better, and better for whom. Therefore, action research has normative, teleological component -- we aren't just studying our practice, we want to improve it in a particular direction that is tied to our political and moral stance. More about this can be found under the Ethics tab of this site. Also, more about my particular take on this can be found in articles that I've written about validity in action research (Feldman, 2007) and in self-study (Feldman, 2003). I wrote these essays in response to pieces by Heikkinen et al. (2007) and Bullough and Pinnegar (2001). You may find it interesting to compare and contrast these different perspectives on what constitutes quality and validity in practitioner inquiry.

Action research as political, moral, and ethical work affects other people. Therefore, we need to know that any claims that we make about it are well grounded, just, and can lead to the results that we desire. Consequently, there is a need for much of action research to have an empirical basis -- data ought to be collected and analyzed.

Bullough, R. V. and S. Pinnegar (2001). "Guidelines for quality in autobiographical forms of self-study." Educational Researcher 30(3): 13-22.

Feldman, A. (2003). "Validity and quality in self-study." Educational Researcher 32(3): 26-28.

Feldman, A. (2007). "Validity and Quality in Action Research." Educational action research 15(1).

Heikkinen, H. L. T., et al. (2007). "Action research as narrative: five principles for validation." Educational action research 15(1): 5-19.

What is data in action research?

In the introductory section I briefly argued that there are good reasons why action research ought to have an empirical basis. Usually that means that it should rely on some form of data. Here are some of the ways that my co-authors and I have desribed what we mean by data in the forthcoming 3rd edition of Teachers Investigate Their Work (Feldman, Altrichter, Posch, and Somekh, 2018):

Data have several important features:

  • They are material traces or representations of events and therefore are givens in a physical sense, which can be passed on, stored and made accessible to many people.

  • They are regarded as relevant by a researcher, providing evidence with respect to the issue investigated.

  • They are selective, either because of the limited ability of the researcher or the research instruments to include all pertinent information.

  • The selection of data by the research is theory-laden. What makes data data is that the researcher had decided in some way to designate it as such. That decision process is somehow guided, either explicitly or implicitly, by some type of theoretical perspective.

  • Finally, data is static. They are produced from events and interpretations that occur at particular times and places in the past. While errors in data can be corrected, they arise from events that have already happened.

Please add what you consider are important aspects or characteristics of data in context of action research.

Creativity in Collecting and Analyzing Evidence

Shared by Margaret Riel

The five features that Feldman, et al, (2018) describe are extremely useful in thinking about what evidence will be useful in making sense of the actions taken to change an activity system. I want to also highlight the role of creativity in deciding which "material traces" of change to collect and how to make sense of them.

When first learning about action research, many practitioners bring notions of research that are drawn from controlled, experimental approaches. They think that they need to control variables and have a treatment and attempt to draw causal relationships between actions and outcomes. In action research, practitioners study an action within an activity system over time. While one can use many of the same research tools that come other forms of research (surveys, focus groups, questionnaires, interviews, etc.), he or she is also free to find creative forms of evidence that help to make sense of the changes that are made. I will give two examples.

A teacher who was the head of a high science department wanted to build a professional learning community among the science teachers. His assumption was if a professional community was forming, the nature of informal discussion would change with teachers initiating more dialogue around their craft. He wanted to see if the actions he was taking in action research cycles (making it possible for teachers to visit classrooms, sharing readings) would have an effect on informal professional dialogue. What evidence could he collect to test his assumptions? He decided to monitor the informal exchanges he had with teachers. As soon as possible following an exchange, he used his phone to collect the date, time, number of people involved, the approximate length of the exchange, and percent of professional vs social talk. This form of record keeping was a creative way of showing a shift over time in both the frequency and length of professional exchanges.

Another creative example of data collection was by an educator offering workshops at a museum. She wanted a measure of teacher interest and engagement following changes in museum programs for educators. Asking educators if they were more engaged can be problematic for many reasons. Instead, she noticed that in workshops that were not going well, when people took necessary bio-breaks, they often used this time to make phone calls, chat, and check email. If teachers were really engaged, she reasoned, then the length of time away from the workshop would decrease. When a teacher left the workshop, she would note the time and then again when the teacher returned. This was a creative inversion of "time on task" which has been used by researchers in school. In this case it was time away from task and she was able to show a dramatic decrease in the amount of time teachers were missing from the workshop.

I share these examples of finding non traditional, yet valid evidence of the outcomes of their action research as a way to emphasize the role of creativity in designing the data collection and analysis strategies. If the data collected is a valid index of behavioral change, and can be analyzed in a reliable way, the action researcher has a great deal of latitude in what they collect and how they analyze it.

Understanding the Difference Between Data and Evidence

Shared by Linda Purrington

In our leadership program action research strand of courses, we have found it helpful to share definitions and scenarios from multiple sources to help students understand the difference between data and evidence and a process for generating evidence. Using real and or realistic scenarios is particularly helpful as they illustrate generating evidence in context.

The terms “data” and “evidence” are sometimes mistakenly used interchangeably. McNiff and Whitehead (2006) help action researchers understand the difference between data and evidence. “Evidence is not the same as data. Data refers to pieces of information you have gathered about what you and others are doing and learning…Your task is to turn some of these pieces of data into evidence” (p. 148). David Wilkinson, Editor in Chief of the Oxford Review, posts in blog that “Data is factual information such as numbers, percentages, and statistics. Evidence is data that is relevant and furnishes proof that supports a conclusion. Data is just data and has no meaning on its own. Evidence has to be evidence of or for something: an argument, an opinion, a viewpoint, or a hypothesis. The evidence you use depends upon your argument. As we get more evidence or different types of evidence, our argument might change”. Wilkinson further states, “When people in organizations understand this distinction between data and evidence, we tend to find evidence-based practice not only starts to become real, but also it enables people to make better decisions and judgement, and you may be surprised to find that people also become much more adaptable and flexible as well”. In his blog, Wilkinson provides a scenario to help illustrate the different between data and evidence https://www.oxford-review.com/data-v-evidence/

McNiff and Whitehead (2006, pp. 148-152) describe “… evidence as more than illustration. Generating evidence is a rigorous process which involves

1. Making a claim to knowledge. You know something now that was not known before. Knowledge generated through action research is about practice and theory. Practice example: We have created better communications in our office. Theory example: We know how and why to communicate better in our office.

2. Establishing criteria and standards of judgement. As a practitioner-researcher, your job is to set your own standards of practice and judgement, and show how you are fulfilling them. In relation to practice, do you live out your values of love, compassion and purpose? In relation to judgement, do you use these values as your standards. Furthermore, do you articulate these standards and communicate them to others so that others can see how you judge your practice and negotiate judgement with you?

3. Selecting data. Having established your criteria and standards of judgment, you now need to search your data archive and find instances of values in action.

4. Generating evidence. In the same way you produce evidence to support your claim that you have improved your work, you search the archive and find artifacts that contain data. You take out of these data those specific instances that you feel show your values in actions, such as a special comment, or a picture, or a field note. You use data in your research report, but now you explain how they represent both your capacity to realize your values in practice and also your capacity to articulate and communicate your specific standards of judgement.”

McNiff and Whitehead further illustrate the rigorous process of generating evidence through a fictitious example titled, The Educational Leader’s Story (p. 153).

References

McNiff, J. & Whitehead, J. (2006). All you need to know about action research. Thousand Oaks, CA: Sage Publications.

Wilkerson, D. Re: What’s the difference between data and evidence? Evidence-based practice [Blog post]. Retrieved from https://www.oxford-review.com/data-v-evidence/

Early Experience with Data Collection and Analysis

Shared by Kathy Shafer

In this section of our website, Allan pointed to resources that compare and contrast “different perspectives on what constitutes quality and validity in practitioner inquiry.” He also outlined several important features of data. Margaret shifted the discussion to creative forms of documenting evidence—through concrete examples— that reach beyond traditional tools. I found these to be most interesting and look forward to sharing with future students.

Feldman, et al. (2018) also expand on non-traditional tools such as photography, archival data, narrative data, and the issue of triangulation of data (see Chapter 5).

Linda followed with a discussion of the terms data and evidence and also the process of generating evidence. Novice researchers often struggle with these constructs, so providing selected resources for teachers of action research is critical. My contribution focuses on traditional tools that are typically found in qualitative research.

We all learn best by doing. As a result, in the fall semester of my mathematics education research methods course, teachers complete five data collection and assessment (DC&A) assignments: a reflective journal, video observation, postmortem, focus group interview, and artifact assessment (an assignment on surveys is optional.) As a result of completing these DC&A assignments, the teachers actually completed five mini action research cycles prior to writing their AR research proposal. In the fall of 2017, I asked Josh Giebel and Caitlin Bedwell to reflect back on their coursework [emphasis added].

Josh (2015-16 Cohort):

The Data Collection and Analysis assignments gave me an opportunity to experiment with the types of data collection tools that I felt would work well in my action research study. Specifically, it allowed me to get a better sense of what the data collection would look and feel like. This experience proved vital to my action research proposal because after testing out several different data collection tools, I was able to identify methods that would allow me to triangulate data and arrive at valid conclusions. I also was able to adjust my approach to using these tools as well as gain valuable experience analyzing the data that was collected. The timing of these assignments really could not have been better, this work really propelled me into working on my methodology and proposal. After completing the Data Collection and Analysis assignments, I truly began to see the reflective power of action research.

Caitlin (2016-17 Cohort):

When I started working on the Data Collection & Assessment assignments, I didn't understand how useful they would be when I was implementing my action research second semester. I thought the assignments were simply aspects of research we needed to be familiar with. The truth was that when it came time for me to write my proposal and then implement my action research, these assignments were both a source of confidence and useful reference tools. The assignments that were of most value to me were the lesson postmortem, the focus group interview, the survey, and the artifact assessment. The postmortem assignment forced me to look at myself and my classroom in a new way. I was paying attention to different details and aspects of the class then I normally would. This aided me in my research because I had an experience of how I needed to pay attention to my students while implementing standards based grading. The focus group interview (FGI) assignment pushed me outside of my comfort zone with my students. Doing this assignment before the FGI in my action research was vital. Since I had this prior experience interviewing my students, I was very comfortable talking with them, asking follow up questions and paying attention to their responses in the action research FGI. For me, I needed the FGI assignment before my research to get my jitters out and focus on what was important. The survey and artifact assessment also ended up being critical assignments for my research. I conducted two surveys in my action research and spent a lot of time with student artifacts as well. Doing both of these assignments prior to my pilot even helped to work out kinks, give light to assumptions and misconceptions I was prone to making, and give me an idea of what would be required of me during analysis. Since I had a brief experience analyzing survey results and student artifacts, I was aware of and prepared for the amount of time and work this area of my research was going to need when preparing my final report.