Engaging Students in Conducting Data Analysis: The Whole-Class Data Advantage

Engaging Students in Conducting Data Analysis: The Whole-Class Data Advantage

Virginia Oberholzer Vandergon, John M. Reveles, Norman Herr, Dorothy Nguyen-Graf, Mike Rivas, Matthew d’Alessio, and Brian Foley

California State University, Northridge, USA

Click here for graphics

Abstract

Computer Supported Collaborative Science (CSCS) is a teaching pedagogy that uses collaborative web-based resources to engage all learners in the collection, analysis, and interpretation of whole-class data sets, and is useful for helping secondary and college students learn to think like scientists and engineers. This chapter presents the justification for utilizing whole-class data analysis as an important aspect of the CSCS pedagogy and demonstrates how it aligns with the Next Generation Science Standards (NGSS). The chapter achieves this end in several ways. First, it reviews rationale outlined in the NGSS science and engineering practices for adapting 21st century technologies to teach students 21st century science inquiry skills. Second, it provides a brief overview of the basis for our pedagogical perspective for engaging learners in pooled data analysis and presents five principles of CSCS instruction. Third, we offer several real-world and research-based excerpts as illustrative examples indicating the value and merit of utilizing CSCS whole-class data analysis. Fourth, we postulate recommendations for improving the ways science, as well as other subject matter content areas, will need to be taught as the U.S. grapples with the role-out of new Common Core State Standards (CCSS) and NGSS. Taken together, these components of CSCS whole-class data analysis help constitute a pedagogical model for teaching that functionally shifts the focus of science teaching from cookbook data collection to pooled data analysis, resulting in deeper understanding.

Keywords: Data Analysis, Student Engagement, Student Explanations, Technology Tools, Scientific Investigations, Computer Supported Collaboration, Science Education

Introduction

Science education in the United States is about to undergo one of the most significant shifts since it was overhauled in response to the Soviet Union’s launching of Sputnik I in 1957. After Sputnik, our nation’s science curricula were renovated to meet the evolving needs of a technologically threatened society. As we entered the 21st century, it was recognized that the U.S. was once again behind in the teaching and learning of the core concepts needed to build a strong foundation in life-long learning including in science (NCES, 2013). The problems in science education in this country are all too familiar. Conditions have hardly changed since the 1989 report “Science for All Americans” (AAAS, 1990). Science classes are often still taught by underprepared teachers in a highly didactic manner that does little to promote understanding of science or the nature of scientific knowledge (McNeill & Krajcik, 2008; Newton, 2002). These issues might contribute to the fact that American students still lag far behind other leading countries in science achievement, which will inevitably result in a looming shortage of science/technical workers in the U.S. (Augustine, 2007; OECD, 2010). The majority of American students are still taught in large urban schools that often lack adequate science instructional resources and tend to have low student expectations (Tal, Krajcik, & Blumenfield, 2006). The need to update 21st century teaching in the U.S. has led to the introduction of the Common Core State Standards (CCSS) and the Next Generation Science Standards (NGSS). These standards are already changing the way teachers will be required to teach as well as the what they will need to teach. With such mandated changes quickly approaching, increasing effort is being invested in how teachers will be required to teach students 21st century skills. This chapter focuses on the how of the new standards implementation by bringing cloud technology to K-20 science classrooms to teach NGSS and CCSS through the use of collaboration and whole-class data analysis as it is gathered in inquiry based classrooms.

Recognizing that science is the systematic study of the structure and behavior of phenomena in the physical and natural world through observation and experimentation, it is clear that there should be an emphasis on inquiry. This should be modeled in the classroom as it would be practiced in a research laboratory setting. The National Science Education Standards were developed by the National Research Council to “promote a scientifically literate citizenry”. The Standards frequently encourage the use of inquiry in the science classroom, defining it as:

A multifaceted activity that involves making observations; posing questions; examining books and other sources of information to see what is already known; planning investigations; reviewing what is already known in light of experimental evidence; using tools to gather, analyze, and interpret data; proposing answers, explanations, and predictions; and communicating the results. Inquiry requires identification of assumptions, use of critical and logical thinking, and consideration of alternative explanations. (NRC, 1996, p. 23)

We recognize that ideally all science classrooms should provide hands-on activities that are driven by inquiry and that allow students to investigate phenomena, test ideas, make observations, analyze data, and draw conclusions. Unfortunately, students in science classrooms still spend most of their time on data collection at the expense of analysis and interpretation (Alozie & Mitchell, 2014). In other cases, when there is time for analysis, it remains topical in nature and is teacher-led rather than in-depth and student-centered (Levy, Thomas, & Rex, 2013). With the introduction of NGSS, teachers are required to do in-depth data analysis so that students can develop skills to analyze, interpret, and communicate inquiry findings using evidence based reasoning (NGSS Lead States, 2013).

Science Inquiry: Emphasized in the NGSS Framework

The Framework for K-12 Science Education (NRC, 2011), the document guiding the creation of the NGSS, identifies eight practices of science and engineering that are essential for all students to be able to utilize:

1. Asking questions (for science) and defining problems (for engineering).

2. Developing and using models.

3. Planning and carrying out investigations.

4. Analyzing and interpreting data.

5. Using mathematics and computational thinking.

6. Constructing explanations (for science) and designing solutions (for engineering).

7. Engaging in argument from evidence.

8. Obtaining, evaluating, and communicating information.

(Appendix F of NGSS)

Developing strong lessons that meet the needs of today’s students means that teachers need to incorporate the eight practices of science and engineering into their own science teaching practice. NGSS makes it clear that a substantial emphasis is placed on science process skills and scientific explanations (Reiser, Berland, & Kenyon, 2013).

Science Inquiry: Facilitated by Technology

At the same time as the NGSS are being introduced, technology is becoming more prevalent in classrooms. Previous research on teaching science with technology has shown that technology can be an effective support for science inquiry and learning. Projects such as WISE (Slotta & Linn, 2009) and LeTUS (Hug, Krajcik, & Marx, 2005) have demonstrated the potential of technology when used effectively. Computers have been used for simulations or scientific visualization to provide feedback and hints, educational games and to connect students to scientists. In 2010, Lee et al. published a paper linking technology and inquiry in the science classroom. They described how changes to the curriculum for an inquiry classroom should involve certain knowledge change principles such as making science accessible to all learners, making thinking visible, collaboration with students, and promoting life-long learning. They found that well-designed inquiry units improved students’ understanding of science content and students maintained their knowledge and used it in future courses. They noted that the success of the student depended on the success of the teacher (emphasizing the importance of professional development (see below)). All the teachers studied by Lee et al. (2010) used technology daily or weekly to accomplish inquiry in the classroom. One of the ways that technology contributes to inquiry is that it can enhance collaboration between students. In the literature, this approach is called Computer Supported Collaborative Learning (CSCL) and it focuses less on introducing new activities to a classroom, but instead seeking to enhance the work students already do in class (Suthers, 2006; Scardamalia & Bereiter, 2006). In science classrooms, this involves using pooled data analysis which allows students’ thinking to be visible where they can “see” outliers, trends, and patterns in data as “real science” does. This ability provides a powerful learning tool that affords opportunities for students to recognize their own errors when participating in data analysis activities. It is clear that technology tools can help teachers adapt to the new requirements by supporting the core practices in the NGSS. The purpose of this chapter is twofold: a) To provide specific research-based examples of technology tools and their use in science classrooms, and b) to analyze the value of supporting science teachers in the use of such technology to facilitate their students’ development of essential 21st century skills that they will need to enter the evolving job market.

Background

The authors of this chapter have worked together as a team of university scientists and science education researchers for over 15 years to prepare both in-service and pre-service K-12 science teachers in content and pedagogy. Through these years, we have evolved our own teaching practices and have seen the importance of integrating technology into both our University teaching as well as into our Professional Development (PD) institutes. We recognize the value of transforming the way science is taught and the need to integrate technology as a pedagogical tool to create engaging and interactive science classrooms. Building on the literature concerning computer supported collaborative learning (CSCL) (see Stahl, Koschmann, & Suthers, 2006), we provide a focus for the science classroom and call our approach Computer Supported Collaborative Science (CSCS) (Herr & Rivas, 2010; Rivas & Herr, 2010). The CSCS model employs freely available cloud-based technologies (e.g., Google Docs, Microsoft OneDrive, etc.) to promote student collaboration and whole class data analysis.

The Five Principles of CSCS Instruction. We have identified five key principles that can help teachers implement collaborative documents in science classes (Foley & Reveles, 2014):

1. Information is shared with the class online. Teachers need to post information for the class on their website, so it is accessible to students and parents at all times. Student work is often posted online for the class, so students can give feedback to each other and share ideas. Sharing increases the accountability for students to do quality work and ensures that previously covered topics can be quickly accessed so prior work is always “alive” (Bereiter & Scardamalia, 2010).

2. Teachers check on students’ understanding often. Formative assessment helps teachers know which students are struggling and when the entire class needs to slow down or speed up. Instant polling with quickwrites (live student written responses to teacher questions) and online forms provide much more effective assessment than traditional questioning. This is so because all students respond to each question and results are immediate, thus allowing teachers to address student needs in real time. Teachers can also use collaborative documents on mobile devices to quickly collect and assess student diagrams or visualizations (see adjacent chapter in this book; Herr & Tippens, 2013).

3. Data from experiments and simulations is pooled. Students can get much more out of the analysis of a large data set compared to when they are limited to their own data. Having students pool their data allows them to see trends and helps them learn to identify outliers and correct errors instantly rather than turning in flawed results (d'Alessio & Lundquist, 2013). Google forms and spreadsheets allow teachers to collect data in the cloud. Therefore, this data is captured and students can then collaborate on the analysis of this data at various times during and after experimentation.

4. Collaborative data analysis is emphasized. Many science teachers focus on the hands-on part of labs and, for a variety of reasons, shortchange the data analysis. We agree with the NGSS that the analysis and conclusions are indispensable to authentic science inquiry. Pooling data on an online, collaborative spreadsheet instead of a paper worksheet leads classes to access the data easily and to do authentic scientific analysis. Students are working off the same data set and can compare results with each other and decide on the merits of any particular way of making sense of the data.

5. Students’ explanations are shared and compared. Explanations are a key part of the NGSS but rarely discussed in classrooms (Songer & Gotwals, 2012). Collaboration tools allow students to share their explanations and get feedback on their ideas and writing. Shared conclusions allow for further discussion and the consensus building that is essential for inquiry (Berland & Reiser, 2009). Tools like Google Moderator allow students to think about the quality of different explanations and come to consensus as to the best one (d’Alessio, 2014).

The theme through all these principles is collaboration. Collaboration in science includes both the exchange of ideas and of data. Our adjacent chapter in this book describes how the exchange of ideas enabled by CSCS provides teachers with the opportunity to conduct Continuous Formative Assessment (CFA) that can be used to adjust instruction immediately; this is most clearly seen in Principle 2 (see adjacent chapter in this book; Foley & Reveles, 2014; Herr & Rivas, 2010; Herr, Rivas, Foley, Vandergon, & Simila, 2011a and 2011b). This chapter emphasizes Principles 3 and 4 though all principles are touched on in the examples given below.

The CSCS principles are not meant as a comprehensive definition of good science teaching. Rather, they articulate key instructional techniques that support collaborative science using technology tools. It is also important to us that the principles not burden teachers with additional responsibilities. Using these techniques, teachers can efficiently create a classroom environment that is evidence-rich and stresses data interpretation, evaluation, and explanation especially as it is integrated into inquiry-based classrooms.

Issues

Throughout our collaborative efforts, the authors have been working with 6-12th grade teachers both in pre-service and in-service capacities to model and demonstrate our CSCS pedagogy. While we believe strongly in the transformation of science classrooms with technology, we recognize that there are some issues that may arise when utilizing technology in the classroom. Having a one-to-one device for each student is ideal but not always realistic. Recently, a large urban district (Los Angeles Unified School District) tried an iPad rollout with mixed results (Blume, 2013). We feel that some of the issues with the rollout might have been avoided if teachers had been trained in advance regarding ways to use technology to engage students around science activities and lessons. The infrastructure also has to be in place to have a smooth transition for allowing consistent and safe access to cloud-based technologies via the internet (Shirley et al., 2011). With that said, we also feel that over the next few years these issues will be resolved in the following ways: a) As a result of the increasingly ubiquitous presence of needed technology in classrooms to administer new standards-based assessments, b) districts will update their internet connections because of improved technology application and therefore speed will improve, c) a new generation of teacher candidates will graduate with experience in using technology in their teaching and progressively see it modeled in their credential coursework, and d) with the rollout of the NGSS and CCSS, professional development opportunities will expand to include using technology to teach to the new standards in ways that allow teachers to update their skills and experience.

Another real issue in U.S. classrooms is the diversity of learners. Many students today come from homes where English is not primarily spoken and where parents often have little more than an 8th grade education and do not have adequate resources for helping their children. Also, K-20 American classrooms are increasingly more inclusive of students with varied learning disabilities. This makes having an engaging classroom more of a challenge. We believe that the CSCS pedagogy can help relieve the burden of these realities because all students can remain engaged with class content and all learners can participate, as we will outline below. English Language Learners (ELLs) and learners with disabilities will be able to participate and feel like true scientists themselves.

In order to address these issues, the authors have taught CSCS lessons in their own University classrooms as well as to in-service teachers in Professional Development institutes. The teacher participants of the PD institutes have applied CSCS principles in elementary, middle, high school, and university classes with multiple science topics (d’Alessio & Lundquist, 2013; Herr, Rivas, Foley, Vandergon, & Simila, 2011a & 2011b). Below we focus on lessons the authors and some of our participating teachers have used in their courses.

Examples of Data Analysis: Real-World and Research-Based Illustrations

Cloud-based collaborative forms, spreadsheets and documents are tools that can be used to implement inquiry learning in the STEM classroom and the basis for our CSCS pedagogy. In this section, we discuss five examples of how students can pool their data, perform statistical analyses, and interpret trends in large-group data gathered as they participate in inquiry lessons. The examples presented here are modified from classic labs and activities previously carried out in a non-technology classroom setting. They are from different science disciplines and show how teachers can make data analysis visible to students by emphasizing the scientific process to engage all students in the NGSS science and engineering practices.

Example 1: Genetics, Using Pooled Data to Discover Trends & Patterns. A great way to engage students in a unit on genetics is to start by asking them some questions about how they look (their phenotypes). Traditionally, this was either done by asking students to raise their hands to certain questions such as “Raise your hand if you can roll your tongue like a taco” or “Raise your hand if you have attached earlobes” or it was done as a paper and pencil worksheet. In order to adapt the CSCS pedagogy into this lesson, a Google survey/form can be created with students responding to whether they have attached earlobes or not, whether they can roll their tongues, whether they have a widow’s peak etc. Images of these traits can be either projected on a screen at the same time or embedded within the survey. Figure 1 is such a survey. After students input all their data, the teacher can then show/project the result page (it is suggested that the teacher hide the column with names to make it a more anonymous environment) and a discussion can ensue about what the results indicate. Data is captured electronically and allows the teacher to use the same survey across multiple class periods and even across years, thus creating larger pooled data sets where sample size gets bigger and bigger. Students can then look at a bar graph (see Figure 2) that is generated from this data to pictorially show the results of the averages; these averages will change slightly as more data is input but the general trend will remain the same. If this data was gathered by hand-raising or paper and pencil, then the graph would either have to be done taking up classroom time by the teacher or outside of class time and therefore not shown to students until the next class. However, by utilizing a CSCS approach, instantaneous data is now presented in a way that allows the instructor to get at misconceptions in inheritance patterns by asking deeper-level probing questions drawn from the data. One standard question might be to ask the students to do a quickwrite (a Google spreadsheet where students respond to the question at the top of the column, their name is indicated separately in a row) about which traits they think are dominant and why they use evidence from the graph (see Figure 3). A large majority of the students define dominance as the traits with the higher percentage of individuals expressing the phenotype. They mistakenly assume that the higher the percentage the more dominant the trait. With this misconception, students in this example choose tongue rolling as a dominant trait and refer to traits like having a widow’s peak as a recessive trait (it turns out that both of these are dominant traits). Sometimes, the teacher can then leave it at that and go on to teach about Mendel and how his four conclusions shaped the study of genetics. One of Mendel’s conclusions is the Principle of Dominance and Recessive traits. Because the above exercise is carried out in the cloud, the teacher can then bring back the graph (even if it is a week later) and re-ask the question on which traits are dominant based on what they now know about genetics. Most students then figure out that you cannot tell which trait is dominant and which is recessive without doing crosses and looking at parental types and offspring so that you can see what trait gets masked in a generation (see Figure 3). With the ability to go back to the data set, students are able to see the difference between the common misconception they previously may have had and what they now understand about dominant and recessive traits. Students can now take this further and go home and ask these same questions of their parents. They can use the same data to help dispel the common misconception that dominant traits are dominant because there is a greater percentage of individuals in the population that show that trait, rather than the correct definition that dominant traits are dominant because they mask the effect of the other phenotype if both traits are inherited in the same individual. When this exercise is done on paper, a teacher is limited to the sample of the class and usually to the day that the exercise is used in class. Though the exercise would still be engaging using the CSCS version which allows for a more complete exercise providing opportunities for all students to participate and for all students to have an opportunity to see how patterns and trends can be interpreted.

Figure 1

Note. Screenshot of the Google form-survey called Genetics Phenotype Survey for measuring phenotypes in individuals.

Figure 2

Note. Bar graph of data collected from the Genetics Phenotype Survey (Fig. 1). Data collected from 282 individuals.

Figure 3

Note. Quickwrite example with first question asked as students observe the graph presented in Figure 2 and second question asked after learning content on Mendel’s conclusions in Genetics.

Example 2: Chemical Reactions, Using Pooled Data to Explore the Effect of Variables. Another example comes from a chemistry lab in which students explore chemical reactions and what happens to the reaction rate when you change parameters. An important understanding in doing any experiment is emphasizing the control and the experimental variable. In this lesson, students work in groups of 3, following the directions as outlined in Figure 4 and traditionally they would do a lab write-up with paper and pencil or take down notes in their lab notebooks. Each group would perform all the experiments within their own groups but would never fully understand the importance of a control variable because they have nothing to compare it to. Using CSCS, students follow the directions as outlined in the figure and they enter their results in a Google form such as shown in Figure 5. Each group then performs the first reaction (control) with the same parameters and all start at the same time. Once the data is entered, theoretically all the data should be on top of each other since all groups have the same parameters. Then all the groups are assigned a variable to change in their next experiment. Each group does about 3 reactions resulting in a class worth of data. Once all students have completed their set of reactions and clean-up has occurred, data analysis can proceed (note: since everything is in the cloud it doesn’t matter if this occurs now or at the next class meeting). A few samples of what students’ data looks like before analyses are shown in Figure 6.

DIRECTIONS:

  • Get in groups of 3.

  • Material for each group:

    • Citric Acid, Sodium Bicarbonate, Red Cabbage Juice

    • Beakers for the chemicals

    • Beaker for the reaction

    • a graduated cylinder

    • thermometer

  • Put about 4 spoonfuls of the chemicals in each beaker.

  • Measure out 200 mL of the red cabbage juice into the graduated cylinder.

  • Place the juice into the larger beaker.

  • Take the initial temperature (there should be 1 decimal place to your reading).

  • T initial = 0 seconds

  • Measure out a teaspoon of each reactant.

  • Get a stop watch (computer, iPhone, Stopwatch Gadget)

  • Quickly pour the chemicals into the larger beaker and take the readings every 15 seconds.

    • person #1 - write down the temperatures every 15 seconds

    • person #2 - hold the beaker and try to mix and stir.

    • person #3 - read the reading on the thermometer

  • Stop when the temperature doesn't fluctuate after 3 readings

  • Enter your data here.

  • Instructor will assign a parameter change to your group.

  • Lets look at the data here

Figure 4

Note. Directions students will follow in order to establish the change in temperature in chemical reactions that have different variables.

Figure 5

Note. Screenshot of partial Google form used to collect data on the chemical investigations as outlined in Figure 4.

As students look at the data, they begin to see the importance of having control and experimental variables, how to interpret graphs with regards to chemical reactions, and how graphs change when a parameter is manipulated. Establishing a control is important because the experimenters need a baseline to compare their variations to in a chemical reaction.

In the first part of the experiment, even though all the groups are given the same procedure and parameters, many groups do not get the same data results. Usually, this is the case (see Figure 6) and a beneficial conversation on experimental techniques (reading thermometers, initial temperatures, measurement of reactants, time to read, etc.) can be discussed. Often, teachers assume that their students know, and sometimes the students themselves think they know how to use the tools given to them and may not have the time to review the importance of their proper scientific use. By doing the controls first and then by stopping and looking at the data results, it becomes clear that the students need to be reminded how to read a thermometer and how to take correct measurements. Students can then retry the control experiment again to get a class-averaged control for everyone to use as a comparison (see Figure 7). They then can continue on with their variables and have a more accurate comparison of the reactions using different variables.

Figure 6

Note. Graphs of the controls for the chemical investigations as outlined in Figure 4 (also see Figure 5).

Figure 7

Note. What happens to the graphs when groups fix their data that was presented in Figure 6.

Example 3: Hurricanes, Using Datasets from Databases for Hard to Capture Data.

Although it is relatively easy to conceive of how inquiry lessons can be used in a laboratory-based science class such as chemistry or physics, it is more challenging to see how they may be employed in field-based classes such as earth science where one can not change independent variables to study changes in dependent variables. Fortunately, however, there are numerous large, free databases that can be used to engage students in inquiry learning. For example, the National Oceanic and Atmospheric Administration (NOAA) maintains a data base on hurricanes (coast.noaa.gov/hurricanes) that can be used to study hurricane behavior. Each student can select a historic hurricane to investigate and, by examining the animation of the hurricane track, can record such things as season (August, September, etc.), highest speed (over land or over sea), origin (northeast Atlantic Ocean, Central Atlantic, etc.), path (does it hook clockwise, counter-clockwise or go straight), landfall (Florida, Georgia, etc.), and relations of pressure and speed (proportional, inversely proportional, no relationship). Each student submits their data using an online form, and the data is collected in a cloud-based spreadsheet in which the rows represent individual hurricanes, while the columns represent hurricane characteristics (path, landfall, origin, etc.). Rather than analyzing a specific hurricane in isolation, students look for patterns in the class data, Figures 8 represent the data collected by 184 students isolating each data question into a bar graph. In some examples, students will notice that North Atlantic hurricanes occur in the late summer (see Figure 8), start in warm waters in the Gulf of Mexico, the Caribbean Sea, and the tropical Atlantic Ocean as far east as the Cape Verde Islands (see Figure 9) don’t cross the equator (see Figure 10), tend to veer clockwise (see Figure 11), and often make landfall in the southeastern United States (see Figure 12) at which time wind speed decreases significantly (see Figure 13). Rather than being told these facts, students discover them by observing large data sets from real hurricanes studied by them and their peers. Patterns in data suggest underlying causes and, once students have discovered patterns by examining whole class data collected in cloud-based collaborative spreadsheets and graphs, they are better prepared to discuss the underlying causes for these patterns such as differential heating, pressure gradients, latent head of condensation, and the Coriolis effect.

Figure 8

Note. Output of data from a Google form collected by students looking at when hurricanes occur.

Figure 9

Note. Output of data from a Google form collected by students looking at where hurricanes occur.

Figure 10

Note. Output of data from a Google form collected by students looking at the path hurricanes take during their run.

Figure 11

Note. Output of data from a Google form collected by students looking at the direction of a hurricane’s path.

Figure 12

Note. Output of data from a Google form collected by students looking at where hurricanes hit land.

Figure 13

Note. Output of data from a Google form collected by students looking at the speed of hurricanes on land and in the water.

Example 4: Osmosis, Pooled Data for Statistical Testing. The “power” of a statistical test is the ability of that test to detect an effect if an effect actually exists. In other words, it is the ability to correctly reject the null hypothesis when it is indeed false, and correctly accept the alternative hypothesis when it is indeed true. Although it is desirable to increase the sample size for statistical purposes, it is not necessarily desirable from a curricular perspective. Requiring students to collect large sample sizes may indeed improve the predictability of experiments in STEM classrooms, but may also introduce an element of tedium while simultaneously robbing time from other valuable activities. Fortunately, however, it is possible to increase the sample size without these negative side effects if we pool data from all students in the class.

Genetic and environmental factors introduce significant variability into any biological population. Thus, the results from an experiment on one organism may yield different results than an experiment on another organism in the same population. To make generalizations about populations, it is necessary to collect data on random samples from within that population. The greater the sample size, the greater the statistical power to detect if effects actually exist. One of the most common ways to determine the osmotic concentration of tissues is to submerge samples in different solutions of varying concentrations. If placed in hypertonic solutions, the tissues will shrink over time as water leaves the tissues to enter the solution and, if placed in hypotonic solutions, the tissues will swell as water leaves the solution to enter the tissues. If, however, these tissues are placed in isotonic solutions (ones in which the concentrations are the same as that of the tissue), the tissues neither shrink nor swell. Thus, one can infer the osmotic concentration of tissues by placing them in a series of solutions of increasing osmotic concentration, plotting the percent of weight gained or lost in each solution as a function of the molarity of the solution, and looking for the molarity at which there is no change in weight. Figure 14 is a plot of actual student data from 19 lab groups. One can see significant variability in the data due to variability in the tissues, techniques, and experimental errors. For example, the outlier at 0.4M solution is an obvious case of experimental error. If lab group 19, which reported this measurement, tried to draw a conclusion on the osmotic concentration of the potatoes based solely on their own data, they would not have seen a trend and would not have been able to make a conclusion. By contrast, if lab group 19 had seen their data in the context of the whole class data, they would have likely recognized that their value was an outlier and would have re-examined their data or re-measured.

Figure 14

Note. Student data from weighing potato slices that have been placed in different solutions and plotted as a whole class data set.

Figure 15 shows that there is substantial variability in the data. Students can examine their data in light of the whole class data and then go back to Figure 14 and compare it with the trend line for the entire population which indicates an average osmotic concentration of 0.3 M (the point where the trend line crosses the x-axis, indicating that the tissue and solution have the same osmotic concentration). By pooling their data in collaborative cloud-based spreadsheets, students not only learn that they can increase the statistical power of an investigation, but can also examine their own data and techniques in light of class patterns and trends.

Figure 15

Note. Data inputted by 19 groups of students measuring the percentage of gain or loss of a potato slice after it was placed in solutions of different molarities. This data is plotted in Figure 14.

Example 5: Exothermic and Endothermic Reactions, Using Pooled Data to Encourage Metacognition. Education in STEM-related courses benefits when students become more metacognitive during investigations and experiments. Metacognition is the process of thinking about one’s own thought processes, and students who are metacognitive will reflect not only on their own thought processes, but also upon the data that they obtain through investigation. Metacognitive students reflect on their data to see if it is reasonable before they proceed to interpret such data. Although it is possible to reflect on one’s data in isolation, it is more effective to reflect on one’s data in the context of whole-class data. If students enter their data into a collaborative, cloud-based database, then they have the advantage of comparing and contrasting their data with that collected by their peers. Major discrepancies in data may indicate significant findings, or simply poor technique or inaccurate reporting. Metacognitive students will analyze these discrepancies to see if perhaps they had poor techniques, misreported their data, or indeed found something significantly different than their peers. The following activity illustrates how collaborative cloud-based data can be used to encourage self-reflection and metacognition.

Students were asked to determine if the following series of chemical reactions were endothermic or exothermic. Students used probes to measure the change in temperature as chemicals were mixed

(1) CaO(s) + H2O(l) -→ Ca(OH)2(s) (Lime + water)

(2) NH4NO3(s) + H2O (l) → NH4+(aq) + NO3-(aq) (Ionization of ammonium nitrate, a fertilizer)

(3) HCl(dilute) + NaOH(dilute) → H2O + NaCl (Neutralization)

(4) NaCl + H2O → Na+(aq) + Cl-(aq) (Dissolving table salt)

(5) CaCl2 + H2O → Ca+ (aq) +2Cl-(aq) (De-icing roads)

(6) NaHCO3(s) + HCl(aq) →H2O(l) + CO2(g) + NaCl(aq) (Neutralization)

(7) CH3COOH(aq)+NaHCO3(s) →CH3COONa(aq)+H2O(l)+CO2(g) (Baking soda & vinegar)

(8) C12H22O11 + H2O (in 0.5M HCl) → C6H12O6 (glucose) + C6H12O6 (fructose) (Decomposing table sugar)

(9) KCl + H2O → K+(aq) + Cl-(aq) (Dissolving potassium chloride)

(10) NaCl + CH3COOH(aq) → Na+(aq) + CH3COO- + HCl (Preparing HCl to clean tarnished metals)

Figure 16 shows real class data from such an investigation. Note that, although only one group collected data from all reactions, everyone was able to see multiple measurements for each reaction because of cloud-based data pooling. As students examined their data, they noticed discrepancies in value, sign, and number of significant digits. The shaded cells indicate reactions that students re-investigated as they saw the data appear. Teams 1, 2 and 5 noticed potential problems with their data, and changed their entries after noting either errors in measurement or reporting. By contrast, group 7, did not re-examine their data until prompted to by the instructor. As students become more accustomed to sharing and pooling their data, they become more metacognitive, reflecting not only on their techniques and reporting, but also their interpretations.

Figure 16

Note. Student data on temperature change (using probes) during a chemical reaction. Each of the reactions is outlined in the text. Shaded boxes indicate outliers that the student groups then reassessed.

Results of Classroom Data Pooling

All of the above are examples of ways the CSCS pedagogy can be used in science classrooms. Over the last several years, we have been modifying our CSCS pedagogical model as we learn lessons from teaching in university classrooms as well as in the summer PD we are doing with 6-12th grade teachers. Our hope is that these teachers then go on to utilize CSCS tools in their 6-12th grade science classrooms. We are looking for gains in content knowledge in science. In order to test content knowledge, we did an initial screening with university students comparing CSCS or lack of CSCS instruction for one lecture in an undergraduate Biology course (the lecture was on glycolysis and cellular respiration) between two different classes (see Figure 17). In Class 1, the instructor used only a PowerPoint and talked straight from the slides providing no time for interaction between the students. In Class 2, the same instructor used the same slides but embedded some of the CSCS tools, i.e. quickwrites, iPad drawings and think pair share activities where she gathered formative assessment data and had more interactions with the students. The students took a short exam on the subject the week before and the week after class. Both classes failed the pretest exam but the traditional teaching class scored higher. The post exam shows that the CSCS class gained more on average (46%) than the traditional class (22%). A single lecture is a small teaching sample, but this data suggests that CSCS instruction using online collaborative documents was much more effective for learning biology content than traditional instruction. At the level of an entire course, the students in a science content course for pre-service elementary teachers used CSCS to evaluate the quality of their pooled data and physical interpretations of their results in real-time throughout the semester. As a result, these students were able to analyze data and construct explanations that were more expert-like than the students who did not use CSCS in the same course (d’Alessio & Lundquist, 2013).

Figure 17

Note. Pilot data comparing a class taught using CSCS instruction in class collaboration via Google Docs to one taught using traditional lecture-style instruction.

Since 2010, we have been working with in-service science teachers on using CSCS instruction. They were excited about the potential of the new tools, but only a few (22% in 2010) of the teachers reported that they had computers in their classes that allowed them to put these pedagogical tools to practice (see section on issues above). Recently, we have seen a slight increase in computers in the classroom (29% in 2013). However, a great deal of interest in our CSCS pedagogy continues to develop as districts make plans to provide all students with devices (tablets or laptops). One of the most appealing aspects of the CSCS pedagogy is that educators are encouraged to use their own favorite inquiry lessons as a basis for engaging their students in cloud-based data analyses. Therefore, they do not need to “throw out” lessons that have worked for them in the past. Rather, they learn to utilize the CSCS pedagogy to enhance and update their lessons to engage their students in 21st century skills.

Our first attempt to train teachers on CSCS used traditional PD models with teachers coming to the university for summer workshops. As is found in many PD’s, the teachers left excited about what they learned and appeared willing to try the new pedagogy or use the new tools in July but, when school started in August, they became less confident and mostly went back to their traditional way of teaching (Darling-Hammond, 2006). In order to address this, we began to integrate more clinical teaching into our summer PD. Teachers participating in the PD now teach a CSCS modified lesson to groups of middle school students enrolled in enrichment summer school courses on the University campus. This approach has been more successful at preparing teachers to use CSCS in the classroom. Figure 18 shows how teachers gained confidence (often the limiting factor in the use of CSCS in the past) in using collaborative technology during our summer workshop when they were able to “practice” what they learned in CSCS immediately in a clinical site setting. We also surveyed teachers in our summer programs before the workshops and then at the end of the following school year (N=15 teachers completed both surveys). Figures 19 and 20 show some of the positive effects we are seeing (i.e., teachers are spending less time with the textbook and more time analyzing data in the classroom). The transformation of teachers’ practice away from textbooks and towards data analysis is exactly the change called for by the NGSS. These ideas are further described in the teachers’ comments in Figure 21.

Figure 18

Note. Data from CSCS PD (2012). Each data point is a specific CSCS computer skill (e.g., creating online surveys) with teachers’ average confidence level before and after PD.

Figure 19

Note. Teacher self-report of frequency of textbook use before and after CSCS training (N=15 su 2012/sp 2013)

Figure 20

Note. Teacher self-report of frequency of data analysis activities before and after CSCS training (N=15 su 2012/sp 2013)

Figure 21

Note. Teachers’ comments on CSCS one year after attending CSCS PD.

Conclusion

Science education in the U.S. is at a pivotal point in history as we prepare our students to become both educated and productive 21st century citizens. The rollout of the more integrative standards for teaching in this nation provides a perfect opportunity to create more engaging student-centered classrooms. In this chapter, we outlined a pedagogical approach called CSCS that helps K-20 instructors use cloud-based technology to engage all learners in collaborative data analysis that is more akin to the true nature of doing science. We presented several illustrations of modified CSCS science lessons that emphasize whole-class data analysis and can be applied within any classroom context. CSCS allows students to clearly see trends and patterns, to analyze larger data sets, to correct their misuse of tools, input data, and see the value of the inquiry lessons. As such, we have shown that, through the application of whole-class data analysis, students are: a) more engaged with the science content being learned, b) they collaborate with one another more readily, c) they are more clearly able to see the scientific process they are engaged in, d) they are more likely to take ownership of their work, and e) all students are able to participate regardless of learning styles or learning roadblocks that might appear. Consequently, we believe that students in the large urban schools that our teachers serve will also begin to perform better on standardized tests. Therefore, using the CSCS model in classrooms will help decrease the U.S. achievement gaps particularly in the STEM and STEM-related fields.

Another critical aspect of the work presented in this chapter is the training of in-service science teachers. None of this will happen unless there is quality PD. We feel that our PD model (which includes clinical site training) significantly improves the ways teachers teach science that aligns with the new NGSS standards. Our evidence indicates that teachers are more confident when utilizing CSCS despite issues with existent technology infrastructure and the availability of computers. In the future, these issues will ultimately be resolved, thus resulting in a transformation of classrooms that will benefit all students. Additionally, we will be able to see growth in the ability of CSCS-taught students to participate fully in science using CSCS tools for whole-class data analysis. Teachers will be well-equipped to bridge current achievement gaps—especially with students who come from underrepresented populations in urban school settings—to make content knowledge accessible to students who may approach learning in different ways. In doing so, we will better prepare the next generation of citizenry who will invariably be expected to: a) use technology with competence, b) work in collaborative groups, and c) critically apply what they are learning and doing in a myriad of public and private industries, universities, and school settings. Lastly, we believe that the CSCS model can be extended across disciplines and look forward to providing these opportunities to teachers in a variety of subject matter content areas.

Future Research

Continue Research on CSCS-CSCL. As the issues with technology get resolved in more and more classrooms, it will be easier to collect student artifacts and compare results on standardized tests. As a result, this will help us in guiding our next steps in the use of the CSCS pedagogy. We believe that more students are engaged and are learning content in classrooms that are using CSCS tools than in non-technology classrooms and we feel that we will be able to support these claims as we gather more evidence.

Expand our Pedagogical Approach. We will continue to look at this research strand and have recently been funded by the Bechtel Foundation to do just that. With this new funding, we intend to begin providing a model for instruction in other content areas (i.e., Math, History, Special Ed., and so forth) as well as strengthen the career pipeline for STEM teachers in secondary education. Providing competent new teachers for our local schools will help create CSCL classroom environments more quickly. We have a large-scale evaluation project underway with outside top evaluators from the field looking at teacher outcomes and student outcomes to analyze the effect of CSCS learning in science classrooms. We will take lessons learned and apply them to the other subject matter content.

CSCS/CSCL Dissemination. Our summer institutes are ongoing and critical for the professional development of local teachers as the rollout of the new CCSS and NGSS continue. Many districts are not equipped to provide the PD needed by the teachers. At the university, we can provide both the content and the pedagogy (as our team is comprised of cross college and cross discipline scientists and researchers) to help teachers take advantage of the integrative approach of both of the new sets of standards. Over time, the technology will improve and the availability of the tools will be more and more accessible. We will continue to teach in university classrooms producing students who are tech-savvy in the use of CSCS tools. This will help not only our future K-12 teachers but also our students who choose other STEM content area specialties, providing them with 21st century tools.

As discussed, collaborative cloud-based technologies such as CSCS provides opportunities to instantly collect and analyze large data sets with speed and accuracy while simultaneously keeping students “on task”. The chapter authors, along with their own students and in-service teachers, have developed the CSCS pedagogy using technology to create a classroom environment that mirrors the collaborative environment of a professional scientific community and provides a way for NGSS rich lessons to occur in any science classroom. As such, all students in these classes can acquire a better understanding of the nature of science as they view their findings within the context of larger data sets collected by themselves and their peers. By engaging in laboratory activities to analyze whole-class data using wikis and collaborative web-based documents, teachers and students gain an understanding that the scientific enterprise requires collaboration, independent verification, and peer review.

References

AAAS. (1990). Science for all Americans. New York: Oxford University Press.

Alozie, N., & Mitchell, C. (2014). Getting students talking: Supporting classroom discussion practices in inquiry-based science in real-time teaching. The American Biology Teacher, 76(8), 501-506.

Augustine, N. (2007). Rising above the gathering storm: Energizing and employing America for a brighter economic future. Washington, DC: National Academies. Retrieved from http://www.nap.edu/openbook.php?isbn=0309100399

Bereiter, C., & Scardamalia, M. (2010). Can children really create knowledge? Canadian Journal of Learning and Technology/La revue canadienne de l’apprentissage et de la technologie, 36(1).

Berland, L. K., & Reiser, B. J. (2009). Making sense of argumentation and explanation. Science Education, 93(1), 26–55.

Blume, H. (2013, August 27). LAUSD launches its drive to equip every student with iPads. Los Angeles Times. Retrieved from http://www.latimes.com/local/la-me-lausd-ipads-20130828,0,906926.story#axzz2mcXRdEuT

d'Alessio, M.A. (2014). What kinds of questions do students ask? Results from an online question ranking tool. Electronic Journal of Science Education, 18(5).

d'Alessio, M., & Lundquist, L. (2013). Computer supported collaborative rocketry: Teaching students to distinguish good and bad data like an expert physicist. The Physics Teacher, 51(7), 424-427.

Darling-Hammond, L. (2006). Powerful teacher education: Lessons from exemplary

programs. John Wiley & Sons.

Foley, B. J., & Reveles, J. M. (2014). Pedagogy for the connected science classroom: Computer supported collaborative science and the next generation science standards. Contemporary Issues in Technology and Teacher Education, 14(4). Retrieved from http://www.citejournal.org/vol14/iss4/science/article1.cfm

Herr, N., & Rivas, M. (2010). Teaching the nature of scientific research by collecting and analyzing whole-class data using collaborative web-based documents. In J. Sanchez & K. Zhang (Eds.), Proceedings of World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education 2010 (pp. 1029-1034). Chesapeake, VA: AACE.

Herr, N., Rivas, M., Foley, B., Vandergon, V., & Simila, G. (2011a). Using collaborative web-based documents to instantly collect and analyze whole class data. In Proceedings of the 9th Annual Hawaii International Conference on Education, January 3-7 (pp. 2497-2503). Honolulu, Hawaii.

Herr, N., Rivas, M., Foley, B., Vandergon, V., & Simila, G. (2011b). Computer supported collaborative education - strategies for using collaborative web-based technologies to engage all learners. In Proceedings of The 9th Annual Hawaii International Conference On Education, January 3-7 (pp. 2504-2506). Honolulu, Hawaii.

Herr, N., & Tippens, M. (2013). Using scanning apps on smart phones to perform continuous formative assessments of student problem-solving skills during instruction in mathematics and science classes. In T. Bastiaens and G. Marks (Eds.), Proceedings of World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education 2013 (pp. 1138-1143). Chesapeake, VA: AACE.

Herr, N.; Tippens, M., Rivas, M., Vandergon, V., d'Alessio, M., Reveles, J., & Foley, B. (2015 submitted). Continuous Formative Assessment (CFA) - A cloud-based pedagogy for evaluating student understanding to optimize STEM teaching and learning. In L. Chao (Ed.), Cloud-based STEM education for improved learning outcomes. To be published by IGI Global.

Hug, B., Krajcik, J. S., & Marx, R. W. (2005). Using innovative learning technologies to promote learning and engagement in an urban science classroom. Urban Education, 40(4), 446–472.

Lee, H. S., Linn, M. C., Varma, K., & Liu, O. L. (2010). How do technology‐enhanced

inquiry science units impact classroom learning? Journal of Research in Science Teaching, 47(1), 71-90.

Levy, B. L.M., Thomas, E.E. Drago, K., & Rex, L.A. (2013). Examining studies of inquiry-based learning in three fields of education: Sparking generative conversation. Journal of Teacher Education, 64(5), 387-408.

McNeill, K. L., & Krajcik, J. (2008). Scientific explanations: Characterizing and evaluating the effects of teachers’ instructional practices on student learning. Journal of Research in Science Teaching, 45(1), 53–78.

National Research Council. (1996). The national science education standards. Washington, DC: The National Academy Press.

NCES. (2013). Public school graduates and dropouts from the common core of data: School year 2009–10 (No. NCES 2013309REV). Washington, D.C.: National Center for Education Statistics. Retrieved from http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2013309rev

Newton, L. D. (2002). Teaching for understanding in primary science. In L. D. Newton (Ed.), Teaching for understanding across the primary curriculum (pp. 27–37). Multilingual Matters. Retrieved from http://books.Google.com/books?id=KAhbNIyyGyUC&lpg=PA27&pg=PA27#v=onepage&q&f=false

NGSS Lead States. (2013). Next Generation Science Standards: For States, By States. Achieve, Inc. on behalf of the twenty-six states and partners that collaborated on the NGSS.

NGSS Appendix F. (2013). Retrieved Feb. 1, 2015, from http://www.nextgenscience.org/next-generation-science-standards

OECD. (2010). Educational research and innovation: Are the new millennium learners making the grade? Technology use and educational performance in PISA 2006. Retrieved December 16, 2012, from http://www.oecd.org/edu/ceri/educationalresearchandinnovationarethenewmillenniumlearnersmakingthegradetechnologyuseandeducationalperformanceinpisa2006.htm

Reiser, B. J., Berland, L. K., & Kenyon, L. (2012). Engaging students in the scientific practices of explanation and argumentation. Science Scope, 35(8), 6-11.

Rivas, M., & Norman, H. (2010). The use of collaborative web-based documents and websites to build scientific research communities in science classrooms. In Proceedings of the 8th Annual Hawaii International Conference on Education, January 7-10 (pp. 851-858). Honolulu, Hawaii.

Scardamalia, M., & Bereiter, C. (2003). Beyond brainstorming: Sustained creative work with ideas. Education Canada-Toronto, 43(4), 4-8.

Schweingruber, H., Keller, T., & Quinn, H. (Eds.). (2012). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. National Academies Press.

Shirley, M. L., Irving, K. E., Sanalan, V. A., Pape, S. J., & Owens, D. T. (2011). The practicality of implementing connected classroom technology in secondary mathematics and science classrooms. International Journal of Science and Mathematics Education, 9(2), 459-481.

Slotta, J. D., & Linn, M. C. (2009). WISE science: Web-based inquiry in the classroom. In Technology, education--connections. Teachers College Press.

Songer, N. B., & Gotwals, A. W. (2012). Guiding explanation construction by children at the entry points of learning progressions. Journal of Research in Science Teaching, 49(2), 141-165.

Stahl, G., Koschmann, T. D., & Suthers, D. D. (2006). Computer-supported collaborative learning: A historical perspective. In R. K. Sawyer (Ed.), Cambridge handbook of the learning sciences (pp. 409–426). Cambridge, UK: Cambridge University Press. Retrieved from http://scholarspace.manoa.hawaii.edu/handle/10125/22870

Suthers, D. D. (2006). Technology affordances for intersubjective meaning making: A research agenda for CSCL. International Journal of Computer-Supported Collaborative Learning, 1(3), 315–337.

Tal, T., Krajcik, J. S., & Blumenfield, P. C. (2006). Urban schools’ teachers enacting project-based science. Journal of Research in Science Teaching, 43(7), 722–745.