To get a sense of the success of the Sun Camp program, we planned two types of assessments: a content assessment that evaluates what participants learned about heliophysics and a satisfaction assessment that explores participants' enjoyment and use of the program as well as their relationship to science itself. On this page I'm focusing on the content aspect. An evaluation of content knowledge and skills-level in relationship to a NASA-provided heliophysics question bank was required by the grant. As part of the program, I also was required to develop a learning assessment tool that educators could use with students.
In addition to the quiz-style content assessment, I also developed a form that requested photos from participants showing their work. In addition, we asked, "Tell us a bit about your picture. What does it show? What did you learn? How did you feel when you did this activity?" The goal was to informally assess participants' understanding of the content.
Because the age group was young and the timing of the program placed the assessment near a holiday (Thanksgiving), I wanted to keep the evaluation short and focused. I decided on a 10-question quiz that featured both multiple-choice and open-ended questions. Topics were selected based on the information provided in the six educational resources developed for Sun Camp. Specific focus and phrasing of questions were further developed and refined using the provided question bank, though I did not use any of the questions verbatim as supplied. Perhaps if I had had the question bank before developing the resources, I may have been able to tie these aspects together more closely, but unfortunately, I did not have access to the resource until late in the program development. Nonetheless, I think the assessment reflects the intention and goal set by the grant adequately.
In addition to evaluating content knowledge, I had other goals for the assessment as well. I wanted it to be easy for teachers to copy and distribute. Therefore I opted to use a Google form that can be easily integrated into Google Classroom as needed or distributed independently for OST programs, families, and homeschoolers. This offered the most flexibility and ease of use. It also allowed me to embed an answer key with links back to the Sun Camp resources to make evaluation and remediation easy for those using the quiz.
As the last goal, I wanted to make the assessment as engaging as possible to encourage its use. I started by making it visually engaging by embedding photos into the quiz to demonstrate concepts. I was also able to offer a free gift to anyone that completed the assessment. So any participant that completed the quiz, regardless of performance, would receive a Science Friday branded desk organizer.
The request to complete the assessment was sent via an email newsletter to all participants. I reminder was also posted in the SciFri STEM Educator's Lounge. A copy of the assessment is linked below.
As of December 7, I have received 16 responses. This is fewer than I had hoped for, but given the time of year, low participation is not really surprising. Even with a small sample size, it is possible to draw some conclusions from the results.
We'll start with the basics. The average age of the respondents was about 8 years old, which fits our targeted demographic of 5 to 9-year-olds. (One participant responded with the parent's age rather than the child's age. This result was removed from the calculation of the average age.) All of the respondents at this point are family participants or homeschoolers. It does not look like OST or in-class educators have participated at this time.
The average grade was a 67%, which is lower than I was hoping for, but not entirely surprising for a voluntary, online program that relied primarily on written (in only English) asynchronous content for delivery. Of the 16 participants, half received a grade of 70% or above on the quiz. A quick look at the bell curve, shows a shift to the right, indicating that many participants scored near the average, but those who scored below that average scored very poorly. This is borne out by the individual scores. So it looks like we had most participants gaining average mastery, but the overall average was reduced as a result of poor understanding in a small group of participants.
Looking at the actual results, we get a sense of what topics were well understood by participants. They scored especially well on the question about what causes the apparent change in the Sun's position across the sky during the day, why planets follow an orbit around the Sun, and how scientist study the Sun and solar system. Nearly 90% of participants were able to answer these questions correctly.
Most participants also understood that light from the Sun helps humans make Vitamin D, can cause sunburn, influences photosynthesis, causes shadows, is absorbed by the land, water, and air, and can damage pigments. However, 57% believed that light from the Sun can disrupt satellites and radio signals, which is not true. And 50% believed the light from the Sun causes the auroras of the Northern Lights, which is a completely different phenomenon. In fact, the question about auroras showed that 33% of respondents believe UV light causes auroras. Clearly, a lot of misunderstanding still exists on this topic, which indicates an area for further development of resources.
If I hold the camp again, I plan to rewrite or select different pictures for the question, "What is the Sun made of?" Options included "Electrically charged plasmas," "Fiery cloud of gas," "Very hot rocks," and "Molten lava." Most participants (69%) chose the correct answer, "Electrically charged plasmas." However, 33% chose "Firey cloud of gas." Neither of the other incorrect options was selected. This makes me wonder if I failed to clearly explain the difference between plasmas and gases in the educational resource or if this misunderstanding comes from elsewhere.
Of particular interest was the question on energy from the Sun. The educational resource focused on electromagnetic radiation, specifically visible light and ultraviolet light, and magnetic fields, as energy from the Sun that affects Earth. 63% of respondents selected both magnetic fields and electromagnetic radiation correctly. 88% correctly selected magnetic fields, and 75% correctly selected electromagnetic radiation. Only 6% correctly identified X-rays as a type of energy from the Sun that reaches the Earth. I expect this, as X-rays and gamma rays were not specifically addressed, but I wanted to see if participants were using the additional resources provided. These results seem appropriate given the focus of the resources provided.
What was very surprising was the number of participants, 19%, that believe the sound from the Sun reaches Earth. This indicates a significant misunderstanding about the nature of open space and sound waves. Again, this may be an area where we can develop additional resources.
I was also surprised by the number of participants who did not understand the nature of the solar cycle. Only 56% understood that this cycle is the result of "A pattern of increasing and decreasing solar activity on the Sun's surface." 38% believed it was the result of the rotation of the Sun. However, this level of understanding seems very different from that expressed through the informal collection of photos of the project (as seen in the next section). I only have two points of overlap between those who took the quiz and those who submitted photos. Those who did the hands-on project related to the topic did get the answer correct, but this is not enough to imply that there is a correlation between the activity and better understanding. I plan to reach out to participants to better understand this difference.
Below is a summary of the performance on the quiz.
I worked with our social media manager to develop an Airtable form allowing Sun Camp participants to submit photos and videos related to their project work. We used Airtable instead of Google Forms because it allows users to easily attach media and submit it through the form. Plus, the app embeds images easily into the resulting spreadsheet for easy review and access. This was the first time I've used Airtable, and I look forward to using its features for other projects in the future. It certainly made it easy to collect this kind of information.
In addition to collecting photos, we asked participants to share what they learned. We asked, "Tell us a bit about your picture. What does it show? What did you learn? How did you feel when you did this activity?" The goal was two-fold: 1) We wanted to get a sense of what information the participants remembered from reading the instructional text and completing the project, and 2) We wanted to gauge how completing the projects engaged and connected participants to solar science.
We received the most submissions for the first project of Sun Camp, Make A Swirling Shaving Cream Sun Model. Given the comments shared, it was clear that participants understood the main concepts: that the surface of the Sun changes, that features such as sunspots, solar flares, and CMEs occur, and that the increase and decrease of activity is part of the solar cycle. They used vocabulary words such as solar maximum and solar minimum. This was very encouraging, though it paints a bit of a different picture than we got from the results of the formal assessment (see above.)
The picture is a little less clear as we move through the week. Photo submissions sharply declined after the first week or two. We received few submissions for other projects. However, the evidence we did collect shows a good understanding of UV light, for example. And the photo submission for the Use Engineering To Design A Solar Space Probe shows that the participants did define a problem to solve before making their prototypes.
However, the submissions we got for the Gravity And Centripetal Force In Our Solar System did not show a strong understanding of the underlying concepts. For example, one submission of a video shows a participant dropping two plastic eggs, one with 4 marbles in it and the other empty. She does not drop them from the same height as instructed. Based on how gravity works on Earth as described in the section "Understanding Gravity In Free Fall" the eggs should land at the same time if dropped from the same height. The difference in mass doesn't matter due to inertia. This concept was missed by the participant. They wrote, "The egg without the sticker dropped in the big first and faster because of the weight of the egg (contained 4 marbles) and gravitational [sic] force as opposed to the one with the sticker (no marbles, empty)." This was disappointing. But the results of the formal assessment do indicate that participants at least recognize gravity as an important force that helps to create the orbits of planets and moons.
More encouraging were the statements that indicated a connection to science. For example, participants wrote:
"I felt happy when I was making the model."
"I felt exited [sic] because it was a new activity. I learned there was something known as a CME."
"I was happy to do this activity and I learned that there are dark spots on the Sun!"
"It was exciting making a model of the sun."
"We had so much fun learning about gravity."
These kinds of statements make it clear that those participating in Sun Camp were engaged and excited to be part of the process. We had similar comments regarding the Q&A sessions as well.
All the submissions and comments are available in this compilation. I've shared some examples below as well.