Welcome to the UYSEG blog
This blog is written by members of UYSEG for the UK school science education community.
The UYSEG network is a termly meeting for teachers, researchers and those involved in science education to engage in discussion about the relationship between science education research and classroom practice.
At the latest meeting, Mary Whitehouse led us in thinking about assessment for learning, and in particular, how questions can be used in low-stakes situations to improve student learning in science.
As in medicine, questions can be used to diagnostically, i.e. to identify problems on the basis of evidence. Mary explained that in science education, diagnostic questions can be used to give teachers information about what children have understood, so that informed decisions can be made about what to teach next. Using examples from particle theory and drawing on work done by Phil Johnson and others, Mary demonstrated how carefully designed questions can help teachers get to the crux of what children understand about scientific ideas.
Prompted to consider our students’ likely responses when asked what the bubbles are made of when presented with a video of boiling water and multiple options, Mary demonstrated that the design of diagnostic questions requires teachers to know not only the correct answer (and the reasons why it is correct), but likely incorrect answers and the reasoning that leads students to respond in this way. Provided with different diagnostic question formats: PEOE (Predict Explain Observe Explain) focused cloze exercises and talking heads, we were charged with creating diagnostic questions that could be shared via a google group.
Teachers interested in creating and sharing examples of diagnostic assessment items under a creative commons are invited to join the group by contacting firstname.lastname@example.org.
“Sharing high quality questions may be the most important thing we can do to improve the quality of student learning”
Wiliam, 2011, p.104
Look out for the next meeting on Friday 16th October (4.30 refreshments for 5pm start).
Wiliam, D. (2011). Embedded formative assessment. Bloomington, IN: Solution Tree Press.
RISES, Research Into Students’ Engagement with STEM, will help further understanding of young people’s decisions to pursue a career in STEM (Science, Technology, Engineering and Mathematics), space-related or otherwise.
In November 2015 the European Space Agency is sending its first British astronaut, Tim Peake, to the International Space Station for his six month Principia Mission. A host of educational programmes have been developed surrounding these events, in order to engage young people, and to inspire the next generation of UK scientists and engineers. Our team at York will be evaluating these educational efforts, investigating students’ attitudes to, and engagement with, STEM subjects (science, technology, engineering and mathematics), in particular in relation to human spaceflight. We invite you and your schools to take part in our project.
The research project is funded by the UK Space Agency and the Economic and Social Research Council, led by Professor Judith Bennett and conducted by a team of researchers of the Department of Education at the University of York. The team includes Dr Jeremy Airey, Dr Lynda Dunlop and Dr Maria Turkenburg. The research has been approved by the Department’s Research Ethics Committee.
Data collection involves a survey of a range of schools in a variety of contexts, before and after the launch of Tim Peake’s Principia mission. We will be using questionnaires and interviews with children, young people and their teachers. The purpose of the questionnaires is to gauge young people’s attitudes towards STEM subjects, while the interviews will explore the reasons and explanations for these attitudes. In addition, we are looking to make a connection between the information the interviewed students provide, and their background and attainment data from the National Pupil Database.
Taking part in the pilot phase
At the moment we are urgently looking for schools, both primary and secondary, to be part of our pilot study (although we would also be delighted to work with you in the full research project, if you prefer).
The pilot study would involve at least one group of year 5 or year 8 students (preferably all of them) to fill in our online questionnaire, which asks them about their attitudes to STEM subjects inside and outside the school context, specifically in relation to their learning about space. This should take no more than 45 minutes (and most likely considerably less), and will help us inform the design and use of the survey, in order to be ready to administer it to a larger sample for the full project.
Taking part in the full project
In case you would be more interested in joining the full research project, we would look to recruit your current year 4 or year 7 group.
What we would like from your school:
● for a single year group of students to complete a 45-minute (maximum) online questionnaire three times (Autumn 2015, Summer 2016 and Summer 2017), following the same group of students as they move up the school, at a time convenient to you and your school. Paper copies would be available on request;
● on a follow-up visit to a subset of the schools soon afterwards, for one of our researchers to have interviews with the year group’s teacher(s) and a group of the students who completed the questionnaire (around 30 minutes each). We would like the focus group students to be diverse across all categories of social difference, if possible, including gender, ethnicity and ability;
● to make a link with focus group students’ data in the National Pupil Database, for which we will require their full name, date of birth, home postcode and, ideally, their UPN/ULN. Procedures are all in full compliance with Data Protection legislation.
Photo: ESA-M. Alexander
Benefits to your students, you and your school
A summary of our final project report as presented to the UKSA and ESRC might be used by schools to inform practice. You will be recognised as a “University of York Science Education Group Research Partner School”.
For more information about the project or to sign up, please contact Maria Turkenburg, the project’s Research Officer, by email at email@example.com or by telephone at 01904 323444.
We are looking forward to hearing from you, and hopefully working with you in the near future.
It's York Pride on Saturday 20th June, and this year the theme is 'Raise your Rainbow'. The University of York Science Education Group (UYSEG) will be running some hands-on, rainbow-themed experiments on the University stall and explaining the science behind rainbows.
Salters Horners Advanced Physics (SHAP) is a context-led, A level physics course developed by UYSEG. The course has been updated and relaunched for the new A levels in 2015.
The SHAP course provides a context-led approach to the Edexcel specification, and is now in its third edition. It has been revised and updated by writers able to draw on their own experience of teaching SHAP.
The move to terminal assessment means that we have been able to restore much of the original structure of SHAP (we are no longer constrained by modular assessment requirements), as well as updating the contextual aspects and revising the materials to match the new specification.
The new requirements for practical assessment fit in well with SHAP. The directors of SHAP and its sister project, Salters-Nuffield Advanced Biology (SNAB, also assessed by Edexcel), have devised a coherent framework for the development and assessment of practical skills, and all the new .core practicals. are an integral part of the SHAP course.
New course materials
Thanks to a splendid team of writers, the Salters Horners Advanced Physics (SHAP) course materials for the first year of the new A-level are now published. The second-year materials are going through the editing process and are on schedule for publication in the autumn.
Click here to find out more about the SHAP course materials.
SHAP is fortunate in being sponsored by the Salters and Horners companies, and both offer grants to centres to help with the purchase of SHAP course materials. The deadline for Salters applications has now passed (sorry!), but Horners offer grants of up to £250 to centres moving to SHAP in September. The application deadline is 15 June 2015. Click here to begin the application process.
I write this as I prepare for researchED New York, the latest in a line of conferences that provide a forum for teachers, education researchers, and others interested in evidence-informed practice to meet and exchange ideas.
My presentation at this conference is here.
Teachers who want to engage with research evidence will already be reflective practitioners; their reflections may lead to questions that they turn to research to answer. During their PGCE course trainee teachers use the research literature to help them reflect on their practice, but once they become classroom teachers it is difficult to continue this level of interaction with current research findings. Many teachers struggle to find time for reflection, as one class leaves the room another almost group almost immediately arrives and it can be difficult to access current research papers, although there are some routes which I have described on the researchED website.
All our projects are a collaboration between researchers, science writers, educators, and teachers in schools; the team draws on research carried out at York, but also, of course, research from the wider community.
In thinking about the approaches we will take to developing new resources we will consider where we can support teachers to make the most difference to their students' learning. There is a large body of research evidence to support the effectiveness of formative assessment, including the work of Paul Black and Dylan Wiliam (1998, 1998, 2003).
These three publications together show how putting research evidence into practice can work. Following the publication of their original research paper in 1998, Black and Wiliam wrote Inside the Black Box, a booklet that summarised the key findings for teachers. Then, together with colleagues, they worked with science and mathematics teachers to show how formative assessment can lead to significant improvements in students learning.
For formative assessment to be effective, there need to be high quality assessment items available to teachers; that was the motivation behind the York Science project – developing assessment items that would provide evidence of students’ understanding of key ideas in science.
Many of the assessment items that we are developing for York Science are ‘diagnostic’ questions of the kind described earlier. These kinds of questions can help focus the teacher’s thinking on the ideas and skills that really matter if students are to make progress in their studies.
Even where research indicates an intervention is capable of leading to student improvement, it may not be implemented well enough to bring about any visible improvement, as Smith and Gorard (2005) showed in their case studies of the implementation of Assessment for Learning.
For this reason one of the strands of work at UYSEG is to develop professional development materials which will support teachers in implementing the resources we publish. Currently we are offering training to teachers who will be teaching our revised A level courses from September 2015. See the Events page for details.
Our A level Science courses, Salters Advanced Chemistry, Salters Horners Advanced Physics and Salters Nuffield Advanced Biology are context-led courses. Similarly Twenty First Century Science helps students how to think about the science they will meet in their everyday lives.
Clearly it is essential for these courses that that not only is the science we teach correct, but also up to date. So our collaborations go wider than the education research community, we also work with scientists and engineers not only here in the University of York, but also in other universities, in research institutes and in industry.
We will write more about these collaborations in future blogs.
It is often said, in a pejorative tone, that teachers spend too much time ‘teaching to the test’. Teachers might argue that it is their professional duty to prepare students for the tests they will face, and ask what else could they be expected to do. Given that these ‘tests’ are high stakes assessments that not only affect the futures of students, but also provide accountability data about the teacher and the school, you would not expect anything else of teachers.
Over the past few weeks I have been involved in discussions about assessment of school science both in the UK and US. This has led to me reflecting yet again on the purposes of assessment and how we can ensure that assessment supports good teaching and learning.
But what is relevant to all discussions about assessment is that unless an assessment is valid (that is assesses what it claims to assess), and reliable (the outcomes truly reflect the competencies of the students) then it is not worth using implementing that assessment.
I have recently returned from a conference at Stanford University discussing the assessment of science in schools. Currently in many states in the US the assessment of students’ ability in science at each grade is through the use of multiple-choice tests. However the new Next Generation Science Standards (NGSS) are being introduced across many states; science educators and others are considering how best to assess students against these standards.
There seemed to be general agreement that a broader portfolio of testing tools should be used than just multiple choice tests. However, in deciding what tools to use the purpose of the assessments of the assessment must be clarified. These purposes may include:
Whilst many kinds of assessments could be used for each of the purposes, it becomes complicated if the outcomes from an assessment are used for more than one purpose, Stobart (2008) suggests that “in accountability cultures with frequent high-stakes testing, making headway with formative assessment will be more difficult” (p159)
Designing assessments of understanding that are used formatively by teachers requires efficient ways of capturing the information they yield, so that teachers and learners can use it to good effect. If there is a significant time delay between the assessment being taken and the outcomes being available, it is likely that the teaching will have moved on – and those students who did not understand the previous lesson may now be struggling with the later ideas. For this purpose multiple choice questions that incorporate common misconceptions are often ideal. Our 'cupboard under the stairs' question is such an assessment item.
The question can be displayed by the teacher and the information collected can be acted on immediately.
The new NGSS not only lists the science ideas that students should know and understand, it also describes the science and engineering practices that they need to develop.
Teachers in England will recognise these as being very similar to Working Scientifically in the new National Curriculum Programmes of Study for Science. These Practices are not easily assessed by multiple-choice questions, much better to set them in the context of more extended tasks, including practical investigations. It is possible to write workable marking criteria (rubrics) for such investigations, indeed we (UYSEG) have done this with OCR in developing the coursework tasks for Twenty First Century Science where Investigations, Data Tasks, and Case Studies have been part of the assessment since 2006.
Using these rubrics with students provides an opportunity for formative assessment – the students can see for themselves what they need to do to improve their work. Encouraging students to mark their own work, or that of their peers, with the oversight of the teacher, would reduce the marking load for the teacher and develop students’ understanding of the requirements of the Practices.
The same kinds of tasks, using the same rubrics could also be used for summative purposes, with the marking being carried out by teachers. There would need to be some form of moderation to ensure that teachers in different schools were marking to the same standard. There is a precedent for this in the UK from the 1970s and 1980s when science teachers in local consortia of schools worked together to produce Mode 3 Certificate of Secondary Education (CSE) courses which were locally assessed and marked, with the local Examinations Board ensuring that the standards were comparable with other similar qualifications. More can be read about developing such assessments in Nuffield Secondary Science Examining at CSE Level (Nuffield, 1972).
I would suggest that encouraging teachers to work together to develop suitable tasks, which they then mark against commonly agreed rubrics would help them to engage with the detail of the NGSS and support the planning of their teaching. This would be excellent professional development for those teachers and so benefit the students they teach.
However, if we consider the use of assessment for accountability purposes, teachers marking their own students’ work bring its own problems. The problems have been such that in England, after over 30 years of teacher-marked work counting towards qualifications in science, new science specifications at GCSE and A level will no longer include such marks. This has been the result of many consultations by the regulator, Ofqual. In 2013 they reported
A possible way to remove the pressure on teachers is to organise local clusters of schools to work together to moderate each other’s marking, as in the CSE model described above. Ofqual felt this could not be imposed in England where there are currently five different science suites being taken across the country (2014. p. 14). However, in the US where the same assessments are taken across a state, it might be possible to develop such a system. I look forward to hearing how these developments progress, and shall probably refer to them when I speak about research-informed curriculum design at researchED New York in May.
I wrote about the Ofqual’s decision to remove teacher assessment of practical work when their report was published, and my colleague, Alistair Moore wrote about it on this blog.
Mary Whitehouse is a member of UYSEG with an interest in secondary education and particularly in the relationship between teaching and learning and assessment. You can follow Mary on Twitter.
Nuffield (1972) Nuffield Secondary Science Examining at CSE Level. Longman downloadable from the National STEM Centre Library.
Ofqual (2013) Review of Controlled Assessments in GCSEs
Quinn, H., Schweingruber, H., & Keller, T. (2012) A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas
Stobart, G. (2008) Testing times: The uses and abuses of assessment. Routledge
Many of the questions and tasks we have been developing in the York Science project can be described as ‘diagnostic’. That is, they do not only tell you which students have some understanding of the idea we are thinking about, they also give you some information about the misconceptions of those who do not use the accepted scientific explanation.
For instance take a look at this question about how we see.
The correct answer is D, but A, B, and C are all ideas held by many students (and adults).
In the Programme of Study for Science for Year 6 (10-11 years old) students are expected to be taught that we see things because light travels from light sources to our eyes or from light sources to objects and then to our eyes. So this question could be used in primary school to check that students understand this idea at the end of a sequence of teaching, or it might be used in secondary school prior to teaching ideas that further develop students understanding of light.
For all the diagnostic questions we develop we use the available research evidence to inform our writing. A good starting point for teachers looking for information about students’ ideas is the work of Ros Driver, including Making sense of secondary science , which includes useful bibliographies for each chapter.
Rather than ask students simply to select one correct answer, where they may hold more than one of the ideas listed you can gain additional information by asking them how they sure they are about their answers.
A confidence rating grid like this can supply that information quickly.
When planning to use the question with a class, you need to think about what you will do next when some of the class give the answers A, B, or C. Do you have a dark room you can use to show them that in complete dark you can see nothing? Or maybe the geography department will be taking them on a trip to some caves? (I am not suggesting you bring a cat into the dark room, or take one down the caves!)
This item is one of the Evidence of Learning Items developed for the York Science project. The resource sheet and presentation are available in the Download section.
Mary Whitehouse is a member of UYSEG with an interest in secondary education and particularly in the relationship between teaching and learning and assessment. You can follow Mary on Twitter.
Why will the Sun appear to go dark tomorrow morning? Why will the eclipse look different from various places on Earth? And what’s the difference between a solar eclipse and a lunar eclipse?
Tomorrow’s eclipse of the Sun offers teachers and students the opportunity to explore how models can help us explain phenomena and answer questions about familiar and unfamiliar events.
The ‘Working Scientifically’ strand of the new National Curriculum requires students, by the time they complete Key Stage 4, to be able to use a variety of models to solve problems, make predictions and develop scientific explanations. Models and modelling are fundamental to science, and are really very useful; even the simplest of representational models can help us to visualise scientific explanations and mechanisms using physical analogies. Almost anything will do – a lamp, a tennis ball and a football can help students at Key Stage 3 and Key Stage 4 to understand what’s going on with the solar eclipse.
We’ve teamed up with Oxford University Press to give free access to an activity from Twenty First Century Science, one of our biggest curriculum projects, that will enable students to explore models and explanations of the science behind the eclipse.
Click here to download a free copy of the ‘Modelling an eclipse’ activity.
This hands-on activity does not require any specialist equipment and comes with student worksheets and teacher guidance. The activity was developed by UYSEG, the Nuffield Foundation and Oxford University Press for GCSE Twenty First Century Science.
We hope you enjoy the eclipse tomorrow morning. Coverage of the sun will range from 85 to 98% in the UK (better the further north you are), and we won’t experience coverage like that again in the UK until 2026 – so it’s one not to be missed. And whatever you do – watch safely.
For further information on Twenty First Century Science please visit our curriculum projects page on the University of York Department of Education website, or follow @C21Science on Twitter.
Alistair Moore is a member of UYSEG with an interest in secondary science education and assessment. You can follow Alistair on Twitter.
The reformed science A Level and GCSE courses launching over the next two years will have an increased emphasis on the use of mathematics. Fundamentally, we might hope that all students of science will appreciate that mathematics helps us develop scientific explanations and solve scientific problems. Using mathematics is as much a part of science as using words and using apparatus, and is not just something that happens in a classroom down the hall with no relevance here.The Association for Science Education has launched a new project called The Language of Mathematics, which aims to develop guidance materials for teachers on vocabulary, processes and approaches to teaching. Visit the project page on the ASE website for further information.
So how can we make links between science and mathematics in the science classroom?
One strategy might be to highlight how a number of the explanations we explore in school science depend on mathematical ideas, and how a single mathematical idea can underpin many explanations in science. Proportional reasoning is one such idea, at the heart of scientific explanations in physics, chemistry and biology.
Teachers at a recent meeting of the UYSEG Network brainstormed science ideas
from the reformed GCSEs that depend on proportional reasoning.
Can you think of others?
Proportional reasoning is more than just solving proportions. Some research in mathematics education provides examples of pupils who can solve proportions by setting up two equivalent ratios and solving for an unknown, but who cannot reason about proportions in a way that makes sense of a proportional situation (e.g., Lobato, Ellis, and Zbiek, 2010). Inability to reason proportionally presents more than just a problem in mathematics lessons, though. Pupils who lack an understanding of proportions are left without one of the essential tools for making connections between science and mathematics, or among different concepts in science.
Lobato, Ellis, and Zbiek (2010) offer this as the essential understanding about proportions: “When two quantities are related proportionally, the ratio of one quantity to the other is invariant as the numerical values of both quantities change by the same factor.”
Using double number lines is one way to focus on proportional reasoning. Consider this proportional situation:
40 ml of a substance has a mass of 31.4 g. What is the mass of 100 ml of the substance?
Maths and science teachers alike will recognize this as a situation about constant density. Here’s one way to reason about the problem using a tool that’s different from the usual approach of setting up and solving a proportion. Using a double number line, the quantity of the mass and the quantity of the volume are represented by the same distance along the number line:
There are a variety of ways to reason what the value that corresponds to 100 ml is. One strategy might be to divide the distance between 0 and 40 ml into four equal parts; this leads to 10 ml corresponding to 7.85 g. Taking ten copies of that length gives 100 ml, which is equivalent to 7.85 g x 10 = 78.5 g. The number line is a visual and structural tool used to keep track of these lengths; it isn’t important that the line be drawn to scale:
Similar reasoning with a slightly different approach might be to divide the distance between 0 and 40 ml into two equal parts; this leads to 20 ml corresponding to 15.7 g. Taking five copies of that length gives 100 ml, which is equivalent to 15.7 x 5 = 78.5 g.
Other approaches include using strip diagrams or ratio tables to reason about proportions.
When a student appreciates that proportional reasoning can be used to solve other problems and explain other phenomena in science, the mathematical idea becomes a threshold concept that could transform the student’s approach to problem solving regardless of context.
By making clear the links between mathematics and science (and different ideas within science), teachers can help students see mathematical problem solving as helpful in the process of doing science, rather than as something just to be rote learned in specific contexts for an exam.
This blog is based on a session led by Beth at a recent meeting of the UYSEG Network, which brings together researchers and teachers with an interest in implementing research in the classroom. Beth’s presentation from the event is attached below.
Lobato, Ellis, & Zbiek (2010). Developing Essential Understanding for Teaching 6-8 Ratio, Proportion, and Proportional Reasoning. NCTM.
Let’s get one thing straight: scrapping problematic summative assessment of practical work is not the same as ‘scrapping science practicals’. As is often the case with education policy announcements, Ofqual’s proposal about the future of GCSE practical work illustrates the need to look beyond the headlines.
What Ofqual published last week was the outcome of its public consultation on the assessment of practical work in reformed science GCSEs for 2016. The consultation presented, and sought to answer, the following problem:
How can we:
Direct and indirect assessment
The original consultation document considered the difference between direct assessment and indirect assessment of practical skills and understanding. Direct assessment generates a mark based on observation of the student doing practical work (manipulating apparatus and materials, working safely, etc.). Indirect assessment is based only on written work associated with a practical activity, which could be a student’s write-up of the activity or their answers to questions based upon it.
For both forms of assessment there are questions about who should do the marking (teachers or external examiners), whether the assessment is valid and whether or not the marks should contribute to the final grade.
This proposal suggests an arrangement in which direct assessment does not contribute to the final grade. It certainly does not amount to the scrapping of practical work in science GCSEs.
Some will argue that without direct assessment, practical work will be seen as less important and will be squeezed out of lessons. I disagree. Present and past arrangements, including coursework and controlled assessment, have not included direct assessment (relying instead on the written record produced by the student). Most science teachers want to do practical work, they just want the freedom to do it their way. Removing the need for laborious internal assessment (characteristic of coursework and controlled assessment) will, hopefully, free up time for the planning and embedding of practical work into teaching to support learning.
The proposal includes safeguards to ensure that practical work will be done. The first – the carrot – is the indirect assessment in the exam papers. Students who have had a wide range of practical experience ought to be better able to answer these questions, and thus boost their grade as a result. The challenge for the exam boards will be setting questions and mark schemes that are a valid assessment of practical understanding and that differentiate between students who have experienced relevant practical work and those who have not. Examiners will need to be trained and predictability in assessment (which could have a narrowing effect on what practical work is done) avoided.
The second safeguard – the stick – is the requirement for schools to confirm that students have completed a range of practical work, and the threat of sanctions if they haven’t. Ofqual admits that how this will be regulated is yet to be decided, but any sanctions to be applied should be sufficient to deter a school from taking this course of action whilst not penalising the students themselves.
The format of the ‘student record’ is also yet to be decided, but the regulator and the exam boards should ensure that maintaining it does not become cumbersome and a disincentive to doing practical work. It should primarily be a useful learning and revision tool for the student, and it should take whatever format is appropriate for the student and the school.
A practical solution?
Claims of ‘government ire’ stem from a letter to Ofqual from the Secretary of State for Education. In the letter, Nicky Morgan expresses her desire that the new system should be properly regulated to ensure a sufficient amount of practical work is done, and that the system is monitored over time to ensure effectiveness. Reasonable requests. In return, the DfE should ensure that schools receive sufficient funding to provide for both the practical work they want to do and the professional development they need to excel.
Given the time scale in which the proposal was drawn up, it seems to be a sensible approach to a difficult problem and should enable teachers to embed a range of practical work in GCSE science lessons. It removes the need for laborious internal assessment and the conflict for teachers between assessing performance and performance measures.
Until somebody can come up with a form of direct assessment that is manageable with large GCSE cohorts and does not limit the range of practical work that is done, leaving direct assessment out of the equation may just be the best solution.
Alistair Moore is a member of UYSEG with an interest in secondary science education and assessment. You can follow Alistair on Twitter.
Read Mary Whitehouse’s thoughts on this issue on the Education in Chemistry blog.