By Simon Cook, Department of Geography
In redeveloping and reflecting upon my tutorial sessions supporting two first-year groups undertaking the introduction to human geography module, I recognised the difficulty some students faced in transitioning to the higher standard of academic skills required at HE compare to FE. Therefore, I developed a tutorial session around essay writing, which aimed to introduce, apply and experiment with various fundamental academic skills by using crowd-sourced, student-produced learning resources. I have previously discussed this session in Cook (2017).
The sessions itself adopted elements of a flipped learning approach (Bergmann and Sams, 2014; Flipped Learning Network, 2016), where the students engaged with the core material/ direct instruction individually in preparation for the session, and explored, applied and challenged the material further in the group learning space under my guidance. Shifting the direct instruction around essay writing skills to the individual learning space, freed-up the face-to-face time of the tutorial to be richer and more meaningful for the students, becoming a more student-centred approach (Sams and Bergmann, 2013).
This session took place in the run-up to the student’s first essay submission and it was designed to explain and test out the essentials of essay writing. This was important as the students often had differing experiences of essay writing and varying ideas of what a good university essay looks like. The session was structured around the book Good Essay Writing: A Social Sciences Guide (Redman and Maples, 2013), which empathetically introduces readers to essay writing in an approachable and accessible manner, making it suitable to use in this instance.
Prior to the session, each person was assigned a chapter to read and summarise in under two pages. As well as developing their critical reading, note-taking and summary skills, this meant that the whole book was covered by the group. The summaries were collected and collated before the session into a mini-guide to writing human geography essays. Each student had their own copy which provided hints and tips about what an essay is; planning an essay; answering the question; critical thinking; developing an argument; writing introductions, main bodies and conclusions; as well as troubleshooting key concerns.. The mini-guide ended with a couple of blank pages for students to add their own notes to, including key points from their essay feedback. In this way, the resource was intended to be a living resource which develops and grows as the students get more practice of essay writing.
The tutorial itself entailed a group discussion about each of the sections in the guide, with any questions or concerned being discussed and worked through by everyone. This was progressed by working through a hypothetical essay question as a group to demonstrate how these skills may work in practice and to apply essay writing theory to something a little more concrete. As a result of this session, the students found writing their first essay less daunting compared to previous years cohorts and scored better marks on average. They possessed a strong but developing understanding of what a successful essay does and looks like, as well as some strategies for implementing the skills they need to produce one.
References
Bergmann, J. and Sams, A. (2014) Flipped Learning: Gateway to Student Engagement, International Society for Technology in Education, Eugene.
Cook, S. (2017) Tutorials for introducing human geography and academic study skills at university [online]. Royal Holloway University of London. Available from: https://intranet.royalholloway.ac.uk/staff/teaching/teaching-learning/tutorials-for-introducing-human-geography.aspx [Accessed 18 July 2017].
Flipped Learning Network (2016) What is Flipped Learning? [online]. Flipped Learning Network. Available from: https://flippedlearning.org/wp-content/uploads/2016/07/FLIP_handout_FNL_Web.pdf [Accessed 18 July 2018].
Redman, P. and Maples, W. (2013) Good Essay Writing: A Social Sciences Guide, 4th edition, SAGE, London.
Sams, A. and Bergmann, J. (2013) ‘Flip Your Student’s Learning’ Educational Leadership, 70(6): pp.16-20.
This spring, I was preparing a Master’s seminar where I was supposed to talk about my “research design”. I skimmed through a number of ideas, as I was not sure from which angle to approach this broad topic. Against all odds, I eventually decided to focus on dry theory. The concepts of ontology and epistemology were something that had always been addressed in isolation throughout my education. Yet, I found that a conscious choice/identification of one’s own ontology/epistemology is very important for the development of a sound ‘research design’. I tried to build up the intellectual journey of finding my own methodology from the very beginning and started with an element that is often neglected in academia: the researcher and her own positionality.
Thus, the very first activity of the class should encourage the students to critically think about one’s own positionality. They should assess how central it is to the development of a research design that does not only fit the project but also the researcher. For this I used the “Power Flower” as a first icebreaker and basis for discussion. I was introduced to this activity at a seminar on Community Based Participatory Action Research for Arctic communities. The Power Flower has been developed by Canadian education scholars to “identify who we are (and who we aren’t) as individuals and as a group in relation to those who wield power in our society.” (Arnold et al., 1991)
I distributed copies of the Power Flower. On each piece of paper was the outline of a flower with about ten petals. Each petal was divided into an outer and an inner part, while the core of the flower indicated the socio-economic “theme” of each petal; class, family-structure, gender, religion etc. As a group we quickly determined the “dominant” group in our society for each theme and put it on the outer part of the petal. Then the students were asked to write (if they wanted to and in private) what they consider themselves to be in each respective category. I highlighted several times that they would not need to share they insights afterwards and that they would not have to write anything if they did not want to.
It was nice to see that all students participated eagerly and that some students afterwards realized how close or how far they were from the “norm”. We afterwards discussed the resulting potential power imbalances or positive connections between researcher and participant that might either lead to opportunities or limitations that need to be taken into consideration not only from an ethical standpoint. After having understood this, we moved on to see that our ontologies, based on our self-identification might differ from the ontologies of the societies and people that we engage with in our research and that this again will have implications on our epistemology, methodology, methods and sources that we use.
Arnold, R., Burke, B., James, C., Martin, D., Thomas, B. (1991). Educating for a Change. Toronto: Doris Marshall Institute for Education and Action and Between the Lines Press.
Angharad Jones
Geography is fundamentally an understanding of the world, its environments and inhabitants. Problem based learning (PBL) is an approach that requires students to use a range of information to solve or understand a problem (Savin-Baden, 2000), and thus seems suited to learning in geography. This post will explore the potential application of PBL in physical geography undergraduate tutorials.
A PBL course was developed at the University of Texas whereby students were presented with a number of dinosaur fossils (Montgomery and Donaldson, 2014). In groups, they were required to conduct their own research in order to identify the fossils and make inferences about the dinosaurs and the environment. The opening question of the task was ‘How can I characterize the Cretaceous ecology of the Big Bend using these extinct organisms as my guide?’ (Montgomery and Donaldson, 2014, p.716). While this course lasted a number of weeks, the principles could be modified to fit a single tutorial session.
At Royal Holloway, part of the first year physical geography module is an introduction to Quaternary environments (the past 2.6 million years). A few lectures look at proxies (sources of past environmental information) and archives (where the proxies are found). The lectures thus give a foundation of knowledge and methods. There are tutorials that accompany these lectures, which seem the idea venue to conduct PBL, and give the students hands on practise of reconstructing past environments, using the methods they’ve been exposed to in lectures. The students could be presented with a theoretical archive containing one or more proxies (although fewer specimens than Montgomery and Donaldson’s, 2014 example), which they would then have to identify. In this case, the driving question could be ‘How old are the deposits from Site A, and what can they tell us about the environmental conditions at that time?’ To answer this, the students would have to consider what environmental information each proxy contains, the drawbacks and advantages of each proxy, and link together information from the lectures. Ultimately they would apply the learned material from the lectures to a real-world, research scenario.
This approach may encourage a deep learning approach, following Entwistle and Peterson (2004), whereby the students have to look at the patterns in their findings and assess why these patterns occur, and critically assess what the data shows, i.e. the limitations or biases with each proxy record. This may be more enjoyable for students than surface, rote learning. Indeed, the students at the University of Texas felt that the PBL approach helped them to link class-based learning to the world beyond the classroom (Montgomery and Donaldson, 2014).
References
Entwistle, N.J. and Peterson, E.R. (2004) ‘Conceptions of learning and knowledge in higher education: relationships with study behaviour and influences of learning environments’, International Journal of Educational Research, 41, pp. 407-428.
Montgomery, H. and Donaldson, K. (2014) ‘Using problem-based learning to deliver a more authentic experience in paleontology’, Journal of Geoscience Education, 62, pp. 714-724.
Savin-Baden, M. (2000) Problem-based Learning in Higher Education: untold stories. Buckingham: SRHE and Open University Press.
Katy Lawn
Also called ‘spectrum line’ or ‘barometer’, I first came across this activity at a workshop in a gallery in 2017, run by two artists who admitted that they’d ‘never done a workshop before’. The workshop was attended by about 12 people, aged 24-65 and functioned as a real ice breaker, even though it was the last activity in the workshop. I found that this activity was the one that really opened people up for discussion. For that reason (as well as because the activity is simple enough for first time workshoppers / teachers!) I decided to try it out in my teaching on a third year seminar workshop.
Step 1: Explain the activity and prepare the space
Set an imaginary line going from one corner of the room to the other. One end of the line is ‘strongly agree’ and the other is ‘strongly disagree’. Introduce a statement or question, e.g. in my case, I asked: ‘I feel optimistic about the future of work’, and ask students to spend a couple of minutes thinking about the statement, and how they feel, and then go and stand somewhere along the line that reflects their thoughts.
Step 2: Discussion
Re-iterate to students that this is a discussion activity, and they must respect everyone’s opinion and debate respectfully as there is no right or wrong answer. Then, ask students (one by one, or at random selection intervals) to state why they have chosen to stand where they have. Encourage them to use case studies, their own experience (if appropriate) in their argument.
In terms of evaluating this exercise, I found it – on the whole – to be a great way of getting students out of their seats (which in itself seems to be a good first step in increasing participation) and more engaged with the content. The exercise seems to lead to more discussion amongst the students, but requires careful monitoring, as there is a sense that you’re pitting them against each other in some way. It is useful to refer to Morss and Murray’s ‘Ground Rules For Acceptable Behaviour’ here, the first of which is “respect for others: listening and valuing others’ contributions” (see Morss and Murray 2005:141). This underlines that the tone of the discussion must be kept respectful. In all four instances where I ran this activity, the conversation was absolutely fine, but needed to be managed a little as some of the topics under discussion were to do with the welfare state and more left/right leaning students clashed. The best way to do this was to reiterate that there is no right or wrong answer, and that the whole point of the session was to have a dialogue, rather than to come to some complete conclusion of ‘right’ or ‘wrong’.
This exercise also usually involves ‘picking on’ people to contribute, which can be an issue. If you know the students well (i.e. if you’re doing a series of seminars over a period) you may be able to gather knowledge on whether there are some students especially who don’t respond well to this. I did pick on people, but the students responded well in general and it worked well as an ice-breaker which actually lead to more natural dialogue. To alter this, you could ask for volunteers to discuss their opinions.
This exercise could be adapted/developed in a number of ways. Firstly, there are some accounts of this activity which suggest its productive to put paper with ‘strongly agree’ and ‘strongly disagree’ on the walls as this helps people to identify each pole of the line. You could also have a reflection activity, where students reflect on whether the discussion in any way altered their own opinion, or run the activity at the beginning an end of a course to monitor development of knowledge and thought.
In general, however, I found this to be a really great teaching method (especially appropriate for towards the end of a session/course) which allows open dialogue between students, increases engagement, and also addresses a need to synthesise learning and make an argument after learning course content.
References:
Morss, K. and Murray, R. (2005). Teaching at university. London: SAGE Publications.
I like this approach, and I would be interested in applying it in the future to engage all students in a question. It also might be interesting to perform it twice, before and after a further piece of information has been introduced, similar to of Trufant’s (2003) onion peel method.
I can think of one scenario in my tutorials when it could have been used. I described an environment that is currently under threat, and asked the students whether it should be conserved. I then told the students that the only reason that the environment exists is because of human activities hundreds of years ago. The Opinion Line could have been performed first before, and then after, the second piece to information was revealed. This would have encouraged all students to participate and think about the reason for their answer, and then consider whether their answer would change, and why.
References
Trufant, L.W. (2003) ‘Move Over Socrates: online discussion is here’. Available at: https://www.educause.edu/ir/library/pdf/NCP0330.pdf (Accessed 4th May 2018).
Angharad Jones
This post shows that the ‘opinion line’ is an excellent way to encourage students to participate and to engage. Additionally, it might also help them remembering new information. Relating newly encountered ideas and people to different locations in the classroom as they move along the ‘opinion line’, this information might become embedded in their hippocampal ‘cognitive map’ (Manns & Eichenbaum, 2009).
I think this is a great idea Katy. Not least, because it gets the students up and moving. When I was teaching last year, there were 2 – 3 hour long lecture sessions put back to back, and even in their breaks students didn’t really move around that much. I think this would have been a particularly useful activity to help keep the students focused and engaged and a nice way to bring in discussion/active learning to a class. If there isn’t space in a room to carry out this activity by standing up, I have also read about using post-it notes stuck on the board with people’s names on (https://busyteacher.org/12329-get-students-writing-6-post-it-activities.html). I think that loses the benefit of getting people moving though.
Emily
By Nina Willment
As a workshop leader this year, I was tasked with facilitating discussion on the idea of the future of work with a group of around 20 students from across a variety of different disciplines. At the beginning of the workshop, I asked the students what they wanted to gain from attending this workshop. The students stressed to me that they wanted to gain knowledge from other students from across the different disciplines about the future of work. However, the students also stressed that they wanted to actively be able to use and apply this diverse knowledge other students shared to their own work, particularly to areas they may be struggling with. From my own experience from a previous workshop on a similar topic, I had also noticed that not every student entered into discussion during the workshop and therefore there were varying levels of participation. This varied level of participation was something I wanted to improve on within this subsequent workshop. I therefore redesigned and adapted the idea of the ‘whip’ or ‘whip around’ activity for this workshop. The 'whip around' activity is whereby the teacher asks a question and 'whips' around the classroom and every student has to verbally provide an answer to the question (Tanner, 2013).
I gave each student at the workshop a piece of paper and instructed them to write a question on it relating to their own research and the future of work. I signalled that this question could something that they had been struggling to deal with within their own work or something that they wanted to find more about, anything they wanted! I then asked students to pass their piece of paper with their question on to the front of the room. I then randomly gave out all the pieces of paper so that each student had a different piece of paper with a different student’s question on it. I then instructed the students that they had 1 minute to write as many themes, ideas, interesting thoughts, authors, literatures and answering relating to the question they had in front of them. After 1 minute, I instructed them that I would be shouting ‘swap’ and that they should pass that question to the left. When they had received a new piece of paper containing a new question, they should start the process of writing down as many interesting thoughts etc about this new question again for another minute. This was repeated until each student had the question they had started with at the beginning of the swapping back. At the end of the workshop, students could collect their original question (complete with annotations).
I feel this activity was successful as it enabled myself as the teacher to actively manage and facilitate the participation of all students within the workshop (Tanner, 2013). The ‘whip around’ technique also provides a stimulating environment which teaches students to think for themselves. As well as contributing their own viewpoints, students can read, comment on or add to other students ideas which encourages both self and peer reflection and assessment (Erickson et al., 2010). Moreover, the ideas that students must attempt to in some way answer the diverse range of questions in front of them can help students to condense, select and organise material from their own readings and research within their mind. Students may also begin to visually witness and form their own ideas about how they can connect and link themes/ideas related to the topic of work from across both their own discipline and beyond through their engagement with the 'whip around' activity (Edwards-Groves, 2003).
References:
Edwards-Groves, C. (2003) On Task: Focused Literacy Learning. Sydney: Primary English Teachers Association.
Erickson, P.M., Fox, W.S. and Stewart, D. (eds.) National Standards for Teachers of Family and Consumer Sciences: Research, Implementation and Resources. National Association of Teacher Educators for Family and Consumer Sciences.
Tanner, K.D. (2013) ‘Structure Matters: Twenty-One Teaching Strategies to Promote Student Engagement and Cultivate Classroom Equity’, CBE: Life Sciences Edition, 12(3), pp. 322-331.
I have also tried this method and found that it was really successful in getting contributions from every student, which is really important because less ‘extroverted’ participants might not feel able to contribute their own knowledge verbally. It was also good because it allowed for written feedback which each person can then take with them, which is not something that people often come out from a seminar with! I completely agree that it increases participation. I would also argue that it offers different kinds of responses, because of the format – things that people wouldn’t think are relevant to the whole group, perhaps – are ‘put out there’ because they are relevant to one person, so you actually get through more content and more responses because the whole group are working on different questions simultaneously.
~Katy Lawn
This activity, a speed dating of sorts, seems an excellent one for encouraging full class participation in a way that goes beyond lip-service or trying to avoid the awkward silences after a teacher’s question. The written format allows for meaningful participation from all members of the class, not only those with the confidence to speak out, and in doing so values everyone’s thoughts equally and actually gives a clearer picture of what students are thinking about. This activity further builds student’s confidence by putting them in the position of the teacher/expert through answering the questions of other students, helping to build a peer-support network. It is all topped off with the huge benefit of having a physical take-home from the session, a resource the students can draw on and think about long after the session has ended. I think it is an excellent activity and will consider how I can include it within my teaching.
- Simon Cook
I was teaching a single session on Violence Against Women in a Masters course on Sustainability, Development and Society and had to think about how best to communicate some of the evidence of prevalence and incidence nationally and internationally, whilst also encouraging students to think critically about how data are generated and what counts as evidence (Matheson, 2008).
As part of their assessment for the course, they needed to produce a Policy Briefing and I have experience of working in policy development and implementation, so know the importance of presenting valid statistics to make effective arguments (Nutley and Webb, 2000). However, violence against women is extremely pervasive and often hidden, so generating convincing statistics is difficult (especially if policy makers and politicians are reluctant to be convinced of the need for action...) (Walby, 2005, 2006). I did not want to spend too much of the seminar presenting statistics or debating the validity of statistics, but I wanted the students to appreciate the work that authoritative statistics can do in making the case for law, policy and projects.
A similar session had been presented in previous years, and I could see that statistics had been provided from a range of sources as answers to a quiz on myths and realities. However, I felt that a quiz might encourage students to focus on right/wrong answers, rather than develop critical questioning of all sources. So I used a video produced by the United Nations Economic Commission for Europe (UNECE) and the World Bank, which presented statistics and other data on violence against women from a range of quantitative and qualitative sources across diverse countries (http://www.unece.org/fileadmin/DAM/stats/gender/vaw/resources.html). I also used an online resource of policy and practice documents “Global Database on Violence against Women” (http://evaw-global-database.unwomen.org/en) to look at the evidence for countries the students were familiar with. This enabled them to think about what counts as evidence – including statistics – in national and international contexts; and also compare their expectations about different countries’ responses to violence against women with the record that those states present to the United Nations. It encouraged interesting debate about what surprised them in the statistics and policies, and how authoritative they thought different sources were.
References:
Matheson, C. (2008) ‘The educational value and effectiveness of lectures’, The Clinical Teacher, (5), pp. 218–221. doi: 10.1111/j.1743-498X.2008.00238.x/epdf.
Nutley, S. M. and Webb, J. (2000) ‘Evidence and the policy process’, in Davies, H. T. O., Nutley, S. M., and Smith, P. C. (eds) What Works? Evidence-based policy and practice in public services. Bristol: The Policy Press.
Walby, S. (2005) ‘Improving the statistics on violence against women’, Statistical Journal of the United Nations Economic Commission for Europe, 22, pp. 193–216.
Walby, S. (2006) Towards International Standards for Data Collection and Statistics on Violence Against Women. ECE/CES/GE.30/2006/7. Geneva, Switzerland: UN Economic Commission for Europe Statistical Commission. Available at: http://www.unece.org/fileadmin/DAM/stats/gender/vaw/resources/7%20e%20Lancaster_Univ.pdf.
You teach highly interesting material! I can identify with the sentiment of wanting to present accurate data. Especially in an environment of very inquisitive students.
From teaching on a qualitative research methods course, my advice would be to encourage students to question the research methodologies behind the papers they are consulting. If the papers are using quantitative research methods, are there any confounding factors that impact on the implied causality? Was there a sound sampling method employed? If the papers were using qualitative research methods, how many interviews were taken? How much time was spent in the field in the case of an ethnography?
I appreciate that there might not be much time to dedicate to research methodology in a lecture that focusses on a different subject. It might be worth to just briefly consider the methodology of one paper as an example. You were probably already recommended this book on qualitative research design in the past, but here it is in case you were not yet:
Flick, U., 2014. An Introduction to Qualitative Research. Edition 5., London: Sage Publications Ltd.
Effectively assessing learning has many benefits. It helps inspire students to want to learn more; help measure performance; and encourages creativity (Attwood, 2009; Bond and Clark, 2013; Nicol and Macfarlane-Dick, 2006). I taught on a course with a method of assessment which I thought was not very effective. As part of their assessment, students submit two written essays, each carrying 25% of the final mark. The other 50% comes from the final written examination for which they sit towards the end of the academic year.
I had one student who always actively participated in discussions during seminars. The student submitted her two essays, on which she was awarded fail marks of 35% and a 38%. She was very furious with this mark and, quite frankly, I also felt the mark she received did not adequately reflect her knowledge on the subject. This, therefore, led me to start questioning the course’s current method of assessment.
It seems to disadvantage some categories of students, including those whose first languages are not English. Also, other equally important skills and abilities such as teamwork, oral communication and presentation skills are often not cultivated and assessed with this system. Students also tend to anxiously focus on the final mark on their written assignments and final written examinations, often leading to dejection and frustration when these marks turn out lower than expected. I also thought that, due to their rigidity, written forms of assessment do not give students the chance to be asked questions so that they clarify and explain their thinking.
As a result, I am thinking of engaging the course leader and propose the restructuring of the current assessment method. I want to propose the addition of an oral component as part of the coursework (Nightingale, 1996); and participation on the course’s online forum. This would develop the deliberative capacity of students, while also enabling them to engage confidently with technology (since students would be able to draft and post comments; critically engage with others’ ideas, upload documents). Would these two assessment techniques work?
Assessing participation in the course’s online forum is a really interesting idea. Maybe I’ve been thinking about it particularly because of participating in this inSTIL Teaching & Learning forum!! I was interested in the example of “One-sentence response (OSR)” in the Carless book (Carless, 2015, p. 73) – students have to attend the class to find out the question and then produce a 1-2 sentence response. This therefore records attendance and enables students to hone brevity in their communication skills. He reports that students feed back that they also value being able to look at other students’ responses to the same question. He collected responses handwritten on slips of paper, but it could be adapted to require a response on the course’s online forum.
Reference:
Carless, D. (2015) Excellence in University Assessment: Learning from Award-winning Practice. London: Routledge.