For the past year, members of the Digital Wildfire team have been involved in a research project that seeks to develop teachers’ and students’ understanding of responsible social media usage. Led by Carina Girvan (Lecturer in Education and member of the Digital Wildfire Steering Committee) at Cardiff University this project was funded by the British Academy. Fieldwork took place in two schools and promoted the co-production of innovative teaching and learning materials. Schools are currently caught between developing students' knowledge and understanding of responsible usage of technology, whilst governing its use. The co-production of materials provides a means to address this tension and foster a growing knowledge of digital responsibility.
Coming soon! Digital Wildfire online social media ethics resource
One of the final tasks of the Digital Wildfire project is to create an online resource that promotes the responsible use of social media. This ethics ‘toolkit’ or ‘map’ will be designed to support different groups of users to reflect on current tensions around social media and to identify opportunities for the responsible governance of digital social spaces. The resource will not provide a set of do’s and don’ts for social media but will instead recognise the complexity of the issues involved.
The resource will present:
· findings from the Digital Wildfire project
· case studies of controversies arising from the spread of social media
· perspectives from key organisations and individuals involved in the management of social media content.
The resource is aimed towards professionals who are required to deal with social media content – in particular potentially harmful content – as part of their work. For instance:
· teachers and other educators
· law enforcement professionals
· policy and (local) government professionals
· staff members at anti-harassment and equality organisations
· staff members at social media platforms
Using the resource will help these professionals to understand the ways in which harmful content can spread on social media, the consequences this content can have and the different steps that might be taken to respond to it. Users will then be able to reflect on what kinds of governance measures might be most appropriate and effective in relation to their own work.
The resource is still being developed and we will make it available online as soon as it is ready. Until then, here is an early preview of some of the content
Digital Wildfire project team member Adam Edwards summarises the results of our Policy Delphi
The third work package of the Digital Wildfires project recruited key informants about the challenges of reducing the harms associated with abusive social media communications, particularly amongst adolescents, whilst also seeking to protect the positive freedoms of speech enabled by this new technology. Given this tension, key informants from the police, criminal legal practice, education and social media platforms were asked to participate in a ‘policy Delphi’ to discuss the political and technical feasibility of policing abusive social media communications. The policy Delphi is a deliberative method in social science, which seeks to identify key points of agreement and disagreement about a problem through iterative rounds of debate and dialogue. It is an especially useful method for investigating problems about which there is a great deal of uncertainty and which are evolving rapidly, as in innovations in digital technologies such as social media, that are ‘disrupting’ established ways of thinking, such as the enforcement of criminal law as a remedy for abusive behaviour.
To read more click here.
Digital Wildfire: Raising awareness of social media discrimination through football
Kick It Out, football’s equality and inclusion organisation, has been at the forefront of tackling discrimination in football for over 20 years. Here they explain their work involving social media.
Working throughout the football, educational and community sectors to challenge discrimination, encourage inclusive practices and campaign for positive change, Kick It Out has helped football become a more tolerant and inclusive sport.
However, there are still many issues blighting the nation’s game. With the organisation now into its third decade of existence, there are new challenges that lie in the way of making football the bastion of inclusion.
With society moving towards a digital future, the issue of discrimination has seemingly shifted from the openness of the terraces to the vast and anonymous space that is social media.
To read more click here.
THE ACT OF SEEING: REALITIES IN THE MAKING
Art Event, Academic Centre for Artistic Initiatives: AOIA, Łódź, Poland, 27-30 September 2016
AIOA provided a generous venue reconfigured into four adjacent areas that exposed and cross-referenced different aspects of a figurative painting: as a physical object, an image to be interpreted, real life of here and now, or a place where different past and present realities meet.
The art event attracted patronage of the President of Łódź City Council (Honorary), the Academic Centre for Artistic Initiatives AOIA, two Cultural Foundations: Łódź Jest Kulturą and Plaster Łódzki, Digital Wildfire, the Arts at the Oxford Fire Station, and the I-Ex Firm. The event was well frequented and had a great educational value. To nearly 300 High School Art Students, invited by the Director of AOIA Monika Kamieńska, the event was, in their own words, impressive, incredible, superb, or totally unexpected. They were ecstatic by what they had seen, inspired to create their own art in a different way, and surprised how art could influence human lives. To one it was the best spectacle he had seen in his life with no exception. Another one composed a thank you poem where she wished that, on a road to wisdom of every human being, there would be the space for the artists to fit in.
Read more about the exhibition here.
As the project moves into its reporting phase we have spent much of the last few weeks writing up and disseminating our key findings. The main conclusions of our study relate to:
1) The scale and breadth of the ‘problem’ of harmful content spreading on social media
Through our interviews, observations and surveys we have found that a very wide range of agencies are now having to deal with rapidly spreading social media content that is in some way inflammatory, antagonistic or provocative. This includes the police, councils, news agencies, anti-harassment organisations, anti-bullying groups and schools.
2) The complexities and limitations of current governance
Various mechanisms currently exist to deal with social media content and/or its impact but these tend to have practical limitations. For instance, the law and governance mechanisms enacted by social media platforms (removing posts, suspending accounts etc.) are mostly retrospective – dealing with content after it has already spread and caused harm. They also tend to act on individual posts or users, rather than the multiple posts and users associated with a digital wildfire.
3) The potential value of counter speech and user self-governance
In contrast to other governance mechanisms, we find that user self-governance has some capacity to be prospective and limit the spread of harmful content in real time. The posting of counter speech to disagree with an inflammatory comment or unsubstantiated rumour can serve to encourage others to reflect carefully before sharing or forwarding content. It also upholds rather than undermines freedom of speech. Our analysis of social media content (involving qualitative and computational approaches) suggests that multiple voices of disagreement on a Twitter conversation and function to quell hate speech. Click here for further information.
4) The value of education and engagement
When we ask respondents to tell us what they feel are appropriate ways forward for the responsible governance of social media, they frequently emphasise the idea of communities working together and the value of fostering responsibility on social media through education.
Recognising the value of education, we have focused much of our project impact activities on engaging with and providing resources for educators and young people. We have run two youth panel competitions and produced two sets of educational materials for secondary schools – focusing on e-safety and digital citizenship. We have also co-produced two video animations - #TakeCareOfYourDigitalSelf and Keeping Social Media Social.
Further questions and next steps
The next step for our project is the development of a ‘toolkit’ that will help users to navigate through social media policy and practice. This will be available online and more information will be posted here when it is ready.
We are also continuing to look into certain methodological questions raised by the project. These questions challenge us to consider how we can analyse social media content in ways that are both accurate and ethical.
The contributions of our project will continue through a number of activities. These include an impact acceleration project being lead by team member Rob Procter at the University of Warwick and Marina Jirotka’s work as specialist advisor to the House of Lords Select Committee on Communications for the Inquiry into Children and the Internet.
For more information about our project activities and findings, please contact email@example.com
Team members Adam Edwards, William Housley and Helena Webb also took part in a panel session on Social Media and Social Futures. This discussed the theories, concepts, methods and ethical research practices we can draw on to understand social media and its transformative capacities in modern social life.
The judging panel members were:
Katherine Fletcher - Co-ordinator of Cyber Security Oxford, University of Oxford (Chair of meeting)
Giles Lane - Artist, Designer and Researcher, Proboscis
Sarah Wilkin – Engagement Officer, University of Oxford
Leslie Haddon – Visiting Lecturer, London School of Economics
Menisha Patel– Research Associate, Human Centred Computing, University of Oxford
Anna Jӧnsson – Reporting Officer, Kick It Out
Christopher Greatorex - Regional Cyber Protect Coordinator for the South East Regional Organised Crime Cyber Crime Unit
Helena Webb spoke at the Go_Girl Code+Create symposium "How can we challenge inequalities in further and higher education?" to describe the Digital Wildfire project's engagement activities with young people.
In March we focused on the qualitative analysis of social media data and have also launched the second round of our Delphi panel survey. Project leader Marina Jirotka spoke about Digital Wildfire at a meeting of the Partnership for Conflict, Crime and Security Research Initiative and project researcher Helena Webb talked about the qualitative coding of tweets to students at the Oxford Internet Institute.
Feb 9th Safer Internet Day The Digital Wildfire project is an official supporter of Safer Internet Day. We contributed to the day by preparing a set of teaching and learning materials about the safe use of social media. These materials promote themes of digital citizenship and #TakeCareOfYourDigitalSelf. They are available free to secondary schools and youth groups. For more information contact firstname.lastname@example.org
We are making preparations for the second round of the Policy Delphi survey and continuing to analyse samples of provocative and inflammatory posts on social media. We will draw on the results of our survey, social media data analysis and fieldwork on governance practices to produce guidance that will assist different users to navigate through social media policy and practice.
We continue to engage with schools to identify their concerns over the vulnerability of young people on social media and the ways in which digital maturity and resilience among young people can be fostered. As part of this we are producing a short video animation called #TakeCareOfYourDigitalSelf. This animation, plus an accompanying set of teaching and learning materials, will be made available to secondary schools in the new year. We have also recently run a youth panel in which over 16's at a small number of schools and youth organisations were invited to submit pieces of work addressing the topic 'What makes a good digital citizen on social media?' We received a very impressive set of submissions in the form of essays, personal narratives, poems, artwork and videos. These submissions highlight the continuous presence of social media in the lives of young people and the benefits and harms it brings to them. The youth panel submissions will be displayed on this website in early 2016 and the young people who produced the best 5 entries will be attending our workshop on January 12th to receive a certificate and small prize.
Meredydd Williams has presented the findings of our work on 'positive wildfires' and humanitarian campaigns at the Social Media, Activism and Organisations symposium held at Goldsmiths, University of London. Helena Webb visited the young women taking part in the Go_Girl initiative at the Department of Education, University of Oxford to talk about social media and digital citizenship. You can see their blog post about the visit here. Helena also conducted a seminar on social media and emerging ethical and regulatory controversies as part of the postgraduate Innovation and Society module at the University of Nottingham. Project leader Marina Jirotka spoke as part of a roundtable discussion at an Alan Turing Institute workshop on The Ethics of Data Science.
We are analysing responses to round 1 of our Policy Delphi survey. We are also continuing to look in detail at the properties of 'provocative' posts on social media and the responses they receive. We are asking: What are the characteristics of provocative posts?' 'In what different ways do users reply to provocative posts?' and 'Can we identify forms of user self-regulation in which social media users employ counter speech etc. to challenge or slow the spread of hate speech and rumour?'
Examples of provocative posts include messages posted by celebrity Katie Hopkins. Responses to her posts include disagreements with her opinions as well as challenges to - or attacks on - her personal credibility. We also find examples of provocative posts and responses at sentinel sites such as 'Yes you're homophobic', which retweet offensive posts in order to educate, mock or shame the original poster
October 2015 This month we are continuing to solicit the informed opinion of different experts on the appropriate regulation of social media through our Delphi panel. We are also looking in depth at examples of provocative content on social media and asking 'What are the characteristics of provocative posts?' 'In what different ways do users reply to provocative posts?' and 'Can we identify forms of user self-regulation in which social media users employ counter speech etc. to challenge or slow the spread of hate speech and rumour?'
We are continuing to work on workpackages 2 and 4, which explore the how provocative content such as hate speech and rumour spread on social media and how different stakeholder groups respond to it. As part of this we are examining examples of provocative content on social media. We are also conducting interviews with stakeholders from various organisations - anti-harassment groups, schools etc. - to find out how they deal with the challenges presented by the posting of provocative content and their views on the regulation of social media.
On July 28th the University of Nottingham's Citizen-centric approaches to Social Media Analysis (CaSMa) research group visited the Human Centred Computing group workshop in Oxford. The CaSMa team are conducting a variety of projects that address the ethical challenges raised by the use of social media data in research. These projects also seek to design tools and services that enable users to have more control of their personal data.
The conference season is underway so we have also been busy talking about our work to different academic audiences. Project members William Housley and Adam Edwards discussed on "Social media and civil society: participation, regulation and governance" at the 2015 Conference of the Wales Institute of Social and Economic Research, Data and Methods (WISERD) held at Cardiff. At the WebSci15 conference in Oxford project leader Marina Jirotka took part in a panel discussion on ethical web science and we presented a poster summarising our project.
We are making preparations for data collection in workpackages 2 and 4, which explore the spread of provocative content on social media and the ways in which different stakeholder groups respond to it.
Our project artist in residence, Barbara Gorayska, has been working on a painting on the subject of digital wildfires. Since 2007 Barbara has been exploring meanings and issues that can be communicated through the poetry of painting – with a particular interest in the interface between science and visual art. Her work is informed by her professional background in Semantics and Pragmatics of Natural Language and her academic work on Artificial Intelligence and Human-Centred Computing. She spent most of her university career investigating how people communicate and fabricate or adapt to their environments, in particular how they decide what is relevant and how their use of tools impacts on that process. The paintings inspired by digital wildfires will cover the themes of "cyber haven" and "metamorphosis". They will symbolically reference past art related to wisdom, truth and justice.
April 20th 2015 project Research Associate Helena Webb talked about Digital Wildfire at the University of Oxford Women in Computer Science Day.
January and February 2015
January 22nd 2015: Helena Webb and Marina Jirotka discussed the Digital Wildfire project with members of the Human Centred Computing group at the University of Oxford.
November and December 2014
We have begun the project by making arrangements for our various workpackages and recruiting members to our Steering Committee. On December 5th we held a Kick Off meeting to set out plans for the next 18 months. This meeting was attended by the project team and members of the Steering Committee.