Workshop 2:

Trust and Technology

Reintroducing and revisiting trust 

As a team, we also explored the definition of trustworthy autonomous systems together. One of our student co-researchers explained that he knew someone who worked at Curry’s (a high street technology store) and that they sold washing machines (an autonomous system). Another student described a defibrillator machine as an autonomous system. 


Following on from definitions and examples of what TAS might constitute, we explored the role of autonomous systems and decision making. We posed the question, ‘Would we let robots make decisions for us?’ We talked about examples such as cooking our tea. 


We explored different examples such as getting a robot to do your hair or your makeup and how seeing a robot do makeup on friends and knowing what you like or what suits you is important to use and trust. These examples raised the importance of robots knowing what's important to us as individuals, what matters to us and what we like. 


Our session revisited a key contribution from one of our co-researchers in workshop one. This was exploring the concept of trust through the example of animals. In workshop one, our co-researchers documented that you could trust a dog more so than a cat. We discussed this again in depth, talking about dogs' ability to follow instructions and the ability to train a dog in comparison to a cat. We also talked about loyalty and reliability between cats and dogs, and how cats still practise commitment to their owners by looking to please them, and how this is interesting in relation to trust and relationships with humans and non-humans.  

Beebots, Biscuits and Apples:  An Exercise in Autonomous Decision Making 

On a beechwood table, there is a line of an orange, a nice biscuit and a red apple. Below the snacks is the ‘beebot’ with red buttons, blue lights and a bug-like face.

To prompt discussions regarding decision making and choice when it comes to robots, we introduced ‘beebots’. 

Along with the definition of TAS and its emphasis on the technology assisting/doing on our behalf and the potential of decision making, we posed the question, ‘Would we let the robot choose food for us?’. 

We set out an apple, an orange and a nice biscuit on each table and in teams, programmed the beebots together to try and reach our snack of choice. 

Sometimes the beebot fell off the table and some students became frustrated by the need to provide so much instruction and programming to the beebot. Each of the students successfully programmed the beebots and reached their choice of biscuit/orange/fruit. Through the programming we invited young people to consider ideas of trust and autonomy through the experience of active learning and co-design. 

‘Circles of trust’: Concentric circle methods to explore trust in TAS 

To further explore relationships of trust in relation to TAS, we co-developed a method of developing ‘trust circles’. This arrived from the use of concentric circles as an established research method often used to creatively explore and articulate relationships and connectedness through the mapping and networking between others and their value and closeness. We thought this method helped us explore conversations on TAS. 

a cluster of circles sketched on the ground. The inner circle contains printed pictures of a family, friends and a dog. The circle beyond that contains printed pictures of a teacher, a doctor, a shopkeeper, security guard, bus driver and dentist. Coloured pens and satsumas are holding down the pictures.
printed image of a robot washing a person's hair. Factor ‘talk to creator’ sketched on to the image.

Our concentric circles were created with chalk on the playground of the school. Each of our student co-researchers were involved in creating the circles and assembling our framework. The central (or inner) circle was the most trusted/circle closest to us. Each circle surrounding and expanding from the inner circle represented increasing distance in terms of relationships, familiarity and thus, trust. 


Together, we had a series of images representing key figures, people and objects that we could collectively explore to discuss notions of trust. These included visual representations of family, friends and pets through to ‘strangers’, security guards, robotic arms for dressing and driverless cars. 


We each had a selection of images that we took in turn to talk about and students shared their thoughts on their trustworthiness and placed them in our trust circle. Friends, family and dogs were right in the centre. A ‘stranger’ was placed right at the edge. Still close to the centre but further away from family and friends were bus drivers, teachers and doctors. We then introduced images of driverless cars and assistive robots for dressing and placed them in the trust circle - our student co-researchers placed both at the edges of the circle (less trusted). We then asked questions such as ‘Would we trust the assistive robot if it was working with a doctor?’ we agreed it did and then this moved the robot closer to the centre. 


We asked what made or broke trust. Examples included evidence of training and the length of time seeing people/things in action or used by others. With the driverless car, reasons of broken trust included speeding and the breaks going. Our co-researchers often cited specific lengths of time it would take in order to trust a technology (e.g. two years for a car). 

We introduced the beebot and discussed what the beebot would need to do to move closer to the centre. Students talked about robots being able to talk, it being programmed and the role of AI and the ability to adapt. We wrote down these criteria as part of the circle. Students also prompted critical questions as part of this, such as ‘What if you’re Blind?’. This was referred to in multiple activities and students unpacked the multiple ways in which robots and technology could sense and communicate, and importantly, adapt. Throughout the day, we discussed the importance of meeting the designers as important to trust and technology, something we developed through the next phase of the project.