6. Developing Counselling Skills through AI Client Simulation

Authors:  Beverley Pickard-Jones, Luke Sanders, Fay Short


Institution: Bangor University, UK



Situation

Remote chatroom therapy has become increasingly popular, providing clients with improved accessibility, anonymity, and the ability to communicate asynchronously with a counsellor. As such, it is vital for trainee counsellors to be confident in delivering therapy through this medium. Students on the MSc Counselling course at Bangor University, UK, undertake core training in remote counselling in the first year of their two-year Master’s degree.

Trainee counsellors require extensive practice before engaging with real clients. However, exposing trainees to a diverse range of client problems can prove challenging due to the availability and willingness of peers to repeatedly act as clients, and because their peers may not present with the diversity of complex scenarios that trainees may encounter when they practise professionally. GenAI chatbots therefore offer a realistic and accessible way to practise remote counselling skills by effectively simulating clients. Trainees were consequently able to access a more comprehensive and varied training experience, freed from the limits previously imposed by peer participation and the diversity of scenarios available to them.



Task  

Using ChatGPT 3.5, we trialled a simulated therapy chatroom for trainee counsellors to develop and refine their remote counselling skills. The simulated chatroom offered a controlled environment that we considered “safe” because it did not involve any real humans sharing genuine emotional trauma. The task also provided an opportunity for trainees to reflect on their practice and identify areas for improvement and skill development by requesting immediate feedback from ChatGPT regarding their performance.



Action 

ChatGPT was prompted to simulate a client struggling with a variety of presenting issues and scenarios (see Figure 1 below). The activity offered trainees a highly accessible means of practising their remote counselling skills, exposing them to a wide range of virtual client personalities and scenarios and safely preparing them for real-world remote counselling.

Figure 1 Extract from dialogue between counselling trainee and ChatGPT in client role

Figure 1 Extract from dialogue between counselling trainee and ChatGPT in client role

We prioritised an inclusive and easily scalable solution: GenAI chatbots can accommodate many users concurrently, meaning the activity easily scales to support increased numbers, and since ChatGPT is accessible from any location, trainees who intended to practise outside of the English language, who faced mobility challenges, or were managing other obligations could be better accommodated and included. 

The concept can also transfer to different disciplines. Law students, for instance, could use simulated clients to improve their legal interviewing and advocacy skills. Educators might use simulations to practise navigating complex conversations with students, parents or guardians. Law enforcement personnel could develop de-escalation, interviewing, and crisis management techniques through simulated scenarios. 


Results

The simulated counselling sessions allowed trainees to experiment with the practical application of theory in a safe environment. It also mitigated the anxiety associated with practising new skills on real clients, while allowing for a more focused learning experience in a way that was not previously possible.

This activity simulated a broad range of client scenarios and personalities, offering students a breadth of experience that would be difficult to replicate in traditional training settings. This diversity in practice scenarios enriched the learning experience by exposing students to various aspects of counselling, including managing challenges such as client resistance and transference. 

We identified potential biases in ChatGPT's language in roleplay scenarios, such as a Western demographic bias that could limit exposure to diverse demographics unless specifically prompted. Trainees also noted that careful prompting strategies were required to generate more realistic client simulations, acknowledging that ChatGPT’s bias to respond positively might not always reflect real-life scenarios. Introducing periodic human oversight, for example by submitting chat logs for peer or tutor review, might mitigate the associated risks.

To further improve both scalability and the range of scenarios a trainee might encounter, the activity could be enhanced with the introduction of a custom bot designed to introduce variability in scenarios beyond the user’s own experience and imagination, and to introduce more explicit scaffolding of skills aligned with the learner's level of competence, accommodating trainees at various points in their training.


Stakeholder Commentary 

Feedback from trainees highlighted the potential for this activity to enhance counselling competencies. They appreciated ChatGPT’s 24/7 accessibility, providing them with the flexibility to implement the training into their schedule, and its ability to offer developmental and positive feedback during and at the end of their sessions. 

Educators noted two particular benefits: firstly, the activity was offered at no additional cost to trainees or the institution. Secondly, trainees’ ability to run repeated sessions with the simulated client, and adapt the parameters to meet their skill level as they progress throughout their training, could facilitate more the effective and personalised scaffolding of skill development.

We concluded that the activity was a valuable addition to the training package, offering more variety and flexibility than previously offered, but still allowing for the human oversight needed to ensure that students were prepared for working with real clients when they graduate.

Author biographies

Beverley Pickard Jones

Beverley Pickard Jones teaches psychology at Bangor University, and takes great pride in developing students into independent thinkers and effective communicators. She researches AI-driven methods to enhance active engagement and information processing.


Fay Short 

Professor Fay Short is APVC for Employability at Bangor University and Course Director for the MSc in Counselling in the School of Psychology and Sport Science.


Luke Sanders 

Luke Sanders is a postgraduate student and trainee counsellor at Bangor University. He is currently studying an MSc in Counselling and is passionate about how AI intersects with wellbeing and learning initiatives.