Plenary 1
As educators we expect our post-graduate learners to be engaged, motivated and driven to succeed. In reality, we see a range of behaviours and complex factors influencing their attitudes towards their studies. Are we inhibiting their progress by having unrealistic expectations of their potential? What can we do to support them and still maintain a robust healthcare training programme? Physician Associate educators from St. George’s and Birmingham University have come together to review these questions and consider how to move away from unhelpful preconceptions of our postgraduate students.
aperrott@sgul.ac.uk
Professional behaviour is something that we expect PA students to have knowledge of and demonstrate in preparation for professional practice. Within the university system there is a process for managing professional behaviour but this is normally at the institutional level and does not always make provision for students with low level professionalism issues. At programme level whilst students are taught about professional expectations, there is a lack of formal assessment in spite of the importance placed on it. This session proposes a fair, supportive and transparent process to formally recognise, record, and manage students' with low level professionalism concerns at programme level and building the evidence to escalate where necessary.
Jean.watkins@swansea.ac.uk
Objectives: To identify the presence and key triggers of stress amongst Physician Associates,
including possible ongoing consequences of stress. Additionally, to determine if there are
sufficient support services available to deal with the consequences of stress within this
population of clinicians.
Methodology: Online survey containing a combination of qualitative and quantitative
questions distributed via social media. 45 qualified Physician Associates working within the
NHS responded, following which parametric statistical and inductive thematic analysis was
conducted.
Results: Physician Associates do experience stress due to triggers that are NHS-wide but also
others that are PA-specific, commonly relating to a lack of knowledge of their role and
capabilities. This stress most commonly results in negative impacts on performance at work
and the individual’s mental wellbeing, as such PAs find themselves in need of support which
is currently not adequately available or accessible to them.
Conclusion: Clinicians should be educated regarding the remit of a Physician Associate role,
to avoid ineffective communication and responsibility division due to misinformation.
Furthermore, Physician Associates don’t currently have access to adequate support which
must be resolved to ensure their continued professional and personal wellbeing.
Rebecca.gray@outlook.com
Breakout Workshops
Embedding multi-disciplinary simulation learning into PA education
Clinical simulation as a learning tool can bridge the gap from theory to practice. From an educator’s
perspective, it allows the teaching and assessment of complex clinical situations in a controlled
environment with no direct risks to patients.
Health Education England (HEE) promised a well-trained multi-professional workforce through
simulation-based education in 2018. Nevertheless, guidelines or standards are nonspecific regarding
which type of simulation is permissible. Interdisciplinary or multidisciplinary collaboration can foster
teamwork, improve communication, and promote positive attitudes but evidence specifically
involving Physician Associate (PA) students is scarce.
We have developed and completed several multi-disciplinary simulation sessions which have been
incorporated into our curriculum. Through student evaluation and reflection, we have revised and
updated our sessions.
Through this workshop, we will share our experiences and evaluate multi-disciplinary simulation. We
aim to equip educators with knowledge and techniques which can be incorporated into their own
teaching underpinned by evidence and educational theory.
Tariq, Rameez - R.Tariq@bolton.ac.uk
Michael Hannides - M.Hannides@bolton.ac.uk
Michelle Powell - Michelle.Powell@bolton.ac.uk
Reflection is essential to healthcare professionals’ development and those who embed regular
reflection into their practice are known to be safer practitioners (1) . The GMC advocates that group
reflection helps to identify complex issues and effect change across systems (2) . In line with this, the
St. George’s University of London physician associate (PA) programme incorporates reflective
practice into the curriculum and utilises the ‘Reflecting teams’ model (3) . It also has a role in
supporting educators and can be employed on faculty training days.
During this interactive workshop attendees will explore the ’reflecting teams’ model using the Gibbs
cycle. Participants will be encouraged to share some of their own challenging scenarios to appreciate
the value of the ‘reflecting teams’ model first hand. Guidance will also be given on the practicalities
of implementing reflective practice into PA curriculums and staff training days. For those interested,
there will be an opportunity to collaborate with SGUL with setting up and running ‘reflecting teams’.
REFERENCE
1. Academy of Medical Royal Colleges/COPMeD. Reflective practice toolkit, 2018.
https://www.aomrc.org.uk/wp-
content/uploads/2018/09/Reflective_Practice_Toolkit_AoMRC_CoPMED_0818.pdf.
2. The reflective practitioner guidance (gmc-uk.org) GMC Developed by the Academy of Medical
Royal Colleges, the UK Conference of Postgraduate Medical Deans, the General Medical Council, and
the Medical Schools Council.
3. Launer, J. (2016). Clinical case discussion: using a reflecting team. Postgraduate Medical
Journal, 92(1086), pp.245–246.
mvijikan@sgul.ac.uk
Single Best Answer questions are utilised in the Physician Associate National Examination (PANE) and
by UK PA programmes. Blueprinting is a powerful tool to help focus the SBA item writers in ensuring
they have covered a breadth of topics within one exam. With the large number of items needing to
be developed each year, this workshop will also help participants learn how to write multiple items
in a timely manner.
This workshop is designed to support those involved in exam blueprinting and with an interest in
SBA development. The session will start with a presentation covering both topics. Participants will
receive top tips on how to develop high quality items in a time-effective way.
During this interactive and collaborative session, attendees will work in small groups to develop an
example SBA blueprint for a 100-question exam, based on the new GMC PARA Content Map.
Participants will then work together to write multiple SBAs from a single stem.
1. Dr Nicola Dearnley
Brighton & Sussex Medical School
n.dearnley@bsms.ac.uk
2. Mrs Karen Roberts PA-C/R
Brighton & Sussex Medical School
k.roberts@bsms.ac.uk
Plenary 2
St George’s, University of London, runs six objective structured clinical examinations (OSCEs) over
the two year Master’s in Physician Associate Studies. In year one, the OSCEs are intended to test
students’ clinical skills. In year two, there is an increased focus on assessing students’ ability to
correctly detect, diagnose and manage patients with abnormal clinical signs. As such, in first year
OSCEs, actors are used for examination stations but in second year OSCEs, real patients with clinical
signs have traditionally been used. In March 2020, the COVID pandemic rendered many patients
vulnerable and necessitated the running of OSCEs without any real patients. Instead, actors were
used for all stations. Clinical photographs were developed which supplemented second year
examination stations. Students were required to examine a clinically normal actor for seven minutes
and, with one minute remaining, were shown a photograph by their examiner and asked for a
differential diagnosis or management plan. Clinical photographs were sourced from open access
websites with consideration of copyright. All modified OSCE stations were re-trialled, re-Angoffed
and re-weighted (domain based marking). Analysis of data from summative examinations from
2018/9 (real patients) compared to 2020/1 (actor and clinical photograph) showed comparable pass
rates for stations examining the same system. The advantages and disadvantages of using clinical
photographs were considered. Ultimately, real patients for examination stations is the gold
standard. However, clinical photos were effective in allowing assessment of second year students’
clinical application of knowledge and will remain a valuable tool within OSCEs.
lwolff@sgul.ac.uk, pguppy@sgul.ac.uk
Simulation is an essential part of all healthcare courses, enabling students to practice clinical skills and procedures in a safe environment without risk to patients. It is also a valuable tool for continuous professional development for qualified healthcare practitioners. However, simulation can be problematic. It requires clinicians to have access to a simulation laboratory, highly trained staff to devise and run simulation sessions, and equipment that is expensive to purchase and maintain. Furthermore, equipment and content of simulation sessions can be of very variable quality. Virtual reality (VR) is a potential solution to these problems as it can be accessed anywhere using relatively cheap and readily available computer equipment and headsets, requires minimal staffing resources, and standardises quality of content. Software to support education for healthcare is becoming more available. In Autumn 2022, Bournemouth University embarked on a pilot trial, in association with Oxford Medical Simulation, to introduce and evaluate use of virtual reality for Physician Associate (PA) education. This presentation showcases our evaluation of the first phase of this project exploring the feasibility of using VR with both PA students in a University setting and newly qualified PAs in a hospital setting, and will include a VR video-simulation of a session run for our PA students.
csimon@bournemouth.ac.uk
The recent release of ChatGPT and other large language models (LLMs) has created concern for
educators across all disciplines, including healthcare. The technology has the ability to form coherent
‘human-like’ sentences and paragraphs: a feature which has caught the world’s attention. This
technology is likely to grow its capabilities rapidly. We find ourselves as educators on the precipice
of an advanced technological information age, which both brings about numerous concerns with
regards to how students are assessed and challenges the traditional academic establishment.
Moreover, the question of how reliable our assessments are going to be in the future is both timely
and critical, as the capabilities of an Artificial Intelligence (AI) paradigm shift excels rapidly, leaving in
its wake an expanding void between itself and the necessary underpinning governance required to
protect assessment processes and students alike. We can protect our knowledge-based assessments
by offering secure in-person delivery; however, this knowledge is often learned by rote and does
not assess students’ ability to think critically 1 . Traditionally, critical analysis is tested via research and
written submissions; yet the authenticity of this work could be compromised by AI, given the
remarkable capability of LLMs to retrieve and assimilate relevant information and use this to
construct essay-worthy prose practically in real time. Freely available online LLM platforms have
demonstrated that they are able to perform the task of critical analysis in a few seconds, providing
everything the student needs for their assignment while simultaneously removing the need for them
to apply logic and reasoning. The iterative development of this technology is developing
exponentially, rendering this academic enquiry already out-of-date at the time of writing, which has
captured the concerns of AI leaders globally calling for a temporary halt in further development.
There are, however, limitations to the technology: there is some uncertainty regarding the accuracy
of the chatbots responses, as text is often generated without citations, or indeed citation imitations
are provided whereupon closer inspection reveals no legitimate source or information. Despite this
concern, the instance of inaccuracies will diminish as the technology is more widely accessed and
further advancements with the AI algorithms are made. A further argument in support of LLMs is
that the output generated can only be as good as the initial inputted question. Regardless, evolution
of the assessment of critical analysis is needed in order to maintain the integrity of healthcare
qualifications to ensure we have fully developed and well-rounded clinicians in the future. The aim
here is to explore LLMs in more detail and make suggestions moving forward as to what this
evolution could be and the implications for educators.
References:
1. Douthit NT, Norcini J, Mazuz K, Alkan M, Feuerstein T, Clarfield MA, et al. Assessment of Global
Health Education: The role of Multiple-Choice Questions. Front. Public Health [internet]. 2021
[cited 2023 April 20]; 9(640204). Doi: 10.3389/fpubh.2021.640204
a.ryder@bham.ac.uk
Plenary 3
In this session the Research and Professional Development Subgroups will give short 10 minute presentations on the role of the subgroups and take ideas from the floor regarding the things that educators want the subgroups to do and support going forwards.
A general overview of where the GMC have got to with regulation of PAs together with some early insights from their baselining quality assurance work.
Closing Talk
As PA educators the focus tends to be on the delivery of necessary knowledge and skills to students to enable them to be successful. There appears to be little time for educators to engage in pedagogic scholarship with a lack of support for PA educators to develop the skills and confidence to write for publication. This session looks at ways to enable and empower PA educators to place equal value on the concept of educational pedagogy, reconceptualise, develop and improve their academic writing practices and output.