Update on OSCEs and new program of assessments
10 Mar 2023
Trainee news
Communique – March 2023
Key Messages
While OSCEs appeared a fit for purpose examination, there was widespread disquiet that the OSCEs were not assessing the core skills of psychiatrists.
The rationale for a change is based on the recommendations made by ACER, the regulators’ prompting specialist medical colleges to adopt contemporary approaches to assessment and reduce reliance on high-stakes summative examinations.
Prof Schuwirth’s evaluation and external review of the AAP provides context and recommendations for future developments, and also clearly summarises the most relevant lessons from the assessment literature in medical education.
We are commencing the development of a new ‘Program of Assessments’ that builds on the strengths of workplace experiential training and practice and is more compatible with a programmatic assessment approach.
The focus is not on a one-time measurement of achievement but on using assessments to foster the learning process and that all assessment information is collated and synthesised.
Any future program will be based on sound stakeholder consultation.
Background
Further to the implementation of the Alternative Assessment Pathway (AAP), the College has been holding Stakeholder Forums to discuss the future of assessments, including the OSCEs. The forums comprised various College stakeholder groups, whose contribution the College sought in the discussion of the delivery models for future assessments. As an outcome of these forums, there was agreement on key assessment principles to guide the College’s transition to the more contemporary assessment methodology.
The College proceeded with the AAP and CCA for 2022-2023 and concluded the delivery of OSCEs, with the announcement made on 16 December 2022.
Objective Structured Clinical Examination (OSCE)
In the communication of 16 December 2022 from the Chair, Education Committee, the College announced that we will not be delivering OSCEs in the future as it no longer forms part of the philosophy of the longer-term assessment strategy. The College is committed to the implementation of contemporary approaches to assessment in medical education using multiple measures of trainees’ competencies over time that are aggregated and synthesised to inform judgements about candidate progression.
Concerns have been expressed by some College members regarding the discontinuation of OSCEs and that the decision to remove this high-stakes assessment should be reconsidered. The reasons voiced for the retention of OSCEs include:
- OSCEs are valid high-stakes examinations with external accountability.
- OSCE is an objective examination that is perceived to be fair, robust, and reliable.
- The lengthy preparation for OSCEs provides candidates an opportunity to practice skills to undertake focused learning, as well as collegial individual and group learning experience.
While these views are valid, the Australian Council for Educational Research (ACER) in its review of RANZCP examinations (January 2020), highlighted the following findings:
- While OSCEs appeared fit for purpose examination, there was widespread disquiet that the OSCEs were not assessing the core skills of psychiatrists.
- Not all core areas of psychiatric practice are currently assessed by the OSCE.
- The OSCE operates on a compensatory model of assessment whereby a candidate could fail a station that might be regarded as containing a number of core clinical competencies but pass the overall exam.
- Concerns were raised that a large number of stations and the short time available resulted in the assessment of a very narrow spectrum of practice; each station only allows for a cursory assessment of psychiatric practice rather than a deeper probing of the art of psychiatry.
OSCEs as high-stakes assessments
OSCEs are considered high-stakes examinations because they are externally examined and therefore have external accountability. Further, the standardised structure of OSCEs provides a basis for objectivity in the examination.
Increasingly, contemporary approaches in medical education are questioning the value and fairness of ‘one-off-high-stakes assessments’ that purport to be valid where valid inferences can be made of a candidate’s medical competence.
Rationale for a change
The rationale for a change is based on the recommendations made by ACER, the regulators’ prompting specialist medical colleges to adopt contemporary approaches to assessment and reduce reliance on high-stakes summative examinations, and the College’s own experience with the recently introduced alternative assessment pathways which supports evidence for candidates to undertake summative assessments that align with the workplace environment rather than as a one-off high stakes assessment.
It should also be noted that while OSCE as a summative assessment has been ceased, other types of summative clinical assessments have been proposed as part of the new ‘Program of Assessments.’
AAP evaluation report
The College engaged Professor Lambert Schuwirth, a medical education and assessment pioneer, to provide an external independent evaluation of the AAP.
The experiences of the AAP and CCA have provided the College with an insight into how workplace-based assessments can be used to make progression judgements for candidates. Professor Schuwirth in his evaluation of the AAP (November 2022) viewed the AAP as a longitudinal, multi-instrument, multi-supervisor assessment pathway that better aligns with the contemporary literature and philosophy on assessment and the nature of competence.
The Australian Medical Council (AMC) has urged specialist medical colleges to consider reducing reliance on centralised high-stakes summative examinations and move towards more workplace-based competency assessments.
The external independent evaluation of the AAP by Professor Schuwirth also provides the basis for a change of future assessments.
Prof Schuwirth’s evaluation and external review of the AAP provides context and recommendations for future developments, and also clearly summarises the most relevant lessons from the assessment literature in medical education.
The paper refers to important concepts such as:
- Competence is domain specific – broad sampling is required for reliable assessment programs; with better sampling strategies higher reliabilities are achieved
- Medical problem-solving is idiosyncratic – experts use individual strategies and problem-problem-solving pathways to reach the same key decision and assessments need to take this into account
- Validity –whether the assessment actually measures the construct it purports to measure
- Reliability – the extent to which outcomes on the assessment are generalisable
- The role of content for validity – it is not the format of the assessment that determines its validity but the content
- The role of human judgement in assessment – more complex aspects of competence require subjective (but expert) human judgement rather than a measurement approach
- Assessment of learning versus assessment for learning – the focus is not on a one-time measurement of achievement but on using assessment to foster the learning process longitudinally – assessment for learning requires the learner to engage meaningfully with the feedback and be able to demonstrate the necessary improvement.
- Assessment as a program – the underlying principle for programmatic assessment is that assessment information must be collated and synthesised through a meaningful narrative.
The AAP evaluation found that the WBA-ITAs and the Portfolio Review processes are better aligned with modern views on the assessment literature and the nature of competence being acquired in Psychiatry training. Prof. Schuwirth concluded that the WBA part of the AAP has been a more defensible assessment approach than the OSCE. The AAP evaluation report has raised valid issues in relation to the rigour of the Case-based Discussions that require consideration as we develop the strategy and modelling of the new ‘Program of Assessments.’ Despite his concerns about the Case-based Discussion element of the AAP, Prof Schuwirth calculated the false-positive rate of the CbD to be no worse than those of an OSCE.
Logistics of the OSCE
Delivery of an OSCE is an extremely complex logistical exercise, and its sustainability was presenting issues prior to COVID.
Workforce pressures have become a considerable concern, and having large-scale OSCEs in the current environment would not be feasible given the health services and workforce issues in the context of:
- the inability of health services to meet the venue and space requirements for the ever-increasing number of candidates, in the current workforce landscape,
- the increasing requirement of the number of examiners and ancillary and supporting staff,
- the escalating costs and event expense for centrally administered OSCEs, and
- projected increasing trainee intakes in future years
To enable to continue to deliver OSCEs, it would become necessary to make the following adaptations:
- restricting the number of OSCE stations to what was feasible (impacting the reliability of the exam) or
- creating barriers for OSCE enrolments and limiting the number of candidate enrolments (impacting fairness in the accessibility of the exam) or
- introducing a non-compensatory model as recommended by the ACER report (in turn impacting the pass rates).
Such modifications to OSCEs from their original delivery model would have raised the issue of its fitness for purpose.
New ‘Program of Assessments’
The Board agreed that the College will explore opportunities to commence the development of a new ‘Program of Assessments’ that builds on the strengths of workplace experiential training and practice and is more compatible with a programmatic assessment approach.
It is envisaged that the new ‘Program of Assessments’ will incorporate elements of a programmatic approach that integrates both workplace-based assessments and centrally administered summative assessments. This is strongly supported by the evidence in the medical education literature that highlights the need for the competence of a trainee to be evaluated on a longitudinal basis that also assists with early detection of underperformance, monitoring, and remediation.
The transition is consistent with the contemporary and international assessment philosophy and aligns with the AMC’s direction as it urges the medical specialist colleges to move away from centrally administered high-stakes exams.
The integration of summative assessments into the workplace will assist to transition the building blocks of the existing program using modern assessment theory and better align training and assessment to improve feedback consistency through multiple points of assessment and enable better flexibility and adaptation to any potential disruptions.
It will be important to ensure that the new program comprises assessments that assess the core psychiatry skills around meeting and communicating with real patients, taking a history, conducting a patient examination, making a diagnosis, and formulating a treatment plan - skills essential for a psychiatrist and which were not assessed through the OSCEs in a robust and defensible way. The College continues to discuss the suitable format of such a clinical assessment and is seeking consultation via the stakeholder forums and committee engagement to determine the model that the College can agree upon.
An important element in the new ‘Program of Assessments’ will be to consider assessments in a holistic manner. A multitude of assessments should be used in making progression decisions rather than heavily relying on a single one-off high-stakes examination. The focus is not on a one-time measurement of achievement but on using assessments to foster the learning process and that all assessment information is collated and synthesised.
The central importance of the supervisor and the workplace-based apprenticeship model will require development of resources and supports for supervisors and directors of training. The College has commenced work on these resources and supports informed by the supervisor survey undertaken in 2022. The guiding principles for the new program of assessments will be that workplace-based assessments map as much as possible to the current workplace-based assessments and that digital and internet technologies do not increase but wherever possible reduce the burden of documentation of assessment and support feedback for learning and development.
We are currently in the early stages of the consultation process to determine the new ‘Program of Assessments.’ The consultation commenced with a meeting of stakeholders on 3 February 2023 via Zoom to discuss possible options for the new Program of Assessments and will continue to seek views on the development of an agreed model of the new ‘Program of Assessments’.
While there were representatives from all groups at the Stakeholder meeting, the ‘stakeholders’ will be further broadened to include representatives from:
- Trainees/SIMGs/candidates
- Supervisors
- Directors of Training
- Education Committee members
- Committee for Training (including its various subcommittee members)
- Committee for Examinations (including its various subcommittee members)
- Committee for SIMGs
- Community members
Some genuine concerns and issues have been highlighted in the February 2023 Stakeholder forum meeting and other relevant committee meetings, regarding the role of high-stakes exams, supervisor workload, burden of assessments (for both trainees and supervisors), feasibility and sustainability of any option that is considered, quality and types of assessments to be considered and the communication strategy. These are important issues that will need to be discussed and agreed upon to enable progress with the determination of the substantive strategy for the long-term Program of Assessments.
It will be important for the membership to support the proposed direction and transformation in Assessment before the future model of Program of Assessment is finalised and an implementation plan is developed.
The College acknowledges the burden and complexity of training and workplace-based assessment for Supervisors is very high currently, and that any changes to assessments must be authentically feasible and sustainable. The College is embarking upon a Supervisor Program – a project to develop supports, training, and calibration for supervisors to reduce unworkable pressure on them and to empower and equip them with the skills necessary for them to continue in their evolving roles as some assessment moves into the workplace environment.
We value your contribution and input into the design, development, and delivery of the future program of assessments. We understand that we are going through a significant transformation of processes, assessment philosophy, and mindsets as we embark upon new challenges in medical education to address the needs of our candidates and competencies of future psychiatrists, workforce challenges, and community requirements.
Dr Nick O’Connor
Chair, Education Committee
More news & views
The Binational Committee for Trainees met online (July 10) and the Trainee Advisory Council met in p...
Applications for RANZCP assessments need to be submitted and completed via InTrain.
The 2024 Medical Training Survey (MTS) conducted by the Medical Board of Australia is now open givin...