References

Sandler PJ. It beggars belief. Orthod Update. 2022; 15
van der Vleuten CP, Schuwirth LW. Assessing professional competence: from methods to programmes. Med Educ. 2005; 39:309-317 https://doi.org/10.1111/j.1365-2929.2005.02094.x
Norcini J, Anderson B, Bollela V Criteria for good assessment: consensus statement and recommendations from the Ottawa 2010 Conference. Med Teach. 2011; 33:206-214 https://doi.org/10.3109/0142159X.2011.551559
RCSEd. Important announcement of changes to Part B of the membership examination in the specialty of orthodontics (M Orth RCSEd). 2022. http://www.rcsed.ac.uk/view?id=ca824300-99bf-40f1-98db-585eed64d21f&type=exam (accessed May 2022)
Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990; 65:S63-67 https://doi.org/10.1097/00001888-199009000-00045
ISCP. The new surgicla curriculum for August 2021. 2021. https://iscp.ac.uk/iscp/curriculum_2021/ (accessed May 2022)
Health Education England. Annual review competency progression – England. https://specialtytraining.hee.nhs.uk/ARCP (accessed May 2022)
NHS England. Guides for commissioning dental specialties – orthodontics. http://www.england.nhs.uk/commissioning/wp-content/uploads/sites/12/2015/09/guid-comms-orthodontics.pdf (accessed May 2022)
GDC. Curriculum and specialist training. Programme in orthodontics. 2010. http://www.gdc-uk.org/docs/default-source/specialist-lists/orthodonticcurriculum.pdf?sfvrsn=76eecfed_2 (accessed May 2022)

Extraordinary Guest Editorial

From Volume 15, Issue 3, July 2022 | Pages 111-112

Authors

Jonathan Sandler

BDS (Hons), MSc, PhD, MOrth RCS, FDS RCPS, BDS(Hons), MSc, PhD, FDSRCPS, MOrth RCS, Consultant Orthodontist, , DOrth RCS

Consultant Orthodontist, Chesterfield Royal Hospital, Chesterfield, UK

Articles by Jonathan Sandler

Email Jonathan Sandler

Philip D Taylor

BDS, MGDS, MSc, MRD, FDS, FHEA, FDTFEd

Dean of the Faculty of Dental Surgery, Royal College of Surgeons of Edinburgh, Scotland

Articles by Philip D Taylor

Article

Jonathan Sandler

I penned an editorial in the previous issue of Orthodontic Update entitled ‘It beggars belief’ describing the unfathomable decision by RCS Edinburgh to remove treated cases from their version of the MOrth examination.1 There then followed quite some email ‘chatter’. We have therefore decided to take the unusual step of printing an ‘Extraordinary Guest Editorial’ giving the RCS Edinburgh's ‘justification’ of their monumental change, alongside a sample of the letters I have received on the subject. I will let you, the readership, be the judge of the wisdom of their approach.

Extraordinary Guest Editorial

The recent editorial opinion in Orthodontic Update argued that removing treated cases in the Royal College of Surgeons of Edinburgh (RCSEd) MOrth examination will be detrimental to the overall quality of orthodontic training.1 RCSEd contends that the assessment of treated cases is more appropriate as part of the longitudinal review of progression through the ARCP process and this change aligns the examination with modern educational pedagogy. We argue that the changes to the assessment format make for a fairer and more robust exam.

Common criteria for evaluating assessment utility include validity, reliability, feasibility, educational impact, and acceptability, the first three being most critical for high-stakes summative assessments such as MOrth.2,3 These criteria were more difficult to meet with the old-style examination that used self-selected cases.

Self-selected cases weakened the validity of the assessment, as there was no surety on how much input tutors may have had to treatment. Experience shows that weaker candidates tend to present cases predominantly managed by their clinical tutor with a lack of clarity on the contribution of the tutor to the write-up. Therefore, RCSEd cannot be assured how much of the presented case reflects candidates' understanding or ability to carry out such work in a reasonable timeframe and independently.

The reliability criterion relates to accuracy and reproducibility of scores. This too can be compromised using the old-style exam. When using self-selected cases, candidates each submit different material with varying levels of complexity. Candidates are then asked different questions that relate to their specific cases. This lack of standardization and equivalence can lead to an ‘uneven playing field’ for candidates. The small number of cases sampled is also likely to impact negatively on reliability.

The self-selected case approach can have a negative or undesirable educational impact if undue time is spent perfecting the presentation of a small select group of cases.

It should be recognized that the examination is only one part of the longitudinal process of assessment that a modern trainee must undertake. This includes entry of the full range of cases treated during training on to the Intercollegiate Surgical Curriculum Programme4 portal and verified through the Annual Review of Competency Progression5 process. This is the key area where any contention about case management should rest, and not at a single examination point. Such a process allows external validation of the range of cases treated and, assuming it is properly managed, should ensure they fulfil the requirements set out in the training blueprints, ranging from the NHS England Commissioning Standards6 to the SAC/JCPTD specialty training curricula7 (the latter currently under revision). This element of the training is overseen by four national health education boards and the blueprint to be covered and assessed by the GDC.8,9

The self-selected case approach can also lead to feasibility issues where a candidate fails this aspect of their examination but has reached the end of their training. The problem then arises as to how they will be able to provide other appropriate cases prior to the retake sitting.

Acceptability is an area where there is ongoing debate. Some see merit in continuing with self-selected cases and argue that they provide a chance to test technical competence. However, a written report with structured oral would not be considered an appropriate approach for assessing technical skills. These should be directly observed and sampled sufficiently as to generate reliable scores. The self-selected case approach does not support these requirements. Again, the evidence points to this area being addressed more appropriately in the workplace throughout training.

The key question is with what have we replaced these cases. Essentially, the treated cases will be replaced by four unseen cases related specifically to orthodontic treatment mechanics as below. The Part B examination will therefore consist of:

  • Four 30-minute structured oral examinations related to diagnosis, treatment planning and patient care;
  • An additional four 30-minute structured oral examinations related to diagnosis, treatment planning and management of orthodontic treatment mechanics with four sets of patient records;
  • A 1-hour communications examination over four stations, on aspects of communication relating to orthodontic treatment.
  • This allows RCSEd to better ensure syllabus content is covered at the appropriate level consistently across candidates, and that no candidate is disadvantaged. This can be achieved without the loss of the case-based approach which can continue as an assessment more robustly assessed in the workplace.

    We thank the Editorial Director for this opportunity to highlight the advantages conferred by the new exam format. It is important that all trainees and trainers understand the underpinning rationale for their examinations, and the need to employ current best evidence in their design and delivery. This ensures fair outcomes for the trainees, the profession and for patients and the GDC, provision of safe competent care.