CICM Online CCR Journal logo CICM logo

Full Text View

Letter

Too hot to handle? Assessing the validity and reliability of the College of Intensive Care Medicine "Hot Case" examination

Jeremy Cohen, Mary Pinder, Daniel Angelico, Frieda Keo

Crit Care Resusc 2022; 24 (1): 93-4

Correspondence:Jeremy.Cohen@health.qld.gov.au

https://doi.org/10.51893/2022.1.L

  • Author Details
  • Competing Interests
    All authors declare that they do not have any potential conflict of interest in relation to this manuscript.
  • References
    1. Hoffman KR, Nickson CP, Ryan AT, Lane S. Too hot to handle? Assessing the validity and reliability of the College of Intensive Care Medicine “Hot Case” examination. Crit Care Resusc 2022; 24: 87-92
TO THE EDITOR: We have read the thoughtful article from Hoffman and colleagues 1 with great interest, and agree with much of the content. No assessment process is perfect, and the issues raised by the authors of validity, reliability, transparency and fairness are ones which the College of Intensive Care Medicine of Australia and New Zealand considers to be of great importance. Indeed, the College would be failing in its role as an educational body were there not constant review and refinement of its assessment processes.

Since its inception more than 40 years ago, the General Fellowship Examination has gone through changes in all its sections to match the current best practices in educational theory and assessment. The changes to the clinical section have included a formalised approach to cases, examiner pairs, examiner calibration and workshop training, candidate reading time, improved feedback, independent marking, and clarity around passing standards.

However, despite changes in how the hot cases have been implemented, the fundamental reasoning behind them has not altered: the belief that the ability to perform a time-critical accurate, directed, clinical assessment of a critically ill patient is a key skill for an intensive care physician. In this, we strongly agree with the authors’ comments on the educational importance of the hot cases as an assessment for learning as much as an assessment of learning, and it must be acknowledged that this is a key function of the College exam. This is supported by the findings of a study in progress evaluating the lived experience of trainees who have had multiple attempts at the exam before achieving success. The study participants report that the skills acquired during preparation for the hot case exam have translated into better performance as a consultant, with enhanced ability to perform focused patient assessments, construct a management plan and communicate their findings to colleagues.

Balancing the vital necessity for accurate assessment of a trainee’s clinical skills against some of the limitations of assessment procedures identified by the authors is not straightforward. Furthermore, the logistic difficulties of running high stakes examinations in working intensive care units, the increasing numbers of candidates presenting for examination, and of course the issues imposed by the coronavirus disease 2019 (COVID-19) pandemic all add to the complexity of the process. As part of its commitment to improvement in this area, the College has recently appointed an external organisation — the Australian Council for Educational Research (ACER) — to perform an in-depth review of the full suite of College examinations: the First Part written and oral sections, the Second Part (General) written and oral sections, and the Second Part (Paediatric) written and oral sections. The ACER review has generated a detailed report with recommendations for improvements, which the College is already undertaking. These recommendations address four key areas:
  • enhancing examiner capability and ensuring training in relevant issues related to medical education assessment;
  • improving exam structure and marking for all College examinations to ensure more robust blueprinting;
  • addressing the quality and quantity of feedback given to all trainees, not just those who were unsuccessful; and
  • ongoing review and refinement of the training program curriculum, including the implementation of elements of programmatic assessment.
We give our assurance, however, that changes will not be implemented without adequate consultation with relevant stakeholders and adequate notice to trainees, supervisors and examiners.

In summary, the College strives to continuously improve the assessment processes of our training program, subjecting them all to robust internal and external review. Commentaries, like that provided by Hoffman and colleagues, are a welcome and useful part of the discussion and help contribute to our shared aim of providing fair, transparent and valid assessments for all College trainees.
 

TOP