Please use this identifier to cite or link to this item: http://hdl.handle.net/11434/1441
Full metadata record
DC FieldValueLanguage
dc.contributor.authorKelly, D.-
dc.contributor.authorMcKenzie, Dean-
dc.contributor.authorHanlon, Gabrielle-
dc.contributor.authorMackley, L.-
dc.contributor.authorBarrett, Jonathan-
dc.date.accessioned2018-07-18T03:19:57Z-
dc.date.available2018-07-18T03:19:57Z-
dc.date.issued2018-06-
dc.identifier.urihttp://hdl.handle.net/11434/1441-
dc.description.abstractBackground Deficiencies in doctors’ non-technical skills (NTS) contribute to critical incidents and poor patient outcomes, but it is unclear how to reliably assess them. We developed a standardised NTS assessment rubric and evaluated its reliability. Methods Prospective observational study to evaluate inter-rater reliability of a NTS assessment rubric. Intensive Care Registrars and medical students participated in high-fidelity, immersive, in-situ simulated scenarios of medical emergencies. Following a short period of calibration, two Intensive Care Consultants independently viewed the videoed scenarios and scored each scenario leader using the assessment rubric. The primary outcome was inter-rater reliability of the overall score. Secondary outcomes included inter-rater reliability of the 5 domains and 14 individual questions. Results 40 scenarios were videoed, including 5 for consultant calibration. The mean(SD) score for rater A was 12.7(4.0) vs 13.0(4.8) for rater B; Lin’s concordance coefficient 0.74(95% CI 0.60 to 0.89). Inter-rater agreement for the domains and individual questions was assessed using Cohen’s kappa. Mean kappas for domains varied from 0.36(fair) to 0.64(substantial) and kappas for individual questions varied from 0.15(slight) to 0.75(substantial). Conclusion The NTS assessment rubric demonstrated good correlation between raters in relation to overall score. However, there was variability in agreement for individual questions ranging from slight to substantial. Overall the tool shows promise, but further refinement is required for individual questions.en_US
dc.subjectNon-Technical Skillsen_US
dc.subjectNTSen_US
dc.subjectPatient Outcomesen_US
dc.subjectCritical Incidentsen_US
dc.subjectNTS Assessment Rubricen_US
dc.subjectReliabilityen_US
dc.subjectIntensive Care Registrarsen_US
dc.subjectMedical Studentsen_US
dc.subjectMedical Emergenciesen_US
dc.subjectCritical Care Clinical Institute, Epworth HealthCare, Victoria, Australiaen_US
dc.titleEvaluation of a tool to assess non-technical skills in ICU.en_US
dc.typeConference Posteren_US
dc.type.studyortrialProspective Observational Studyen_US
dc.description.conferencenameEpworth HealthCare Research Week 2018en_US
dc.description.conferencelocationEpworth Research Institute, Victoria, Australiaen_US
dc.type.contenttypeTexten_US
Appears in Collections:Critical Care
Research Week

Files in This Item:
There are no files associated with this item.


Items in Epworth are protected by copyright, with all rights reserved, unless otherwise indicated.