Objective
To determine the intrarater, interrater, and retest reliability of facial nerve grading of patients with facial palsy (FP) using standardized videos recorded synchronously during a self‐explanatory patient video tutorial.
Study Design
Prospective, observational study.
Methods
The automated videos from 10 patients with varying degrees of FP (5 acute, 5 chronic FP) and videos without tutorial from eight patients (all chronic FP) were rated by five novices and five experts according to the House‐Brackmann grading system (HB), the Sunnybrook Grading System (SB), and the Facial Nerve Grading System 2.0 (FNGS 2.0).
Results
Intrarater reliability for the three grading systems was very high using the automated videos (intraclass correlation coefficient [ICC]; SB: ICC = 0.967; FNGS 2.0: ICC = 0.931; HB: ICC = 0.931). Interrater reliability was also high (SB: ICC = 0.921; FNGS 2.0: ICC = 0.837; HB: ICC = 0.736), but for HB Fleiss kappa (0.214) and Kendell W (0.231) was low. The interrater reliability was not different between novices and experts. Retest reliability was very high (SB: novices ICC = 0.979; experts ICC = 0.964; FNGS 2.0: novices ICC = 0.979; experts ICC = 0.969). The reliability of grading of chronic FP with SB was higher using automated videos with tutorial (ICC = 0.845) than without tutorial (ICC = 0.538).
Conclusion
The reliability of the grading using the automated videos is excellent, especially for the SB grading. We recommend using this automated video tool regularly in clinical routine and for clinical studies.
Level of Evidence
4 Laryngoscope, 2018
http://bit.ly/2ScJlsR
Δεν υπάρχουν σχόλια:
Δημοσίευση σχολίου
Σημείωση: Μόνο ένα μέλος αυτού του ιστολογίου μπορεί να αναρτήσει σχόλιο.