Publication

P235 Development and validation of the direct observation of barrett’s imaging/endotherapy skills (DOBES) assessment tools

McGoran, John
de Caestecker, John
Sweis, Rami
Smart, Howard
Barr, Hugh
Trudgill, Nigel
diPietro, Massimiliano
Haidry, Rehan
Banks, Matt
Graham, David
... show 3 more
Glos Author
Date
2021-01-21
Journal Title
Type
Conference Abstract
Engagement
Google Scholar:
Altmetric:
Collections
Abstract
Introduction Endoscopic resection (ER) and radiofrequency ablation (RFA) have become the standard of care worldwide for treatment of early Barrett’s neoplasia. Procedural outcomes are highly dependent on the operator skill and training. Validated tools for assessment of competency in these 2 procedures are currently lacking. We aimed to develop and validate ER and RFA tools for use in clinical practice. Methods A working group of 15 experts who met one or more of the predefined inclusion criteria was set up. Using published evidence-based criteria, the group devised a structured checklist of graded competency descriptors (scores ranged from 1=required maximal supervision to 4=competent). The latter were grouped into four main competency domains, namely: pre-procedural; specific skills; post-procedural; and endoscopic non-technical skills (ENTS). Consensus agreement and piloting was undertaken to ensure content validity. Construct validity was measured by independent assessment of 60 videos per procedure of ER and RFA by 7 assessors (selected from the working group) in a random manner. Procedures were performed by 15 operators with variable expertise including experts and trainees. Statistical analysis was performed using Generalizability theory, which analysed ‘variability components’ between: operators; cases; assessors; assessors across (x) operators; and unexplained variation. Results Data on a minimum of 45 videos per procedure were available for analysis. The mean (± standard deviation) competency scores were 3.4 (0.8) and 3.7 (0.6) for ER and RFA, respectively. The variability components for the analysis are detailed in table 1. Variation in scores between operators, assessors, and assessors across different operators was small accounting for <10% of the total variation suggesting good reliability. The majority of variance was explained by variation in cases or unexplained. View inline View popup Abstract P235 Table 1 Variability components in assessment of construct validity of assessment tools using Generalizability theory Conclusions The DOBES assessment tools for ER and RFA appear to have good content and construct validity and were produced based on evidence and expert opinion. The analysis shows agreement on scores between expert assessors which strengthens the case for its adoption into clinical practice.
Citation
McGoran, J., Caestecker, J. D., Sweis, R., et al. (2021). P235 Development and validation of the direct observation of Barrett’s imaging/endotherapy skills (DOBES) assessment tools. Gut, 70(Suppl 1), A163–A164.
Usage rights