Collecting evidence of validity for an assessment tool for Norwegian medical students’ non-technical skills (NorMS-NTS): usability and reliability when used by novice raters

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Standard

Collecting evidence of validity for an assessment tool for Norwegian medical students’ non-technical skills (NorMS-NTS) : usability and reliability when used by novice raters. / Prydz, Katrine; Dieckmann, Peter; Fagertun, Hans; Musson, David; Wisborg, Torben.

I: BMC Medical Education, Bind 23, Nr. 1, 865, 2023.

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Harvard

Prydz, K, Dieckmann, P, Fagertun, H, Musson, D & Wisborg, T 2023, 'Collecting evidence of validity for an assessment tool for Norwegian medical students’ non-technical skills (NorMS-NTS): usability and reliability when used by novice raters', BMC Medical Education, bind 23, nr. 1, 865. https://doi.org/10.1186/s12909-023-04837-6

APA

Prydz, K., Dieckmann, P., Fagertun, H., Musson, D., & Wisborg, T. (2023). Collecting evidence of validity for an assessment tool for Norwegian medical students’ non-technical skills (NorMS-NTS): usability and reliability when used by novice raters. BMC Medical Education, 23(1), [865]. https://doi.org/10.1186/s12909-023-04837-6

Vancouver

Prydz K, Dieckmann P, Fagertun H, Musson D, Wisborg T. Collecting evidence of validity for an assessment tool for Norwegian medical students’ non-technical skills (NorMS-NTS): usability and reliability when used by novice raters. BMC Medical Education. 2023;23(1). 865. https://doi.org/10.1186/s12909-023-04837-6

Author

Prydz, Katrine ; Dieckmann, Peter ; Fagertun, Hans ; Musson, David ; Wisborg, Torben. / Collecting evidence of validity for an assessment tool for Norwegian medical students’ non-technical skills (NorMS-NTS) : usability and reliability when used by novice raters. I: BMC Medical Education. 2023 ; Bind 23, Nr. 1.

Bibtex

@article{5095c99bd79545f7bfecc0fece9630a2,
title = "Collecting evidence of validity for an assessment tool for Norwegian medical students{\textquoteright} non-technical skills (NorMS-NTS): usability and reliability when used by novice raters",
abstract = "Background: The NorMS-NTS tool is an assessment tool for assessing Norwegian medical students{\textquoteright} non-technical skills (NTS). The NorMS-NTS was designed to provide student feedback, training evaluations, and skill-level comparisons among students at different study sites. Rather than requiring extensive rater training, the tool should capably suit the needs of busy doctors as near-peer educators. The aim of this study was to examine the usability and preliminary assess validity of the NorMS-NTS tool when used by novice raters. Methods: This study focused on the usability of the assessment tool and its internal structure. Three raters used the NorMS-NTS tool to individually rate the team leader, a medical student, in 20 video-recorded multi-professional simulation-based team trainings. Based on these ratings, we examined the tools{\textquoteright} internal structure by calculating the intraclass correlation coefficient (ICC) (version 3.1) interrater reliability, internal consistency, and observability. After the rating process was completed, the raters answered a questionnaire about the tool{\textquoteright}s usability. Results: The ICC agreement and the sum of the overall global scores for all raters were fair: ICC (3,1) = 0.53. The correlation coefficients for the pooled raters were in the range of 0.77–0.91. Cronbach{\textquoteright}s alpha for elements, categories and global score were mostly above 0.90. The observability was high (95%-100%). All the raters found the tool easy to use, none of the elements were redundant, and the written instructions were helpful. The raters also found the tool easier to use once they had acclimated to it. All the raters stated that they could use the tool for both training and teaching. Conclusions: The observed ICC agreement was 0.08 below the suggested ICC level for formative assessment (above 0.60). However, we know that the suggestion is based on the average ICC, which is always higher than a single-measure ICC. There are currently no suggested levels for single-measure ICC, but other validated NTS tools have single-measure ICC in the same range. We consider NorMS-NTS as a usable tool for formative assessment of Norwegian medical students{\textquoteright} non-technical skills during multi-professional team training by raters who are new to the tool. It is necessary to further examine validity and the consequences of the tool to fully validate it for formative assessments.",
keywords = "Assessment, Assessment tools, Medical students, Nontechnical skills, NorMS-NTS, Simulation-based training, Validation",
author = "Katrine Prydz and Peter Dieckmann and Hans Fagertun and David Musson and Torben Wisborg",
note = "Publisher Copyright: {\textcopyright} 2023, The Author(s).",
year = "2023",
doi = "10.1186/s12909-023-04837-6",
language = "English",
volume = "23",
journal = "BMC Medical Education",
issn = "1472-6920",
publisher = "BioMed Central Ltd.",
number = "1",

}

RIS

TY - JOUR

T1 - Collecting evidence of validity for an assessment tool for Norwegian medical students’ non-technical skills (NorMS-NTS)

T2 - usability and reliability when used by novice raters

AU - Prydz, Katrine

AU - Dieckmann, Peter

AU - Fagertun, Hans

AU - Musson, David

AU - Wisborg, Torben

N1 - Publisher Copyright: © 2023, The Author(s).

PY - 2023

Y1 - 2023

N2 - Background: The NorMS-NTS tool is an assessment tool for assessing Norwegian medical students’ non-technical skills (NTS). The NorMS-NTS was designed to provide student feedback, training evaluations, and skill-level comparisons among students at different study sites. Rather than requiring extensive rater training, the tool should capably suit the needs of busy doctors as near-peer educators. The aim of this study was to examine the usability and preliminary assess validity of the NorMS-NTS tool when used by novice raters. Methods: This study focused on the usability of the assessment tool and its internal structure. Three raters used the NorMS-NTS tool to individually rate the team leader, a medical student, in 20 video-recorded multi-professional simulation-based team trainings. Based on these ratings, we examined the tools’ internal structure by calculating the intraclass correlation coefficient (ICC) (version 3.1) interrater reliability, internal consistency, and observability. After the rating process was completed, the raters answered a questionnaire about the tool’s usability. Results: The ICC agreement and the sum of the overall global scores for all raters were fair: ICC (3,1) = 0.53. The correlation coefficients for the pooled raters were in the range of 0.77–0.91. Cronbach’s alpha for elements, categories and global score were mostly above 0.90. The observability was high (95%-100%). All the raters found the tool easy to use, none of the elements were redundant, and the written instructions were helpful. The raters also found the tool easier to use once they had acclimated to it. All the raters stated that they could use the tool for both training and teaching. Conclusions: The observed ICC agreement was 0.08 below the suggested ICC level for formative assessment (above 0.60). However, we know that the suggestion is based on the average ICC, which is always higher than a single-measure ICC. There are currently no suggested levels for single-measure ICC, but other validated NTS tools have single-measure ICC in the same range. We consider NorMS-NTS as a usable tool for formative assessment of Norwegian medical students’ non-technical skills during multi-professional team training by raters who are new to the tool. It is necessary to further examine validity and the consequences of the tool to fully validate it for formative assessments.

AB - Background: The NorMS-NTS tool is an assessment tool for assessing Norwegian medical students’ non-technical skills (NTS). The NorMS-NTS was designed to provide student feedback, training evaluations, and skill-level comparisons among students at different study sites. Rather than requiring extensive rater training, the tool should capably suit the needs of busy doctors as near-peer educators. The aim of this study was to examine the usability and preliminary assess validity of the NorMS-NTS tool when used by novice raters. Methods: This study focused on the usability of the assessment tool and its internal structure. Three raters used the NorMS-NTS tool to individually rate the team leader, a medical student, in 20 video-recorded multi-professional simulation-based team trainings. Based on these ratings, we examined the tools’ internal structure by calculating the intraclass correlation coefficient (ICC) (version 3.1) interrater reliability, internal consistency, and observability. After the rating process was completed, the raters answered a questionnaire about the tool’s usability. Results: The ICC agreement and the sum of the overall global scores for all raters were fair: ICC (3,1) = 0.53. The correlation coefficients for the pooled raters were in the range of 0.77–0.91. Cronbach’s alpha for elements, categories and global score were mostly above 0.90. The observability was high (95%-100%). All the raters found the tool easy to use, none of the elements were redundant, and the written instructions were helpful. The raters also found the tool easier to use once they had acclimated to it. All the raters stated that they could use the tool for both training and teaching. Conclusions: The observed ICC agreement was 0.08 below the suggested ICC level for formative assessment (above 0.60). However, we know that the suggestion is based on the average ICC, which is always higher than a single-measure ICC. There are currently no suggested levels for single-measure ICC, but other validated NTS tools have single-measure ICC in the same range. We consider NorMS-NTS as a usable tool for formative assessment of Norwegian medical students’ non-technical skills during multi-professional team training by raters who are new to the tool. It is necessary to further examine validity and the consequences of the tool to fully validate it for formative assessments.

KW - Assessment

KW - Assessment tools

KW - Medical students

KW - Nontechnical skills

KW - NorMS-NTS

KW - Simulation-based training

KW - Validation

U2 - 10.1186/s12909-023-04837-6

DO - 10.1186/s12909-023-04837-6

M3 - Journal article

C2 - 37968662

AN - SCOPUS:85176543582

VL - 23

JO - BMC Medical Education

JF - BMC Medical Education

SN - 1472-6920

IS - 1

M1 - 865

ER -

ID: 382740494