142 - Q-methodology for Quality Assurance and Informed Improvements of the AI OSPE Tool; Insights from Experienced Anatomy Students, the Teaching Assistants
Monday, March 25, 2024
10:15am – 12:15pm US EDT
Location: Sheraton Hall
Poster Board Number: 142
There are separate poster presentation times for odd and even posters.
Odd poster #s – first hour
Even poster #s – second hour
Co-authors:
Layla Rahimpour - McMaster University; Philip Yu - McMaster University; Ethan Michalenko - McMaster University; Trinity Stodola - McMaster University; Siyuan Huang - McMaster University; Abeerah Murtaza - McMaster University; Tracy Wang - McMaster University; Lana Amoudi - McMaster University; Josh Mitchell - McMaster University; Jason Bernard - McMaster University; Noori Akhtar-Danesh - McMaster University; Bruce Wainman - McMaster University; Yasmeen Mezil - McMaster University; Kristina Durham - McMaster University
Undergraduate Student McMaster University Hamilton, Ontario, Canada
Abstract Body : Introduction & Objective: Objective structured practical examinations (OSPEs) are a standard method for testing anatomical knowledge, but practice OSPEs are time and resource-intensive to make and grade. To counter these challenges, a virtual OSPE study tool graded by artificial intelligence (AI), the AI OSPE, was developed. The aim is to develop the tool to meet the standards of educators and to encourage its adoption as a course learning resource. To do so, Q-methodology was used to collect perceptions of the tool from successful anatomy students, who are now teaching assistants (TAs) of McMaster’s undergraduate anatomy and physiology course. The results of statement analysis will be used to evaluate the efficacy of the tool from an educator’s perspective, leveraging participants’ feedback to inform its iterative development process.
Materials & Methods: A concourse of 36 statements relating to accessibility, AI quality, user experience and specific features of the AI OSPE was developed by team consensus. The tool was shared with 40 TAs, who were encouraged to judge its practicality as a study resource. At the time of release, the tool contained 29 ‘cards’, each depicting an OSPE station on the musculoskeletal system. After testing the tool, TAs were asked to complete a Q-survey, in which they had to rank the Q-statements across a quasi-normal grid based on their degree of agreement (strongly disagree; -4 to strongly agree; +4). To justify their most extreme rankings, TAs were asked to provide qualitative feedback. The rankings were analyzed using by-person factor analysis, which categorized participants into different ‘factors’ based on shared perceptions.
Results: Two factors emerged from the pilot Q-survey data. Perspectives distinguishing each factor included: strong support for spaced repetition and virtual OSPEs (Factor 1, n=6) and preference for in-person cadaveric specimens (Factor 2, n=3). Consensus statements indicated both factors appreciate the user-friendliness and academic potential of the tool for their students.
Conclusion: The findings obtained through this research indicate TA support of the tool’s ability to prepare their students for course assessments. TAs hold varying opinions on the tool’s efficacy in comparison to in-person OSPE practice with cadaveric specimens. TAs emphasized their perceived importance of the spaced repetition feature, which distinguishes this tool from other commonly adopted anatomy study tools.
Significance: The findings gathered from this study will ensure perspectives of educators are captured and implemented in future quality improvements. Integrating these perspectives promotes educator approval, encouraging adoption of the tool in anatomy education.