147 - Evaluating Undergraduate Anatomy Student Perceptions of the AI OSPE Tool Using Q-methodology
Monday, March 25, 2024
10:15am – 12:15pm US EDT
Location: Sheraton Hall
Poster Board Number: 147
There are separate poster presentation times for odd and even posters.
Odd poster #s – first hour
Even poster #s – second hour
Co-authors:
Philip Yu - Faculty of Health Sciences - McMaster University; Sarada Sriya Rajyam - Faculty of Health Sciences - McMaster University; Ethan Michalenko - Faculty of Health Sciences - McMaster University; Trinity Stodola - Faculty of Health Sciences - McMaster University; Siyuan Huang - Faculty of Health Sciences - McMaster University; Tracy Wang - Faculty of Health Sciences - McMaster University; Lana Amoudi - Faculty of Health Sciences - McMaster University; Abeerah Murtaza - Faculty of Health Sciences - McMaster University; Josh Mitchell - Faculty of Health Sciences - McMaster University; Jason Bernard - Vector Institute; Noori Akhtar-Danesh - McMaster University; Bruce Wainman - Faculty of Health Sciences - McMaster University; Yasmeen Mezil - Faculty of Health Sciences - McMaster University; Kristina Durham - Faculty of Health Sciences - McMaster University
Abstract Body : Introduction & Objective: Objective structured practical examinations (OSPEs) are a standard method of testing anatomical knowledge. However, student preparedness is limited by scarce access to cadaveric specimens and practice OSPEs. Additionally, facilitating and grading OSPEs are time and resource-intensive. To counter these challenges, we developed a virtual OSPE study tool graded by artificial intelligence (AI OSPE) that provides immediate feedback to users. To ensure the tool meets users’ needs and preferences, Q-methodology was used to assess student perceptions of the AI OSPE as a study tool.
Materials & Methods: A concourse of 36 statements relating to the accessibility, AI quality, user interface, user experience, and specific features of the AI OSPE tool was formed with team consensus. The AI OSPE was introduced to students enrolled in a McMaster introductory anatomy course, six days prior to their term test. On release, the tool contained 29 ‘cards’, each depicting an OSPE station on the musculoskeletal system. Students were encouraged, but not incentivized, to use the tool to prepare for their written OSPE and MCQ test by faculty and TAs. After their test, students were asked to complete a Q-survey. Survey results were analyzed via by-person factor analysis, which categorized feedback into groups based on shared perceptions.
Results: Twenty-three Q-survey respondents identified three salient factors (groups), with shared perspectives including: preferences for in-person learning (Factor 1, n=10), appreciation for virtual studying (Factor 2, n=7), and inclinations for the tool’s aesthetics (Factor 3, n=3). Feedback from Factor 3 indicated a desire for higher-resolution specimen images. There was consensus that the tool was of great academic benefits and user-friendly.
Conclusion: Despite factors emerging from the pilot data with differences in preference for in-person learning versus virtual studying, the consensus statements demonstrate support for the AI OSPE’s features in aiding perceived learning outcomes in anatomy and in navigation of the tool.
Significance: The results of this work on student attitudes are absolutely essential for directing user-friendly development in an efficient manner.