Background: Generative artificial intelligence (AI) integrated programs such as Chat Generative Pre-trained Transformers (ChatGPT) are becoming more widespread in educational settings, with mounting ethical and reliability concerns regarding its usage. This paper explores the experiences, perceptions, and usability of ChatGPT in undergraduate health sciences students.
Methods: Twenty-seven students at Carleton University (Canada) were enrolled in a crossover randomized controlled trial study from a Health Sciences course during the Fall 2023 academic term. The intervention condition involved the use of ChatGPT-3.5, whereas the control condition involved using conventional web-based tools. Technology usability was compared between ChatGPT-3.5 and the traditional tools using questionnaires. Focus group discussions were conducted with seven students to further elaborate on student perceptions and experiences. Reflexive thematic analysis was employed to identify themes from the focus group data.
Results: Easiness of learnability for personal use and a perception of quick learnability towards ChatGPT-3.5 were significantly higher, compared to conventional online tools from the Systems Usability Scale. Qualitative results highlighted strong benefits of ChatGPT-3.5, such as being a tool for increased overall productivity and brainstorming. However, students identified challenges associated with reliability and accuracy, and concerns about academic integrity.
Conclusions: Despite the benefits and positive usability of ChatGPT-3.5 identified by students, an explicit need for the development of policies, procedures and regulations remains. An established framework of best practices for the usage of AI within health science education is necessary. This will ensure accountability of users and lead to a more effective integration of AI technologies into academic settings.
Keywords: AI; ChatGPT; artificial intelligence; crossover RCT; education; health sciences; learning outcomes; perceptions; randomized controlled trial; usability.
© The Author(s) 2024.