Katharine E. Caldwell, MD, MSCI; Kevin Y. Pei, MD, MHSEd, FACS; Karen J. Dickinson MBBS, BSc, MD, MEd, FRCS
At the beginning of the COVID-19 pandemic, many people believed the transition to virtual learning would be temporary with post-pandemic return to full in-person education. However, with each additional variant and wave, demand persists for educational modalities that allow for asynchronous and remote options. Virtual learning enables training to continue during this time of educational disruption. Online platforms for surgical education have offered many benefits, including use of novel technologies, the ability for asynchronous or self-paced learning, and exploration of new assessment and teaching methods.1,2 Engagement has been identified as the most important variable in learning through online modalities, and appropriately, significant concerns have been raised about a learner’s ability to stay focused during virtual education.3,4 Online learning has been shown to lead to decreased levels of learner engagement.5 There have been many proposed contributors to this decline, including Zoom fatigue, increased demands at home, or distractions from the easy ability to click over to another window.6–10
Surgical educators have faced challenges with learner engagement during both the transition to virtual learning and continuation of this modality as the pandemic persists. Recent discourse within the online Association for Surgical Education (ASE) “DocMatters” community, a discussion forum for surgical educators to share experiences, has eludicated that many educators are discouraged by a wall of “black boxes” on zoom when delivering education online. Educators report struggling to engage learners, with examples including all learners participating with cameras off, learners not responding to direct questions, and the “awkward pause” after questions are asked. Further, many surgical educators have faced personal challenges due to the demands placed by virtual education during the pandemic. Examples include greater number of out of hours educational sessions, increased effort required to engage when teaching online, and the impact of the increased cognitive load of managing Q&A or chat functions while delivering education. Crowd sourcing educator opinion on platforms such as the ASE discussion forum are important as instituting virtual educational strategies can be variably successful, and it can be difficult for educators to determine which strategies are most effective for learner engagement and learning.
Educators have long understood that learner engagement and focus are the keys to information transfer, but work in this area has been challenged by the lack of a common definition of an engaged learner. A good working definition is “someone who has good learning strategies, a will to learn and who is involved behaviorally, emotionally and intellectually with their task.”11,12 The ability to effectively assess learner engagement is important to determine efficacy of educational strategies. Initial tools to assess learner engagement were developed for elementary and secondary education learning environments.13,14 At the level of graduate medical education, the STROBE tool has been used to measure student engagement in the health professions classroom.15 Observational engagement tools can be valuable in determining the external behavior of a learner, but self-assessment is crucial to any effective engagement measure. It has previously been demonstrated that learners are able to “fake” engagement when necessary, and formal learner self-assessment must be incorporated in order to gauge a learner’s internal behavior.16
It is more challenging to assess learner engagement in the virtual environment than in-person and educators may struggle to utilize engagement tools designed for an in-person classroom in the virtual sphere due to differences in learner behavior. To combat these challenges, our team developed the Virtual In-class Engagement Measure (VIEM) to assess learner engagement in online surgical education events.17 This tool is an adaptation of the STROBE tool, consisting of two parts: (1) assessment of an online learning event by a trained observer and (2) learner self-assessment. Expectantly, engagement scores as measured by the VIEM were higher in activities that required learner participation (e.g., mock orals, journal clubs) and lower in activities in which learners were passive observers (e.g., lecture-based). Additionally, VIEM scores demonstrated a positive correlation with number of questions asked during sessions and participation in the chat. Interestingly, on learner self-assessment, we found that up to 20% of learners across multiple educational activities reported that they pretended to participate15. On the online platform, fake participation continues to be the most significant limitation in accurate assessment of learner engagement.16,17
While no single method to increase engagement will be successful with all learner groups, several strategies, anchored in active learning principles, have been utilized (see Table 1). Some—such as the use of case-based learning or flipped classroom models—may be familiar to surgical educators due to their previous use in in-person learning spaces, but others may require more preparation on the part of the educator. Each strategy offers benefits and challenges, with some being more suitable for certain learner types or learning events, and we recommend experimentation combined with evaluation of success through learner engagement tools, with revision of tactics as required.
Strategy |
Examples |
Benefits |
Challenges |
Gamification |
Gamification software (TopHat, Kahoot) Jeopardy |
|
|
Small Group Sessions |
Breakout rooms Pre-classroom small groups |
|
|
Flipped classroom |
PDF, video review prior to the learning event
|
|
|
Q&A Based Sessions |
Case-based presentations
|
|
|
Polling Software |
PollEverywhere
|
|
|
Expectation Setting |
“Cameras on”
|
|
|
Assistance |
Moderators
|
|
|
Unique Set Ups |
Role play
|
|
|
Virtual learning is likely to be a permanent fixture in medical education. Soon, a generation of medical students whose majority of medical school training has been completed virtually will be entering the clinical environment. It is imperative that residency educators are ready to meet the needs of these learners. Understanding the engagement of learners is essential to understanding both whether information is transmitted and the efficacy of learning; there is much left to study in terms of the impact of virtual engagement on competency. We suggest that formal assessment of learner engagement is even more important in for virtual learning events, where it may be challenging for an educator to evaluate the engagement of their learners in real time. Formal assessment of learner engagement can be used to evaluate the effectiveness of certain educational strategies and to help determine which learning points are unclear, in order to optimize the experience for both students and educators. As we enter the post-pandemic educational landscape, evaluating the efficacy of virtual and hybrid educational strategies will be essential to determine the best pedagogical approach for surgical education in the future.
Katharine E. Caldwell, MD, MSCI, Washington University in St. Louis, Department of General Surgery, St. Louis, MO; Kevin Y. Pei, MD, MHSEd, FACS, Department of Graduate Medical Education, Parkview Health, Fort Wayne, IN; Karen J. Dickinson, MBBS, BSc, MD, MEd, FRCS, University of Arkansas for Medical Sciences, Department of Surgery, Little Rock, AR