Study design
This was a prospective, randomized study aimed at comparing two different training methodologies: the traditional in-person BLS course of the American Heart Association (T-course) and a distance-based course with asynchronous feedback through an online platform (D-course). The institutional review board approved this report and waived the need for informed consent, with approval number 221118007. (Comité de Ética en Investigación, Facultad de Medicina, Pontificia Universidad Católica de Chile). After approval by the ethics committee, 200 non-medical personnel were recruited to participate in this protocol.
Participants and recruitment
Non-healthcare personnel such as teachers, psychologists, security guards, coaches, and athletes were recruited from five different educational and sports institutions that had an AED defibrillator on their premises. The principal investigator contacted each institution's director, explained the project, and outlined the inclusion and exclusion criteria. The inclusion criteria were non-healthcare personnel, while the exclusion criteria included healthcare professionals and individuals who had previously completed CPR courses. Each director provided the number of potential eligible participants from their institution. Participants received an invitation email explaining the protocol, and they could choose whether to participate. After assessing eligibility, a second email was sent describing the assigned group.
Participants were randomly assigned to two different training methods.
Randomization
Before the protocol started, we used the GraphPad Prism calculator to perform randomization. One of the researchers responsible for statistical analysis generated the random allocation sequence. This sequence was sent to the coordinator at the simulation center. We then emailed participants to inform them of their assigned group intervention and the next steps.
In this study, the institutions reported the number of participants who would be taking the course before the protocol began, so the entire list of participants was completely randomized at the beginning of the protocol. Therefore, an allocation concealment process was not done in this study.
Interventions
For the traditional in-person BLS course offered by the American Heart Association, another simulation center with AHA certification was contacted to send the participants. Participants were scheduled for specific dates and times according to the agenda of the Simulation Center at Los Andes University in Chile to complete the Heartsaver certification. Offered by the AHA, the Heartsaver certification provides training in CPR, AED use, and first aid for individuals with little or no medical experience. Participants received an email from Los Andes University with the instructional materials and instructions for arriving on the day of their course. Those who did not attend their scheduled session were rescheduled a second time to reduce drop-offs.
For the distance-based course, materials for practice—including a pad, resuscitation torso, and automated external defibrillator—were delivered by our simulation center staff to each institution, and a workstation was set up (Fig. 1). Depending on the number of participants from each institution, one or two workstations were established, allowing participants to practice at their workplaces during their free time.
To complete the course, the online platform C1DO1 was used (https://c1do1.ai/). This platform is a desktop and mobile application accessible from any location for participants and instructors with internet access (15). Participants received an email to create a user account and gain access to the platform. The course was organized into stages on the platform, with both theoretical and practical components. It consisted of nine stages: five theoretical stages, three practical exercises, and a final stage with a multiple-choice assessment. The theoretical and instructional material stages were delivered through short video capsules. During practical stages, participants needed to practice and upload a video of their performance to the platform. It is required that participants complete each stage before moving on to the next. The course was designed to ensure that the objectives and training duration matched those of the BLS course. Each stage is briefly described in Table 1.
Table 1
Structure of the stages of the CPR course on the online platform
Stage 1 | Introduction |
|---|
Stage 2 | Recognizing Cardiorespiratory Arrest. |
Stage 3 | Quality of chest compressions. |
Stage 4 | Practice stage: Quality chest compressions. At this stage, the student must upload a video to the platform. |
Stage 5 | Use of Automated External Defibrillator (AED). |
Stage 6 | Practice stage: Use of AED: At this stage, the student must upload a video to the platform. |
Stage 7 | Complete Resuscitation Sequence. |
Stage 8 | Practice stage: Complete Resuscitation Sequence. At this stage, the student must upload a video to the platform. |
Stage 9 | Multiple-Choice Test. The objective of this stage is to reinforce knowledge, it was not for research assessment purposes. |
The methodology used for distance-based courses has been successfully applied in prior studies (15–17). As mentioned above, the platform enabled students to access didactic materials on cardiopulmonary resuscitation and procedure-specific instructional videos for practice. Participants practiced and recorded themselves performing the procedure using a smartphone and uploaded their videos to the platform without supervision. In this study, participants had the practice station at their workplace. Instructors provided remote, asynchronous feedback within a 72-hour window. Participants reviewed the feedback and practiced again until they received approval for the stage. The methodology for participant practice and instructor feedback through the platform is illustrated in Fig. 2. Feedback was provided by three instructors, all emergency medicine MDs, who are experts in healthcare simulation and feedback. The platform allowed them to give audio, written, and drawing feedback.
Finally, an online satisfaction survey was sent to participants to evaluate their perceptions after completing each training program.
Outcome measures
Both groups completed a pre-training assessment (PRE) and a post-training assessment (POST). Our simulation center staff visited each institution to conduct the assessments for both groups. During each assessment, participants were videotaped by our staff during a CPR scenario. To ensure consistency, all participants watched a short video illustrating a situation at their workplace where a colleague becomes unconscious and needs medical attention. Later, two independent, blinded reviewers evaluated the videos, rating the participants' performance using the AHA CPR and AED Skills Testing Checklist. These reviewers were different from the instructors providing feedback in the D-course. Additionally, the quality of chest compressions (CC) was assessed using a Prestan® mannequin with a skill reporter connected to an application on a tablet device, which recorded the rate and depth of chest compressions.
Definition of non-inferiority margins
Based on a pilot distance-based CPR course, the main outcome (AHA CPR and AED Skills Testing Checklist) was expected to improve from 10–15% to 80–85% points (18). Previous studies testing clinical learning outcomes indicated that a 10–15% points non-inferiority limit is fair (19). Concepts of non-inferiority in medical education have used examples like the comparison of Basic Life Support trainings and supported these limits (20). Based on this data, it was decided to establish a difference of 2 points (performance difference of 10% between the trainings) as the non-inferiority margin for this study.
Sample size calculation
A sample size of 156 subjects was calculated to detect a non-inferiority margin of 10% points on the checklist, with a significance level of 0.05 and a power of 0.8. The calculation was performed using the web calculator for non-inferiority trials provided by Sealed Envelope (21). Considering this is a non-captive study population, meaning participants are not students of our institution, a 30% drop-off rate was considered. In a previous study with distance-based courses, the percentage of participants who finished and passed ranged from 20 to 60% depending on the course (16). To ensure the adequacy of the calculated sample size, we ultimately assessed 200 participants for eligibility.
Statistical Analysis
The data were analyzed using JASP software version 0.19.3 for MacOS. Demographic data are presented as mean and standard deviation (SD). Checklist scores and chest compression (CC) data are shown as median and interquartile range (IQR). The intraclass correlation coefficient (ICC) was calculated to evaluate interobserver agreement.
Two-sided 95% confidence intervals (CI) were used to assess the significance of non-inferiority. Mann-Whitney U test and Wilcoxon signed-rank test were employed to compare independent and paired groups, respectively. A p-value of 0.05 was considered statistically significant.
Only participants who fully adhered to the study protocol, meaning they finished the assigned course and completed the POST assessment, were included in the analysis (Per-Protocol Analysis).