The requirements of educational standards are aimed at introducing simulation courses into the educational process that ensure the development of practical skills and abilities for all categories of students, the development of teamwork skills, the development of clinical thinking, and the formation of professional competencies. It is this area of medical education that has been chosen for the evaluation.
Modern medical simulation includes several categories of simulators. The manufactured patient simulators differ in different degrees of closeness to practice. The most complex of them have qualities that allow students to imitate work with a real patient close to reality when working with a simulator. Usually, training takes place after practical work with the simulator; participants are interviewed by specially trained observers who use a specific scenario as an engagement tool to activate what has been learned. Such simulators (for example, SimMan © (Laerdal Medical, Stavanger, Norway) and iStan © (CAE, Sarasota, FL, USA)) have variable levels of programmable physiology that can either be assessed by participants (for example, pulse, respiration, and blood pressure), or displayed on physiological monitors (e.g., heart rates, invasive hemodynamic monitoring, and oxygen saturation) (Labrague et al., 2019). These technologies enable venous catheterization, restoration of airway patency, administration of drugs, labor act, defibrillation, electrical impulse therapy, and many other interventions.
Individual skill trainers usually provide participants with the ability to perform one or a limited number of procedures. Examples include venous catheterization simulators, airway restoration simulators, intercourse simulators, and ultrasound-assisted procedure simulators. In surgery, the Society of American Gastroenterological and Endoscopic Surgeons (SAGES) has developed and validated simulators on which, using real laparoscopic instruments, a trainee can demonstrate a variety of required laparoscopic skills (for example, tightening of intra and extracorporeal nodes) (Van Sickle et al., 2007). These simulators are used for both training and competency assessment.
Virtual reality (VR) simulators take full advantage of video game technology, allowing the user to simulate a natural interface (although the clinical interface is a digital reproduction). Some simulators have tactile feedback, a computer-generated process where the user has the same touch sensation (e.g., resistance, tension) that would occur in an actual situation (Willaert et al., 2012). Some newer VR simulators contain imported patient images, allowing the user to perform the procedure before applying it to a live patient.
A variety of screen simulators is also in use today. The first widespread simulator of this type was developed for the American Heart Association to train a cardiovascular emergency management program (Wayne et al., 2008). In addition to teaching elements, these programs can parse simulated cases displayed on the screen. Another type of screen-based simulator has been developed to provide teams with hands-on communication and interaction simulations for various simulated events. Some screen-based simulators simulate highly complex multi-level scenarios used for educational purposes, independent practice, and certification.
The primary purpose of creating medical simulation centers is to improve the professional training system of medical specialists and develop the human resources of health care to improve the quality, efficiency, and safety of medical care. To date, sufficient experience has been accumulated in applying simulation (imitation) methods in many areas of human activity, involving high risks and threats.
A literature review shows that simulation as a teaching tool has reasonable validity since the learner (and patients) are in such a position that no harm can be done. It allows one to act thoughtfully when performing educational activities and practice until a certain level of technical skill is achieved. Nevertheless, when using simulation as a learning tool, one should consider additional investments, not always monetary. This includes physical location, level of technical readiness, and additional time spent by faculty members. Because of this additional investment, simulation has to prove its worth, far more than any other educational innovation introduced into medical education (Masiello, 2012). As far as utility is concerned, it is also not easy to measure. In medical education, this can be both the price and acceptability of the model, the rationality of training, its effectiveness, the duration of the training, and the impact of using the model on clinical outcomes (Kothari et al., 2017).
As the simulation developed, so did the number of people willing to answer these questions. At the earliest stage, the studies were mainly qualitative and had as their main goal to substantiate the acceptability of simulation as a teaching tool (Aebersold, 2016). The rationality and efficiency of the cognitive process and skills acquisition have become the main topic of subsequent research. Subsequently, close attention was paid to finding out the optimal duration of training using simulation and determining the most appropriate time for refresher training. The last and most challenging study involved correlating simulation results with clinical outcomes.
A pilot study by Gordon et al. (2006) assessed the effect of medical simulation on acquiring knowledge in cardiovascular physiology. In this study, first-year medical students were randomly selected into two groups – a standard form of practical training (control group) and a standard form of practical training, supplemented by working with a simulator as a patient (intervention group). They were assessed immediately after the semester in which they were taught using the simulator, and after one year, using the test using six drawings. The result of testing was conducted immediately after the session; there was a significant difference between the groups in terms of mean values [mean 4.0 (control group), 4.7 (intervention group), p =.005)]. After 1 year, the difference remained [mean 4.1 (control group), 4.7 (intervention group), p =.045)]. Based on the comprehensive analysis results, it can be concluded that the addition of simulation turned out to be a statistically significant decisive factor in performance.
In 2012, Bonrath et al. assessed the effect of simulation on the acquisition and maintenance of laparoscopic skills in newcomers (Bonrath et al., 2012). Thirty-six medical students were trained in 6 skills with increasing difficulty levels. Accuracy and speed of execution were assessed before the training course and immediately after its completion. The participants were divided into two groups of 18 people. One of the groups was recertified after six weeks, and the other after 11. It was found that immediately after the training period, all participants showed excellent mastery of skills. The group scored after six no retained 8 out of 9 skills at a level corresponding to that noted immediately after completing the preparatory course. The group assessed after 11 weeks retained only 4 of 9 skills. The authors concluded that after 11 weeks, the participants had a significant deterioration in their skills, which required additional practice. A longer period of work using simulation is required to assimilate skills and knowledge, which is apparent and is discussed in detail in the following study.
Palter et al. (2011) evaluated the impact of prior learning of technical skills learned through simulation on acquiring cognitive knowledge in the operating room. They randomized 18 resident surgeons into intervention and control groups. All participants were shown how to perform fascia suturing and then were allowed to perform this procedure on a simulator. The intervention team was allowed to practice on the simulator until professional technical skill was attained. Then, both groups performed fascia suturing on a real patient in the operating room. Both execution technique and cognitive outcome were statistically better in the intervention group. This highlights the potential benefits of gaining “pre-skill” skills in a simulation setting, and the same applies to enhancing cognitive skill acquisition in the operating room.
An example of a translational study using simulated techniques is the elegant study by Barsuk et al. (2009), which examined the effect of an educational program using simulators in preventing catheter-associated infections (CIs). The influencing element was the passage of a 4-hour training using simulators to teach the catheterization technique of the central vein under the control of ultrasound. The incidence of CIs in the trained group was 0.5 infections per 1000 catheter/day versus 3.2 infections per 1000 catheter/day in the group of patients before the training of doctors (retrospective data). In the study of paired groups, the results were more impressive: 0.5 episodes of infection per 1000 catheter/day versus 5.03 infections per 1000 catheter/day; 10-fold reduction in infections after training. The findings mean that 140 bed-days have been saved through staff training. Moreover, the researchers assessed the financial benefits of training physicians. They estimate that $ 800,000 is saved annually, i.e., the ratio of funds invested in training and benefits from its implementation is 7: 1.
Purpose of Evaluation
The acceptance of medical simulation means a significant overhaul of medical education. Since 2000, there has been an impressive increase in the use of medical simulation in the US and worldwide (Scalese et al., 2008). This growth is happening despite significant obstacles: funding, technology, and fear of change. Thus, it is necessary to introduce a comprehensive assessment of medical education programs that include medical simulation as a teaching tool. That is why this area of medical education has become the subject of evaluation.
The most important stakeholder here is the general public. Along with the widespread adoption of simulators among a wide range of consumers in the United States, there has been a growing interest in patient safety. American healthcare has begun to change as a result of efforts to tackle the problem of medical errors. Examples of operational measures taken include implementing clinical decision support systems for the prescription of medications or analyzes, developing electronic patient health records, using control cards, and much more (Argani et al., 2012). This process did not escape medical education – while many believed that modeling could be helpful in the analysis of medical errors and analysis of their causes. The opinions of patients, educators, and common sense have led to the introduction of simulations to ensure the safety of various procedures. It was not uncommon for inexperienced clinic workers to perform the first procedures in their practice on real patients in the past. This had consequences for both those and others: fewer and fewer patients allowed themselves to be used as “guinea pigs” to train novice clinicians.
According to one of the studies aimed at examining the attitude of patients towards the trainees who perform procedures with them, only 49% of patients were satisfied with the fact that they were the first patients of the trainees who had sutures for the first time (Argani et al., 2012). The indicator dropped to 29% in probing and to 15% in the case of lumbar puncture (Argani et al., 2019). Beginners also do not get a positive experience, especially if there is a negative treatment outcome. Supervised simulated exercise in a relaxed atmosphere allows the beginner to gain some degree of competence and confidence, away from the stressful environment of the clinic. One can make a mistake and correct it without explaining what happened to the patient or an inconsolable relative. Over time, mastery can be achieved by resorting to modeling alongside real work in a hospital room.
Medical organizations are another major stakeholder. Health care facilities are under increasing pressure to reduce costs and improve healthcare services at the level that patients deserve and demand today. This can be achieved in an environment that requires more efficiency at a lower cost, including reducing staff. In the face of a shortage of study time, many educational programs are looking for new approaches that would make it possible to optimize the educational process under such constrained circumstances (Hippe et al., 2020). Traditional lecture giving, during which only one speaker can convey the content of the question to an almost unlimited number of listeners, is probably the most economical, albeit least effective, method of teaching adult students. Lectures are beginning to be supplanted by active small-group activities, and modeling is just one example of this. It became evident that the clinic environment is not the best place to conduct classes, especially with newcomers (Hippe et al., 2020). In addition to the unpredictability of the practical experience gained, some tensions make a general atmosphere not conducive to acquiring knowledge. This includes the tension caused by the practical work with an actual patient, often in critical situations.
With the focus on clinical quality, the need for additional time for hands-on training for students is highly detrimental to the efficiency required of clinic staff. For example, if a separate procedure by an experienced clinician usually takes 60 minutes in total, and the inclusion of a trainee in the process increases this time by 20 minutes, during an 8-hour shift, two additional procedures could be performed, excluding the trainee from the process (Hippe et al., 2020). In the era of health care financial reform, hospital administrators take a keen interest in these issues. As the amount of information and the number of skills a physician is supposed to possess increases, special attention begins to be paid to determining the best and most effective approach to their practical development.
Assessment of the level of training of a specialist after the simulation training allows one to determine the effectiveness of the simulation course and plan further training using simulation technologies to achieve specific values of the quality of the tasks performed.
To assess the effectiveness of simulation training for practical health care, it is necessary to take into account the following closely interrelated indicators:
- availability and individualization of training;
- correspondence of the structure and content of training to the current needs and trends of healthcare;
- the level of technical equipment of the educational process;
- implementation of a multi- and interdisciplinary approach;
- the quality of methodological support of the educational process;
- indicators characterizing the results of control and evaluation activities;
- expected positive changes in the field of practical healthcare.
References
Aebersold, M. (2016). The history of simulation and its impact on the future. AACN advanced critical care, 27(1), 56-61.
Argani, C. H., Eichelberger, M., Deering, S., & Satin, A. J. (2012). The case for simulation as part of a comprehensive patient safety program. American journal of obstetrics and gynecology, 206(6), 451-455.
Barsuk, J. H., Cohen, E. R., Feinglass, J., McGaghie, W. C., & Wayne, D. B. (2009). Use of simulation-based education to reduce catheter-related bloodstream infections. Archives of internal medicine, 169(15), 1420-1423.
Bonrath, E. M., Weber, B. K., Fritz, M., Mees, S. T., Wolters, H. H., Senninger, N., & Rijcken, E. (2012). Laparoscopic simulation training: testing for skill acquisition and retention. Surgery, 152(1), 12-20.
Gordon, J. A., Brown, D. F., & Armstrong, E. G. (2006). Can a simulated critical care encounter accelerate basic science learning among preclinical medical students? A pilot study. Simulation in healthcare, 1(Inaugural), 13-17.
Hippe, D. S., Umoren, R. A., McGee, A., Bucher, S. L., & Bresnahan, B. W. (2020). A targeted systematic review of cost analyses for implementation of simulation-based education in healthcare. SAGE open medicine, 8, 2050312120913451.
Kothari, L. G., Shah, K., & Barach, P. (2017). Simulation-based medical education in graduate medical education training and assessment programs. Progress in Pediatric Cardiology, 44, 33-42.
Labrague, L. J., McEnroe‐Petitte, D. M., Bowling, A. M., Nwafor, C. E., & Tsaras, K. (2019). High‐fidelity simulation and nursing students’ anxiety and self‐confidence: A systematic review. Nursing Forum, 54(3), pp. 358-368.
Masiello, I. (2012). Why simulation-based team training has not been used effectively and what can be done about it. Advances in health sciences education, 17(2), 279-288.
Palter, V. N., Grantcharov, T., Harvey, A., & MacRae, H. M. (2011). Ex vivo technical skills training transfers to the operating room and enhances cognitive learning: a randomized controlled trial. Annals of Surgery, 253(5), 886-889.
Scalese, R. J., Obeso, V. T., & Issenberg, S. B. (2008). Simulation technology for skills training and competency assessment in medical education. Journal of general internal medicine, 23(1), 46-49.
Van Sickle, K. R., Ritter, E. M., McClusky, D. A., Lederman, A., Baghai, M., Gallagher, A. G., & Smith, C. D. (2007). Attempted establishment of proficiency levels for laparoscopic performance on a national scale using simulation: the results from the 2004 SAGES Minimally Invasive Surgical Trainer—Virtual Reality (MIST-VR) learning center study. Surgical endoscopy, 21(1), 5-10.
Wayne, D. B., Didwania, A., Feinglass, J., Fudala, M. J., Barsuk, J. H., & McGaghie, W. C. (2008). Simulation-based education improves quality of care during cardiac arrest team responses at an academic teaching hospital: a case-control study. Chest, 133(1), 56-61.
Willaert, W. I., Aggarwal, R., Van Herzeele, I., Cheshire, N. J., & Vermassen, F. E. (2012). Recent advancements in medical simulation: patient-specific virtual reality simulation. World journal of surgery, 36(7), 1703-1712.