Medical Simulation as Specialists Training

Introduction

The program to be evaluated

The requirements of educational standards are aimed at introducing simulation courses into the educational process that ensure the development of practical skills and abilities for all categories of students, the development of teamwork skills, the development of clinical thinking, and the formation of professional competencies. It is this area of ​​medical education that has been chosen for the evaluation.

The program and its various components

Modern medical simulation includes several categories of simulators. The manufactured patient simulators differ in different degrees of closeness to practice. The most complex of them have qualities that allow students to imitate work with a real patient close to reality when working with a simulator. Usually, training takes place after practical work with the simulator; participants are interviewed by specially trained observers who use a specific scenario as an engagement tool to activate what has been learned. Such simulators (for example, SimMan © (Laerdal Medical, Stavanger, Norway) and iStan © (CAE, Sarasota, FL, USA)) have variable levels of programmable physiology that can either be assessed by participants (for example, pulse, respiration, and blood pressure), or displayed on physiological monitors (e.g., heart rates, invasive hemodynamic monitoring, and oxygen saturation) (Labrague et al., 2019). These technologies enable venous catheterization, restoration of airway patency, administration of drugs, labor act, defibrillation, electrical impulse therapy, and many other interventions.

Individual skill trainers usually provide participants with the ability to perform one or a limited number of procedures. Examples include venous catheterization simulators, airway restoration simulators, intercourse simulators, and ultrasound-assisted procedure simulators. In surgery, the Society of American Gastroenterological and Endoscopic Surgeons (SAGES) has developed and validated simulators on which, using real laparoscopic instruments, a trainee can demonstrate a variety of required laparoscopic skills (for example, tightening of intra and extracorporeal nodes) (Van Sickle et al., 2007). These simulators are used for both training and competency assessment.

Virtual reality (VR) simulators take full advantage of video game technology, allowing the user to simulate a natural interface (although the clinical interface is a digital reproduction). Some simulators have tactile feedback, a computer-generated process where the user has the same touch sensation (e.g., resistance, tension) that would occur in an actual situation (Willaert et al., 2012). Some newer VR simulators contain imported patient images, allowing the user to perform the procedure before applying it to a live patient.

A variety of screen simulators is also in use today. The first widespread simulator of this type was developed for the American Heart Association to train a cardiovascular emergency management program (Wayne et al., 2008). In addition to teaching elements, these programs can parse simulated cases displayed on the screen. Another type of screen-based simulator has been developed to provide teams with hands-on communication and interaction simulations for various simulated events. Some screen-based simulators simulate highly complex multi-level scenarios used for educational purposes, independent practice, and certification.

The program’s overall purpose and goals

The primary purpose of creating medical simulation centers is to improve the professional training system of medical specialists and develop the human resources of health care to improve the quality, efficiency, and safety of medical care. To date, sufficient experience has been accumulated in applying simulation (imitation) methods in many areas of human activity, involving high risks and threats.

Comparing the program to recognized best practices and other related research

A literature review shows that simulation as a teaching tool has reasonable validity since the learner (and patients) are in such a position that no harm can be done. It allows one to act thoughtfully when performing educational activities and practice until a certain level of technical skill is achieved. Nevertheless, when using simulation as a learning tool, one should consider additional investments, not always monetary. This includes physical location, level of technical readiness, and additional time spent by faculty members. Because of this additional investment, simulation has to prove its worth, far more than any other educational innovation introduced into medical education (Masiello, 2012). As far as utility is concerned, it is also not easy to measure. In medical education, this can be both the price and acceptability of the model, the rationality of training, its effectiveness, the duration of the training, and the impact of using the model on clinical outcomes (Kothari et al., 2017).

As the simulation developed, so did the number of people willing to answer these questions. At the earliest stage, the studies were mainly qualitative and had as their main goal to substantiate the acceptability of simulation as a teaching tool (Aebersold, 2016). The rationality and efficiency of the cognitive process and skills acquisition have become the main topic of subsequent research. Subsequently, close attention was paid to finding out the optimal duration of training using simulation and determining the most appropriate time for refresher training. The last and most challenging study involved correlating simulation results with clinical outcomes.

A pilot study by Gordon et al. (2006) assessed the effect of medical simulation on acquiring knowledge in cardiovascular physiology. In this study, first-year medical students were randomly selected into two groups – a standard form of practical training (control group) and a standard form of practical training, supplemented by working with a simulator as a patient (intervention group). They were assessed immediately after the semester in which they were taught using the simulator, and after one year, using the test using six drawings. The result of testing was conducted immediately after the session; there was a significant difference between the groups in terms of mean values ​​[mean 4.0 (control group), 4.7 (intervention group), p =.005)]. After 1 year, the difference remained [mean 4.1 (control group), 4.7 (intervention group), p =.045)]. Based on the comprehensive analysis results, it can be concluded that the addition of simulation turned out to be a statistically significant decisive factor in performance.

In 2012, Bonrath et al. assessed the effect of simulation on the acquisition and maintenance of laparoscopic skills in newcomers (Bonrath et al., 2012). Thirty-six medical students were trained in 6 skills with increasing difficulty levels. Accuracy and speed of execution were assessed before the training course and immediately after its completion. The participants were divided into two groups of 18 people. One of the groups was recertified after six weeks, and the other after 11. It was found that immediately after the training period, all participants showed excellent mastery of skills. The group scored after six no retained 8 out of 9 skills at a level corresponding to that noted immediately after completing the preparatory course. The group assessed after 11 weeks retained only 4 of 9 skills. The authors concluded that after 11 weeks, the participants had a significant deterioration in their skills, which required additional practice. A longer period of work using simulation is required to assimilate skills and knowledge, which is apparent and is discussed in detail in the following study.

Palter et al. (2011) evaluated the impact of prior learning of technical skills learned through simulation on acquiring cognitive knowledge in the operating room. They randomized 18 resident surgeons into intervention and control groups. All participants were shown how to perform fascia suturing and then were allowed to perform this procedure on a simulator. The intervention team was allowed to practice on the simulator until professional technical skill was attained. Then, both groups performed fascia suturing on a real patient in the operating room. Both execution technique and cognitive outcome were statistically better in the intervention group. This highlights the potential benefits of gaining “pre-skill” skills in a simulation setting, and the same applies to enhancing cognitive skill acquisition in the operating room.

An example of a translational study using simulated techniques is the elegant study by Barsuk et al. (2009), which examined the effect of an educational program using simulators in preventing catheter-associated infections (CIs). The influencing element was the passage of a 4-hour training using simulators to teach the catheterization technique of the central vein under the control of ultrasound. The incidence of CIs in the trained group was 0.5 infections per 1000 catheter/day versus 3.2 infections per 1000 catheter/day in the group of patients before the training of doctors (retrospective data). In the study of paired groups, the results were more impressive: 0.5 episodes of infection per 1000 catheter/day versus 5.03 infections per 1000 catheter/day; 10-fold reduction in infections after training. The findings mean that 140 bed-days have been saved through staff training. Moreover, the researchers assessed the financial benefits of training physicians. They estimate that $ 800,000 is saved annually, i.e., the ratio of funds invested in training and benefits from its implementation is 7: 1.

Purpose of Evaluation

Why the program was selected for an evaluation

The acceptance of medical simulation means a significant overhaul of medical education. Since 2000, there has been an impressive increase in the use of medical simulation in the US and worldwide (Scalese et al., 2008). This growth is happening despite significant obstacles: funding, technology, and fear of change. Thus, it is necessary to introduce a comprehensive assessment of medical education programs that include medical simulation as a teaching tool. That is why this area of ​​medical education has become the subject of evaluation.

The program’s stakeholders and their needs

The most important stakeholder here is the general public. Along with the widespread adoption of simulators among a wide range of consumers in the United States, there has been a growing interest in patient safety. American healthcare has begun to change as a result of efforts to tackle the problem of medical errors. Examples of operational measures taken include implementing clinical decision support systems for the prescription of medications or analyzes, developing electronic patient health records, using control cards, and much more (Argani et al., 2012). This process did not escape medical education – while many believed that modeling could be helpful in the analysis of medical errors and analysis of their causes. The opinions of patients, educators, and common sense have led to the introduction of simulations to ensure the safety of various procedures. It was not uncommon for inexperienced clinic workers to perform the first procedures in their practice on real patients in the past. This had consequences for both those and others: fewer and fewer patients allowed themselves to be used as “guinea pigs” to train novice clinicians.

According to one of the studies aimed at examining the attitude of patients towards the trainees who perform procedures with them, only 49% of patients were satisfied with the fact that they were the first patients of the trainees who had sutures for the first time (Argani et al., 2012). The indicator dropped to 29% in probing and to 15% in the case of lumbar puncture (Argani et al., 2019). Beginners also do not get a positive experience, especially if there is a negative treatment outcome. Supervised simulated exercise in a relaxed atmosphere allows the beginner to gain some degree of competence and confidence, away from the stressful environment of the clinic. One can make a mistake and correct it without explaining what happened to the patient or an inconsolable relative. Over time, mastery can be achieved by resorting to modeling alongside real work in a hospital room.

Medical organizations are another major stakeholder. Health care facilities are under increasing pressure to reduce costs and improve healthcare services at the level that patients deserve and demand today. This can be achieved in an environment that requires more efficiency at a lower cost, including reducing staff. In the face of a shortage of study time, many educational programs are looking for new approaches that would make it possible to optimize the educational process under such constrained circumstances (Hippe et al., 2020).

Traditional lecture giving, during which only one speaker can convey the content of the question to an almost unlimited number of listeners, is probably the most economical, albeit least effective, method of teaching adult students. Lectures are beginning to be supplanted by active small-group activities, and modeling is just one example of this. It became evident that the clinic environment is not the best place to conduct classes, especially with newcomers (Hippe et al., 2020). In addition to the unpredictability of the practical experience gained, some tensions make a general atmosphere not conducive to acquiring knowledge. This includes the tension caused by the practical work with an actual patient, often in critical situations.

With the focus on clinical quality, the need for additional time for hands-on training for students is highly detrimental to the efficiency required of clinic staff. For example, if a separate procedure by an experienced clinician usually takes 60 minutes in total, and the inclusion of a trainee in the process increases this time by 20 minutes, during an 8-hour shift, two additional procedures could be performed, excluding the trainee from the process (Hippe et al., 2020). In the era of health care financial reform, hospital administrators take a keen interest in these issues. As the amount of information and the number of skills a physician is supposed to possess increases, special attention begins to be paid to determining the best and most effective approach to their practical development.

The evaluation’s overall purpose and goals

Assessment of the level of training of a specialist after the simulation training allows one to determine the effectiveness of the simulation course and plan further training using simulation technologies to achieve specific values ​​of the quality of the tasks performed.

To assess the effectiveness of simulation training for practical health care, it is necessary to take into account the following closely interrelated indicators:

  • availability and individualization of training;
  • correspondence of the structure and content of training to the current needs and trends of healthcare;
  • the level of technical equipment of the educational process;
  • implementation of a multi- and interdisciplinary approach;
  • the quality of methodological support of the educational process;
  • indicators characterizing the results of control and evaluation activities;
  • expected positive changes in the field of practical healthcare.

Evaluation Plan

An evaluation model

The Ralph Tyler Model developed in 1942 was selected for evaluation. Tyler’s model is often called the “objective model,” and the “target approach” consists of four questions that must be answered in order to assess the effectiveness of training (Anh, 2018):

  1. What educational goals need to be achieved?
  2. What teaching methods should be chosen to achieve these goals?
  3. How will the training be organized to achieve effectiveness?
  4. How can one evaluate the effectiveness of training?

In answering these questions, Tyler presented ideas for measuring learning outcomes. The measurement of results occurs by compiling a list of training tasks that indicate how the employee’s behavior should change after training and practice (Bhuttah et al., 2019). Upon completion of training, data on changes in productivity and labor quality are analyzed and compared with indicators of achievement of training goals. Tyler’s model does not assess the return on investment when analyzing training effectiveness, just as it does not include assessing the impact of other factors on changes in employee productivity and work behavior. The above highlighted indicators imply an objective assessment of the content and methods of teaching practical tasks and goals; the target approach is the most appropriate.

Evaluation questions

The proximity to reality of the simulated processes influences the transfer of knowledge and skills. The general “realism of perception” of the simulated processes depends on various mechanical, environmental, physiological, and temporal factors (Rystedt & Sjöblom, 2012).

While the simulation of individual manipulations is of limited value, when integrated into the curriculum and students and teachers perceive the simulation as a means of communicating a particular part of the curriculum, significant success can be achieved. While one can guess that many factors increase the effectiveness of simulation-based learning, the evidence suggests that such an increase is definitely dependent on effective feedback. Researchers have yet to demonstrate how the enhanced cognitive and psychomotor skills gained in simulated settings translate into real-life clinical activities and improve patient safety and outcomes. Simulations can also serve as a powerful means of continually assessing training performance. Applying a Bayesian simulation to measure the effectiveness of actions continuously retains the benefits of intermediate knowledge assessments for the final assessment.

At a time when political and social forces demanding improved patient safety advocate a simulation-based medical education system, it is crucial that educators use simulation in their work to maximize positive impact. It is clear that simulation, when applied appropriately, has a high educational value. Any pedagogical action that reproduces a clinical setting for the purpose of teaching, training, evaluating, reviewing or researching can be classified as a simulation program. For this purpose, mechanical equipment, computers, or simulated patients can be used. As with any training program, simulations should be clearly focused on the needs of the participants and the requirements of the environment. Simulations just for the sake of simulation have very little overall value. Before developing a simulation program, the evaluation considers answering the following essential questions:

  • Is Simulation Necessary?
  • Can the same problems be solved using other pedagogical approaches? Who will be the participant?
  • How easy will it be to organize simulation sessions for the group?
  • What educational goals will the simulations achieve?
  • How will the simulations fit into the current teaching system and curriculum?
  • What faculty, financial, and infrastructure resources are needed, and are they available?

Sampling techniques

The panel sampling technique will be used. Panel research is multiple surveys of the same sample from the general population at different points in time. This reusable selection is called the panel (Chauvenet et al., 2020). The ability to assess the “net effect” and the magnitude of the observed changes is a great advantage of the panel plan. Panel studies are indispensable in testing causal hypotheses, especially in cases where there is no “natural” criterion for separating the independent and dependent variables over time (Wang et al., 2017). Panel studies are also indispensable for the analysis of more complex causal models with delayed effects (lags), feedback loops, and other significant analytical categories. (Wang et al., 2017). The main advantage of the panel plan from a purely statistical point of view is the ability to separate the actual changes in indicators from the variance associated with sampling error.

Data collection

The studied general population is students of medical educational institutions involved in training within the framework of simulation centers. For three years, students of each year will be assessed on the knowledge gained during the training program, as well as leave feedback through a survey form. Units of analysis are educational institutions that included simulation education in their curricula.

Data analysis

Panel data combines both spatial data and time series and combines the strengths of each of these types of data. This makes it possible to build more adequate and meaningful models for studying the true causal relationship between various variables, which seems impossible within the framework of only temporal or only spatial data. Thus, the relevant indicators of student training, their feedback, the compliance of the curriculum with the goals and objectives of medical education, as well as the local characteristics of educational institutions will be compared in dynamics. The time frame will help to see the dynamics of the effectiveness and validity of the inclusion of simulation training in the medical education curriculum.

Panel data contain a large number of observations and thus provide the researcher with more information; they are characterized by more significant variation and less collinearity of the explanatory variables; they give a greater number of degrees of freedom and provide more efficient estimates. When analyzing time series, researchers often encounter multicollinearity of factors. The use of state panel data is intended to reduce the high collinearity between factors since spatial dimension slightly increases the variation in factors and makes the data more informative. Indeed, variation in data can be decomposed into two components: variation between states of different sizes and with different characteristics and variation within states, with the latter usually always larger (Wang et al., 2017). In addition, more informative data can lead to more reliable parameter estimates.

Possible ethical issues

Since the evaluation of the discussed training program involves assessing students’ skills, it is necessary to address the issues of privacy, anonymity, and autonomy in the first place. Researchers should explain the study’s objectives and describe the subject’s actions (e.g., filling out a questionnaire, verbal answers to questions, etc.). In addition, the researcher should describe the potential risks and benefits associated with participation in the experiment, emphasize the voluntariness and confidentiality of participation in it. Consent to participate in the research is most often given by reading and signing an “informed consent” form. Informed consent includes a description of the following components: the purpose of the study; research participation procedure; the procedure for refusing to participate in the study; possible risks and inconveniences for the research participant (for example, discomfort); the benefits of the research to the community and/or the research participant; the duration of the study; compensation for participation (if applicable); Contact Information; confidentiality of information; a statement that participation is voluntary and that withdrawal from it will not lead to any negative consequences.

Recommendations

Findings summary

A new simulation program or any change to an existing program can be related to the model described above. The distribution of resources, infrastructure, and the place of simulations in the learning process will depend on this. At the same time, it is important to determine whether a simulation program is the best method to achieve the set goals or whether they can be achieved using alternative teaching aids.

A well-equipped simulation center is a valuable educational resource. The appropriate structure of the simulation center improves simulation accuracy and feedback quality, improving training efficiency. A well-organized center also allows one to increase the level of simulation through the use of new technologies. Before setting up a simulation center, the following factors need to be considered to help select the appropriate structure and maximize the effectiveness of simulation-based medical education.

Architecture: There are basic principles of organization and planning that are common to all simulation centers. The architecture and design of the simulation center should allow the following:

  1. Play different scenarios in realistic conditions. Ideally, this is achieved through a dedicated room that can flexibly transform into different clinical settings depending on the needs.
  2. Manage simulators so that students cannot see them. As a rule, the control process is carried out from the adjacent control room. Ideally, it should be possible to directly observe what is happening from the control room through one-way glass.
  3. Students – in real-time to observe what is happening in the central room from an adjacent classroom through one-way glass or using video communication.
  4. If one plans to use simulations using on-screen simulators, virtual reality, and standardized patients, additional conditions and premises are required. The simulation center should also include premises for equipment maintenance, service, and storage facilities.

Gas supply: As a rule, the supply of oxygen and compressed air is necessary for the operation of the simulators. Some mannequins require a nitrogen line. If the simulation is to evaluate the end-expiratory carbon dioxide content, a CO2 supply is required. Some dummies allow the use and measurement of volatile anesthetics (through various modifications). In this case, an exhaust gas system is needed.

It is also essential to keep in mind future improvements to the center. All simulation centers require a centralized hospital gas distribution or a separate gas supply from the cylinders. In most cases, computerized dummies cannot function without gases, except for the latest generations of portable simulators with built-in compressors. To use modern Laerdal dummies, the gas must be supplied directly to the classroom. If the medical educational institutions plan to upgrade to METI ™ HPS dummies, they will need to provide gas piping to the control room. Audio and video equipment: varies significantly from center to center.

There are two options for equipping with such equipment. First, equipping the center with proprietary hardware and software from Laerdal ™ or METI ™. Second, equipping the center with equipment from independent companies adapted for specific needs. In any case, the equipment should be capable of recording in real-time, followed by playback and discussion. The advantages of the proprietary software of the simulator manufacturers include the ability to register actions and events on electronic media. In both cases, one can record from multiple cameras and record the readings of the simulator monitor. Using such programs to record and reproduce student actions increases the effectiveness of the ensuing discussion.

In addition, during the playback of various scenarios, it is necessary to ensure the possibility of confidential communication between teachers in the classroom and the control room using wireless headsets. It is also mandatory to have built-in microphones in the rooms, allowing students and teachers to hear each other from different center rooms. The advantage of a local area network in a simulation center is that it reduces the need for multiple audio and video adapters because the software allows the recording to be played back for discussion over the network. It also allows any simulated scenario to be demonstrated over the Internet to students elsewhere.

In modern dummies, embedded software allows the instructor to operate wirelessly. Potential problems Simulation centers are often created by re-equipping existing premises, which does not always provide an optimal structure and reduces the effectiveness of simulation training. This approach, as a rule, complicates the possibilities of audio and video equipment, as well as gas supply. Creating simulation centers with optimal conditions requires careful prior thought. That is why, before starting work, it is necessary to agree on the budget that will be required to purchase equipment and create the appropriate conditions. If these requirements are not met, there may be shortcomings during the construction phase, which will affect the quality of training in the future. Such planning flaws can prevent scenarios from being played out with high realism and effective feedback.

Recommendations based upon the findings

There are several important factors to consider before designing a simulation center. Architecture, audio and video equipment, gas supply must be carefully planned. It is vital that the center’s structure meets educational requirements, ensuring the highest possible simulation accuracy and quality feedback. The irrational structure of the simulation center can significantly limit its use and reduce the quality of training. The structure of multimodal simulation centers, allowing to reproduce of patient rooms, use interactive simulation and other latest computer technologies, requires careful analysis of many other factors at the planning stage.

There are several stakeholders: specialist educators, training funds, strategic health authorities, and individual clinics. In the first case, emphasis should be placed on the learning content when communicating findings and recommendations since the teaching people are responsible for transferring the practical standards through the delivery of the content to the students. In the case of other stakeholders, emphasis should be placed on the benefits of contributing to the improvement of simulation training programs by identifying lagging criteria and highlighting the benefits and profits of their development for the national healthcare system and specific clinics). First of all, an emphasis is needed on practical effectiveness: a practical way out (orientation towards a practical result) in relation to the medical environment is a positive dynamics of indicators, productivity, and efficiency of the institutions in general for the capital’s health care in improving the quality of medical care. The need for interaction to solve these problems is thus relevant both for each individual player and for society as a whole.

References

Aebersold, M. (2016). The history of simulation and its impact on the future. AACN advanced critical care, 27(1), 56-61.

Anh, V. T. K. (2018). Evaluation models in educational program: Strengths and weaknesses. VNU Journal of Foreign Studies, 34(2), 3-29.

Argani, C. H., Eichelberger, M., Deering, S., & Satin, A. J. (2012). The case for simulation as part of a comprehensive patient safety program. American journal of obstetrics and gynecology, 206(6), 451-455.

Barsuk, J. H., Cohen, E. R., Feinglass, J., McGaghie, W. C., & Wayne, D. B. (2009). Use of simulation-based education to reduce catheter-related bloodstream infections. Archives of internal medicine, 169(15), 1420-1423.

Bhuttah, T. M., Xiaoduan, C., Ullah, H., & Javed, S. (2019). Analysis of Curriculum Development Stages from the Perspective of Tyler, Taba and Wheeler. European Journal of Social Sciences, 58(1), 14-22.

Bonrath, E. M., Weber, B. K., Fritz, M., Mees, S. T., Wolters, H. H., Senninger, N., & Rijcken, E. (2012). Laparoscopic simulation training: testing for skill acquisition and retention. Surgery, 152(1), 12-20.

Chauvenet, A., Buckley, R., Hague, L., Fleming, C., & Brough, P. (2020). Panel sampling in health research. The Lancet Psychiatry, 7(10), 840-841.

Gordon, J. A., Brown, D. F., & Armstrong, E. G. (2006). Can a simulated critical care encounter accelerate basic science learning among preclinical medical students? A pilot study. Simulation in healthcare, 1(Inaugural), 13-17.

Hippe, D. S., Umoren, R. A., McGee, A., Bucher, S. L., & Bresnahan, B. W. (2020). A targeted systematic review of cost analyses for implementation of simulation-based education in healthcare. SAGE open medicine, 8, 2050312120913451.

Kothari, L. G., Shah, K., & Barach, P. (2017). Simulation-based medical education in graduate medical education training and assessment programs. Progress in Pediatric Cardiology, 44, 33-42.

Labrague, L. J., McEnroe‐Petitte, D. M., Bowling, A. M., Nwafor, C. E., & Tsaras, K. (2019). High‐fidelity simulation and nursing students’ anxiety and self‐confidence: A systematic review. Nursing Forum, 54(3), pp. 358-368.

Masiello, I. (2012). Why simulation-based team training has not been used effectively and what can be done about it. Advances in health sciences education, 17(2), 279-288.

Palter, V. N., Grantcharov, T., Harvey, A., & MacRae, H. M. (2011). Ex vivo technical skills training transfers to the operating room and enhances cognitive learning: a randomized controlled trial. Annals of Surgery, 253(5), 886-889.

Rystedt, H., & Sjöblom, B. (2012). Realism, authenticity, and learning in healthcare simulations: rules of relevance and irrelevance as interactive achievements. Instructional science, 40(5), 785-798

Scalese, R. J., Obeso, V. T., & Issenberg, S. B. (2008). Simulation technology for skills training and competency assessment in medical education. Journal of general internal medicine, 23(1), 46-49.

Van Sickle, K. R., Ritter, E. M., McClusky, D. A., Lederman, A., Baghai, M., Gallagher, A. G., & Smith, C. D. (2007). Attempted establishment of proficiency levels for laparoscopic performance on a national scale using simulation: the results from the 2004 SAGES Minimally Invasive Surgical Trainer—Virtual Reality (MIST-VR) learning center study. Surgical endoscopy, 21(1), 5-10.

Wang, M., Beal, D. J., Chan, D., Newman, D. A., Vancouver, J. B., & Vandenberg, R. J. (2017). Longitudinal research: A panel discussion on conceptual issues, research design, and statistical techniques. Work, Aging and Retirement, 3(1), 1-24.

Wayne, D. B., Didwania, A., Feinglass, J., Fudala, M. J., Barsuk, J. H., & McGaghie, W. C. (2008). Simulation-based education improves quality of care during cardiac arrest team responses at an academic teaching hospital: a case-control study. Chest, 133(1), 56-61.

Willaert, W. I., Aggarwal, R., Van Herzeele, I., Cheshire, N. J., & Vermassen, F. E. (2012). Recent advancements in medical simulation: patient-specific virtual reality simulation. World journal of surgery, 36(7), 1703-1712.

Cite this paper

Select style

Reference

NursingBird. (2022, December 11). Medical Simulation as Specialists Training. https://nursingbird.com/medical-simulation-as-specialists-training/

Work Cited

"Medical Simulation as Specialists Training." NursingBird, 11 Dec. 2022, nursingbird.com/medical-simulation-as-specialists-training/.

References

NursingBird. (2022) 'Medical Simulation as Specialists Training'. 11 December.

References

NursingBird. 2022. "Medical Simulation as Specialists Training." December 11, 2022. https://nursingbird.com/medical-simulation-as-specialists-training/.

1. NursingBird. "Medical Simulation as Specialists Training." December 11, 2022. https://nursingbird.com/medical-simulation-as-specialists-training/.


Bibliography


NursingBird. "Medical Simulation as Specialists Training." December 11, 2022. https://nursingbird.com/medical-simulation-as-specialists-training/.