• Users Online:1581
  • Home
  • Print this page
  • Email this page
Home About us Editorial board Search Ahead of print Current issue Archives Submit article Instructions Subscribe Contacts Login 

 Table of Contents  
REVIEW ARTICLE
Year : 2022  |  Volume : 11  |  Issue : 2  |  Page : 95-103

Defining medical simulators for simulation-based education in EUS: Theoretical approach and a narrative review


1 CAMES Engineering, Copenhagen Academy for Medical Education and Simulation, Centre for Human Resources and Education, The Capital region of Denmark, Copenhagen, Denmark
2 Department of Surgery and Transplantation, Copenhagen University Hospital Rigshospitalet, Copenhagen, Denmark

Date of Submission11-May-2021
Date of Acceptance30-Nov-2021
Date of Web Publication23-Apr-2022

Correspondence Address:
Morten Bo Søndergaard Svendsen
CAMES Engineering, København Ø
Denmark
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/EUS-D-21-00123

Rights and Permissions
  Abstract 


Choosing the right simulator for tasks in simulation-based education in medicine will affect the trainees' skills. However, there is a shortage in the vocabularies used for describing medical simulators and the contextual usage of simulators. We propose methods for approaching the task of choosing and defining the simulators needed, regardless of it being an acquisition or development process. It is advocated that efforts are made in defining the simulator's requirements before making any choice in regards to development processes. Multiple advantages are attained by keeping the simulator simple, both educational and development wise. Issues on validating simulators are discussed and highlighted as actions where interprofessional communication is likely to fail. The following conventional terms in medical education are problematic in regard to establishing a clear communication: Virtual reality, fidelity, validation, and simulation. The text is finalized in a short discussion on applying the methods in an EUS/endobronchial ultrasound (EBUS) context. The work is the authors' interpretation of an invitation having the title “Development of EUS and EBUS training models and simulators.

Keywords: simulator design, simulator choice, curriculum design


How to cite this article:
Svendsen MB, Achiam MP. Defining medical simulators for simulation-based education in EUS: Theoretical approach and a narrative review. Endosc Ultrasound 2022;11:95-103

How to cite this URL:
Svendsen MB, Achiam MP. Defining medical simulators for simulation-based education in EUS: Theoretical approach and a narrative review. Endosc Ultrasound [serial online] 2022 [cited 2022 May 16];11:95-103. Available from: http://www.eusjournal.com/text.asp?2022/11/2/95/343771




  Introduction Top


This text was made as an invitation to write an article on “Development of EUS and endobronchial ultrasound (EBUS) training models and simulators.” It covers the subject of when and how to make a medical simulator. It does not touch whether simulation before EUS practice is needed,[1] or make comparisons of existing simulators.[2] The text is an expert opinion/narrative review on making decisions regarding medical simulators in connection to the curriculum. The only assumption is that medical simulators are designed for simulation-based education (SBE). The specifics of simulators for EUS, as a combination of skill trainers for endoscopy and ultrasound is covered in the final part.

Making medical simulators, regardless of type, is often a polytechnical venture where creators have to decide on subjects that fall within the scope of a wide range of fields, e.g., materials science, computer science, biometry, physics, electronics, educational research, medicine. Thus, there is an apparent need for acknowledging that cross-professional communication can be challenging. Especially, as one word might have different connotations or even different literal meanings between each profession. The authors' experience is that especially; simulation,[3] fidelity,[4] virtual reality, and validation confuse the development of medical simulators. The terms are either not precise (simulation, fidelity), their connotation depending on the context (e.g., fidelity, virtual reality, simulation), or present a different meaning (e.g., validation, simulation). To make sure the text is read as intended, we do not refer to immersive virtual reality when mentioning the term but refer to virtual reality as per medical education conventions.[5],[6]

Hence, suppose a physician can no longer improve a simulator and needs more professional advancement. In that case, it is difficult to pass on knowledge about what the modality does at its present level of development and what the modality needs to do to those outside the sector, e.g.,the pancreas needs to have higher fidelity.” Likewise, the engineering students who have developed a simulator, might present it as “ensuring patient safety,” or the data scientist not being able to get traction for her algorithm that provides “high confidence feedback to endoscopists.” As these statements are addressed to a person of different occupation, they may, at best, sound excessive or, at worst, unintelligible. Nevertheless, more frequently, they are merely expressed in one sense and heard in another, causing uncertainty rather than certainty– it is a lack of lingua franca; a common language.

This text tries to help the reader avoid confusion; we will describe general considerations when making medical simulators. There is no need to reinvent the wheel for many of the decisions and considerations. The text can be read as if wanting to develop a simulator or as if having to choose between simulators–the challenge being the same.


  General Considerations Regarding Medical Simulators Top


Prerequisites: What and how are you going to teach?

Deciding on how and what to teach interacts with the desired properties of the medical simulator (s) used, e.g., the processes in writing problem-based learning (PBL) scenarios.[7] In the perfect world, SBE curricula and plans are ready before developing or choosing the simulator (s).[8] However, naturally during the processes of writing PBL scenarios, see,[7] uses and limitations of simulators is likely to change with the scenario development. When the curriculum determines the properties of the simulators, establishing requirements are somewhat straight forward. However, that is only sometimes the case. Often the SBE is determined by the available simulators and not the other way around–then the initiation of simulator development focus on features lacking from other simulators. There is readily available evidence of best practice in SBE.[9] The context of designing SBE medical simulators uses cognitive interactivity, applying various learning techniques, and adjusting complexity.[9] The difficulty lies in producing what makes sense, finding the core of the skills needed, and developing something that mitigates learning these skills.

What to make?

Perdefinition, a simulator represents real conditions (Cambridge dictionary, 2020); i.e., a simulator is not the real thing. Often then, the discussion arrives at fidelity. However, as mentioned in the introduction and as discussed in Hamstra et al.,[4] fidelity is an unfortunate term, e.g., fidelity is context-dependent. In one context, the same simulator can be high-fidelity, in another low. Fidelity is not a constant property. Even the real meaning of the word makes it difficult to interpret. People do not see the same properties when presented with a simulator. The Umwelt, the perceptional awareness (vision, touch, hearing, smell, taste)[10],[11] of an expert endoscopist versus a novice cannot be considered the same.

A medical simulator's overall properties can be divided into physical resemblance and functional task alignment, the simulator's functional correspondence to the context in which it is used.[4] When considering people's perception, most simulators need properties that look or feel right; often, sound is included, but not of specific interest. The context in which the simulator may be used is paramount. There is no established language to describe the simulators themselves. However, the level and classification of technological simulation with advantages and disadvantages has been described by Alinier.[3] Assistive in determining the level of usage connected to the correspondence of teaching sessions to real-world scenarios.[3] The classification range from level 0, being written, to level 5 being trainee led interactive simulators, computer-driven.

Functional task alignment/functional correspondence

First, consider the simulator usage; is it examination or teaching, testing, or building competence? In the context of building competence, a decomposition of the skills needed will help set a direction. Starting simple, building each specific skill separately is preferable (see all continuing chapters) [Figure 1].
Figure 1: Theoretical progression and Change in Tasks. Changing the simulated tasks as competence in single subject is acquired

Click here to view


Physical resemblance

High levels of physical resemblance are commonly sought,[12] seen as positive. However, in the case of SBE, it is likely not the case. Higher resemblance adds to the complexity and enforces a higher cognitive load on the trainees.[13],[14] Stated in another way, more manageable tasks enable better performance.[15] Better performance is the goal of SBE. Once competent at a level of complexity, the trainee can advance. Thus, it is advantageous to have different levels of difficulty[9] in mind from the start when deciding on medical simulator requirements. See a conceptual graph in [Figure 2].
Figure 2: Optimal simulator realism. Depending on the skill level of the trainee, different realism and diffuculty of the simulator and related tasks is required

Click here to view


A relevant example, with available evidence, is laparoscopic camera navigation. At face value, the apparent difficulties in navigating a laparoscopic camera are the oblique view, limited workspace, and hinged motion via the trocar; all contributing to many unintuitive degrees of freedom for navigating the camera. The highest resemblance, rationally, would be to practice camera navigation on patients, in which case, it suggested that using a simulator is not only as good but also more time-efficient.[16] Concerning the task correspondence and procedural resemblance, when practicing camera navigation, trainees gain competence regardless of practicing the specific skill or the more complex whole procedure.[17] Finally, a conventional in terms of medical education, i.e., nonimmersive, virtual reality simulator, with high visual complexity, is equal to a simple box trainer when gaining camera navigation skills.[18] All suggestive of simple does it; the requirements for the simulator's functional correspondence is to mitigate the skill of navigating a camera with an oblique view, limited workspace, and hinged motion. Everything above that is only relevant to higher skill levels or completely different skills altogether. For laparoscopic surgery, a review states, even conclude, that for basic skills, there is no benefit in using conventional virtual reality simulators compared to simple task trainers,[19] and continues that advanced procedures in virtual reality still need to demonstrate educational value. Developing simple things are much more straightforward, less resourceful, and seemingly more educationally rational. Discern and define what one needs (need to have), what one wants (nice to have).


  A Product Development Approach: The Minimum Viable Simulator Top


To think of a minimally viable simulator, it could be defined as a simulator with only the core features for practicing a specific task. The title of this section is inspired by the minimum viable product (MVP) idea, the MVP, the minimum set of features that is marketable. The minimum viable approach is another fuzzy definition, with many connotations (Lenarduzzi and Taibi, 2016). However, utilizing the way of thought supports product design, facilitates communication, and supports validated knowledge and experience generation of the product's (read: simulator) features (Duc and Abrahamsson, 2016). It is an approach that enables a learning process regarding medical simulator features and their usage.

Different approaches to fulfill the product requirements will provide different properties, good and bad. Development takes time, and the more details specified, the more resources, effort, and thus, the time required. Even though a given feature's intention is good, its effect does not necessarily equal its intention, e. g., tutoring guidance,[20],[21] or feedback.[22] Virtual reality simulators sometimes state that they are better because they also provide metrics (scores), helping the motivation. Nevertheless, the metrics are not necessarily useful in assessment.[23],[24] Suppose the metric does not vary with skill. Is it rational to emphasize it as a beneficial feature to spend resources? Avoid falling victim to the sunk cost fallacy.[25],[26] By testing and re-iterating the simulator, one is less likely to spend resources on unachievable features with only illusory effects.

In developing a medical simulator for SBE, the furthest step to completion is defining the minimally effective platform for the skills taught. Once version zero is built, simulate its usage, test it, determine what works, what does not. Development is iterative, and keeping it simple is an advantage. After gaining experience with the first versions, the simulator converges towards being specifically designed and optimized for teaching the specific curriculum of skills. Some peer-reviewed literature on simulator development documents this travel through versions.[27],[28] The goal is to reach a minimum viable simulator, use it, learn from the experience, improve what need be improved concerning functional correspondence,[4] other functionalities, design, reliability, and usability—bearing in mind that some essential features, or unintended effects, will only be discovered through the usage of the simulator (s).

As the simulator traverses through versions, more complexity is likely added [cf [Figure 2]]. The experience in using the simulator in the specific context (s) grows. Keep it in mind, and the realization of even simpler simulators becomes likely, emphasizing skill-building of some subschema of the defined skills. Thus, inventing a simulator that precedes the initial idea, has functional correspondences that allow the transfer of skill, and allows practice at a lower level of cognitive load– enabling better performance.


  Validation Top


In medical education, the validity of the assessment process is often referred to when discussing validity. The primary take-home message is; that it is a process that evaluates the appropriateness of interpretation of assessment results.[29] For the context of discussing medical simulators, Frameworks for Assessment in Medical Education (FAME), specifically Messick and Kane,[29] apply to simulator usage contexts via the assessment tools used. However, outside the context of assessment, observing medical simulators using the mentioned frameworks falls short.

However, as mentioned in the introduction, validation is something that has different connotations within different professions. Different professions have different references, and there is no exception in validity concepts.

Software and hardware developers have standards they can choose to follow. Validation in a technical context regards conformation to predefined conditions. It concerns whether the product meets the customer's needs, and verification concerns whether the product meets the specifications (IEEE-STD610, 1990).[30] As opposed to FAME, when completed, validation and verification are endpoints for different versions of a product, scrutinized in the scope of predefined specifications and requirements. Validation is at the core of development; it is inherently difficult to identify the customer's real needs (or yourself)[31],[32] and to fulfill them. Consider “I could use some coffee now” could mean coffee, but if it is the context of drinking coffee that's sought for (social, break, warm drink-feeling cold), tea would likely suffice. Establishing user needs is a field of research in itself, with frameworks for approaching the tasks.[33] Now consider an educator asking for, “I would like a simulator for EUS.” It is a spoken user need but has no clues to the requirements of the simulator. In the context of SBE, the overall need could probably be fulfilled by simulators (plural) that can assist in educating physicians in EUS. As previously stated, the process of simulator development should be preceded by a definition of the requirements–know what the needs are before beginning, otherwise, validation is impossible.

As suggested in section #2, the MVP approach allows for validated development. A standard model used for approaching technical validation is the V-model. It consists of a V-shaped flow of processes. The descending part (from the left) consists of decomposition and definitions of user needs until detailed specifications. The ascending part of the V consists of integrative steps to inspect, verify, and validate the technical aspects, from the build, specifications, to user needs.[34],[35] The application of the V-model can be highly controlled.[35] However, the primary take-home message is the process of decomposing the simulator and skills to allow definitions.

For development purposes, the user needs themselves can be validated in combination with updating requirements, by pilot trials, e.g., is a low physical resemblance for ultrasound imaging of the pancreas sufficient–which details need be improved, which removed (e.g., Exhibit 3 and 5 in[35]). At the bottom of the V-model, the validation of functionality can be at the level that checks (yes/no) if the required output is returned for different given inputs, e.g., add (2,3), add (1,4), add (4,1) should all return 5.

Once ready for use, more complex aspects can be validated. There is, as such, no framework for validation of a simulator itself. Validation of the simulator effect can be done by comparison, like done with the medical device products. As an example, validation of an effect can be validated by hypothesis like “Does simulator X, used in context Y, increase the competency of trainees in ultrasound ?” Simulator usage can be asserted in combination with a FAME for an assessment tool; e.g.,Does OSAUS provide a reliable assessment tool in context Z (using simulator X) ?” [Table 1].
Table 1: Validation and its differents meanings

Click here to view


Finally, it is essential to note that in connection with FAME and assessing skills, the simulator intended usage is different from learning. The details that make a simulator sufficient for testing trainees are substantially different from optimized learning (see section 2). In a test setting, the simulator should enable expert performance, a high level of task difficulty, or detail. A simulator for assessment does not allow novices to perform perfectly–the opposite of the optimal simulator for learning.


  EUS  Top


Please help me make training models and simulators for EUS and EBUS training.

Definition of needs

At the time of writing, there are some criteria for competence in EUS.[36] In the same journal issue as this text, there is a needs assessment describing what needs to be.[37] Concerning the definition of a simulator, few cues as to what is required exist. There is no well-defined curriculum.[8] Focusing on the technical skills, a common need from both the sources mentioned earlier is the duodenum or esophagus's intubation–focused on procedural points, rather than specific skills. Contacting the authors of the needs assessment, we had the privilege of early access to the data. In the first round of the need assessment, 84% of the first 45 respondents mention things related to “endoscope,” 62% tasks related to “anatomy” and 6% points related to “ultrasound”– one respondent mention handling “oblique view endoscope” and one mention “fine motor movements.” An interview conducted by author MBSS (engineer) to understand these challenges of EUS confirmed that the difficulties, compared to conventional endoscopy, is the oblique view of the EUS scope (Pers. Comm., Prof. L. B. Svendsen). In our experiment of thought, this information points in the direction that the first prerequisite is handling the endoscope safely, second master ultrasound, third interpret the ultrasound image, and last perform EUS procedures such as fine needle aspiration (FNA). Further, we focus on the points “Endoscope insertion duodenum”[37] and “image and identify anatomy close to the duodenum.”[36]

Curriculum focus

Different simulators and different technological simulation levels can be utilized to obtain the competencies previously described.[36] Considering requirements for technical skills [Figure 3] and interpretative skills [Figure 4], one will not end up with the same simulator (s) neither approach to teaching the competencies. Thus, it is important again the remember to learning objectives, and progress of learning exemplified in [Figure 1].
Figure 3: Relations between technical skills and medical simulator. Left is required technical skills. the middle section is the.broader terms, that the specific skills relate to. The rightmost box illustrates whether the broad term subjects can be practiced on either physical or digital simulators

Click here to view
Figure 4: Relations between interpretational skills and medical simulation medium. Left is required technical skills. the middle section is the.broader terms, that the specific skills relate to. The rightmost box illustrates whether the broad term subjects can be practiced on either physical, digital simulators or media at TSL0 (reading etc)

Click here to view


If the intended program targets already competent endoscopists, they need to be acquainted with the echoendoscope.[36] The simplest way utilizing SBE would be to practice a conventional upper gastrointestinal endoscopy model, perhaps rebuilt with hidden structures visualized with ultrasonography only, but using the new equipment. Is there any need for tiring trainees to reach the duodenum before they can practice imaging and identify the anatomy close to the duodenum? Is there any need for the simulated anatomy to have a high physical resemblance when practicing imaging and handling the echoendoscope, or will something equivalent to imaging squares and circles in a cavity work for the basic steps? That could also function as needle handling practicing phantom, e.g., FNA or lumen-apposing metal stents (LAMS).

Hence, our small curriculum requires one simulator, physical-upper gastrointestinal endoscopy, for perfecting navigation of the oblique view endoscope; one simulator (physical, anatomical resemblance not necessary, ultrasound compatible material, PVA or ballistic gel) only practicing ultrasound imaging with the endoscope; one simulator (physical, anatomical resemblance not necessary, but with physical properties simulating tissue, e.g., FNA or LAMS, only practicing ultrasound-guided invasive procedure with the endoscope. Finally, in combination with simulation on level 0 (knowledge via reading, studying),[3] a simulator having a high resemblance concerning anatomy and its appearance on ultrasound, to establish competence in recognizing and identifying anatomy on ultrasound [Figure 3] and [Figure 4]. The last steps, similar to a recent demonstration that multistage training was beneficial in the acquisition of skills in EUS.[38]

Specific simulator considerations

Different models and approaches for training EUS are available and have been described,[2] many of which would be directly utilizable in the above-described contexts. However, only in the specific regime of endoscopy and ultrasound combined. In this last section, we will expand the field of view to endoscopy.

High-quality endoscopy is multidimensional.[39] Initial training regarding EUS could likely utilize conventional criteria for competence in, e.g., colonoscopy,[40] just performed by an oblique viewing endoscope. However, one cannot assume that digital simulators will provide meaningful insight,[24] or even competence.[41] Virtual/digital simulation is not equal for all endoscopic procedures,[41] likely as contextual use is not considered. Digital simulators have their strength in possibilities of varying the visuals and making resemblant imaging and weaknesses in a lack of haptic congruence (feel). There is a, still technically contemporary, overview of digital endoscopy simulators and their usage available in the literature.[42] It is unknown, however, if any of the simulators allows practicing with an oblique viewing endoscope. The visual part and training skills within that realm solely are demonstrated in providing competence in evaluating normal anatomy and increasing efficiency in a EUS program.[43]

Considering EUS and EBUS as advanced endoscopic procedures, the trainees are likely skilled in endoscopy pretraining, allowing for high physical resemblance and high cognitive load simulators. This has been stated in other words but supported by proposing utilizing animals for advanced endoscopic techniques only.[44] e.g., porcine models do find usage for FNA.[45],[46] However, using animals for training pose extra considerations within regulatory demands (e.g., FELASA, IACUC), the dedication of equipment only for animals, and zoophilism among trainees, all warranting inanimate simulators, compromising some physical resemblance.[28],[47] Live animals (high resemblance) should be limited to advanced procedures to ensure trainees benefit from the experience.[44] Concerning the Three R's tenet within animal experimentation (Reduction, Refinement, Replacement) (Fenwick[48] et al., 2009; Russell[49], 1995), prepractice on inanimate simulators (initial replacement) would ensure that trainees have a higher level of competence, enabling more skill acquisition (combined leading to refinement) when performing on the animals models, at the end leading to a reduction in the number of animals used.


  Summary Top


Regardless of standing before an acquisition or development process, it is recommended to consider the definition of the needs as a prerequisite. Answering the simulator's real usage (learning or examination), makes you go a long way.

Further, evidence from both educational and product development research suggests starting simple. For educational purposes, simplicity enables perfection in performance at specific sub-tasks of a procedure, mitigating the cognitive load, increasing learning. From the product side, keeping it simple helps to learn the curriculum's real needs without utilizing all resources.

When communicating in connection to simulators, be aware of the pitfalls that follow the choice of words. Words do not infer the same meaning to different professions (e.g., validation, virtual reality, fidelity). To the extent that the usage of some words, like fidelity, should be abandoned in the context of medical simulators.

Acknowledgments

We would like to acknowledge the referees for the process, and the editors for the invitation.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.



 
  References Top

1.
Gromski MA, Matthes K. Simulation in advanced endoscopy: State of the art and the next generation. Tech Gastrointest Endosc 2011;13:203-8.  Back to cited text no. 1
    
2.
Kefalides PT, Gress F. Simulator training for endoscopic ultrasound. Gastrointest Endosc Clin N Am 2006;16:543-52.  Back to cited text no. 2
    
3.
Alinier G. A typology of educationally focused medical simulation tools. Med Teach 2007;29:e243-50.  Back to cited text no. 3
    
4.
Hamstra SJ, Brydges R, Hatala R, et al. Reconsidering fidelity in simulation-based training. Acad Med 2014;89:387-92.  Back to cited text no. 4
    
5.
Frederiksen JG, Sørensen SM, Konge L, et al. Cognitive load and performance in immersive virtual reality versus conventional virtual reality simulation training of laparoscopic surgery: A randomized trial. Surg Endosc 2020;34:1244-52.  Back to cited text no. 5
    
6.
Mah E, Yu J, Deck M, et al. Immersive video modeling versus traditional video modeling for teaching central venous catheter insertion to medical residents. Cureus 2021;13:e13661.  Back to cited text no. 6
    
7.
Wood DF. Problem based learning. BMJ 2003;326:328-30.  Back to cited text no. 7
    
8.
Pai DR, Minh CP, Svendsen MB. Process of medical simulator development: An approach based on personal experience. Med Teach 2018;40:690-6.  Back to cited text no. 8
    
9.
Cook DA, Hamstra SJ, Brydges R, et al. Comparative effectiveness of instructional design features in simulation-based education: Systematic review and meta-analysis. Med Teach 2013;35:e867-98.  Back to cited text no. 9
    
10.
Koenderink JJ. World, environment, Umwelt, and innerworld: a biological perspective on visual awareness, Proc. SPIE 8651, Human Vision and Electronic Imaging XVIII, 865103. 2013. https://doi.org/10.1117/12.2011874.  Back to cited text no. 10
    
11.
Tønnessen M. Umwelt Trajectories. Semiotica 2014;134:695-9.  Back to cited text no. 11
    
12.
Issenberg SB, McGaghie WC, Petrusa ER, et al. Features and uses of high-fidelity medical simulations that lead to effective learning: A BEME systematic review. Med Teach 2005;27:10-28.  Back to cited text no. 12
    
13.
Andersen SA, Mikkelsen PT, Konge L, et al. Cognitive load in mastoidectomy skills training: Virtual reality simulation and traditional dissection compared. J Surg Educ 2016;73:45-50.  Back to cited text no. 13
    
14.
Frithioff A, Frendø M, Mikkelsen PT, et al. Ultra-high-fidelity virtual reality mastoidectomy simulation training: A randomized, controlled trial. Eur Arch Otorhinolaryngol 2020;277:1335-41.  Back to cited text no. 14
    
15.
Andersen SA, Caye-Thomasen P, Sørensen MS. Novices perform better in virtual reality simulation than in traditional cadaveric dissection training of mastoidectomy. J Surg Simul 2015;2:68-75. [doi: 10.1102/2051-7726.2015.0014].  Back to cited text no. 15
    
16.
Franzeck FM, Rosenthal R, Muller MK, et al. Prospective randomized controlled trial of simulator-based versus traditional in-surgery laparoscopic camera navigation training. Surg Endosc 2012;26:235-41.  Back to cited text no. 16
    
17.
Nilsson C, Sorensen JL, Konge L, et al. Simulation-based camera navigation training in laparoscopy – A randomized trial. Surg Endosc 2017;31:2131-9.  Back to cited text no. 17
    
18.
Diesen DL, Erhunmwunsee L, Bennett KM, et al. Effectiveness of laparoscopic computer simulator versus usage of box trainer for endoscopic surgery training of novices. J Surg Educ 2011;68:282-9.  Back to cited text no. 18
    
19.
Beyer-Berjot L, Aggarwal R. Toward technology-supported surgical training: The potential of virtual simulators in laparoscopic surgery. Scand J Surg 2013;102:221-6.  Back to cited text no. 19
    
20.
Andersen SA, Mikkelsen PT, Sørensen MS. The effect of simulator-integrated tutoring for guidance in virtual reality simulation training. Simul Healthc 2020;15:147-53.  Back to cited text no. 20
    
21.
Andersen SA, Frendø M, Sørensen MS. Effects on cognitive load of tutoring in virtual reality simulation training. MedEdPublish 2020;9:51. [doi: 10.15694/mep.2020.000051.1].  Back to cited text no. 21
    
22.
Rölfing JD, Nørskov JK, Paltved C, et al. Failure affects subjective estimates of cognitive load through a negative carry-over effect in virtual reality simulation of hip fracture surgery. Adv Simul (Lond) 2019;4:26.  Back to cited text no. 22
    
23.
Jensen K, Bjerrum F, Hansen HJ, et al. A new possibility in thoracoscopic virtual reality simulation training: Development and testing of a novel virtual reality simulator for video-assisted thoracoscopic surgery lobectomy. Interact Cardiovasc Thorac Surg 2015;21:420-6.  Back to cited text no. 23
    
24.
McConnell RA, Kim S, Ahmad NA, et al. Poor discriminatory function for endoscopic skills on a computer-based simulator. Gastrointest Endosc 2012;76:993-1002.  Back to cited text no. 24
    
25.
Arkes, H. R., & Ayton, P. The sunk cost and Concorde effects: Are humans less rational than lower animals? Psychological Bulletin 1999;125:591-600. https://doi.org/10.1037/0033-2909.125.5.591.  Back to cited text no. 25
    
26.
Arkes HR, Blumer C. The psychology of sunk cost. Organ Behav Hum Decis Process 1985;35:124-40.  Back to cited text no. 26
    
27.
Gromski MA, Ahn W, Matthes K, De S. Pre-clinical training for new notes procedures: From ex-vivo models to virtual reality simulators. Gastrointest Endosc Clin N Am 2016;26:401-12.  Back to cited text no. 27
    
28.
Morikawa T, Yamashita M, Odaka M, et al. A step-by-step development of real-size chest model for simulation of thoracoscopic surgery. Interact Cardiovasc Thorac Surg 2017;25:173-6.  Back to cited text no. 28
    
29.
Cook DA, Hatala R. Validation of educational assessments: A primer for simulation and beyond. Adv Simul (Lond) 2016;1:31.  Back to cited text no. 29
    
30.
ISO/IEC/IEEE International Standard - Systems and software engineering--Vocabulary, in ISO/IEC/IEEE 24765:2017(E). 2017:1-541. doi: 10.1109/IEEESTD.2017.8016712.  Back to cited text no. 30
    
31.
Privitera MB, Design M, Murray DL. Applied ergonomics: Determining user needs in medical device design. Annu Int Conf IEEE Eng Med Biol Soc 2009;2009:5606-8.  Back to cited text no. 31
    
32.
Lee Y, Cho S, Choi J. Determining user needs through abnormality detection and heterogeneous embedding of usage sequence. Electron Commer Res 2019;21:245-61. [doi: 10.1007/s10660-019-09347-6].  Back to cited text no. 32
    
33.
Decker R, Trusov M. Estimating aggregate consumer preferences from online product reviews. Int J Res Mark 2010;27:293-307.  Back to cited text no. 33
    
34.
Aughenbaugh JM, Paredis CJ. The Role and Limitations of Modeling and Simulation in Systems Design. Anaheim, California, USA: Computers and Information in Engineering; 2004. p. 13-22. [doi: 10.1115/IMECE2004-59813].  Back to cited text no. 34
    
35.
Forsberg K, Mooz H. The relationship of systems engineering to the project cycle. Eng Manag J 1992;436-43.  Back to cited text no. 35
    
36.
Hoffman BJ, Hawes RH. Endoscopic ultrasound and clinical competence. Gastrointest Endosc Clin N Am 1995;5:879-84.  Back to cited text no. 36
    
37.
Karstensen J, Nayahangan LJ, Delphi Panel EU, et al. A core curriculum for basic EUS skills – An international consensus using the Delphi methodology. EUS J 2022;11:122-32.  Back to cited text no. 37
    
38.
Han C, Nie C, Shen X, et al. Exploration of an effective training system for the diagnosis of pancreatobiliary diseases with EUS: A prospective study. Endosc Ultrasound 2020;9:308-18.  Back to cited text no. 38
    
39.
Gurudu SR, Ramirez FC. Quality metrics in endoscopy. Gastroenterol Hepatol (N Y) 2013;9:228-33.  Back to cited text no. 39
    
40.
Preisler L, Svendsen MB, Nerup N, et al. Simulation-based training for colonoscopy: Establishing criteria for competency. Medicine (Baltimore) 2015;94:e440.  Back to cited text no. 40
    
41.
Qiao W, Bai Y, Lv R, et al. The effect of virtual endoscopy simulator training on novices: A systematic review. PLoS One 2014;9:e89224.  Back to cited text no. 41
    
42.
Triantafyllou K, Lazaridis LD, Dimitriadis GD. Virtual reality simulators for gastrointestinal endoscopy training. World J Gastrointest Endosc 2014;6:6-12.  Back to cited text no. 42
    
43.
Gao J, Fang J, Jin Z, et al. Use of simulator for EUS training in the diagnosis of pancreatobiliary diseases. Endosc Ultrasound 2019;8:25-30.  Back to cited text no. 43
    
44.
Fernandez-Sordo JO, Madrigal-Hoyos E, Waxman I. The role of live animal models for teaching endoscopy. Tech Gastrointest Endosc 2011;13:113-8.  Back to cited text no. 44
    
45.
Fritscher-Ravens A, Cuming T, Dhar S, et al. Endoscopic ultrasound-guided fine needle aspiration training: Evaluation of a new porcine lymphadenopathy model for in vivo hands-on teaching and training, and review of the literature. Endoscopy 2013;45:114-20.  Back to cited text no. 45
    
46.
Li J, Yao J, Li S, et al. Validation of a novel swine model for training in EUS-FNA (with videos). Endosc Ultrasound 2020;9:232-7.  Back to cited text no. 46
    
47.
Sankaranarayanan G, Matthes K, Nemani A, et al. Needs analysis for developing a virtual-reality NOTES simulator. Surg Endosc 2013;27:1607-16.  Back to cited text no. 47
    
48.
Fenwick N, Griffin G, Gauthier C. The welfare of animals used in science: How the “Three Rs” ethic guides improvements. Can Vet J 2009;50:523-30.9.  Back to cited text no. 48
    
49.
Russell WM. The development of the three Rs concept. Altern Lab Anim 1995;23:298-304.  Back to cited text no. 49
    


    Figures

  [Figure 1], [Figure 2], [Figure 3], [Figure 4]
 
 
    Tables

  [Table 1]



 

Top
 
 
  Search
 
Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
Access Statistics
Email Alert *
Add to My List *
* Registration required (free)

 
  In this article
   Abstract
  Introduction
   General Consider...
   A Product Develo...
  Validation
  Summary
   EUS