By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.

What’s the big deal about full physics simulation?

Mentice welcomes Anthony G Gallagher, Ph.D. who is one of the leading exponents and international experts on the design, application and validation...

July 3, 2012
Anthony G Gallagher Ph.D.

Mentice welcomes Anthony G Gallagher, Ph.D. who is one of the leading exponents and international experts on the design, application and validation of VR in medicin. He currently works in the School of Medicine, University College Cork (UCC) and this week he is our guest blogger. 


Link to recent review article in European Heart Journal by Anthony G Gallagher.


Göran Malmberg, CEO/President

The last 20 years have seen enormous changes in the way doctors, particularly surgeons, interventional cardiologist and radiologists are trained. Traditionally after successful completion of medical school doctors acquired a considerable portion of their procedural skills through long hours in clinical service under the guidance of a more experienced mentor. This apprenticeship style learning was based on an approach to learning developed at the Johns Hopkins Hospital by Dr. William Stewart Halsted (Professor of Surgery) at the end of the 19th and beginning of the 20th century1 2. Changing work practices 3, high profile cases of medical error 4-7 and reduced work hours 8 9 forced medicine to consider new ways of preparing doctors for the challenges a career in health-care. The consensus view was that (virtual reality (VR)) simulation training represented a viable solution. First proposed for training laparoscopic surgeons in the early 1990s by Dr. Richard M Satava 10, quantitative evidence relating to the viability of VR simulation training took a decade to accumulate.

It is now a decade since,

  • A closed door meeting hosted by Dr. Gerry Healy and the American College of Surgeons (ACS) where the decision was taken for the ACS to ‘champion surgical simulation’ 11
  • the first prospective, randomized, double-blinded trial of virtual reality training for the operating room was presented to the American Surgical Association at The Homestead,Hot Springs,VA
  • Publication of the results of the study in Annals of Surgery 12

In many respects a lot has changed since 2002. True to its word the ACS has championed simulation based training which has led to the establishment of Accredited Education Institutes (within the USAand globally) of which simulation is an important pedagogical component. The FDA in the US and the Department of Health in the UK have also recognised the potential of simulation technology as a powerful adjunct to patient safety efforts 13 14. Thus, everything seems to be moving in the right direction. So what’s all the fuss about the medical simulation and full physics?

In 1993 Satava wrote “The virtual-reality surgical simulator signals the beginning of an era of computer simulation for surgery. The surgical resident of the future will learn new perspectives on surgical anatomy and repeatedly practice surgical procedures until they are perfect before performing surgery on patients” 10. He also suggested that two generic requirements for any surgical simulator are that it must have accurate detail and be highly interactive. Specifically, the image must be anatomically precise and the organs must have natural properties such as the ability to change shape with pressure and to behave appropriately in gravity. These aspirations and expectations are still valid today but also need to be expanded on in light of what we now know about the application of simulation for effective and efficient training of procedural skills in medicine.

The Halstedian approach to training developed at Johns Hopkins of one-on-one learning by ‘repeated practice’ under the supervision of a master surgeon was challenged with a sea change in duty hours and new minimally invasive approaches to the performance of interventional procedures which did not fit well with this approach to learning. Traditionally procedural skills in medicine had been acquired and honed with long hours in the clinical care of patients and sheer volume of the number of operative procedures performed during a lengthy apprenticeship. On closer scrutiny, it became apparent that the assumption of ‘sheer volume’ of number and variety of operative procedures was no longer valid. When Dr. Richard Bell and colleagues evaluated the number and breadth of surgical procedures performed in US surgical residency training programs they found that surgical residents received considerably less operative experience than previously thought 15. These findings reinforced the growing conviction that the traditional approach to training was inadequate, on its own.

Simulation-based training always offered a viable skill acquisition tool. The sophistication of trainers using simulation as a training tool continues to mature. Some users believe that it is sufficient to purchase or acquire a simulator for the skill acquisition process to commence. Others believe that the skill acquisition process only truly commences when the simulator is housed in a bespoke skills laboratory. Still others have the unreasonable expectation that the simulator must actually be unboxed and used for training sessions. The true zealots actually argue that simulation based training should only be a constituent component of a broader curriculum.

The reality of simulation use by the vast majority of users lies somewhere along the caricature continuum described above. Simulation is a very powerful tool for training but its benefits can only be accrued from appropriate use within a well thought out curriculum. In 1993 Satava construed simulation as a tool for repeated practice until skills had been perfected before applying them on patients. In 2012 this account has considerably advanced on the basis of empirical research. Repeated practice on a simulation will undoubtedly improve the efficiency and flow of procedural skills. The main problem with this approach is that trainees left to their own devices can learn bad habits really well which if enacted during the performance of the same procedure on a patient could compromise patient safety. In the traditional apprenticeship approach to learning procedural skills the master surgeon would have unambiguously instructed the trainee about appropriate and inappropriate performances. Thus repeated practice alone is an insufficient attribute of a simulation model for optimal training to occur. These types of simulations can most certainly function as a demonstration however, they have limited training functions.

Ericsson 16 and colleagues have reported on the importance of performance feedback during the lengthy training of a musician. They developed the concept of ‘deliberate practice’ and argued that expert performance was the result of individuals’ prolonged efforts to improve performance while negotiating motivational and external constraints. A regimen of effortful activities (deliberate practice) designed to optimize improvement, even among elite performers (e.g., musicians, golfers etc) were found to be closely related to assessed amounts of deliberate practice frequently for a minimum of 10 yrs. Medicine latched on to the idea of deliberate practice but sometimes through the optic of repeated practice arguing that this approach was being compromised by reduced work hours 17.

When I first heard of deliberate practice I had the mental image of a very serious faced trainee starting to practice their procedural skills on a simulator. It was only much later that I actually understood why deliberate practice was such an effective approach to training. The extended, structured and motivated practice by the trainees that Eriksson, Krampe and Tesch-Römer described were no doubt important but one crucial aspect of this approach to training was missing, i.e., feedback, particularly immediate. In addition to engaging in deliberate practice the trainee musicians studied by Ericsson and colleagues also had access to immediate feedback from their tutor. They would have been informed by their tutor (or recognised themselves) when they played a note wrong. Engaging in frequent practice sessions would mean that the trainee had ample opportunity to practice and rehearse their playing with equivalent opportunities to correct and improve performance. Implicit to this process and presumed performance improvement (i.e., learning) is the concept performance feedback. This realization led me to a considerably better understanding of the importance of simulation fidelity.

Satava (1993) alluded to the idea of simulation fidelity when he proposed that a simulation should accurately depict the anatomy of the procedure that was to be learned. This approach can however, be taken to mean how a simulation looks and appears to behave in response to user interaction with it. When combined with a process of performance feedback, simulation appearance and behaviour become integral but insufficient simulation characteristics for effective and efficient learning. To afford trainees with performance feedback the simulation should be able to model the procedure, allowing the trainee to perform the procedure with the same devices, in the same order and in the same was as they would be used during the procedure on a real patient. However, the simulation also needs to facilitate the timely reporting to the trainee aspects of their procedural performance that deviate from optimal performance. This means that replicating what the procedure and interventional devices look like will be insufficient for optimal training. The simulation must be capable of simulating how the anatomy and devices behave when they interact. This device-tissue behaviour must also replicate the way that they would behave inside a real patient, in real-time. It is the capacity to reliably and validly duplicate device-tissue interaction in a real patient that allows procedural experts to define (not describe) optimal and sub-optimal performance.

Metrics are simply quantitative summaries of procedural performances and deviations from optimal performance operationalised within a simulation model 18. The fidelity of the simulation model determined by the underlying physics engine determines the degree of performance feedback. Summative feedback at the end of a training session on whether the right device was used in the correct sequence and how long it took to perform the procedure will certainly facilitate learning. More detailed performance feedback proximate to actual performance will accelerate the speed of learning further. For example, choosing the correct catheter and wire are important but so also is what is done with them. Advancing the catheter without a wire in front of it at a particular phase of the procedure is not something than can be accurately captured (for learning purposes) in a report of the procedure has just been completed. A better approach would be to inform the trainee of this deviation from optimal performance whilst they were doing it, i.e., proximate to performance. This is a far superior approach to training which inevitably leads to more effective and efficient learning.

The pre-requisites for this approach to simulation training are the identification and definition of optimal and sub-optimal performance by procedural experts and the capability of the simulation to model the procedure and operator performance interaction in real-time. The authenticity, realism, accuracy, validity and reliability (or fidelity) of the simulation model are fundamental benchmarks against which a simulation should be evaluated. But so also are the capabilities of the simulation to model, detect and report (if called upon) detailed second-by-second intra-operative deviations from optimal performance during training sessions because immediate feedback can be given to the trainee. This type of immediate feedback based training is a more effective and efficient approach to simulation based training. Physical simulation models have (in theory) the same capability, e.g., training in an animal lab. However, the capacities to implement, detect and give feedback on subtle but important deviations from optimal performance are reliant on supervisor vigilance, which may vary considerably as does the reliability with which they detect these deviations. Computers are much more dependable administrators with these types of tasks. The ‘sensitivity’ of the simulation model for the detection of trainee deviations from optimal performance has important and very real learning implications. To correct these deviations trainees must first be made aware of them, preferable as soon as they have been enacted. The nice thing about simulation is that it affords the trainee the opportunity to modify, rehearse and practice their procedure performance which they could not do if they were operating on a patient.

The revolution in training procedure skills is not so much the virtual reality simulation technology but the opportunities in now presents afford to doctors. Procedure skills which have been traditionally learned in the operating room and catheterization laboratory can now be learned in the skills laboratory. They can be practiced repeatedly until a skills benchmark has been reached before ever operating on a patient. This means that trainees are much better prepared to perform procedures on real patients which in turn means more effective, efficient and safer operations, irrespective of the experience of the trainee.

References

1. Cameron JL. William Stewart Halsted: our surgical heritage. Annals of Surgery 1997;225(5):445-458.

2. Halsted WS. The training of the surgeon. Bulletin of Johns Hopkins Hospital 1904;xv:267-275.

3. Beall DP. The ACGME Institutional Requirements: What Residents Need to Know. Journal of the American Medical Association 1999;281(24):2352.

4. Bristol Royal Infirmary Inquiry. Learning from Bristol: the report of the public inquiry into children’s heart surgery at the Bristol Royal Infirmary 1984–1995. London: Stationery Office 2001.

5. Kohn LT, Corrigan JM, Donaldson MS. To err is human: building a safer health system. Washington DC 2000:196–7.

6. Van Der Weyden MB. The Bundaberg Hospital scandal: the need for reform in Queensland and beyond. The Medical Journal of Australia 2005;183(6):284 – 285.

7. Harding Clark M. The Lourdes Hospital Inquiry–An Inquiry into peripartum hysterectomy at Our Lady of Lourdes Hospital, Drogheda. Report of Judge Maureen Harding Clark SC. The Stationery Office, Dublin 2006.

8. Wallack MK, Chao L. Resident work hours: the evolution of a revolution. Archives of Surgery 2001;136(12):1426 -1431.

9. MacDonald R. Implementing the European working time directive. BMJ Career Focus 2003;327(7406).

10. Satava RM. Virtual reality surgical simulator. The first steps. Surgical Endoscopy 1993;7(3):203-5.

11. Healy GB. The college should be instrumental in adapting simulators to education. Bulletin of the American College of Surgeons 2002;87(11):10-11.

12. Seymour NE, Gallagher AG, Roman SA, O’Brien MK, Bansal VK, Andersen DK, et al. Virtual reality training improves operating room performance: results of a randomized, double-blinded study. Annals of Surgery 2002;236(4):458-63; discussion 463-4.

13. FDA. Draft Guidance for Industry and Food and Drug Administration Staff; Applying Human Factors and Usability Engineering to Optimize Medical Device Design. In: Center for Devices and Radiological Health, editor. Silver Spring, MD 20993 U.S. Department of Health and Human Services, 2011.

14. Department of Health. A Framework for Technology Enhanced Learning. In: Workforce, editor. London: Crown, 2011.

15. Bell Jr RH, Biester TW, Tabuenca A, Rhodes RS, Cofer JB, Britt LD, et al. Operative experience of residents in US general surgery programs: a gap between expectation and experience. Annals of Surgery 2009;249(5):719 – 724.

16. Ericsson KA, Krampe RT, Tesch-Römer C. The role of deliberate practice in the acquisition of expert performance. Psychological Review 1993;100(3):363-406.

17. Purcell Jackson G, Tarpley JL. How long does it take to train a surgeon? British Medical Journal 2009;339:b4260

18. Gallagher AG, O’Sullivan GC. Fundamentals of Surgical Simulation; Principles & Practices London: Springer Verlag, 2011.

 

 

Join us today

To always stay on top within the field of medical simulation and its development subscribe to our news and resource list

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.