Assessment of the accuracy of the Medical Response to Major Incidents (MRMI) course for interactive training of the response to major incidents and disasters
Keywords:curriculum evaluation, educational validation, simulation, training, major incident, disaster medicine, mass-casualty training
Background and aims: The benefit of simulation models for interactive training of the response to major incidents and disasters has been increasingly recognized during recent years and a variety of such models have been reported. However, reviews of this literature show that the majority of these reports have been characterized by significant limitations regarding validation of the accuracy of the training related to given objectives. In this study, precourse and postcourse self-assessment surveys related to the specific training objectives, as an established method for curriculum validation, were used to validate the accuracy of a course in Medical Response to Major Incidents (MRMI) developed and organized by an international group of experts under the auspices of the European Society for Trauma and Emergency Surgery.
Methods: The studied course was an interactive course, where all trainees acted in their normal roles during two full-day simulation exercises with real time and with simultaneous training of the whole chain of response: scene, transport, the different functions in the hospital, communication, coordination, and command. The key component of the system was a bank of magnetized casualty cards, giving all information normally available as a base for decisions on triage and primary management. All treatments were indicated with attachments on the cards and consumed time and resources as in reality. The trainees’ performance was recorded according to prepared protocols and a measurable result of the response could be registered. This study was based on five MRMI courses in four different countries with altogether 235 participants from 23 different countries. In addition to conventional course evaluations and recording of the performance during the 2 exercise days, the trainees’ perceived competencies related to the specific objectives of the training for different categories of staff were registered on a floating scale 1-10 in self-assessment protocols immediately before and after the course. The results were compared as an indicator of to which extent the training fulfilled the given objectives. These objectives were set by an experienced international faculty and based on experiences from recent major incident and disasters.
Results: Comparison of precourse and postcourse self-assessments of the trainees’ perceived knowledge and skills related to the given objectives for the training showed a significant increase in all the registered parameters for all categories of participating staff. The average increase was for prehospital staff 74 percent (p < 0.001), hospital staff 65 percent (p < 0.001), and staff in coordinating/administrative functions 81 percent (p < 0.001).
Conclusions: The significant differences in the trainees’ self-assessment of perceived competencies between the precourse and postcourse surveys indicated that the methodology in the studied course model accurately responded to the specific objectives for the different categories of staff.
Archer F, Seynaeve G: International guidelines and standards for education and training to reduce the consequences of events that may threaten the health status of a community. Prehosp Disaster Med. 2007; 22(2): 120-130.
Ashkenazi I, Olsha O, Schechter WP, et al.: Inadequate mass-casualty knowledge base adversely affects treatment decisions by trauma care providers: Survey on hospital response following a terrorist bombing. Prehosp Disaster Med. 2009; 24(4): 342-347.
Lennquist S: Education and training in disaster medicine. Scand J Surg. 2005; 94: 300-310.
Olson DK, Hoeppner MH, Scaletta K, et al.: Games, simulations and learning in emergency preparedness: A review of the literature. Am J Disaster Med. 2012; 7(2): 145-154.
Linn RL: The Design and Evaluation of Educational Assessment and Accountability Systems. CSE Technical Report 539. Los Angeles: Center for the Study of Evaluation, University of California, 2001.
Hsu EB, Jenckes MW, Catlett CL, et al.: Training of Hospital Staff to Respond to a Mass Casualty Incident. AHRQ Publications No. 04-E015-2. Baltimore, MD: The John Hopkins University Bloomberg School of Public Health, 2004.
Lennquist Montan K, Dobson B, Hreckovski B, et al.: Development and evaluation of a new simulation model for interactive training of the medical response to major incidents and disasters. Eur J Trauma Emerg Surg. 2014; 40: 429-443.
Hreckovski B, Lennquist Montan K, Dobson R: Medical response to major incidents: A standardized post-graduate course in management and performance in major incidents and disasters. In Gursky E, Hrecˇkovski B (eds.): NATO Handbook for Pandemic and Mass-Casualty Planning and Response. Amsterdam: IOS Press BV, 2012;100:3-11.
Linn RL: Assessments and accountability. Educ Res. 2000; 29: 4-16.
Campbell C, Levin B: Using data to support educational improvement. Educ Ass Eval Acc. 2009; 21: 47-65.
Lennquist Montan K, Ortenwall P, Khorram Manesh A, et al.: Comparative study of physiological and anatomical triage in major incidents using a new simulation model. Am J Disaster Med. 2011; 6(5): 289-298.
Randel JM, Morris BA, Wetzel DC, et al.: The effectiveness of games for educational purposes: A review of recent research. Simul Gaming. 1992; 23(3): 261-276.
McLean CR, Leer YT, Jain S: Integration of incident management simulation-based training applications. Paper presented at the Proceedings of the 2007 Fall Simulation Interoperability Workshop (SIW), Orlando, FL, 2007.
Lennquist S: Education and training in disaster medicine—Time for a scientific approach. Int J Disaster Med. 2003; 1: 9-12.
Lennquist S, Lennquist Montan K: Education and training. In Lennquist S (ed.): Medical Response to Major Incidents and Disasters—A Practical Guide for All Medical Staff. Berlin: Springer, 2012.
American College of Surgeons Committee on Trauma: Advanced Trauma Life Support Program for Physicians. 9th ed. Chicago: American College of Surgeons Committee, 2012.
Linn R, Burton E: Performance-based assessment: Implications and task specificity. Educ Meas Issues Pract. 1994; 13(1): 5-8.
Amrein-Beardsley A, Barnett JH: Working with error and uncertainty to increase measurement validity. Educ Ass Eval Acc. 2012; 24: 369-379.
McNamara G, O’Hara J: The importance of the concept of self-evaluation in the changing landscape of educational policy. Stud Educ Eval. 2008; 34: 173-179.
Aylwin CJ, Konig TC, Brennan NW, et al.: Reduction in mortality in urban mass-casualty incidents—Analysis of triage, surgery and resources use after the London bombings on July 7, 2005. Lancet. 2006; 368: 2219-2225.
Turegano F, Perez-Diaz D, Sanz-Sanchez M, et al.: Overall assessment of the response to the terrorist bombings in trains in Madrid March 11, 2004. Eur J Trauma Emerg Surg. 2008; 34: 433-441.
Pryor JP: The 2011 World Trade Center disaster—Summary and evaluation of experiences. Eur J Trauma Emerg Surg. 2009; 3: 212-224.
Lennquist S: Major incidents: Examples and experiences. In Lennquist S (ed.): Medical Response to Major Incidents and Disasters—A Practical Guide for All Medical Staff. Vol 2. Berlin: Springer, 2012: 9-33.
Lennquist S (ed.): Medical Response to Major Incidents—A Practical Guide for All Medical Staff. Berlin: Springer, 2012.
Copyright 2007-2023, Weston Medical Publishing, LLC
All Rights Reserved