CommentaryMedical Education

Medical Education Research As Translational Science

See allHide authors and affiliations

Science Translational Medicine  17 Feb 2010:
Vol. 2, Issue 19, pp. 19cm8
DOI: 10.1126/scitranslmed.3000679


Research on medical education is translational science when rigorous studies on trainee clinical skill and knowledge acquisition address key health care problems and measure outcomes in controlled laboratory settings (T1 translational research); when these outcomes transfer to clinics, wards, and offices where better health care is delivered (T2); and when patient or public health improves as a result of educational practices (T3). This Commentary covers features of medical education interventions and environments that contribute to translational outcomes, reviews selected research studies that advance translational science in medical education at all three levels, and presents pathways to improve medical education translational science.


Medical education at all levels aims to equip physicians with the knowledge, skills, and attributes of professionalism needed to deliver quality patient care. Research on medical education aims to make the enterprise more effective, efficient, and economical. Downstream goals of medical education research are to demonstrate that educational interventions contribute to physician competence measured in the classroom, educational laboratory, and patient care setting. Improved patient outcomes linked directly to educational events are the ultimate goal of medical education research.

Translational science progresses from bench to bedside in at least three seamless phases. T1 science seeks to move basic laboratory discoveries in the biomedical sciences to clinical research. T2 science aims to produce evidence of clinical effectiveness at the level of the patient; compare the success of different treatments to identify “the right treatment for the right patient in the right way at the right time”; and translate these results into practice guidelines for patients, clinicians, and policy-makers (1). T3 science addresses health care delivery, community engagement, and preventive services that yield measureable improvements in the health of individuals and society (1).

A growing body of medical education research expands the definition of T1 to T3 translational science. These investigations address skill and knowledge acquisition and maintenance by individual physicians and health care teams chiefly in simulation laboratories that provide postgraduate education and in continuing medical education (CME) programs, the application and transfer of acquired skill and knowledge to patient care, and improved patient outcomes that occur as a result of educational practices (Table 1). This Commentary describes educational interventions that contribute to translational science, presents a selected set of T1 to T3 medical education research reports, and concludes with suggestions about ways to increase the quality of medical education translational science.

Table 1. Contributions of medical education interventions to T1, T2, and T3 outcomes.


View this table:


Medical education interventions that contribute to T1 to T3 clinical translational science have several key features. (i) These interventions focus on learning objectives grounded in behavioral descriptions of expert clinical performance derived from experience or laboratory research (2). (ii) They also use live or multiple media and multiple educational techniques and exposures (3), combined with opportunities for intense, deliberate practice of skills and knowledge to promote their acquisition (4). Deliberate practice derives from information processing and behavioral theories of skill acquisition and maintenance, is very demanding of learners, relies on rigorous measurements that permit performance feedback from several sources, and aims for constant improvement (4). (iii) The educational interventions use a medical simulation center or laboratory to allow deliberate practice with feedback in a controlled, forgiving environment using devices that vary in fidelity depending on target learning objectives (5, 6). (iv) Finally, they use the mastery learning model, in which all learners are held to high and uniform competence standards even though the amount of deliberate practice time needed to reach the standards may vary (79).

Features of the educational, professional, and clinical context of these interventions are also essential for training effectiveness. Active components of effective training include institution-level incentives for training and a culture that promotes safety; relevant, “in-house” training; nonthreatening assessment and training for the entire workforce; self-directed infrastructural changes, which seek local solutions to national problems; realistic training tools that are high fidelity rather than high tech; and multiperson professional teams to provide diverse points of view and experience in solving problems as a group (10).


Medical education research at the T1 level—the most common form of such research—involves the design and delivery of training protocols and measurement of educational progress to study outcomes in controlled laboratory settings (1113). An example is demonstration research by Murray and colleagues, in which the clinical performance of anesthesiologists in training and practice was evaluated rigorously in a series of simulated intraoperative procedures. Inexperienced anesthesiology residents performed significantly worse than more advanced residents and board-certified anesthesiologists. The research also revealed wide variation in clinical performance scores obtained by individuals in each group. The investigators concluded that the simulation-based assessment was a legitimate method of differentiating the skills of more experienced anesthesia residents and anesthesiologists from those of residents with less training (14).

Medical education research at the T2 level stretches the endpoint beyond the T1 laboratory setting. A sustained T1 to T2 research program implemented by Wayne and colleagues, aimed at strengthening internal medicine residents’ skills in advanced cardiac life support (ACLS), provides an example. In this program, second-year residents undergo rigorous simulation-based education (SBE) (Fig. 1) designed to enhance their recognition and response to six disorders seen in the ACLS setting: (i) asystole, the lack of cardiac electrical activity; (ii) ventricular fibrillation, the uncoordinated contraction of the ventricles; (iii) supraventricular tachycardia, a fast heart rhythm originating somewhere other than the ventricles; (iv) ventricular tachycardia, a fast heart rhythm originating in one of the ventricles; (v) symptomatic bradycardia, a very slow heart rate; and (vi) pulseless cardiac electrical activity, heart rhythm that does not produce meaningful contractions. Their study consisted of a randomized crossover trial with a wait-list control condition, so that some of the 38 residents participating did not receive the specialized training at first (and served as the control group), but later (at crossover) did receive training. A standardized 8-hour ACLS curriculum featuring deliberate practice with feedback produced a 38% increase in measured clinical skill, replicated at crossover (6), a T1 outcome. Clinical experience alone, however, had no effect on skill improvement (6). A subsequent ACLS mastery learning study with a fresh group of 41 residents yielded a 24% skill improvement over baseline. Mastery outcomes—in which residents met or surpassed high ACLS competency standards for all six scenarios (7)—were achieved by 80% of the residents in the standard 8-hour curriculum; 20% needed more time (1 or 2 hours). These results represent another T1 outcome. However, a later case-control study compared cardiac arrest responses to 48 actual in-hospital “codes” (that is, emergencies involving cardiac arrest) by resident teams that had or had not received prior ACLS SBE. Code responses by resident teams that received SBE were significantly more adherent to American Heart Association quality indicators than were responses by traditionally trained residents (odds ratio = 7.1; 95% confidence interval, 1.8 to 28.6; controlling for patient age, ventilator, and telemetry status, indicating that the patient was being monitored remotely for abnormal cardiac activity) (15). In this research program, early ACLS skill-acquisition training outcomes (T1) transferred to improved patient care practices (T2).

Fig. 1. Simulation-based education.

Internal medicine residents improve their skills in advanced cardiac life support.


Edelson and colleagues evaluated the outcomes of a cardiopulmonary resuscitation (CPR) education intervention targeting in-hospital performance by internal medicine residents. The training protocol featured resuscitation with actual performance–integrated debriefing (known as the RAPID protocol), enhanced by objective data from a CPR-sensing and feedback-enabled defibrillator. The in-hospital CPR performance of residents who received the intervention was found to be better on a variety of clinically meaningful measures (such as the return of spontaneous circulation) than the performance of a historical resident cohort (16). This study also illustrates T2 medical education research, because the measurement endpoint extends beyond trainee skill acquisition to its clinical application.

A UK research group led by J. F. Crofts trains and evaluates obstetricians and midwives in the management of complicated deliveries involving shoulder dystocia, during which delivery of an infant’s shoulder is obstructed by the mother’s pelvis. In an initial CME multisite randomized trial involving 45 doctors and 95 midwives that compared low- and high-fidelity mannequins for management of shoulder dystocia, Crofts et al. found that both training devices improved pre- to post-test performance in the simulation laboratory, as measured by the use of basic maneuvers, successful deliveries, and good patient communication (17), a T1 outcome. Trainees who used the high-fidelity mannequin also achieved a higher successful simulation laboratory (T1) delivery rate than those who used the low-fidelity device. The authors concluded that SBE on handling shoulder dystocia is effective, and medical education research results can be used to inform health policy (17).

A subsequent retrospective, observational study at one of the UK hospitals that participated in the previous study (17) compared the management and neonatal outcome of births complicated by shoulder dystocia before and after the introduction of SBE (18). Outcomes were evaluated using pre- and post-SBE intrapartum and postpartum birth records in which difficulty with the infant’s shoulders was recorded. Although shoulder dystocia rates in the two periods were similar, post-training clinical management showed statistically significant and clinically meaningful improvement on six measured clinical performance variables: (i) the use of McRoberts’ position, in which the mother’s legs are pressed to her abdomen to widen the pelvis; (ii) the application of pressure to the lower abdomen (suprapubic pressure); (iii) the use of an internal rotational maneuver; (iv) delivery of the posterior arm; (v) no recognized maneuvers performed; and (vi) documented excessive traction on the infant’s head (T2 outcomes). The authors also reported that the fraction of babies born with an obstetric brachial palsy injury, in which the baby’s arm is paralyzed, was significantly reduced after training (18). This followup observational study demonstrates T3 medical education research, because better patient outcomes stemmed directly from the T2 results of the prior educational intervention.

A companion report from the UK obstetrical research group presents the results of a simulation-based obstetric emergency team-training study on the incidence of low Apgar scores (≤6 on a scale of 1 to 10) 5 min after birth and of neonatal hypoxic-ischemic encephalopathy (HIE) (19), a brain injury caused by lack of oxygen. The investigators note that the risk for cerebral palsy and cognitive disability is high after moderate or severe HIE occurs. Training involved a mandatory annual 1-day course for midwifery and obstetric medical staff. Training objectives and methods for participants included the following: (i) improved interpretation of cardiotocographic tracings that show the fetal heartbeat and maternal uterine contractions (electronic fetal monitoring), (ii) participation in simulated obstetric emergency scenarios, and (iii) practice in documenting clinical procedures. Detailed debriefing and feedback followed the simulated obstetric emergencies. Clinical research outcomes with actual patients showed that after the training courses were introduced, the number of infants with 5-min Apgar scores ≤6 decreased from 86.6 to 44.6 per 10,000 births (P < 0.001), and the number of infants with HIE decreased from 27.3 to 13.6 per 10,000 births (P = 0.032) (19). This result is another example of T3 medical education research in which educational interventions result in better health care practices that improve patient outcomes.

Barsuk and colleagues introduced a mastery learning SBE program to increase internal medicine and emergency medicine residents’ skills at central venous catheter (CVC) insertion (9). Program outcomes were evaluated in a cohort study that compared internal jugular (IJ) and subclavian (SC) CVC insertion skills of SBE-trained versus traditionally trained residents in a medical intensive care unit (MICU). T2 outcomes showed that residents who received IJ and SC SBE inserted CVCs in the MICU with significantly fewer needle passes, arterial punctures, and line adjustments, and with higher success rates than traditionally trained residents (20). In a before/after observational study in the MICU on the incidence of catheter-related bloodstream infections (CRBSIs) over 32 months, Barsuk et al. reported that significantly fewer CRBSIs occurred after the simulator-trained residents entered the intervention ICU (0.50 infections per 1000 catheter days) as compared to both the same unit before the intervention (3.20 per 1000 catheter days, P = 0.001) and to a comparison ICU in the same hospital throughout the study period (5.03 per 1000 catheter days, P = 0.001) (21). The educational intervention also resulted in significant medical care cost savings, establishing its cost-effectiveness (22). These T3 medical education research results link improved patient outcomes and health care cost savings to the mastery learning educational intervention.


Medical education research is T1 translational science when its results show trainee skill and knowledge improvement in laboratory settings. Such research is T2 translational science when its results yield measureable improvements in clinical skill and knowledge of physicians at all levels, which are transferred and used in patient care settings. T3 medical education research demonstrates measured improvement in the health of individuals and populations as a result of education and training. The quality of translational research in medical education will increase when rigorous educational interventions and outcome measures address key patient care problems; research endpoints are stretched to measure improved patient care practices and improve patient and public health outcomes reliably; and studies employ rigorous, comparative, experimental, and quasi-experimental research designs that have the power to detect the effects of educational intervention.


  • Citation: W. C. McGaghie, Medical education research as translational science. Sci. Transl. Med. 2, 19cm8 (2010).

References and Notes

  1. Supported in part by grant UL1RR025741 from the National Center for Research Resources, National Institutes of Health (NIH). NIH had no role in the preparation, review, or approval of the manuscript. P. Greenland, S. B. Issenberg, and J. Vozenilek provided helpful comments on an earlier draft of the manuscript.
View Abstract

Stay Connected to Science Translational Medicine

Navigate This Article