Back to Journals » Advances in Medical Education and Practice » Volume 15
The Educational Benefits of Plastic Surgery Rotations for Off-Service Residents
Authors Diffley M, Hall JMD, Tepper D, Siddiqui A
Received 12 June 2024
Accepted for publication 12 June 2024
Published 22 October 2024 Volume 2024:15 Pages 999—1004
DOI https://doi.org/10.2147/AMEP.S482437
Checked for plagiarism Yes
Review by Single anonymous peer review
Peer reviewer comments 2
Editor who approved publication: Dr Md Anwarul Azim Majumder
Michael Diffley,1 Jamie MD Hall,2 Donna Tepper,2 Aamir Siddiqui3
1Department of General Surgery, Henry Ford Health, Detroit, MI, USA; 2Division of Plastic Surgery, Henry Ford Health, Detroit, MI, USA; 3Michigan State University, College of Human Medicine, Henry Ford Hospital, Detroit, MI, USA
Correspondence: Aamir Siddiqui, Henry Ford Hospital, K-16, 2799 West Grand Boulevard, Detroit, MI, 48282-2689, USA, Tel +1 313 916-2683, Fax +1 313 916-1155, Email [email protected]
Purpose: With increasing specialization among surgical divisions, a well-rounded education during a surgical residency is often accomplished by rotating among different subspecialties. Inclusion of specific rotations in the resident curriculum can be considered as a cost–benefit calculation balancing the value of exposure to a subspecialty versus the opportunity cost of potential learning from another rotation. We find that often these decisions are based on anecdotal feedback. Our goal is to supplement these reports with a quantifiable metric of learning achieved on the plastic surgery rotation. Our hypothesis in this prospective study was that residents would demonstrate improved performance on a post-rotation test after their 1-month rotation on plastic surgery compared to the pre-rotation test.
Methods: A question bank was developed to reflect institutional curriculum objectives and clinical scenarios commonly seen on the service. The questions were developed, validated and vetted in collaboration with medical educators and attending plastic surgeons yielding 20 questions available for use. Postgraduate year 1 residents were given a 10-question test before and after their plastic surgery rotation. A one-tailed paired t-test was used to assess improvement between the pre-rotation test and the post-rotation test.
Results: A total of 378 tests were administered with 228 (60%) pre- and post-rotation tests completed meeting inclusion criteria. Average percentage of correct answers for the pre-rotation test was 29% and 88% for the post-rotation test showing a differential improvement of 58% (p < 0.001).
Conclusion: Surgical trainee time is a limited commodity. Each clinical rotation needs proven consistent benefit for the trainees. We developed a questionnaire that documents the improvement in clinical knowledge after a one-month rotation on plastic surgery relative to before. The test results were consistent even when comparing trainees who did the rotation early versus late in the PGY-1 year. Clinical exposure reinforces and solidifies specialty learning.
Keywords: education, plastic surgery, curriculum, off-service rotations, general surgery training
Introduction
As the house of surgery has become increasingly specialized, the amount of exposure to different subspecialties and pathologies during training continues to be challenging. General surgery residency programs must prepare physicians to encounter a wide spectrum of diseases and conditions. This requires general surgery residents to rotate among the surgical subspecialties. Each rotation has a cost–benefit calculation. The value of any off-service rotation is weighted against the opportunity cost of what residents are not seeing or learning. The goal is to help develop well-trained residents with exposure to all facets of surgery for which they may eventually be consulted, treat directly, or triage for subspeciality care. Direct learning research has shown that the use of experiential learning, feedback, effective relationships with peers, diverse educational methods, and developing intrinsic motivation are the key factors in quality medical education.1–4 Simply reading about a disease process does not provide the same lasting educational benefit as clinical exposure and practical experience for both cognitive and technical skills.
We believe plastic surgery holds a special position in the allocation of trainee assigned time. Much of the soft tissue and chronic wound management principles and practice are only accessible on the plastic surgery services. These skills are important for all surgical trainees to understand and adopt into their own practice. Previous studies have shown that even 2–3 weeks of condition-specific focused plastic surgery education results in significant improvement in plastic surgery technical skills and overall core competency development in surgery residents.5–7 Plastic surgery exposure allows trainees to participate in complex wound debridement, skin lesion and tumor excisions, skin grafting, scar revision, contracture release, and soft tissue coverage in complex trauma patients. Many of these skills would benefit other surgical specialties and emergency department trainees.
At our institution, the plastic surgery service trains general surgery, orthopedics, neurosurgery, otorhinolaryngology, emergency medicine, and podiatry residents. The decision to participate in the service involves a cost–benefit discussion as outlined above. In most cases, these decisions are often based on feedback from trainees regarding their anecdotal experience on the service. Our goal is to supplement this with data on what they learned by working on the service. We developed and administered a test of multiple-choice items to provide a quantifiable metric of learning. Multiple choice items were chosen as our test format to reduce subjectivity in evaluator scoring while maintaining the ability to test higher cognitive level capabilities.8,9 We hypothesized that residents would demonstrate improved performance on a post-rotation test after their 1-month rotation of plastic surgery compared to a pre-rotation test.
Materials and Methods
We developed and validated a bank of single-best-answer multiple choice questions. The questions were developed to encompass both the topics outlined in the curriculum and clinical scenarios commonly seen by house officers on the plastic surgery rotation (Appendices 1 and 2). Special emphasis was placed on clinical scenarios that trainees might see on call and principles that may provide corollaries to the broader practice of surgery. Examples of this include postoperative complications, problem wounds, and trauma.
With input from the graduate medical education department, we reviewed each question for its ability to promote critical thinking, discriminate between knowledge levels, minimize ambiguity, and moderate difficulty of subject matter.10–12 This step required rewriting many of the questions and responses. Exclusion criteria included the presence of item writing flaws such as repetition of part of the stem, use of qualifiers within an option, complicated or ambiguous stem, use of double negatives, presence of heterogenous option length, and use of absolute options. We used a modified Bloom’s taxonomy scale to assess the cognitive level of the questions with level 1 equating to “remembering”, level 2 as “understanding”, and level 3 as “applying and predicting”. There were 56 questions prepared in this manner for consensus testing. Consensus testing involved reviewing all questions with a group of board-certified plastic surgeons (N = 8).13 Each question had to achieve 100% consensus as written or with minor wordsmithing. Some required changes in the associated images, x-ray and patient photos to achieve consensus. The list was reduced to 35 questions after consensus testing.
The beta testing of this group of 35 questions involved a random set of 10 questions given to each resident rotating on the service before the rotation began. This was sent to them as an online test to be taken prior to starting the service. Any item with an average correct response rate greater than 70% on the pre-rotation test was eliminated. The rationale for their removal was that these questions were not specific enough to discriminate between knowledge gained before and after the rotation or that the question was poorly worded. All questions were reviewed in this manner by at least 5 different rotating residents. After completing this first round of pre-rotation assessment, we then switched to post-rotation assessment. In a similar manner, residents were sent a request to complete a post-rotation questionnaire 4 weeks after completing the rotation. Any question that received a less than 30% correct average response was eliminated. The rationale for this was that the question was poorly worded and needed revision. This development phase similarly continued until at least 5 different residents answered each question.
Following completion of the beta testing, there were a total of 20 items available in the question bank. We then established a pilot program for the off-service residence on plastic surgery. The cadre of residents included mostly general surgery but also neurosurgery, dermatology, orthopedics, otorhinolaryngology, emergency medicine, and podiatry residents. All residents were expected to take both the pre-rotation test and the post-rotation test. This resulted in approximately 36 participants annually. Participating postgraduate year 1 residents were sent a copy of the curriculum and a link to the pretest (Appendix 2) four weeks prior to the rotation. The post-rotation test was sent 1 month after the rotation was completed.
The testing was performed on a platform used systemwide for employee education, training and testing. The site administrator was given all 20 items and asked to randomly assign the items in groups of 10 to serve as the pre-rotation and post-rotation tests (Appendix 1). Residents could potentially get the same question on the pre- and post-rotation tests. Items were scored dichotomously with equal weight. Pre-rotation test scores and keyed responses were not shared with residents to avoid a potential confounding factor. Tests were excluded from our data set if either test was unfinished, not submitted or required more than 24 hours to complete. A one-tailed paired t-test was then used to assess improvement calculated in Microsoft Excel (Redmond, WA). We collected data over 6 years. Institutional review board policies were followed.
Results
A total of 378 tests or 189 test pairs were given over a 6-year period to 189 residents. Based on inclusion criteria, there were 228 completed tests, 60% of total participants for inclusion in our study. The cohort who completed both tests was comprised of 63 general surgery (54%), 24 emergency medicine, 10 orthopedic, 5 podiatry, 4 neurosurgery, 4 dermatology, and 4 otorhinolaryngology residents. The average Modified Bloom’s Taxonomy level of the questions was 2.04. The average score of correct answers on the pre-rotation test was 29.5% (standard deviation 11), and the average score for the post-rotation test was 88% (standard deviation 9) (Table 1) demonstrating a statistically significant positive score differential of 58% (p < 0.001). There was no difference in test performance between general surgery and non-general surgery trainees. No differences were noted between scores for residents taking the pre-rotation test or post-rotation test during the first quarter versus the final quarter of the academic year.
![]() |
Table 1 Average Pre-Rotation and Post-Rotation Percent Correct Scores by Question |
Discussion
We have shown that rotating on plastic surgery for 1 month as a postgraduate year 1 improves understanding of key concepts and relevant clinical scenarios based on a pre-test post-test model. The test was designed to provide objective information on clinical skills learned on the rotation. This was done by comparing scores on a group of standardized questions before and after the rotation. This improvement after real-life exposure to the specialty held for surgical and nonsurgical trainees as well as those rotating early or late in the academic year.
In this study, we set out to confirm our hypothesis that clinical exposure to plastic surgery in the postgraduate year 1 year provides measurable benefits in line with the institutional curriculum. The standardized exam we administered confirms this. The skills we tested align with the trainee goals and objectives. They include understanding chronic and surgical wound assessment and wound care. This is a skill for all physicians and easily transferable to many clinical settings including hospital acquired conditions such as pressure injuries. This is encompassed in the concepts of system-based practice and patient safety, which are part of the Accreditation Council for Graduate Medical Education outlined milestone assessments.14 Other concepts include soft tissue management for trauma patients and in the operating room. These skills also overlap with trainees’ milestones in surgery, podiatry and dermatology.15 There are questions about the management of fractures in the hand and face, which are particularly relevant for orthopedic and otorhinolaryngology residents. These topics overlap with many disciplines and are necessary for triaging care and disposition of patients presenting for emergent and urgent care.
It is noteworthy that the scheduling of the plastic surgery rotation did not impact score or score differential. Comparing first quarter and fourth quarter test takers did not show a statistically significant difference. The impact of the rotation was the same regardless. The fourth quarter test takers have taken care of plastic surgery patients on cross coverage nights, interacted with plastic surgery fellows and staff, attended relevant requisite didactics and presumably studied for the in-service training examination conducted in January of each year. Apparently, none of these factors made as significant of an impact as the 1-month rotation.
Our study has several limitations. While the 20 items were designed to be representative of the rotation curriculum, having a greater number of questions could have allowed for more detailed assessments of learning and potential areas where we can improve the education provided. Our assessment of resident learning is limited by relatively short-term follow-up, so we cannot comment on long-term retention of knowledge. Only 60% of tests administered during the study interval met inclusion criteria. Although the study design was prospective, there may be unintended selection bias.
This work is important in demonstrating a method for evaluating the education provided to trainees on a surgical rotation. Plastic surgery is a unique field as its scope often overlaps other specialties. By providing trainees with another specialty’s perspective on a condition that also falls in their primary specialty’s domain, we can help establish a broad-based, academically well-rounded management approach and facilitate knowledge sharing among specialties.
Conclusions
We have developed a questionnaire that had a twofold benefit. It allows us to confirm that trainee experience is aligned with our agreed upon curriculum, and it also reinforces to us and the medical education community the importance of real exposure and education in plastic surgery. Further work is needed to build these principles for other surgery and nonsurgical rotators.
Data Sharing Statement
The datasets generated during and/or analyzed during the current study are available from the corresponding author on reasonable request.
Ethics Approval
This is an observational study. The Henry Ford Hospital Research Ethics Committee has confirmed that no ethical approval is required.
Consent to Participate
Informed consent was obtained from all individual participants included in the study.
Author Contributions
All authors made a significant contribution to the work reported, whether that is in the conception, study design, execution, acquisition of data, analysis and interpretation, or in all these areas; took part in drafting, revising or critically reviewing the article; gave final approval of the version to be published; have agreed on the journal to which the article has been submitted; and agree to be accountable for all aspects of the work.
Funding
No funding was received to assist with the preparation of this manuscript.
Disclosure
The authors have no relevant financial or non-financial interests to declare for this work.
References
1. Bancroft GN, Basu CB, Leong M, Mateo C, Hollier LH, Stal S. Outcome-based residency education: teaching and evaluating the core competencies in plastic surgery. Plast Reconstr Surg. 2008;121(6):441e–448e. doi:10.1097/PRS.0b013e318170a778
2. Liang JC, Chen YY, Hsu HY, Chu TS, Tsai CC. The relationships between the medical learners’ motivations and strategies to learning medicine and learning outcomes. Med Educ Online. 2018;23(1):1497373. doi:10.1080/10872981.2018.1497373
3. Reed S, Shell R, Kassis K, et al. Applying adult learning practices in medical education. Curr Probl Pediatr Adolesc Health Care. 2014;44(6):170–181.
4. Kong Y. The role of experiential learning on students’ motivation and classroom engagement. Front Psychol. 2021;12:771272. doi:10.3389/fpsyg.2021.771272
5. Vallino LD, Brown AS. Assessing third-year medical students’ knowledge of and exposure to cleft palate before and after plastic surgery rotation. Ann Plast Surg. 1996;36(4):380–387. doi:10.1097/00000637-199604000-00009
6. Munabi NCO, Durnwald L, Nagengast E, Auslander A, Ntirenganya F, Magee WP. Pilot evaluation of the impact of a mission-based surgical training rotation on the plastic surgery skills and competencies development of general surgery residents in Rwanda. J Surg Educ. 2019;76(6):1579–1587. doi:10.1016/j.jsurg.2019.05.001
7. Munabi NCO, Durnwald L, Nagengast ES, Ntirenganya F, Magee WP. Long-term impact of a mission-based surgical training rotation on plastic surgery capacity building in Rwanda. J Surg Educ. 2020;77(1):124–130. doi:10.1016/j.jsurg.2019.08.009
8. Tractenberg RE, Gushta MM, Mulroney SE, Weissinger PA. Multiple choice questions can be designed or revised to challenge learners’ critical thinking. Adv Health Sci Educ Theory Pract. 2013;18(5):945–961. doi:10.1007/s10459-012-9434-4
9. Pham H, Trigg M, Wu ST, et al. Choosing medical assessments: does the multiple-choice question make the grade? Educ Health. 2018;31(2):65–71. doi:10.4103/efh.EfH_229_17
10. Boland RJ, Lester NA, Williams E. Writing multiple-choice questions. Acad Psychiatry. 2010;34(4):310–316. doi:10.1176/appi.ap.34.4.310
11. Morrison S, Walsh Free K. Writing multiple-choice test items that promote and measure critical thinking. J Nurs Educ. 2001;40(1):17–24. doi:10.3928/0148-4834-20010101-06
12. Kehoe J. Writing multiple-choice test items. Pract Assess Res Eval. 1995;4(1):9.
13. Nielsen A, Hansen J, Skorupinski B, et al. Consensus conference manual. The Hague: LEI; 2006. Available from: https://estframe.net/uploads/qyEJ2dPN/et4_manual_cc_binnenwerk_40p.pdf.
14. Accreditation Council for Graduate Medical Education. Surgery Milestones. Chicago: ACGME; 2019. Available from: https://www.acgme.org/globalassets/PDFs/Milestones/SurgeryMilestones.pdf.
15. Accreditation Council for Graduate Medical Education. Milestones by specialty. Chicago: ACGME; 2023. Available from: https://www.acgme.org/milestones/milestones-by-specialty/.
© 2024 The Author(s). This work is published and licensed by Dove Medical Press Limited. The
full terms of this license are available at https://www.dovepress.com/terms.php
and incorporate the Creative Commons Attribution
- Non Commercial (unported, 3.0) License.
By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted
without any further permission from Dove Medical Press Limited, provided the work is properly
attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.