Back to Journals » Advances in Medical Education and Practice » Volume 15
Exploring Differences in Clinical Decisions Between Medical Students and Expert Clinicians
Authors Rojas M , Price A, Kim CJ, Chen SF, Gutierrez K, Wieman C, Salehi S
Received 20 August 2024
Accepted for publication 11 December 2024
Published 24 December 2024 Volume 2024:15 Pages 1285—1297
DOI https://doi.org/10.2147/AMEP.S492302
Checked for plagiarism Yes
Review by Single anonymous peer review
Peer reviewer comments 2
Editor who approved publication: Dr Md Anwarul Azim Majumder
Marcos Rojas,1,* Argenta Price,2,* Candice Jeehae Kim,1,3 Sharon F Chen,3 Kathleen Gutierrez,3 Carl Wieman,1,4 Shima Salehi1
1Graduate School of Education, Stanford University, Stanford, California, USA; 2Doerr School of Sustainability, Stanford University, Stanford, California, USA; 3School of Medicine, Stanford University, Stanford, California, USA; 4Department of Physics, Stanford University, Stanford, California, USA
*These authors contributed equally to this work
Correspondence: Marcos Rojas, Graduate School of Education, Stanford University, 482 Galvez Mall, Stanford, California, 94305, USA, Tel +1 (650) 723-2109, Email [email protected]
Background: Numerous challenges exist in effectively bridging theory and practice in the teaching and assessment of clinical reasoning, despite an abundance of theoretical models. This study compares clinical reasoning practices and decisions between medical students and expert clinicians using a problem-solving framework from the learning sciences, which identifies clinical reasoning as distinct, observable actions in clinical case solving. We examined students at various training stages against expert clinicians to address the research question: How do expert clinicians and medical students differ in their practices and decisions during the diagnostic process?.
Methods: We developed a questionnaire about a pediatric infectious disease case based on the problem-solving framework from the learning sciences to probe clinical reasoning decisions. The questionnaire had four sections: medical history, physical examination, medical tests, and working diagnosis. The questionnaire was administered at Stanford University between January 2019 and June 2023 to collect data from 10 experts and 74 medical students. We recruited participants through maximum variation sampling. We applied deductive content analysis to systematically code responses to identify patterns in the execution of the practices and decisions across the questionnaire.
Results: This research introduces a highly detailed, empirically developed framework that holds potential to bridge theory and practice, offering practical insights for medical instructors in teaching clinical reasoning to students across various stages of their training. This framework involves nine practices, with a total of twenty-nine decisions that need to be made when carrying out these practices. Differences between experts and students centered on ten decisions across the practices: Differential diagnosis formulation, Diagnostic plan and execution, Clinical data reassessment, and Clinical solution review.
Conclusion: We were able to identify nuanced differences in clinical reasoning between students and expert physicians under one comprehensive problem-solving framework from the learning sciences. Identifying key clinical reasoning practices and decision differences could help develop targeted instructional materials and assessment tools, aiding instructors in fostering clinical reasoning in students.
Keywords: clinical reasoning, clinical reasoning practice, clinical decisions, decision making, clinical problem-solving
Introduction
Clinical reasoning is a crucial skill that medical students must develop throughout their professional training.1–3 Physicians employ clinical reasoning during medical consultation while taking the clinical history, performing the physical examination, and selecting medical tests to decide on a working diagnosis. The emphasis on clinical reasoning as a core skill is based on both the need for doctors to be able to reason through complex clinical cases and the recognition that diagnostic errors can seriously impact patients’ lives.4–6 The extensive research on clinical reasoning has yielded varying conceptualizations and frameworks for clinical reasoning and its teaching methods.7,8
Research on clinical reasoning over the past five decades has resulted in various theories and models that provided a better understanding of this complex process.1,8 Clinical reasoning can be conceptualized as a cognitive, contextually situated, or socially mediated activity.9 In the 1970s, researchers introduced clinical reasoning as a cognitive activity, focusing on problem-solving skills through the hypothetico-deductive method, where expert physicians generate and test hypotheses.10,11 However, this approach was too general and lacked a strong connection to expertise, suggesting that correct diagnoses were more linked to content knowledge than to a universal process. In the 1980s, the focus shifted to expert knowledge, with theories emphasizing the accumulation of representative cases to solve new problems. The main limitation of this decade’s theories was the lack of emphasis on how knowledge is organized, which became apparent in the 1990s when mental representations emerged as a key concept.12 During this period, the illness script theory was prominent, highlighting that experts not only possess more knowledge but also have it better organized and more accessible than novices. This led to the observation that experts often reach potential diagnoses rapidly in simple cases, reserving thorough analysis for more complex scenarios.13 In the 2000s, dual processing and cognitive continuum theories became central, explaining clinical reasoning through the interaction of fast, intuitive (system 1) and slow, analytical (system 2) processes. These theories also introduced the importance of metacognition, perception, intuition, and emotions in clinical reasoning.3,14 While these theories effectively explain how expert physicians approach clinical cases, they largely overlook the developmental path medical students must follow to reach this expert level to support them in becoming expert clinicians.
The goal of this work is to contribute to understanding and facilitating the developmental path medical students must follow to become expert clinicians by adapting a framework from science education that breaks the cognitive processes of clinical reasoning into distinct sets of practices and decisions that can be explicitly taught and practiced. With this adaptation, we hope to support medical educators in developing practical instructional strategies for teaching clinical reasoning.5,7
The Cognitive Practices and Decisions Framework from the Learning Sciences
Applying research from the learning sciences can advance our understanding of the development and teaching of the complex skill of clinical reasoning.15,16 Research in the learning sciences has led to an empirical framework that conceptualizes competency in solving complex problems as effective engagement in distinct and measurable practices and decisions. In applying this framework to clinical reasoning, we define clinical reasoning competency as reaching a mastery level in a set of distinct practices and decisions. The specificity of this framework will aid in creating effective instructional materials and tools to scaffold the development of these practices and decisions to foster clinical reasoning.
The problem-solving framework from the learning sciences, initially targeted at undergraduate science and engineering education, involves nine practices, with a total of twenty-nine decisions that need to be made when carrying out these practices.15,16 Table 1 describes the practices as they translate to the context of clinical reasoning. The decisions are the specific choices that must be made to carry out each practice (See Supplemental Appendix 1 for a detailed list of all twenty-nine decisions and related practices). This dynamic and iterative cognitive process of engaging in these practices does not adhere to a linear pathway. In situations where a clear solution is not apparent, experts skillfully solve the problem by iteratively engaging in these practices and decisions.
![]() |
Table 1 Problem-Solving Practices in the Context of Clinical Reasoning and Its Definitions |
We identify clear connections between the diverse theories and models aimed at understanding clinical reasoning and the problem-solving practices and decisions framework. For example, both the illness script theory, a widely used approach to teaching and assessing clinical reasoning,13,17–22 and the problem-solving framework acknowledge the crucial role of discipline-specific previous knowledge, organized as mental models.14 In the illness script literature, expertise in clinical reasoning is developed by iteratively engaging in clinical practice and building these illness scripts. The problem-solving framework also includes selecting a mental model as a crucial decision in the process of solving a clinical case but introduces additional essential practices that facilitate the learner’s journey toward becoming an expert clinician.
The problem-solving framework also draws a clear connection with dual-processing theory, where the selection of a mental model or pattern recognition forms the core of the intuitive system, and the problem-solving skill forms the core of the analytic system.23 Similarly, the problem-solving framework entails both analytical and non-analytical processes, transitioning from rigid knowledge structures to a fluid interaction between knowledge and problem-solving abilities. In contrast to dual processing, the problem-solving framework incorporates more essential cognitive components, including reflection, and entails specific practices and decisions within both analytical and non-analytical processes, addressing a significant gap in the dual-processing theory, which lacks provision for distinct, actionable steps and practices.24
The development of the problem-solving framework was based on the expertise of medical doctors, scientists, and engineers, hence confirming that expert physicians engage in these particular problem-solving practices and decisions while diagnosing clinical cases.16 However, varying levels of mastery in the execution of these practices between medical students and expert physicians remain unexplored. Understanding how the execution of these distinct practices and decisions differ between these two groups would enable instructors to create targeted instructional methods. Such methods would focus on specific clinical reasoning practices and underlying decisions, thereby helping medical students build competency in the targeted practices and decisions. This approach is aligned with competency-based medical education25 and with the principles of deliberate practice from research on the development of expertise.26,27 While several theories and models in clinical reasoning elucidate how experts approach clinical cases, they often fall short in guiding novice students toward expert performance. The question remains: How do we facilitate students’ transition to expert-level performance? The problem-solving framework addresses this by breaking down clinical reasoning into distinct practices and decisions. This decomposition allows for precise feedback on specific skills essential for clinical reasoning, offering learners a clear, actionable path toward achieving expertise.
Research Question
Our research question is: How do expert clinicians and medical students differ during the diagnostic process in different clinical reasoning practices and decisions, as identified in the problem-solving framework? Insights from this work will contribute to the development of teaching and assessment instruments that accurately measure competency in these practices and decisions and distinguish different learner levels. With such instruments, it will be possible to design instructional practices to support gaining competency fundamental to doctors’ diagnostic reasoning.
Methods
Research Design
Our study employs Salehi and Price’s problem-solving practices and decisions framework15,16 as a basis for designing a qualitative questionnaire that captures the execution of clinical reasoning practices and decisions among expert clinicians and medical students. This questionnaire is based on templates from prior research that investigated how clinicians navigate complex cases, informing the necessary steps to teach or guide students in clinical scenarios.28
Setting and Participants
We administered the questionnaire to 10 expert clinicians and 74 medical students from Stanford University from January 2019 to June 2023. We recruited participants through maximum variation sampling.29 We chose five of the 10 experts from a select group of faculty who teach clinical reasoning to medical students. This faculty group is highly selective and annually reviewed, and all continue to care for patients while teaching. We contacted the other five experts based on their clinical experience and the expected likelihood of participation. The expert sample had five pediatricians and five internal medicine doctors. Since the questionnaire was based on a pediatric case, this sample allowed for a comprehensive review of both specialized and general medical problem-solving practices. Seventy-four medical students participated: 50 were pre-clinical students (first and second years), and 24 were clinical students (third, fourth, and fifth years), which allowed us to examine the relationship between clinical experience and medical training on students’ ability to solve the pediatric case.
Data Collection Instruments and methods
Based on previous research, we created a questionnaire to capture problem-solving practices and decisions.28 It contains a pediatric case about an infant for whom the correct diagnosis is UTI (Urinary Tract Infection) and fifteen questions (thirteen open-ended, two multiple-choice) that mimic real-life clinical reasoning processes (see Supplemental Appendix 2, Figure 1 “Clinical Reasoning Questionnaire” for the complete version of the questionnaire). We made the difficulty level of the case such that both preclinical and clinical medical students should have the necessary content knowledge to diagnose it correctly. The questions probe decisions required for five of the problem-solving practices. We left out clinical symptom stratification and two of the reflection practices because they were either less relevant for this relatively simple case or difficult to measure in a 30-minute questionnaire. Respondents also provided demographic data, including medical specialty for experts, and current year in the medical program for students.
The questionnaire had four sections: Medical History, Physical Examination, Medical Tests, and Working Diagnosis. These sections presented a clinical case and related history, provided a comprehensive patient physical exam, inquired about necessary medical tests, and asked respondents to utilize the provided information to choose a working diagnosis and suggest next steps.
Pilot of the Questionnaire
We conducted a thorough pilot testing of the questionnaire through expert review and “think-aloud” interviews with five experts (three pediatricians, two internal medicine doctors) and five students (four pre-clinical, one clinical).30 Both pediatric and non-pediatric faculty had consistent responses, indicating that specialized content knowledge did not significantly affect their diagnostic process. Faculty reasoning was notably different from students’. This process confirmed that the questions were capturing the clinical reasoning process as intended and not just the recall of specific knowledge. The interviews with students confirmed the clarity of the questionnaire’s wording and the relevance of the close-ended questions. The pilot testing led to minor wording changes, but as these were minor, data from the pilot tests were included in the results.
Administration of the Questionnaire to Participants
We administered the questionnaire virtually via Qualtrics to 69 additional medical students, intentionally including both pre-clinical and clinical stages, along with another five expert clinicians. We recruited student volunteers through a departmental mailing list and compensated students with $10 gift cards. We advised participants to finish the questionnaire independently, in one sitting, and without using external resources. Students completed the survey in 27 minutes on average, while experts took 54 minutes on average. This difference in time was primarily because, for the expert group, we conducted five think-aloud interviews, in which clinicians elaborated on their reasoning as they progressed through the questionnaire. This method, which involved detailed explanations of their thought processes, naturally extended their completion time to an average of 54 minutes.
Data Processing and Data Analysis
The coding process began with an iterative review of responses by team members M.R. and A.P., using a sample of 5 experts and a random sample consisting of 10% of students across both stages. Using this sample, we developed a codebook by deductive content analysis, from the perspective of the problem-solving practices and decisions framework, through a series of iterative meetings for robustness and consistency.31,32 The resulting codebook contained 44 unique codes, each capturing content that reveals differences in how practices and decisions are executed. We based this codebook on the questionnaire’s 13 substantive questions, excluding questions three and seven, which focused on measuring medical knowledge (see Supplemental Appendix 3 for the codebook). We established inter-rater agreement through multiple further coding rounds, achieving agreement above 80% in each code when coding 13.5% of the data.33 Table 2 outlines the 13 questions from the questionnaire that were coded and analyzed, along with their corresponding problem-solving practices and decisions.
![]() |
Table 2 Problem-Solving Practices and Decisions Across the Four Sections of the Questionnaire |
Finally, author M.R. coded the complete data set of ten experts and 74 medical students using the finalized codebook. The coder was not blind to the subject demographics and study group assignment. We then analyzed the coded data using content analysis to compare the problem-solving execution between the expert clinicians and medical students.34
Ethical Considerations
The Institutional Review Board (IRB) of Stanford University reviewed and approved the study protocol (protocol no. 48785). Participants provided informed consent before participating in the study. We confidentially stored data with unique identifiers on encrypted devices.
Reflexivity and Trustworthiness
In this study, a diverse research team, including medical doctors, educational scientists, and doctoral students, utilized a methodological approach emphasizing reflexivity and trustworthiness to create and use a questionnaire with medical students and experts. To maintain research integrity and minimize bias, we employed strategies such as member checking, peer debriefing, and meticulously maintained audit trails.35 Regular team meetings ensured that data interpretations and assumptions were valid, enhancing the trustworthiness of the findings. This approach strengthened the research’s robustness and credibility, aiding the understanding of complex problem-solving practices and decisions.
Results
Detailed results for all questions mapped to related practices and decisions are available in Supplemental Appendix 4. Table 3 and the following sections present a summary of results, organized by the main differences found in problem-solving practices between experts and students. We did not differentiate clinical students based on whether they had completed a pediatric rotation due to the small sample size within the clinical group. Additionally, preliminary analyses showed that the results were qualitatively similar when considering all clinical students as a single group.
![]() |
Table 3 Differences in Problem-Solving Practices and Decisions Between Experts and Medical Students |
Differential Diagnosis Formulation: Differences in Deciding on Key Features, Predictive Framework, and Potential Solutions
Experts and students demonstrated differences in the practice of Differential diagnosis formulation, which was examined in the Medical history, Physical examination, and Working diagnosis sections. When deciding key features, experts tended to identify more features and with higher agreement on the features compared to students. Experts, for instance, identified an average of 13 features in the Medical history section, surpassing the 8.8 and 9.5 averages of pre-clinical and clinical students, respectively. More than 50% of experts concurred on all 13 key features, whereas, at the same level of agreement, both student groups concurred on only 5 features. Experts also displayed more structured organization of the key features: 7 experts (70% of experts) arranged features of the physical examination from general to focal examination, compared to 16 (32%) pre-clinical and 9 (37.5%) clinical students.
Another difference in this practice was deciding on mental models, as seen in the Medical history section. Experts articulated a clearer relationship between selected key features and proposed mental models to explain the data; 9 (90%) experts explained this relationship, in contrast to 31 (62%) pre-clinical and 17 (70.8%) clinical students.
The last difference in this practice was deciding on potential solutions in the Medical history and Physical examination sections. After analyzing key features, experts chose different possible diagnoses than students, with little variability between experts. For example, after analyzing the patient’s medical history, 9 (90%) experts chose a respiratory disease as their first option. In contrast, students predominantly opted for gastrointestinal disease: 24 (48%) pre-clinical students and 21 (87.5%) clinical students. Compared to preclinical students, experts also described more specific diagnoses (eg, “pneumonia” instead of “respiratory disease”). In the Medical history section, 100% of experts and clinical students (10 and 24, respectively) mentioned specific potential diagnoses, compared to 50% of pre-clinical students (25 out of 50).
Diagnostic Plan and Execution: Nuanced Differences in the Information Needed and Data Prioritization
Differences between experts and students in the diagnostic plan and execution practice were apparent in their decisions about the information needed and prioritization, measured in the Medical history and Medical tests sections. Experts and students differed in their choices of exams and tests to order. For the physical exam, while most students selected abdominal (43 (86%) pre-clinical, 22 (91.7%) clinical), pulmonary (40 (80%) pre-clinical, 22 (91.7%) clinical), and head and neck (32 (64%) pre-clinical, 23 (95.8%) clinical), all experts selected general, head and neck, and pulmonary, with 9 (90%) choosing abdominal and neurological. Experts also demonstrated a more structured approach to data collection, organizing the information requested from the physical exam from general to focal examination (10 (100%) experts, 43 (86%) pre-clinical, 18 (75%) clinical). Finally, experts specified more precisely the data they wanted to collect.
Another difference between experts and students in this practice was deciding on priorities. Experts prioritized data more explicitly, focusing on the most important information to specify the differential diagnosis or what needs immediate attention (6 (60%) experts, 5 (10%) pre-clinical, 4 (16.7%) clinical). Experts also frequently described the relationships between prioritized data, potential solutions, and key features (for example, 5 (50%) of experts on data requested from the physical exam), unlike students (8 (16%) pre-clinical, 2 (8.3%) clinical) who mostly connected the data requested to potential solutions alone.
Clinical Data Reassessment: Differences When Reflecting on New Data
The clinical data reassessment practice was examined in the Physical examination, Medical tests, and Working diagnosis sections. Experts and clinical students were similar in deciding how collected information related to their predictions. After receiving new data, they converged on similar diagnoses and related those to the new information. However, pre-clinical students differed in their considerations of solutions compared to experts. For instance, in the Physical Examination section, after receiving the patient’s physical exam, more than 50% of pre-clinical students considered sepsis, while more than 50% of clinical students and experts chose both UTI/Pyelonephritis and sepsis.
The decision of narrowing down the problem to a more specific problem in the Medical tests section also differed across groups. Experts more frequently described how the selected information connected with potential diagnoses, allowing them to narrow down the problem to testing for those diagnoses. Eight (80%) experts explained the relationships between the information requested and diagnosis, as opposed to 29 (58%) of pre-clinical and 18 (75%) of clinical students.
Clinical Solution Review: Differences When Reflecting on the Working Diagnosis
The practice of clinical solution review was examined in the Working diagnosis section. Nine out of 10 experts identified UTI/Pyelonephritis as the final working diagnosis compared to 27 (54%) preclinical students and 15 (62.5%) clinical students. Experts were more organized in their decision about presentation: 7 (70%) organized their assessment of the patient according to Medical history, Physical exam, Medical tests, and then Working diagnosis. Only 21 (42%) pre-clinical and 9 (37.5%) clinical students used the same organization.
Finally, there were differences in considering broader implications. Experts and clinical students generally agreed on treatment as a next step (35 (70%) pre-clinical, 22 (91.7%) clinical, 10 (100%) experts), but more experts also considered additional confirmatory medical tests (8 (16%) pre-clinical, 3 (12.5%) clinical, 5 (50%) experts).
Differences in Problem-Solving Practices and Decisions Between Experts and Students
In summary, the data revealed that students vary in their initial diagnostic approach and prioritization of key features, but as they progress in the questionnaire and gain more information, their final diagnoses tend to converge towards those of expert clinicians. Our questionnaire revealed differences across student stages in Diagnostic plan and execution, with clinical students more closely aligning with experts.
Discussion
In this study, we adopted the empirically developed problem-solving framework from learning sciences to conceptualize clinical reasoning as a set of distinct practices and their entailing decisions. We designed a questionnaire based on this framework to examine variations in the mastery of expert physicians and medical students from different stages of medical training in carrying out these clinical reasoning practices and their entailing decisions. In our study, we found specific disparities between medical students and expert physicians. Although many of our results are consistent with prior medical education research, our examination of clinical reasoning is both broader in scope and more nuanced in each of the practice and decision elements. Our hope is that this work can inform the development of an assessment tool that parsimoniously yet comprehensively measures students’ levels of mastery in different clinical reasoning practices and decisions and offers instructors ready-to-use concrete instructional strategies to support students’ development of clinical reasoning mastery.
Clinical Reasoning from the Point of View of the Problem-Solving Practices and Decisions Framework
Although the problem-solving framework we adopted in this work comes from outside medical education, it “translated” well to clinical reasoning and helped us identify several nuanced differences between experts and students under one framework. For example, in the practice of formulating differential diagnoses, our results highlighted clear distinctions in the identification of important features, the organization of information, and the choice of mental models to apply to the case. Experts not only selected a broader set of features but also with greater consistency and organization than students. This systematic approach implies well-established mental models, likely developed over years of clinical experience. This is in line with Marshall, who emphasized the adeptness of expert clinicians in making diagnoses with essential and systematic data.36 Furthermore, when deciding on the appropriate mental model for the case, experts demonstrated a firm understanding of how data correlates with specific mental models, especially at the beginning of the case. This aligns with our previous study emphasizing the critical role of mental models in clinical case resolution.16 Similarly, Kassirer posits that experts utilize a concatenation of concepts early in their diagnostic process to guide their reasoning through the case.37 Our finding also resonates with Nendaz et al, who observed that some experts tend to collect as much information as possible early in the case to test several hypotheses before refocusing on the specific complaint of the patient or the main goal of the case.38 The implication of these insights for teaching is that students need additional practice at identifying and organizing key features and selecting and building appropriate mental models of features and associated illnesses, especially at the start of a case.
In the practice of diagnostic plan and execution, our analyses identified differences between levels of student experience, in addition to the differences between students and experts. Clinical students were more like experts in selecting information to collect from the physical exam and medical tests compared to pre-clinical students. This is consistent with Kern and Doherty’s finding that sometimes medical students seek diagnostically irrelevant information.39 In data collection, experts were more organized and methodical than both student groups (eg, differential diagnosis formulation). Also, they prioritized the requested information by correlating it to potential diagnoses and key features of the case. This result is consistent with observations from Audétat et al and Higgs et al, who reported that medical students often struggle with prioritizing information and may miss or incorrectly select pieces of information.3,40 We theorize that this structured approach of experts during diagnostic planning and execution stems from comprehensive mental models developed over extensive clinical practice, which gives them clarity and precision during data collection. Our finding that clinical students were more aligned with experts than pre-clinical students on which information to collect, but not in prioritizing, suggests that clinical students are developing more comprehensive mental models but still need more practice applying them. Instructional activities such as structured reflection on mental models in the context of diagnostic planning could help students develop this problem-solving practice.
In the practice of clinical data reassessment, after acquiring new data, both experts and clinical students converged on similar potential diagnoses and associated them with the new information. Intriguingly, experts delineated the relationship between data and potential diagnoses with greater frequency than both student groups. This deviation emerges after reflection on new data. The difference may be rooted in the depth and breadth of expert clinical knowledge, based on which they can better make predictions and compare the new information to those predictions. Groves et al similarly argue that the proficiency of experts in diagnosis depends not merely on data collection but on the adept integration of clinical data.41 This finding emphasizes the pivotal role of experience in sharpening diagnostic accuracy. Students can learn this integration by explicitly reflecting on how new information connects to their mental models and predictions about the case.
In the practice of clinical solution review, all experts consistently identified the correct diagnosis, compared to only two-thirds of clinical and half of pre-clinical students. Additionally, experts, unlike students, systematically structured their patient assessment and sought additional tests. The quality and accuracy of these hypotheses have been shown to improve with clinical experience, a notion echoed by Neufeld et al.11 Thus our findings support that providing students with increased clinical practice will improve their diagnostic accuracy. Our data also emphasize the importance of students practicing identifying and organizing key features, selecting information to collect, and reflecting on how information relates to their predictions.
Across the questionnaire, clinical students were closer to experts in the execution of practices and decisions. This matches other studies, such as Jackson et al, and Elstein et al, that have found that experience and knowledge matter in the development of expertise.10,42
Our results are consistent with prior work in medical education research but encompass a broader range and more precise characteristics of clinical reasoning practices based on one unifying framework. This could facilitate the creation of easy-to-implement assessments and support effective instructional practices for teaching diagnostic clinical reasoning.
Implications
Medical education stands at the intersection of theoretical knowledge and practical application, and bridging theories and practices is essential. This work aims to develop approaches that enhance this integration. Furthermore, it contributes to recognizing the nuances in clinical reasoning across varying levels of expertise to support effective instructional practices. Our study used one encapsulating problem-solving framework for bridging between theories and practice by 1) conceptualizing clinical reasoning as a set of practices and their entailing decisions and 2) capturing nuanced differences in clinical reasoning practices and decisions between expert doctors and medical students. Using this framework, instructors can identify specific student deficiencies and offer actionable steps for improvement.
Strengths, Limitations, and Future Research
A strength of this study is that it introduces a novel approach to capture different aspects of clinical reasoning using an empirical framework. Additionally, the framework can better characterize the reasoning processes of students relative to experts and hence what elements to concentrate on in education.
This study has several limitations. The focus of the questionnaire on a single pediatric case limits the range of clinical scenarios examined. We also did not account for demographic background. The questionnaire format may also have limited the engagement in some practices and hence might have limited the depth of insights. Some aspects, like problem decomposition and certain reflection practices, were not explored due to the case’s simplicity and time constraints. Additionally, our study was limited to Stanford University with a purposive sample for participants.
Drawing from Higgs’ assertion that clinical reasoning is deeply influenced by cultures and their context, there’s a need to explore how these clinical reasoning practices vary across different medical scenarios and diverse study participants.3 Building on these insights, our next steps will involve validating the inferences drawn from this questionnaire regarding students’ clinical reasoning practices and decisions to develop practical teaching and assessment tools for medical educators. Additionally, we plan to explore further the reflective practices embedded within this framework.
Conclusion
This study used a problem-solving framework from the learning sciences to analyze clinical reasoning as a set of distinct practices and decisions, revealing specific, observable differences between expert clinicians and medical students in key areas of diagnostic reasoning. Our findings demonstrate nuanced distinctions in Differential diagnosis formulation, Diagnostic plan and execution, Clinical data reassessment, and Clinical solution review, confirming both established and new insights aligned with our research objectives. These results provide a foundation for developing precise teaching and assessment tools that can effectively measure and support competency in clinical reasoning across different learner levels, advancing educators’ ability to guide students toward expertise in diagnostic practice.
Abbreviation
UTI, Urinary Tract Infection.
Ethics Approval and Informed Consent
The study protocol (protocol no. 48785) was reviewed and approved by Stanford University’s Institutional Review Board (IRB). All participants completed an informed consent to be enrolled in this study.
Acknowledgments
Special thanks to all the participants who generously shared their time by completing our questionnaire, your invaluable contributions have been instrumental in enriching the depth and quality of this research.
Funding
This work was funded by the Howard Hughes Medical Institute through an HHMI Professor grant to C.E.W. The sponsor had no role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript.
Disclosure
The authors report no conflicts of interest in this work.
References
1. Norman G. Research in clinical reasoning: past history and current trends. Med Educ. 2005;39(4):418–427. doi:10.1111/j.1365-2929.2005.02127.x
2. Connor DM, Durning SJ, Rencic JJ. Clinical Reasoning as a Core Competency. Acad Med. 2020;95(8):1166–1171. doi:10.1097/ACM.0000000000003027
3. Higgs J, Jensen GM, Loftus S, Christensen N, editors.Clinical Reasoning in the Health Professions.
4. Newman-Toker DE, Peterson SM, Badihian S, et al. Diagnostic errors in the emergency department: a systematic review. Agency for Healthcare Research and Quality (AHRQ). 2022. doi:10.23970/AHRQEPCCER258
5. Norman GR, Monteiro SD, Sherbino J, Ilgen JS, Schmidt HG, Mamede S. The causes of errors in clinical reasoning: cognitive biases, knowledge deficits, and dual process thinking. Acad Med. 2017;92(1):23–30. doi:10.1097/ACM.0000000000001421
6. Auerbach AD, Lee TM, Hubbard CC, et al. Diagnostic errors in hospitalized adults who died or were transferred to intensive care. JAMA Intern Med. 2024;184(2):164. doi:10.1001/jamainternmed.2023.7347
7. Young M, Thomas A, Lubarsky S, et al. Drawing boundaries: the difficulty in defining clinical reasoning. Acad Med. 2018;93(7):990–995. doi:10.1097/ACM.0000000000002142
8. Yazdani S, Hoseini Abardeh M. Five decades of research and theorization on clinical reasoning: a critical review. Adv Med Educ Pract. 2019;10:703–716. doi:10.2147/AMEP.S213492
9. Koufidis C, Manninen K, Nieminen J, Wohlin M, Silén C. Unravelling the polyphony in clinical reasoning research in medical education. J Eval Clin Pract. 2021;27(2):438–450. doi:10.1111/jep.13432
10. Elstein, Arthur S, Lee SS, Sprafka SA, et al. Medical problem solving: an analysis of clinical reasoning. Cambridge, Massachusetts: Harvard University Press. Newsl Sci Technol Hum Values. 1978; 3(3):50–51. doi:10.1177/016224397800300337
11. Neufeld VR, Norman GR, Feightner JW, Barrows HS. Clinical problem-solving by medical students: a cross-sectional and longitudinal analysis. Med Educ. 1981;15(5):315–322. doi:10.1111/j.1365-2923.1981.tb02495.x
12. Schmidt HG, Boshuizen HPA. On the origin of intermediate effects in clinical case recall. Mem Cognit. 1993;21(3):338–351. doi:10.3758/BF03208266
13. Schmidt HG, Norman GR, Boshuizen HP. A cognitive perspective on medical expertise: theory and implication [published erratum appears in Acad Med 1992 Apr;67(4):287]. Acad Med. 1990;65(10):611–621. doi:10.1097/00001888-199010000-00001
14. Croskerry P. A universal model of diagnostic reasoning. Acad Med. 2009;84(8):1022–1028. doi:10.1097/ACM.0b013e3181ace703
15. Salehi S Improving problem-solving through reflection. Published online 2018.
16. Price AM, Kim CJ, Burkholder EW, Fritz AV, Wieman CE. A Detailed characterization of the expert problem-solving process in science and engineering: guidance for teaching and assessment. Momsen J. editor. CBE—Life Sci Educ 2021 Vol. 203;ar43doi: 10.1187/cbe.20-12-0276
17. Feltovich PJ, Barrows HS. Issues of generality in medical problem solving. In: Devolder HG, Schmidt MC, editors Tutorials in Problem Based Learning. 1984.
18. Hornos EH, Pleguezuelos EM, Brailovsky CA, Harillo LD, Dory V, Charlin B. The practicum script concordance test: an online continuing professional development format to foster reflection on clinical practice. J Contin Educ Health Prof. 2013;33(1):59–66. doi:10.1002/chp.21166
19. Lubarsky S, Dory V, Duggan P, Gagnon R, Charlin B. Script concordance testing: from theory to practice AMEE Guide No 75. Med Teach. 2013;35(3):184–193. doi:10.3109/0142159X.2013.760036
20. Charlin B, Roy L, Brailovsky C, Goulet F, van der Vleuten C. The script concordance test: a tool to assess the reflective clinician. Teach Learn Med. 2000;12(4):189–195. doi:10.1207/S15328015TLM1204_5
21. Rencic J. Twelve tips for teaching expertise in clinical reasoning. Med Teach. 2011;33(11):887–892. doi:10.3109/0142159X.2011.558142
22. Cooke S, Lemay JF. Transforming medical assessment: integrating uncertainty into the evaluation of clinical reasoning in medical education. Acad Med. 2017;92(6):746–751. doi:10.1097/ACM.0000000000001559
23. Pelaccia T, Tardif J, Triby E, Charlin B. An analysis of clinical reasoning through a recent and comprehensive approach: the dual-process theory. Med Educ Online. 2011;16(1):5890. doi:10.3402/meo.v16i0.5890
24. Evans JBT, Stanovich KE. Dual-process theories of higher cognition: advancing the debate. Perspect Psychol Sci. 2013;8(3):223–241. doi:10.1177/1745691612460685
25. Frank JR, Snell LS, Cate OT, et al. Competency-based medical education: theory to practice. Med Teach. 2010;32(8):638–645. doi:10.3109/0142159X.2010.501190
26. Ericsson KA, Krampe RT, Tesch-Römer C. The role of deliberate practice in the acquisition of expert performance. Psychol Rev. 1993;100(3):363–406. doi:10.1037/0033-295X.100.3.363
27. Ericsson KA. An expert-performance perspective of research on medical expertise: the study of clinical performance: clinical performance. Med Educ. 2007;41(12):1124–1130. doi:10.1111/j.1365-2923.2007.02946.x
28. Price A, Salehi S, Burkholder E, et al. An accurate and practical method for assessing science and engineering problem-solving expertise. Int J Sci Educ. 2022;44(13):2061–2084. doi:10.1080/09500693.2022.2111668
29. Gill SL. Qualitative Sampling Methods. J Hum Lact. 2020;36(4):579–581. doi:10.1177/0890334420949218
30. Boynton PM. Administering, analysing, and reporting your questionnaire. BMJ. 2004;328(7452):1372–1375. doi:10.1136/bmj.328.7452.1372
31. Ando H, Cousins R, Young C. Achieving saturation in thematic analysis: development and refinement of a codebook. Compr Psychol. 2014;3:
32. Hsieh HF, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res. 2005;15(9):1277–1288. doi:10.1177/1049732305276687
33. McAlister A, Lee D, Ehlert K, Kajfez R, Faber C, Kennedy M Qualitative coding: an approach to assess inter-rater reliability. In:
34. Vaismoradi M, Turunen H, Bondas T. Content analysis and thematic analysis: implications for conducting a qualitative descriptive study. Nurs Health Sci. 2013;15(3):398–405. doi:10.1111/nhs.12048
35. Maxwell J. Designing a Qualitative Study. In: The SAGE Handbook of Applied Social Research Methods. SAGE Publications, Inc.; 2009:214–253. doi:10.4135/9781483348858.n7
36. Marshall J. Assessment of problem-solving ability. Med Educ. 1977;11(5):329–334. doi:10.1111/j.1365-2923.1977.tb00623.x
37. Kassirer JP. Clinical problem solving: a behavioral analysis. Ann Intern Med. 1978;89(2):245. doi:10.7326/0003-4819-89-2-245
38. Nendaz MR, Gut AM, Perrier A, et al. Common strategies in clinical data collection displayed by experienced clinician-teachers in internal medicine. Med Teach. 2005;27(5):415–421. doi:10.1080/01421590500084818
39. Kern L, Doherty ME. Pseudodiagnosticity’ in an idealized medical problem-solving environment. Acad Med. 1982;57(2):100–104. doi:10.1097/00001888-198202000-00004
40. Audétat MC, Laurin S, Sanche G, et al. Clinical reasoning difficulties: a taxonomy for clinical teachers. Med Teach. 2013;35(3):e984–e989. doi:10.3109/0142159X.2012.733041
41. Groves M, O’Rourke P, Alexander H. The clinical reasoning characteristics of diagnostic experts. Med Teach. 2003;25(3):308–313. doi:10.1080/0142159031000100427
42. Jackson JM, Skelton JA, Peters TR. Medical students’ clinical reasoning during a simulated viral pandemic: evidence of cognitive integration and insights on novices’ approach to diagnostic reasoning. Med Sci Educ. 2020;30(2):767–774. doi:10.1007/s40670-020-00946-9
© 2024 The Author(s). This work is published and licensed by Dove Medical Press Limited. The
full terms of this license are available at https://www.dovepress.com/terms.php
and incorporate the Creative Commons Attribution
- Non Commercial (unported, 3.0) License.
By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted
without any further permission from Dove Medical Press Limited, provided the work is properly
attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.