Back to Journals » Advances in Medical Education and Practice » Volume 16
Artificial Intelligence in Medical Education: Promise, Pitfalls, and Practical Pathways
Authors Saroha S
Received 8 March 2025
Accepted for publication 2 June 2025
Published 14 June 2025 Volume 2025:16 Pages 1039—1046
DOI https://doi.org/10.2147/AMEP.S523255
Checked for plagiarism Yes
Review by Single anonymous peer review
Peer reviewer comments 4
Editor who approved publication: Dr Md Anwarul Azim Majumder
Sarup Saroha
University College London Medical School, London, WC1E 6BT, UK
Correspondence: Sarup Saroha, University College London Medical School, Gower Street, London, WC1E 6BT, UK, Email [email protected]
Abstract: Artificial intelligence (AI) is transforming healthcare, yet its integration into medical education remains limited. As AI-powered tools increasingly assist with diagnostics, administrative tasks, and clinical decision-making, future doctors must have the knowledge and skills to use them effectively. This article explores the role of AI in medical education, highlighting its potential to enhance efficiency, improve patient care, and foster innovation while addressing ethical and safety concerns. The widespread adoption of AI presents both opportunities and challenges. While AI-driven transcription tools reduce administrative burdens and machine learning algorithms enhance diagnostic accuracy, the risks of over-reliance, algorithmic bias, and patient data security remain critical concerns. To navigate these complexities, medical schools must incorporate AI-focused training into their curricula, ensuring graduates can critically assess and safely apply AI technologies in clinical practice. However, AI should not be seen as the only solution; non-technological improvements to clinical workflows must also be considered in parallel. This article proposes practical solutions, including optional AI modules, hands-on training with AI-powered diagnostic tools, and interdisciplinary collaboration through innovation laboratories. By embedding AI education into medical training, institutions can prepare students for a rapidly evolving healthcare landscape, ensuring AI is a tool for improved patient outcomes, not a source of unintended harm. As AI reshapes medicine, equipping future doctors with the skills to use it responsibly is essential for fostering a healthcare system that is efficient, ethical, and patient-centred.
Keywords: medical education, healthcare innovation, clinical training, medical technology, diagnostic AI, medical decision support
Introduction
On placement, it is common to hear our junior doctor colleagues express frustration at their substantial documentation obligations. Clinicians remain overburdened by laborious record-keeping practices, with tick-box proformas and defensive notetaking consuming nearly one-third of their working day and contributing to burnout.1,2 Combined with extensive waiting lists and stripped-back services,3 this heavy administrative burden raises concerns for budding clinicians about whether their future careers align with their younger selves’ vision of patient-centred care. Clearly, there is a need for solutions that alleviate this burden, one emerging solution being artificial intelligence (AI).
AI refers to computer systems that mimic specific aspects of human intelligence, such as learning, reasoning, and problem-solving, using a range of computational models and algorithms.4 AI is broadly classified into weak AI, which performs specific tasks without true understanding (eg, Siri, Alexa, self-driving cars),5,6 and strong AI, which would match or surpass human intelligence but currently remains theoretical.7,8
AI-powered scribes are beginning to automatically transcribe and summarise clinician–patient conversations in real-time, helping to reduce administrative burden by streamlining medical documentation.9,10 This frees up time for doctors to connect better with patients even in short GP appointments.11 These tools also capture critical information from consultations that might be overlooked, ensuring a more comprehensive record.
Integrating new technologies into medical practice has a long and continuous history. In the mid-20th century, clinicians began using Dictaphones to record notes, which were later transcribed by typing pools, an early form of documentation support that is now obsolete. Over time, healthcare settings adopted pagers, electronic prescribing systems, and electronic health records, followed by handheld diagnostic tools and mobile apps, each reflecting the field’s responsiveness to technological advances. These innovations have consistently aimed to improve efficiency, communication, and safety, but have also introduced new challenges, such as increased screen time, reduced face-to-face interaction, and administrative burden.12–14 In this context, AI “scribes” represent not a radical departure but rather the latest evolution of established practices, offering a modern, scalable, and flexible approach to clinical documentation support.11 However, as with previous technological shifts, the integration of AI must be thoughtful and evidence-based, balancing the benefits of automation and insight generation with concerns around data privacy, over-reliance, and patient engagement to ensure responsible adoption in clinical education and practice.
The General Medical Council (GMC) states that doctors “are responsible for the decisions they make when using new technologies like AI, and should work only within their competence.”15 This coincides with the World Medical Association calling for reviewing medical curricula and education for all healthcare stakeholders to improve understanding of the risks and benefits of AI in healthcare.16 It follows then that in fostering good medical practice, medical schools must prepare students for the clinical environment that awaits them through building competence and familiarity in this evolving domain.
With 2 in 3 physicians using AI in their clinical practice, an increase of 78% from 2023,17 enthusiasm for the technology is rapidly growing. Yet, despite this uptake, a 2024 international survey of over 4500 students across 192 medical, dental, and veterinary faculties found that over 75% reported no formal AI education in their curriculum, highlighting a critical gap between technological advancement and medical training.18 This discrepancy underscores the urgency for medical schools to proactively incorporate AI teaching to ensure graduates are ready for the realities of modern clinical practice.
This article explores the need for AI education in medical schools, highlighting its potential to enhance efficiency, improve patient care, and foster innovation while addressing ethical and safety concerns.
Why AI Education is Essential
While AI may seem most relevant to data-driven fields like radiology, its use now spans many specialities, from GP notetaking to triage in emergency medicine.11,19,20 However, engagement with AI will vary. For example, clinicians in hands-on or communication-focused roles may need only a foundational understanding, while others will work closely with AI systems. This variability raises the question of whether AI education should be mandatory. A core foundation in digital literacy and responsible use would likely benefit all future doctors. Moreover, high digital literacy improves academic results and reduces procrastination.21,22
Beyond its role as a technical tool, AI carries broader implications for the doctor–patient relationship. While AI may enhance efficiency and support clinical decision-making, over-reliance on automated outputs could depersonalise care or erode the human connection central to medicine. Trust, empathy, and individualised judgement are core components of patient-centred care, that cannot be replicated by algorithms. Excessive use of algorithmic outputs may discourage clinical reasoning, while opaque “black-box” systems, AI tools whose internal decision-making processes are not easily interpretable, risk undermining transparency and trust.23 Medical education must emphasise the importance of using AI to complement, rather than replace, human interaction. Clinicians should be free to use or reject AI tools without penalty, prioritising patient interests and retaining the right to disagree with AI outputs.
Several publications and national medical bodies, including the NHS, AMA, and Royal College of Physicians and Surgeons of Canada, have called for the integration of AI education at all levels of medical training.24–32 However, there remains a gap in both understanding and availability of structured AI curriculum frameworks tailored to medical education, which are essential for guiding effective teaching and learning.7,33
Mastery of key principles is essential for students to use AI tools effectively. Large Language Models (LLMs) such as ChatGPT,34 generate human-like text by predicting likely continuations based on vast training datasets. Although they do not “think” like humans, they can simulate logical reasoning through in-context learning are advanced AI systems trained on extremely large corpora of text data to learn statistical patterns and relationships between words, phrases, and structures in natural language.35 Machine learning (ML) is a subset of AI that enables systems to learn from data and improve their performance on specific tasks without being explicitly programmed.36 ML has demonstrated utility in diagnosis and outcome prediction, with recent examples surpassing radiologists in detecting breast cancer,37 and outperforming dermatologists in diagnosing malignant melanoma,38 with AI now acting as a “second pair of eyes.”
Education can help students identify and manage the inherent pitfalls of using AI applications, such as becoming over-reliant on them or breaching data privacy. Contemporary studies have shown that medical students feel inadequately equipped to address patient’s concerns about using AI in their care.39–41 Without formal training on proper oversight, the unique flaws of AI may go unnoticed, such as “hallucination” errors in LLMs generating plausible but incorrect clinical details that could compromise safety.42
Furthermore, as with discharge summaries once typed by hospital transcriptionists, AI-generated notes require careful proofreading and verification. Whether this responsibility lies with the clinician who was recorded or with a trained third party, clearly defined processes for reviewing AI outputs are essential to ensure accuracy and maintain patient safety. These concerns are compounded by questions of confidentiality and consent, particularly in cases where transcripts contain sensitive information. It is not always clear whether third-party reviewers should have access to such content or whether patients have explicitly consented to this use of their data.
As with traditional medical records, questions remain regarding the ownership of AI-generated transcripts, whether they belong to the clinician, the hospital, or the cloud-based AI provider, and how such data is stored, accessed, and governed. Although existing regulations like GDPR offer some protection, the rapid deployment of cloud-based AI tools highlights the urgent need for clear institutional policies on data ownership and governance. As such, educating students about clinical accountability, data governance, and ethical oversight is crucial for the safe integration of AI into healthcare settings. Rather than dehumanising medicine, AI should be a tool that complements healthcare professionals’ expertise and maintains trust from the outset. As stewards of sensitive patient information, graduates must also be vigilant about the data privacy implications associated with AI used to preserve patient confidentiality.43
While global AI adoption holds promise for reducing disparities in resource-limited settings by optimising care delivery,44 it carries the risk of exacerbating existing inequities. Unequal access to digital infrastructure, variation in institutional capacity, and the underrepresentation of certain populations in AI training datasets such as ethnic groups or rare disease cohorts may widen gaps in care. Algorithmic bias,45,46 systematic error arising from imbalanced data or flawed algorithm design, can perpetuate or even amplify healthcare inequalities, potentially resulting in poorer outcomes for already marginalised groups. Moreover, AI-integrated medical curricula may disproportionately benefit well-resourced institutions in high-income settings, leaving underfunded schools struggling to keep pace. Therefore, it is essential that medical education not only equips students with the technical skills to engage with AI but also fosters critical awareness of its potential to reinforce or mitigate global health disparities.
When using AI in healthcare, the vast amount of sensitive patient data exposed to the technology is at risk.47,48 Therefore, it is essential to ensure that data collection, storage, and usage are conducted responsibly, with explicit patient consent obtained at each stage.
Positioned between academic and clinical environments, medical students are uniquely placed to identify inefficiencies and propose creative solutions. Their dual exposure, combined with fewer entrenched habits and greater openness to emerging technologies like AI, encourages fresh thinking. While all healthcare professionals contribute to innovation, students often bring a distinct perspective, reflected in the growing trend of the “doctorpreneur.”49 Through thorough education in AI applications, students can drive systemic improvements across longstanding challenges.
AI tools that reduce administrative burdens, such as AI-powered scribes or documentation support, represent one of several technological responses to clinician overload. Current non-AI solutions in this space to be considered in parallel include structured electronic health records (EHRs), voice recognition software, and template-based note systems,19,50–52 each offering varying degrees of efficiency but often at the expense of flexibility or clinician satisfaction. AI-enhanced tools differ in their ability to adapt to natural speech, personalise content, and learn from user feedback. However, they are complementary rather than definitive solutions. Like all technologies, their integration requires thoughtful evaluation of cost, accuracy, and impact on clinical workflow. Medical education must hence position AI not as a universal fix but as one tool among many, requiring a critical understanding of when, how, and whether to use it in specific settings.
Proposed Solutions
Medical schools could begin by developing initially optional AI-focused modules targeted at interested students. These modules would cover foundational topics such as the principles of ML, ethical considerations like algorithmic bias and data privacy, and practical clinical applications (Figure 1). Notably, several medical schools, including Dartmouth, Harvard, and institutions in Germany, have already begun integrating formal AI training into their curricula,53–56 demonstrating the growing global momentum for AI education in medicine. Offering these modules on an elective basis would allow institutions to pilot content, gather feedback, and iteratively refine delivery methods. Following successful implementation and evaluation, these modules could be embedded into the core curriculum to ensure that all graduates attain essential competencies in AI. Incorporating hands-on training sessions using AI tools in diagnostics, scribing, and decision support would allow students to build confidence in their implementation. Platforms like PassMedicine, already simulate history-taking scenarios with virtual AI patients.57 Medical schools could develop this further by creating tailored cases where symptoms adapt to students’ decisions in real-time, helping refine clinical reasoning without subjecting patients to potential harm.
Optional advanced tracks could allow interested students to explore deeper integrations relevant to their intended specialities or leadership ambitions. Medical schools could establish AI innovation laboratories to foster collaboration between medical students, data scientists, and AI developers. These labs could host events such as hackathons, intensive, time-limited gatherings focused on rapidly developing and prototyping novel solutions,58 and design sprints, which are more structured, five-phase processes aimed at solving specific problems through ideation, prototyping, and user testing.59 While hackathons are well-suited to encouraging creativity and producing proof-of-concept tools quickly, design sprints are more appropriate for refining targeted challenges, such as improving clinical communication or streamlining documentation workflows, based on iterative feedback. Both formats offer valuable opportunities for students to apply their clinical knowledge to real-world innovation, fostering interdisciplinary thinking and experiential learning.
Medical schools should employ clear evaluation metrics to assess the effectiveness of AI integration into the curriculum. These could include pre- and post-module assessments to measure gains in conceptual understanding, simulation-based tasks to evaluate practical application, and reflective portfolios to gauge students’ ethical reasoning around AI use. Engagement metrics such as attendance, completion rates of optional modules, and participation in innovation activities like hackathons can provide insight into interest and accessibility. Longer-term outcomes, such as students’ confidence in using AI tools during clinical placements or performance in AI-assisted diagnostic tasks, could be tracked to inform curricular refinement. These metrics should not only capture technical competence but also students’ critical thinking, digital literacy, and readiness to engage with evolving technologies in real-world settings.
Notably, integrating AI into the medical curriculum presents logistical and structural challenges. Medical programmes are already dense, and adding new content requires either streamlining existing material or extending instructional time. Faculty readiness is another barrier, as many educators may lack AI expertise and require upskilling or support from interdisciplinary collaborators. Additionally, access to up-to-date software tools, real-world case datasets, and computational resources may be limited in underfunded institutions.
Medical schools may need to develop targeted funding strategies, such as digital education grants, partnerships with technology providers, or government-backed innovation funds, to support infrastructure, training, and curriculum development. To reduce financial and logistical barriers, particularly for under-resourced institutions, medical schools could leverage widely available, low-cost, or free AI education resources as foundational material for optional modules. Courses such as “AI for Everyone” (Coursera),60 “Elements of AI” (University of Helsinki),61 and Google’s “Machine Learning Crash Course”62 offer accessible introductions to AI concepts tailored for non-technical audiences, serving as effective primers for more advanced or contextualised training. Healthcare-specific offerings like NHS AI Lab webinars and Stanford University’s AI in Medicine and Imaging lecture series63,64 can expose students to real-world clinical applications without added costs. Scalability will depend on adaptable content delivery and resource-efficient models: modular curricula enable phased implementation based on institutional capacity, while collaboration with tech firms or national digital health bodies can provide shared infrastructure and expertise. Crucially, core content should remain flexible, updatable, and not overly dependent on specialist faculty to support sustainable, system-wide integration.
Furthermore, standardising AI education across diverse medical schools could prove difficult given regional variability in infrastructure, priorities, and regulatory guidance. Establishing minimum competency frameworks, similar to digital literacy or evidence-based medicine, may help ensure consistency while allowing flexible content delivery. Close collaboration with regulatory bodies, such as the GMC, and investment in educator training will be essential to overcoming these barriers and ensuring sustainable implementation.
It is important to recognise that this space remains highly dynamic. Numerous AI tools are emerging, yet many may not undergo rigorous evaluation through randomised controlled studies before entering the clinical setting. In such a volatile landscape, established providers of EHR systems may increasingly seek to acquire and integrate these technologies, reflecting broader consolidation trends and rapid evolution within the healthcare technology sector. Given the rapid pace of AI development, ongoing curriculum review and adaptation will be essential to ensure relevant and effective training.
Conclusions
By preparing students to engage thoughtfully and collaboratively with AI, medical schools have an unparalleled opportunity to shape the future of healthcare, one that is smarter, fairer, and more effective for both patients and practitioners. This includes equipping graduates with the skills to navigate ethical and clinical complexities and encouraging awareness of broader issues such as algorithmic bias and global disparities in access to AI tools.
With thoughtful integration into medical curricula, AI can enhance care, reduce inefficiencies, and serve as a force for innovation without compromising the human connection at the heart of medicine. By embedding thoughtful AI education today, medical schools can ensure that the doctors of tomorrow lead a healthcare system that is both technologically advanced and deeply human.
Disclosure
The author reports no conflicts of interest in this work.
References
1. Academy of Medical Educators. National Training Survey 2024 [Internet]. London: Academy of Medical Educators; 2024 [cited April 24, 2025]. Available from: https://medicaleducators.org/write/MediaUploads/News%20Articles/Nationaltrainingsurveypressreleasefinal.pdf.
2. European Junior Doctors Association. Policy on burnout and psychosocial wellbeing [Internet]. Brussels: European Junior Doctors Association; 2023 [cited April 24, 2025]. Available from: https://www.juniordoctors.eu/sites/default/files/2023-06/EJD-2023-015_P_Policy%20on%20burnout%20and%20psychosocial%20wellbeing.pdf.
3. UK Government. NHS waiting times and pressures: current challenges [Internet]. London: GOV.UK; 2024 [cited January 12, 2025]. Available from: https://www.gov.uk/missions/nhs.
4. Chen M, Decary M. Artificial intelligence in healthcare: an essential guide for health leaders. Healthcare Management Forum. 2019;33(1):10–18. doi:10.1177/0840470419873123
5. Waithira N, Kestelyn E, Chotthanawathit K, et al. Investigating the secondary use of clinical research data: protocol for a mixed methods study. JMIR Res Protoc. 2023;12:e44875. doi:10.2196/44875
6. Hoy MB. Alexa, Siri, Cortana, and More: an introduction to voice assistants. Med Ref Serv Q. 2018;37(1):81–88. doi:10.1080/02763869.2018.1404391 PMID: 29327988.
7. Sun L, Yin C, Xu Q, Zhao W. Artificial intelligence for healthcare and medical education: a systematic review. Am J Transl Res. 2023;15(7):4820–4828. PMID: 37560249; PMCID: PMC10408516.
8. Singh H, Yadav G, Mallaiah R, et al. iNICU - integrated neonatal care unit: capturing neonatal journey in an intelligent data way. J Med Syst. 2017;41:132. doi:10.1007/s10916-017-0774-8
9. Tierney Aaron A, Gayre G, Hoberman B, et al. Ambient artificial intelligence scribes to alleviate the burden of clinical documentation. NEJM Catal. 2024;5;3:
10. Mess SA, Mackey AJ, Yarowsky DE. Artificial intelligence scribe and large language model technology in healthcare documentation: advantages, limitations, and recommendations. Plast Reconstr Surg Glob Open. 2025;13(1):e6450. doi:10.1097/GOX.0000000000006450 PMID: 39823022; PMCID: PMC11737491.
11. Murgia M. Healthcare turns to AI for medical note-taking ‘scribes’ [Internet]. Financial Times; 2025 [cited January 12, 2025]. Available from: https://www.ft.com/content/5c356658-6db4-47c1-940b-b2e3cf3a51f3?shareType=nongift.
12. Medical Group Management Association. 2022 MGMA regulatory burden report: measuring the mounting regulatory burden in healthcare [Internet]. Englewood, CO: MGMA; 2022 [cited April 22, 2025]. Available from: https://www.mgma.com/getmedia/4bfd2489-6099-49e5-837f-f787d6d0a30f/2022-MGMA-Regulatory-Burden-Report-FINAL.pdf.aspx?ext=.pdf.
13. Rotenstein LS, Holmgren AJ, Horn DM, et al. System-level factors and time spent on electronic health records by primary care physicians. JAMA Network Open. 2023;6(11):e2344713. doi:10.1001/jamanetworkopen.2023.44713
14. Arndt BG, Micek MA, Rule A, Shafer CM, Baltus JJ, Sinsky CA. More tethered to the EHR: EHR workload trends among academic primary care physicians, 2019–2023. Ann Fam Med. 2024;22(1):12–18. doi:10.1370/afm.3047 PMID: 38253499; PMCID: PMC11233089.
15. General Medical Council. Artificial intelligence and innovative technologies. [cited January 12, 2025]. Available from: https://www.gmc-uk.org/professional-standards/learning-materials/artificial-intelligence-and-innovative-technologies.
16. World Medical Association. WMA statement on augmented intelligence in medical care [Internet]. Ferney-Voltaire: WMA; 2019 [cited April 24, 2025]. Available from: https://www.wma.net/policies-post/wma-statement-on-augmented-intelligence-in-medical-care/.
17. Albert Henry T. 2 in 3 physicians are using health AI—up 78% from 2023. American Medical Association; 2025. Available from: https://www.ama-assn.org/practice-management/digital/2-3-physicians-are-using-health-ai-78-2023.
18. Busch F, Hoffmann L, Truhn D, et al. Global cross-sectional student survey on AI in medical, dental, and veterinary education and practice at 192 faculties. BMC Med Educ. 2024;24:1066. doi:10.1186/s12909-024-06035-4
19. Lin SY, Shanafelt TD, Asch SM. Reimagining clinical documentation with artificial intelligence. Mayo Clin Proc. 2018;93(5):563–565. doi:10.1016/j.mayocp.2018.02.016
20. Colakca C, Ergın M, Ozensoy HS, et al. Emergency department triaging using ChatGPT based on emergency severity index principles: a cross-sectional study. Sci Rep. 2024;14:22106. doi:10.1038/s41598-024-73229-7
21. Yuan X, Rehman S, Altalbe A, et al. Digital literacy as a catalyst for academic confidence: exploring the interplay between academic self-efficacy and academic procrastination among medical students. BMC Med Educ. 2024;24:1317. doi:10.1186/s12909-024-06329-7
22. Saha A, Chunder R, Majumdar S, et al. Cross‐sectional investigation of digital literacy and its impact on learning outcomes among medical students. Res J Med Sci. 2024;18:357–361. doi:10.36478/makrjms.2024.1.357.361
23. Shortliffe EH, Sepúlveda MJ. Clinical decision support in the era of artificial intelligence. JAMA. 2018;320(21):2199–2200. doi:10.1001/jama.2018.17163
24. Topol E. The Topol review: preparing the health care work- force to deliver the digital future. National Health Service, UK. 2019. Available from: https://topol.hee.nhs.uk/wp-content/uploads/HEE-Topol-Review-2019.pdf.
25. Paranjape K, Schinkel M, Nannan Panday R, Car J, Nanayakkara P. Introducing artificial intelligence training in medical education. JMIR Med Educ. 2019;5(2):e16048. doi:10.2196/16048
26. Wartman SA, Combs CD. Medical education must move from the information age to the age of artificial intelligence. Acad Med. 2018;93(8):1107–1109. doi:10.1097/ACM.0000000000002044
27. Pucchio A, Papa JD, de Moraes FY. Artificial intelligence in the medical profession: ready or not, here AI comes. Clinics. 2022;77:100010. doi:10.1016/j.clinsp.2022.100010
28. Kolachalama VB, Garg PS. Machine learning and medical education. NPJ Digit Med. 2018;1(1):54. doi:10.1038/s41746-018-0061-1
29. Mehta S, Vieira D, Quintero S, et al. Redefining medical education by boosting curriculum with artificial intelligence knowledge. J Cardiol Curr Res. 2020;13(5):124–129. doi:10.15406/jccr.2020.13.00490
30. Abdulhussein H, Turnbull R, Dodkin L, Mitchell P. Towards a national capability framework for artificial intelligence and digital medicine tools – a learning needs approach. Intell Based Med. 2021;5:100047. doi:10.1016/j.ibmed.2021.100047
31. James CA, Wheelock KM, Woolliscroft JO. Machine learning: the next paradigm shift in medical education. Acad Med. 2021;96(7):954–957. doi:10.1097/ACM.0000000000003943
32. Lomis K, Jeffries P, Palatta A, et al. Artificial intelligence for health professions educators. NAM Perspect. 2021;2021:202109a doi:10.31478/202109a.
33. Ejaz H, McGrath H, Wong BL, Guise A, Vercauteren T, Shapey J. Artificial intelligence and medical education: a global mixed-methods study of medical students’ perspectives. DIGITAL HEALTH. 2022;8:205520762210890. doi:10.1177/20552076221089099
34. Fütterer T, Fischer C, Alekseeva A, et al. ChatGPT in education: global reactions to AI innovations. Sci Rep. 2023;13(1):15310. doi:10.1038/s41598-023-42227-6 PMID: 37714915; PMCID: PMC10504368.
35. Wang Y, Liu S, Zhang Y, et al. Large language models in biomedicine and health: current research and future directions. J Med Internet Res. 2023;25:e46924. doi:10.2196/46924
36. Jordan MI, Mitchell TM. Machine learning: trends, perspectives, and prospects. Science. 2015;349(6245):255–260. doi:10.1126/science.aaa8415
37. McKinney SM, Sieniek M, Godbole V, et al. International evaluation of an AI system for breast cancer screening. Nature. 2020;577(7788):89–94. doi:10.1038/s41586-019-1799-6
38. Haenssle HA, Fink C, Schneiderbauer R, et al. Man against machine: diagnostic performance of a deep learning convolutional neural network for dermoscopic melanoma recognition in comparison to 58 dermatologists. Ann Oncol. 2018;29(8):1836–1842. doi:10.1093/annonc/mdy166
39. Civaner MM, Uncu Y, Bulut F, et al. Artificial intelligence in medical education: a cross-sectional needs assessment. BMC Med Educ. 2022;22:772. doi:10.1186/s12909-022-03852-3
40. Sit C, Srinivasan R, Amlani A, et al. Attitudes and perceptions of UK medical students towards artificial intelligence and radiology: a multicentre survey. Insights Imaging. 2020;11(1):14. doi:10.1186/s13244-019-0830-7
41. Blease C, Kharko A, Bernstein M, et al. Machine learning in medical education: a survey of the experiences and opinions of medical students in Ireland. BMJ Health Care Inform. 2022;29(1):e100480. doi:10.1136/bmjhci-2021-100480
42. Huang L, Yu W, Ma W, et al. A survey on hallucination in large language models: principles, taxonomy, challenges, and open questions. arXiv preprint arXiv:231105232. 2023 doi:10.1145/3703155.
43. Malin B, Goodman K. Section editors for the IMIA yearbook special section. Between access and privacy: challenges in sharing health data. Yearb Med Inform. 2018;27:55–59. doi:10.1055/s-0038-1641216
44. Wahl B, Cossy-Gantner A, Germann S, Schwalbe NR. Artificial intelligence (AI) and global health: how can AI contribute to health in resource-poor settings? BMJ Global Health. 2018;3:e000798. doi:10.1136/bmjgh-2018-000798
45. Obermeyer Z, Powers B, Vogeli C, Mullainathan S. Dissecting racial bias in an algorithm used to manage the health of populations. Science. 2019;366(6464):447–453. doi:10.1126/science.aax2342
46. World Health Organization. WHO calls for safe and ethical AI for health [Internet]. Geneva: WHO; 2023 [cited April 24, 2025]. Available from: https://www.who.int/news/item/16-05-2023-who-calls-for-safe-and-ethical-ai-for-health.
47. El Emam K, Arbuckle L. Anonymizing Health Data: Case Studies and Methods to Get You Started. O’Reilly Media; 2013.
48. Darby A, Strum MW, Holmes E, Gatwood J. A review of nutritional tracking mobile applications for diabetes patient use. Diabetes Technol Ther. 2016;18(3):200–212. doi:10.1089/dia.2015.0299
49. Kanetkar R. Doctors are building tech startups to help fix a healthcare system reeling from crisis after crisis [Internet]. Business Insider. 2023 [cited January 12, 2025]. Available from: https://www.businessinsider.com/meet-the-doctorpreneurs-working-to-transform-healthcare-with-tech-2023-3.
50. Wachter RM. The Digital Doctor: Hope, Hype, and Harm at the Dawn of Medicine’s Computer Age. McGraw-Hill Education; 2015.
51. Bernaert P, Van Hoof C, Evers G. The impact of voice technology on physicians’ documentation practices in EHRs: a systematic review. Int J Med Inform. 2021;154:104547. doi:10.1016/j.ijmedinf.2021.104547
52. Khairat S, Burke G, Archambault H, Schwartz T, Larson J, Ratwani R. Perceived burden of EHR templates among emergency physicians and advanced practice providers. West J Emerg Med. 2020;21(1):93–99. doi:10.5811/westjem.2020.9.48170
53. Dartmouth Health and Geisel School of Medicine. Geisel launches AI-focused medical curriculum to train digital health leaders [Internet]. New Hampshire: Dartmouth Health; 2024
54. Harvard Medical School. How generative AI is transforming medical education [Internet]. Boston (MA): Harvard Medical School; 2024 [cited April 24, 2025]. Available from: https://magazine.hms.harvard.edu/articles/how-generative-ai-transforming-medical-education.
55. The University of Texas Health Science Center at San Antonio. Nation’s first dual degree in medicine and AI aims to prepare the next generation of health care providers [Internet]. San Antonio: UT Health San Antonio; 2023 [cited April 24, 2025]. Available from: https://news.uthscsa.edu/nations-first-dual-degree-in-medicine-and-ai-aims-to-prepare-the-next-generation-of-health-care-providers-2/.
56. Mosch L, Agha-Mir-Salim L, Sarica MM, Balzer F, Poncette AS. Artificial Intelligence in Undergraduate Medical Education. Challenges of Trustable AI and Added-Value on Health [e-Book]. Amsterdam: IOS Press; 2022. 821–822. doi:10.3233/shti220597
57. PassMedicine. [cited January 12, 2025]. Available from: https://www.passmedicine.com/.
58. Briscoe G, Mulligan C. Digital Innovation: The Hackathon Phenomenon. Creativeworks London; 2014.
59. Knapp J, Zeratsky J, Kowitz B. Sprint: How to Solve Big Problems and Test New Ideas in Just Five Days. Simon & Schuster; 2016.
60. Ng A. AI for everyone [Internet]. Coursera; [cited April 22, 2025]. Available from: https://www.coursera.org/learn/ai-for-everyone?msockid=0cf075adef9b6d6c31bc60d8ee9d6cf0.
61. Helbing K, Saramäki J. Elements of AI: introduction to AI [Internet]. University of Helsinki; [cited April 22, 2025]. Available from: https://studies.helsinki.fi/courses/course-implementation/otm-ff82eeb8-5bd0-43e4-808e-03a1b1c1e22e/_Elements_of_AI_Introduction_to_AI.
62. Google. Machine learning crash course [Internet]. Google Developers; [cited April 22, 2025]. Available from: https://developers.google.com/machine-learning/crash-course.
63. NHS England. NHS AI Lab [Internet]. London: NHS England; [cited April 25, 2025]. Available from: https://transform.england.nhs.uk/ai-lab/.
64. Stanford Center for Artificial Intelligence in Medicine & Imaging. AIMI grand rounds [Internet]. Stanford (CA): Stanford University;
© 2025 The Author(s). This work is published and licensed by Dove Medical Press Limited. The
full terms of this license are available at https://www.dovepress.com/terms.php
and incorporate the Creative Commons Attribution
- Non Commercial (unported, 4.0) License.
By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted
without any further permission from Dove Medical Press Limited, provided the work is properly
attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.