Back to Journals » Journal of Pain Research » Volume 18

Biases in Artificial Intelligence Application in Pain Medicine

Authors Jumreornvong O, Perez AM , Malave B, Mozawalla F, Kia A, Nwaneshiudu CA

Received 3 October 2024

Accepted for publication 13 December 2024

Published 28 February 2025 Volume 2025:18 Pages 1021—1033

DOI https://doi.org/10.2147/JPR.S495934

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 2

Editor who approved publication: Professor E Alfonso Romero-Sandoval



Oranicha Jumreornvong,1 Aliza M Perez,1 Brian Malave,1 Fatimah Mozawalla,1 Arash Kia,2 Chinwe A Nwaneshiudu2,3

1Department of Human Performance and Rehabilitation, Icahn School of Medicine at Mount Sinai, New York, NY, USA; 2Department of Anesthesiology, Perioperative and Pain Medicine, Icahn School of Medicine at Mount Sinai, New York, NY, USA; 3Center for Disease Neurogenomics, Icahn School of Medicine at Mount Sinai, New York, NY, USA

Correspondence: Oranicha Jumreornvong, Email [email protected]

Abstract: Artificial Intelligence (AI) has the potential to optimize personalized treatment tools and enhance clinical decision-making. However, biases in AI, arising from sex, race, socioeconomic status (SES), and statistical methods, can exacerbate disparities in pain management. This narrative review examines these biases and proposes strategies to mitigate them. A comprehensive literature search across databases such as PubMed, Google Scholar, and PsycINFO focused on AI applications in pain management and sources of biases. Sex and racial biases often stem from societal stereotypes, underrepresentation of females, overrepresentation of European ancestry patients in clinical trials, and unequal access to treatment caused by systemic racism, leading to inaccurate pain assessments and misrepresentation in clinical data. SES biases reflect differential access to healthcare resources and incomplete data for lower SES individuals, resulting in larger prediction errors. Statistical biases, including sampling and measurement biases, further affect the reliability of AI algorithms. To ensure equitable healthcare delivery, this review recommends employing specific fairness-aware techniques such as reweighting algorithms, adversarial debiasing, and other methods that adjust training data to minimize bias. Additionally, leveraging diverse perspectives—including insights from patients, clinicians, policymakers, and interdisciplinary collaborators—can enhance the development of fair and interpretable AI systems. Continuous monitoring and inclusive collaboration are essential for addressing biases and harnessing AI’s potential to improve pain management outcomes across diverse populations.

Keywords: pain, artificial intelligence, biases, race, gender, socioeconomic status, statistical biases

Introduction

Artificial intelligence (AI) is revolutionizing pain management by enabling personalized care and enhancing clinical decision-making. AI systems, designed to perform tasks such as data processing, pattern recognition, and natural language comprehension, rely on algorithms—step-by-step problem-solving methods—and models, which are mathematical representations of real-world processes. Machine learning (ML), a subset of AI, utilizes techniques such as supervised learning, where labeled input and output training data are used, and unsupervised learning, which analyzes raw, unlabeled data. Deep learning, an advanced form of ML inspired by the human brain, is particularly impactful in recognizing complex patterns in data, such as medical images, text, and audio, enabling more accurate insights and predictions.1 These tools have found extensive applications in healthcare, particularly in analyzing medical imaging and processing complex datasets critical to pain research. For instance, deep learning models have been employed alongside techniques such as random forests and support vector machines for the analysis of medical images,2 which is essential in diagnosing and treating pain-related conditions. However, a major limitation is the variability in dataset quality and size, as large, high-quality datasets are often required to achieve meaningful results in medical imaging and related analyses.3

The application of AI in pain management encompasses assessing pain, predicting treatment responses, and optimizing interventions. Pain management addresses diverse populations with unique needs, including elderly adults, middle-aged adults, children, and newborns. The clinical steps involved include accurate pain assessment, determination of underlying causes, formulation of treatment plans, monitoring for efficacy, and timely adjustment of interventions. AI-driven systems have the potential to improve each step by providing more precise diagnostic insights, predicting individual responses to therapies, and tailoring interventions for specific demographics.

Despite these advantages, biases in AI systems threaten to perpetuate or exacerbate existing disparities in pain management. Biases may arise from several sources, including the algorithms themselves, the data used to train them, or statistical factors. For example, sampling bias occurs when training data fails to represent broader populations, while measurement bias reflects differences in how pain is assessed across demographic groups. Algorithmic bias, meanwhile, stems from inequities embedded in algorithmic decision-making, resulting in unequal outcomes for specific populations. These biases could disproportionately affect vulnerable groups, such as female patients, racial and ethnic minorities, and individuals from lower socioeconomic statuses, ultimately restricting equitable access to care.

To address these challenges, this narrative review explores the origins and implications of biases in AI-driven pain management, with a particular focus on sex, race, ethnicity, and socioeconomic status. By incorporating real-world examples, such as studies where AI tools underestimated pain levels in specific demographic groups, this review aims to promote awareness and advocate for clinical and policy measures that ensure fairness, inclusivity, and personalization in pain management. This review also outlines actionable strategies to address bias, such as developing culturally sensitive pain assessment tools, incorporating socioeconomic variables into predictive models, and ensuring sex-specific evaluations of treatment efficacy. By informing future research directions and equipping clinicians, researchers, and policymakers with tools to create equitable AI-supported approaches, this work seeks to maximize the benefits of AI in pain management while minimizing its potential harms.

Methods

We evaluated studies from databases including PubMed, Google Scholar, and PsycINFO. As seen in Figure 1, the searches focused on research addressing the applications of AI in pain management and their impacts on sex, race, ethnicity, and socioeconomic status, as well as strategies to mitigate biases in AI algorithms. A total of 4207 non-duplicates records were initially identified through database searches specifically mentioning AI and pain in either the title or abstract. Records were reduced to 903 after preliminary screening for clinical trials, meta-analysis, randomized controlled trial, review and systematic review. Keywords used to further refine the search were “Pain”, “Sex”, “Socioeconomic Status”, “Race”, “Ethnicity”, “Statistical Bias”, “Artificial Intelligence”, “Deep learning”, “Algorithmic bias”, “Bias mitigation strategy”, “Data-driven AI system”, “Fairness in data mining”, and/or “Fairness management”.

Figure 1 Flow Diagram for Study Selection in Narrative Review of AI, Pain Management, and Biases (2010–2024).

Inclusion criteria specified studies published in peer-reviewed journals between 2010 and 2024, focusing on applications of AI in pain management or bias mitigation strategies relevant to sex, race, ethnicity, or socioeconomic status. Studies were required to provide original data or systematic insights into AI biases, fairness, or mitigation techniques. Exclusion criteria included non-peer-reviewed studies, conference abstracts without sufficient methodological detail, and papers unrelated to AI applications in pain management or bias mitigation.

To ensure a rigorous and unbiased selection process, at least 2 reviewers screened the studies. Disagreements during the screening process were resolved through discussion, and when necessary, an independent reviewer was consulted to reach consensus. Following the initial screening, reviewers evaluated for their contribution to the thematic development and overall argument of the review. This resulted in the inclusion of 48 studies that offered critical insights into biases in pain management and strategies for their mitigation.

Results

Overview of Artificial Intelligence and Pain Management

Pain affects billions of people worldwide, impacting both physical and psychosocial aspects of health.4 Scientists had utilized AI to improve pain assessment, diagnoses, and treatment.5 AI and ML has the potential to study pain mechanisms, assist with drug and intervention discoveries, and facilitate a fairer and more diverse clinical trials’ recruitment processes.6,7 AI and ML techniques are used to analyze both self-reported pain scores and scale-based clinical pain assessments, as well as pain-related physiological measurements, to improve pain assessment, prediction, and treatment outcomes.8 For example, the Patient-Reported Outcomes Measurement Information System (PROMIS-29) score evaluates a patient’s pain intensity, interference, function, and psychological health, which can be monitored with the help of AI, as seen in other patient-reported outcome measurement scores. In addition to self-reported pain, scale-based pain assessments, such as the Visual Analog Scale (VAS) or Numeric Rating Scale (NRS), are commonly used in clinical settings. Furthermore, AI-driven pain management systems often integrate multimodal data and scales as pain labels, enabling more comprehensive and accurate predictions of pain severity and treatment responses. These scales may be monitored with the help of AI as seen in other patient reported outcome measurement scores.9 AI-powered chatbots may also improve treatment plans and offer immediate access to information.10 Utilizing AI in pain management can improve patient care by alleviating anxieties, promoting adherence, and facilitating a relationship between patients and physicians.

AI can potentially improve pain diagnoses, prediction, and self-management.11 Its ability to rapidly study large patient datasets and identify patterns in acute pain symptoms can offer real time feedback and tailor management for patients with chronic pain. AI could integrate data from the medical records, wearable devices like Apple watch, or any patient reported outcome data to provide a comprehensive description and prediction of pain onset, palliative factor, quality, radiation, severity, temporality, and inciting incident.12–14 AI can be a powerful tool to help physicians anticipate and manage complex pain effectively.15 Research reveals that AI prediction models could utilize data on patients’ knee symptoms, Kellgren-Lawrence grading, their sex, race, and ethnicity to predict outcomes for total knee arthroplasty (TKA). In this study, AI was 80% accurate in predicting which patients would undergo TKA within the next two years.16 This technology could be utilized to predict which patients would undergo successful spinal cord stimulator trials, kyphoplasty or other advanced interventional pain procedures. However, more data is needed to validate its efficacy in large clinical trials.17 By leveraging extensive datasets available through the medical records system and remote patient monitoring system by various medical device companies, we have the potential to personalize pain management, increase quality of life and improve long term outcomes for patients with chronic pain.

Utilizing AI tools in clinical pain setting could forecast unique patient’s responses or lack of responses to treatment during acute pain flare ups. AI’s ability to predict symptoms in chronic pain patients could enable better coordination between primary care providers and pain specialists to facilitate earlier pain prevention and possible interventions if the likelihood of dangerous red flags symptoms, like spinal cord injury, active infection, or tumor, arise.18,19 Despite these positive features, concerns with data literacy, algorithmic bias, and the need for uniform standards within the healthcare profession must be resolved if we are to fully utilize AI in pain treatment.7 Important measures include teaching providers the application of AI tools and the ethical usage of patient data.20 AI can also help reduce administrative burden, such as coordinating appointments, scheduling meetings, welcoming visitors, dispatching timely reminders and note writings, and thus improve patient care despite potential biases.21

Regardless of the benefits of AI, there may be ethical issues that need to be addressed. Transparency and accessibility of AI systems, which may be viewed as “black boxes” with limited understanding in how they make decisions,22 are among the major challenges. If the AI system does not clearly provide rationale to both physicians and patients as to why certain clinical predictions or recommendations are made, these AI models may detrimentally affect the patient–doctor relation and foster mistrust. When assessing medical records for AI data analysis, researchers should also be wary about potential data breach.23 Furthermore, many are still worried that AI biases will aggravate further health inequalities.24 Biases in AI usage in pain management algorithms might result in differences in treatment depending on sex, color, socioeconomic status, and statistical data representativeness utilized to train these AI models.

Sex Biases in AI and Pain Management

For our paper, gender refers to the identities that society describes individuals, while sex is the biological differences such as chromosomes, hormone levels, and reproductive/sexual anatomy. Given that our literature review reveals studies examining on sex differences and do not account for gender identities that may vary, this paper will focus on sex biases given more existing evidence. Future research and AI applications should include a broader variety of gender identities to ensure more diverse and comprehensive data. As seen in Table 1, research shows disparities in AI algorithms’ accuracy rates for diagnosing and treating pain between male and female patients.25 Female underrepresentation in clinical trials can lead to inaccurate data being used to train AI models that evaluate pain.26 Advocating for more equitable sex representation in clinical trial data could help offset potential disparities. Such bias is seen in research on an AI chat bot that demonstrates variations of clinical recommendations based on patients’ sex.27 If left unchecked, these biases can have significant implications for pain management, affecting clinical care and health outcomes of both male and female patients.28

Table 1 Summary of Sex-Related Biases in AI-Driven Pain Management Models

For example, the use of AI in clinical care introduces biases in perioperative settings.30 Clinical decision support AI systems could unknowingly propagate sex stereotypes. Research shows that data with male patients results in algorithms that could be less reliable for females, especially in detecting pain disorders, which show up differently between sexes.29 Furthermore, algorithms developed usually lack diversity and well-established clinical guidance. These biases are seen in the informed consent process for surgeries as well as for clinical research, where researchers communicate differently to male and female patients and thus impact how patients of different sex interpret information.31 Female participants are less likely than male participants to receive detailed explanations. They are also more likely to have their concerns about the clinical trials dismissed, which could contribute to female participants’ understanding of research studies’ risks and benefits and thus may contribute to their reluctance to participate in the clinical trials.37 In addition, the risk modeling used in AI may neglect sex-specific elements due to lack of data and thus distort risk estimates for female patients experiencing pain.32 Sex-sensitive models that focus on sex-based variations in data can produce further biased results.38

In addition, sex-specific responses to therapeutic interventions emphasize the need to involve sex in medicine and clinical research to improve produced data for the AI systems. Studies on knee pain, for example, indicate that compared to male patients, female patients might react differently to pain treatments, such intra-articular knee injections.33 Cardiovascular-related pain conditions utilizing AI also show differences in sex.34 Conventional research practices often either fail to analyze results based on sex or include adequate representation of both sexes. Researchers observe disparities between males and females in the severity of osteoid metaplasia and low back pain.35,36 Strategies to mitigate these sex biases include but are not limited to representing more equitable data for both males and females, increasing understanding of the biological basis of sex differences in pain treatment responses through more sex-specific research, and to incorporate sex-specific considerations in AI algorithms that provide clinical prediction and recommendations. Researching and implementing strategies to mitigate biases could ensure a more equitable healthcare delivery among sexes when utilizing AI.39

Racial Biases in AI and Pain Management

Racial bias biases in AI and pain management may arise from systemic healthcare issues and overrepresentation of patients with European heritage in clinical trials. As seen in Table 2, this may be due unequal access to pain clinics and tertiary care centers, where data used for AI models could be collected.40 A study that examines AI chatbot responses to clinical scenarios reveals AI biases such as different recommendations based on patients’ demographics.27 Therefore, these biases can have significant implications for pain management for marginalized populations.28 Particularly in diseases like osteoarthritis, research using deep learning techniques revealed significant racial inequalities in pain rating.41 These methods can account for discrepancies when predicting patients’ experienced pain using knee X-rays. AI assisted CT and MRI image analysis also reveals racial bias that guides pain management. Studies reveal that lower socioeconomic level, absence of health insurance, and limited availability of imaging facilities in underdeveloped areas may contribute to minorities’ often unequal access to such significant imaging technology.42 More diverse training data set may improve the use of AI in pain management for patients of color. In addition, by analyzing facial expressions, body language, and vocal signs, AI may be used to automate pain assessment.43 However, racial biases in facial recognition software may be problematic as current data mostly contain pictures of lighter skinned individuals. This lack of training set could reduce accuracy for those with darker skin.44 While 62% of the population identifies solely as white, 38% identify as non-white or multiracial, a demographic growing rapidly due to global migration. Relying exclusively on data from white patients can lead to inaccurate outcomes, such as misdiagnosing conditions like melanoma in Black individuals when the AI has only been trained on white patient data.19

Table 2 Summary of Racial Biases in AI-Driven Pain Management Models

Socioeconomic Biases in AI and Pain Management

Socioeconomic status (SES) can also affect biases in AI and pain management because data from lower SES may be excluded in the AI models used for pain management. Seen in Table 3, there may be incomplete electronic health record data from lower SES population. Important data, such as pain scores, ICD-10 coding, reporting of pain prescriptions and therapies, from lower SES patients may be excluded from AI models.45,46 Studies support that lower SES patients had a greater proportion of missing data, potentially leading to disparities in AI models affecting patient care.47 Due to incomplete data sets that make up the AI decision making models, lower SES patients are at risk of receiving poorer treatments and outcomes.48 The missing EHR data suggests a potential lack of equitable pain related data, which may contribute to furthering biases in pain management. Research examining the impact of AI chatbot responses on clinical scenarios also supports that there are variations in recommendations based on patients’ socioeconomic status.27

Table 3 Summary of Socioeconomic Biases in AI-Driven Pain Management Models

In addition, the correlation between socioeconomic position and assessments of pain sensitivity also underscores the important role of bias in medical decision-making processes.49 AI algorithms may contribute to the perpetuation of disparity in pain management by assuming that those with lower socioeconomic status suffer less pain. A more robust pain related data that also incorporates equitable SES could help improve AI chatbots that are linked into healthcare systems as it could help fill in the gap for low resource settings with lack of pain specialists.10 Although there are concerns about SES biases in AI and pain management, clinician approved and secured multilingual AI chatbots can increase medical information by making it accessible through smart phone applications and clinic kiosks.

Statistical Biases in AI and Pain Management

The use of AI in pain management may also contain statistical biases50 as seen in Table 4. Datasets often exclude underrepresented demographics. Biased datasets can lead to a limited ability to apply the algorithms in a diverse patient population. The heterogeneity in size and limited datasets in pain research may make the AI tools more susceptible to sample bias, measurement or classification bias, and label bias.51 Sampling bias may influence reliability of these AI tools because the datasets do not accurately represent the total population. Measurement or classification bias can also arise from discrepancies in healthcare provision. Label bias during the data gathering process may also lead to inaccurate conclusions being drawn across different demographic groups. Furthermore, the presence of bias caused by incomplete data limits the AI’s capacity to consider a wide range of patient attributes, potentially resulting in the exclusion of specific groups from receiving certain interventions.52 Publication bias can also exacerbate these challenges by skewing the data available for AI model development, favoring positive or statistically significant results over null or negative findings. This selective availability of findings may limit the diversity and representativeness of datasets, further entrenching disparities in pain management. To promote equitable and effective AI-based tools for pain assessment and management, it is important to increase collaboration between clinicians, researchers, policy makers and computer scientists.

Table 4 Summary of Statistical Biases in AI-Driven Pain Management Models

Strategies and Future Directions for Mitigating Biases in AI and Pain Management

Sex differences, race/ethnicity, socioeconomic status, and statistical biases can affect AI models used in pain treatment. We explore the complex issue of bias in pain management by addressing four critical dimensions: identifying clinical areas most vulnerable to bias; examining the potential of AI-driven interventions to reduce disparities; analyzing populations disproportionately affected by bias, including distinctions by sex, race, ethnicity, and socioeconomic status, alongside strategies for AI to address these inequities; integrating advanced tools in pain assessment to further mitigate bias; and outlining future directions to enhance equity, transparency, and inclusivity in AI-supported pain management approaches. We formulate various strategies to minimize these potential biases.53 For example, interpretable AI approaches, such as optimal classification trees, could identify and reduce racial inequalities in the treatment of post-rehabilitation injuries and pain.40 These approaches prioritize fairness and accountability in algorithm design. The use of varied and inclusive datasets can enhance the applicability of AI models, transparency, and fairness-conscious protocols. Moreover, engaging with a wide range of viewpoints from diverse clinicians, scientists, and patients might help to recognize and resolve possible biases.54 Understanding and addressing publication bias requires fair sampling and robust certification procedures that encourage the inclusion of diverse study outcomes.55 Continuous monitoring and improving AI detection systems may effectively supervise bias in AI models, advancing a more equitable pain management system.56

In addition to post-injury rehabilitation, efforts to reduce racial inequities must address vulnerable clinical areas such as perioperative and obstetric care, where biases are more pronounced. For instance, racial disparities during the perioperative phase are evident, with patients of non-European descent often experiencing slower recovery due to unequal access to effective post-operative care.57,58 AI offers a promising solution by identifying patterns of inequality in large datasets related to pain management and rehabilitation outcomes across racial groups and can remotely monitor post-operative patients experiencing acute pain, tailoring treatment regimens specifically for patients of color to mitigate disparities.59 Similarly, in feto-maternal health, AI-assisted early diagnosis and ongoing monitoring can address inequities, reducing the higher prevalence of maternal and newborn complications in minority groups. This demonstrates how AI can monitor a wide range of patient groups and could guarantee that all patients, regardless of their socioeconomic or ethnic background, receive timely therapies.60 By employing AI models designed to “police” other AI systems, healthcare providers can detect and correct biases, ensuring equitable access to timely and effective therapies for all patients, regardless of socioeconomic or ethnic background. These interventions underscore the potential of AI not only to mitigate existing biases but also to create a more inclusive and fair healthcare landscape.

Sex biases in artificial intelligence algorithms can lead to inconsistencies in the identification and treatment of pain, disproportionately affecting females. Similarly, racial biases can negatively impact how patients of color are diagnosed and treated for pain conditions. Patients from lower socioeconomic status (SES) backgrounds face additional disadvantages, as they are often underrepresented in AI training datasets. These gaps highlight the importance of building and utilizing diverse datasets that include females, patients of color, and individuals from lower SES backgrounds to reduce the perpetuation of biases in AI models. AI-based interventions can address these disparities by employing sex-agnostic models to analyze clinical data, ensuring that treatment recommendations are not influenced by inherent biases in the data.61 For example, AI systems trained on diverse and representative data can better predict pain management needs across vulnerable populations, promoting equitable care outcomes. Furthermore, incorporating transparency and fairness-focused practices at every stage of AI development can minimize bias, such as requiring clear disclosures about training data sources, ensuring appropriate labeling, and adopting rigorous bias audits. Effective collaborations and diverse perspectives in AI pain research are also essential for identifying and addressing these disparities.62,63 Finally, equitable communication strategies tailored to vulnerable populations can enhance patient care, while machine learning models trained on diverse data can improve the accuracy of pain diagnoses and treatment recommendations, reducing disparities among those most affected by bias.64 These interventions demonstrate how AI can serve as a powerful tool to mitigate inequities and create a more inclusive approach to pain management.

Our review, as seen in Table 5, recommends minimizing bias in AI pain management by creating pain assessment tools that are culturally sensitive, incorporating lower SES data into predictive pain models, investigating effectiveness of pain treatment to both sexes, and further examining biases in AI algorithms.24,65,66 Strategies to advance a more equitable AI-based model in pain management entail comprehending diverse forms of bias and applying these mitigation strategies at different phases of development. Scientists creating these models should collaborate with pain specialists to map out the life cycle of these softwares and apply machine learning tools appropriately with diverse data sets.67 Researchers have advocated for fairness sampling and certification to create precise and effective AI pain management tools.68 Standardizing reporting procedures, collecting electronic health record data in diverse patient populations and real-world testing could minimize bias and create more precise and equitable AI tools.69

Table 5 Strategies and Impact on Pain Management

Future directions in pain management research should address the unique challenges of special populations, including the elderly, children, and newborns, alongside diverse pain contexts such as acute, chronic, intraoperative, and postoperative pain. In elderly patients, considerations like comorbidities, polypharmacy, cognitive decline, and quality-of-life goals necessitate tailored approaches, while pediatric populations require models sensitive to developmental stages and caregiver-reported data. Poorly designed AI systems risk amplifying biases by underrepresenting these groups, but well-designed models can integrate multimodal datasets—such as physiological signals, behavioral cues, and caregiver input—to enhance accuracy and equity. By stratifying data and validating models across these populations, AI can address demographic and clinical complexities while complementing efforts to mitigate cultural and socioeconomic biases, ensuring inclusive and effective pain management for all.

Conclusion

Our study highlights that the use of AI in pain management may exacerbate biases related to sex, race, ethnicity, socioeconomic status (SES), and statistical methodologies seen in Table 6. For example, biases in sex differences within AI tools could perpetuate inequalities in pain evaluation and treatment due to the underrepresentation of women in clinical research and medical records used to train AI models. Similarly, imbalanced training datasets with overrepresentation of patients of European descent and systemic racism could amplify racial disparities when applying AI tools. Individuals with lower SES are particularly at risk of not fully benefiting from AI-driven pain management, as the performance of algorithms often varies significantly based on SES, resulting in unjust treatment and limited access to equitable care. Furthermore, statistical concerns such as sampling bias, measurement or classification bias, and label bias limit the generalizability and reliability of AI systems. Sampling bias caused by unrepresentative datasets, inaccuracies in clinical measurements, and insufficient data from marginalized populations deprive these groups of the benefits of AI-driven pain management solutions.

Table 6 Summary of Biases in Artificial Intelligence Pain Management

To address these biases, collaborative efforts are crucial. Policymakers can enforce guidelines for equitable data collection and AI governance, ensuring diverse population representation in datasets. Healthcare providers can advocate for inclusive clinical trial designs and identify gaps in algorithmic fairness during implementation. Researchers and AI developers can focus on creating explainable AI models, prioritizing fairness-aware machine learning techniques, and actively testing for bias during the model validation phase. By fostering such collaborations, stakeholders can reduce systemic biases, enhance inclusiveness, and ensure that AI-driven pain management solutions are both equitable and patient-centered. This narrative review builds upon existing research on ethical stewardship of AI in chronic pain, by providing a more targeted focus on addressing specific biases in AI applications.70 As the first study to propose an actionable framework for mitigating these biases, it invites future researchers to expand on this foundation, explore additional dimensions of AI fairness, and further refine methodologies to optimize equitable and inclusive pain management solutions.

Ethics

No IRB required given narrative review.

Funding

There is no funding to report.

Disclosure

The authors declare no conflict of interest.

References

1. Russell SJ, Norvig P. Artificial Intelligence: A Modern Approach. 4th ed. Pearson; 2020.

2. Mills SEE, Nicolson KP, Smith BH. Chronic pain: a review of its epidemiology and associated factors in population-based studies. Br J Anaesth. 2019;123(2):e273–e283. doi:10.1016/j.bja.2019.03.023 Epub 2019 May 10. PMID: 31079836; PMCID: PMC6676152.

3. Lötsch J, Ultsch A, Mayer B, Kringel D. Artificial intelligence and machine learning in pain research: a data scientometric analysis. Pain Rep. 2022;7(6):e1044. doi:10.1097/PR9.0000000000001044 PMID: 36348668; PMCID: PMC9635040.

4. Yong RJ, Mullins PM, Bhattacharyya N. Prevalence of chronic pain among adults in the United States. Pain. 2022;163(2):e328–e332. doi:10.1097/j.pain.0000000000002291 PMID: 33990113.

5. Abd-Elsayed A, Robinson CL, Marshall Z, Diwan S, Peters T. Applications of artificial intelligence in pain medicine. Curr Pain Headache Rep. 2024;28(4):229–238. doi:10.1007/s11916-024-01224-8 Epub 2024 Feb 12. PMID: 38345695.

6. Cerda IH, Zhang E, Dominguez M, et al. Artificial intelligence and virtual reality in headache disorder diagnosis, classification, and management. Curr Pain Headache Rep. 2024;28:869–880. doi:10.1007/s11916-024-01279-7 Epub ahead of print. PMID: 38836996.

7. Mazzolenis MV, Mourra GN, Moreau S, et al. The role of virtual reality and artificial intelligence in cognitive pain therapy: a narrative review. Curr Pain Headache Rep. 2024. doi:10.1007/s11916-024-01270-2

8. Adams MCB, Nelson AM, Narouze S. Daring discourse: artificial intelligence in pain medicine, opportunities and challenges. Reg Anesth Pain Med. 2023;48(9):439–442. doi:10.1136/rapm-2023-104526 Epub 2023 May 11. PMID: 37169486; PMCID: PMC10525018.

9. Hays RD, Herman PM, Rodriguez A, Slaughter M, Zeng C, Edelen MO. The PROMIS-16 reproduces the PROMIS-29 physical and mental health summary scores accurately in a probability-based internet panel. Qual Life Res. 2024. Epub ahead of print. PMID: 38652369. doi:10.1007/s11136-024-03662-8.

10. Bigman YE, Chi Yam K, Marciano D, Reynolds SJ, Gray K. Threat of racial and economic inequality increases preference for algorithm decision-making. Computers Human Behav. 2021;122:106859. doi:10.1016/j.chb.2021.106859

11. El-Tallawy SN, Pergolizzi JV, Vasiliu-Feltes I, et al. Incorporation of ”artificial intelligence” for objective pain assessment: a comprehensive review. Pain Ther. 2024;13:293–317. doi:10.1007/s40122-024-00584-8 Epub ahead of print. PMID: 38430433.

12. Mohsen F, Ali H, El Hajj N, Shah Z. Artificial intelligence-based methods for fusion of electronic health records and imaging data. Sci Rep. 2022;12(1):17981. doi:10.1038/s41598-022-22514-4 PMID: 36289266; PMCID: PMC9605975.

13. Shajari S, Kuruvinashetti K, Komeili A, Sundararaj U. The emergence of AI-based wearable sensors for digital health technology: a review. Sensors. 2023;23(23):9498. doi:10.3390/s23239498 PMID: 38067871; PMCID: PMC10708748.

14. Hagedorn JM, George TK, Aiyer R, Schmidt K, Halamka J, D’Souza RS. Artificial intelligence and pain medicine: an introduction. J Pain Res. 2024;17:509–518. doi:10.2147/JPR.S429594 PMID: 38328019; PMCID: PMC10848920.

15. Meier TA, Refahi MS, Hearne G, et al. The role and applications of artificial intelligence in the treatment of chronic pain. Curr Pain Headache Rep. 2024;28:769–784. doi:10.1007/s11916-024-01264-0

16. Heisinger S, Hitzl W, Hobusch GM, Windhager R, Cotofana S. Predicting total knee replacement from symptomology and radiographic structural change using artificial neural networks-data from the osteoarthritis initiative (OAI). J Clin Med. 2020;9(5):1298. doi:10.3390/jcm9051298 PMID: 32369985; PMCID: PMC7288322.

17. Ramírez C, José. AI in healthcare: revolutionizing patient care with predictive analytics and decision support systems. J Artificial Intelligence Gen Sci. 2024. doi:10.60087/jaigs.v1i1.p37

18. Piette JD, Newman S, Krein SL, et al. Patient-centered pain care using artificial intelligence and mobile health tools: a randomized comparative effectiveness trial. JAMA Intern Med. 2022;182(9):975–983. doi:10.1001/jamainternmed.2022.3178 PMID: 35939288; PMCID: PMC9361183.

19. Robinson CL, D’Souza RS, Yazdi C, et al. Reviewing the potential role of artificial intelligence in delivering personalized and interactive pain medicine education for chronic pain patients. J Pain Res. 2024;17:923–929. doi:10.2147/JPR.S439452 PMID: 38464902; PMCID: PMC10924768.

20. Astărăstoae V, Rogozea LM, Leaşu F, Ioan BG. Ethical dilemmas of using artificial intelligence in medicine. Am J Ther. 2024;31:e388–e397. doi:10.1097/MJT.0000000000001693 Epub ahead of print. PMID: 38662923.

21. Gandhi TK, Classen D, Sinsky CA, et al. How can artificial intelligence decrease cognitive and work burden for front line practitioners? JAMIA Open. 2023;6(3):ooad079. doi:10.1093/jamiaopen/ooad079 PMID: 37655124; PMCID: PMC10466077.

22. Poon AIF, Sung JJY. Opening the black box of AI-medicine. J Gastroenterol Hepatol. 2021;36(3):581–584. doi:10.1111/jgh.15384 PMID: 33709609.

23. Li J. Security implications of AI chatbots in health care. J Med Internet Res. 2023;25:e47551. doi:10.2196/47551 PMID: 38015597; PMCID: PMC10716748.

24. d’Elia A, Gabbay M, Rodgers S, et al. Artificial intelligence and health inequities in primary care: a systematic scoping review and framework. Fam Med Comm Health. 2022;10(Suppl 1):e001670. doi:10.1136/fmch-2022-001670 PMID: 36450391; PMCID: PMC9716837.

25. Buslón N, Cortés A, Catuara-Solarz S, Cirillo D, Rementeria MJ. Raising awareness of sex and gender bias in artificial intelligence and health. Front Glob Women's Health. 2023;4:970312. doi:10.3389/fgwh.2023.970312 PMID: 37746321; PMCID: PMC10512182.

26. Bierer BE, Meloney LG, Ahmed HR, White SA. Advancing the inclusion of underrepresented women in clinical research. Cell Rep Med. 2022;3(4):100553. doi:10.1016/j.xcrm.2022.100553 PMID: 35492242; PMCID: PMC9043984.

27. Kim J, Cai ZR, Chen ML, Simard JF, Linos E. Assessing biases in medical decisions via clinician and AI chatbot responses to patient vignettes. JAMA Network Open. 2023;6(10):e2338050. doi:10.1001/jamanetworkopen.2023.38050 PMID: 37847506; PMCID: PMC10582782.

28. O’Reilly-Shah VN, Gentry KR, Walters AM, Zivot J, Anderson CT, Tighe PJ. Bias and ethical considerations in machine learning and the automation of perioperative risk assessment. Br J Anaesth. 2020;125(6):843–846. doi:10.1016/j.bja.2020.07.040 Epub 2020 Aug 21. PMID: 32838979; PMCID: PMC7442146.

29. Weber AM, Gupta R, Abdalla S, Cislaghi B, Meausoone V, Darmstadt GL. Gender-related data missingness, imbalance and bias in global health surveys. BMJ Glob Health. 2021;6(11):e007405. doi:10.1136/bmjgh-2021-007405 PMID: 34836912; PMCID: PMC8628344.

30. Maheshwari K, Cywinski JB, Papay F, Khanna AK, Mathur P. Artificial intelligence for perioperative medicine: perioperative intelligence. Anesth Analg. 2023;136(4):637–645. doi:10.1213/ANE.0000000000005952 Epub 2023 Mar 16. PMID: 35203086.

31. Borkhoff CM, Hawker GA, Kreder HJ, Glazier RH, Mahomed NN, Wright JG. Influence of patients’ gender on informed decision making regarding total knee arthroplasty. Arthritis Care Res. 2013;65(8):1281–1290. doi:10.1002/acr.21970 PMID: 23401380.

32. Tam V, Tong B, Gorawara-Bhat R, Liao C, Ferguson MK. Gender bias affects assessment of frailty and recommendations for surgery. Ann Thorac Surg. 2020;109(3):938–944. doi:10.1016/j.athoracsur.2019.06.066 Epub 2019 Aug 10. PMID: 31408644.

33. Tan SHS, Kripesh A, Chan CX, Krishna L. Gender differences in intra-articular and extra-articular injuries associated with acute anterior cruciate ligament ruptures. J Knee Surg. 2019;32(7):616–619. doi:10.1055/s-0038-1666828 PMID: 30068011.

34. Karnib S, Chinnaiyan KM. Coronary computed tomography angiography: enhancing risk stratification and diagnosis of cardiovascular disease in women. Curr Treat Options Cardiovasc Med. 2019;21(10):62. doi:10.1007/s11936-019-0760-1 PMID: 31584125.

35. Borg TM, Heidari N, Noorani A, et al. Gender-specific response in pain and function to biologic treatment of knee osteoarthritis: a gender-bias-mitigated, observational, intention-to-treat study at two years. Stem Cells Int. 2021;2021:6648437. doi:10.1155/2021/6648437 PMID: 33727933; PMCID: PMC7935570.

36. Schilter LV, Le Boudec JA, Hugli O, et al. Gender-based differential management of acute low back pain in the emergency department: a survey based on a clinical vignette. Women's Health. 2024;20:17455057231222405. doi:10.1177/17455057231222405 PMID: 38282544; PMCID: PMC10826390.

37. Gitanjali B, Raveendran R, Pandian DG, Sujindra S. Recruitment of subjects for clinical trials after informed consent: does gender and educational status make a difference? J Postgrad Med. 2003;49(2):109–113. PMID: 12867683.

38. Muylle KM, Cornu P, Cools W, Barbé K, Buyl R, Van Laere S. Optimization of performance by combining most sensitive and specific models in data science results in majority voting ensemble. Stud Health Technol Inform. 2022;294:435–439. doi:10.3233/SHTI220496 PMID: 35612117.

39. Norori N, Hu Q, Aellen FM, Faraci FD, Tzovara A. Addressing bias in big data and AI for health care: a call for open science. Patterns. 2021;2(10):100347. doi:10.1016/j.patter.2021.100347 PMID: 34693373; PMCID: PMC8515002.

40. Gebran A, Thakur SS, Maurer LR, et al. Development of a machine learning-based prescriptive tool to address racial disparities in access to care after penetrating trauma. JAMA Surg. 2023;158(10):1088–1095. doi:10.1001/jamasurg.2023.2293 PMID: 37610746; PMCID: PMC10448365.

41. Lee LS, Chan PK, Wen C, et al. Artificial intelligence in diagnosis of knee osteoarthritis and prediction of arthroplasty outcomes: a review. Arthroplasty. 2022;4(1):16. doi:10.1186/s42836-022-00118-7 PMID: 35246270; PMCID: PMC8897859.

42. Tejani AS, Ng YS, Xi Y, Rayan JC. Understanding and mitigating bias in imaging artificial intelligence. Radiographics. 2024;44(5):e230067. doi:10.1148/rg.230067 PMID: 38635456.

43. Fontaine D, Vielzeuf V, Genestier P, et al. DEFI study group. Artificial intelligence to evaluate postoperative pain based on facial expression recognition. Eur J Pain. 2022;26(6):1282–1291. doi:10.1002/ejp.1948 Epub 2022 Apr 6. PMID: 35352426.

44. Bacchini F, Lorusso L. Race, again. How face recognition technology reinforces racial discrimination. J Inf CommunEthics Soc. 2019;17:321–335. doi:10.1108/JICES-05-2018-0050

45. Gianfrancesco MA, Goldstein ND. A narrative review on the validity of electronic health record-based research in epidemiology. BMC Med Res Methodol. 2021;21:234. doi:10.1186/s12874-021-01416-5

46. Clarke H, Fitzcharles MA. Are electronic health records sufficiently accurate to phenotype rheumatology patients with chronic pain? J Rheumatol. 2024;51(3):218–220. doi:10.3899/jrheum.2023-1227 PMID: 38224990.

47. Nazer LH, Zatarah R, Waldrip S, et al. Bias in artificial intelligence algorithms and recommendations for mitigation. PLOS Digit Health. 2023;2(6):e0000278. doi:10.1371/journal.pdig.0000278 PMID: 37347721; PMCID: PMC10287014.

48. Summers KM, Deska JC, Almaraz SM, Kurt hugenberg EPL, Lloyd EP. Poverty and pain: low-SES people are believed to be insensitive to pain. J Exp Soc Psychol. 2021;95:104116. doi:10.1016/j.jesp.2021.104116

49. Dorner TE, Muckenhuber J, Stronegger WJ, Ràsky E, Gustorff B, Freidl W. The impact of socio-economic status on pain and the perception of disability due to pain. Eur J Pain. 2011;15(1):103–109. doi:10.1016/j.ejpain.2010.05.013 Epub 2010 Jun 16. PMID: 20558096.

50. Chin MH, Afsar-Manesh N, Bierman AS, et al. Principles to address the impact of algorithm bias on racial and ethnic disparities in health and health care. JAMA Network Open. 2023;6(12):e2345050. doi:10.1001/jamanetworkopen.2023.45050 PMID: 38100101.

51. Faes L, Sim DA, van Smeden M, Held U, Bossuyt PM, Bachmann LM. Artificial intelligence and statistics: just the old wine in new wineskins? Front Digit Health. 2022;4:833912. doi:10.3389/fdgth.2022.833912 PMID: 35156082; PMCID: PMC8825497.

52. Bekbolatova M, Mayer J, Ong CW, Toma M. Transformative potential of AI in healthcare: definitions, applications, and navigating the ethical landscape and public perspectives. Healthcare. 2024;12(2):125. doi:10.3390/healthcare12020125 PMID: 38255014; PMCID: PMC10815906.

53. Mittermaier M, Raza MM, Kvedar JC. Bias in AI-based models for medical applications: challenges and mitigation strategies. NPJ Digit Med. 2023;6(1):113. doi:10.1038/s41746-023-00858-z PMID: 37311802; PMCID: PMC10264403.

54. Gray M, Samala R, Liu Q, et al. Measurement and mitigation of bias in artificial intelligence: a narrative literature review for regulatory science. Clin Pharmacol Ther. 2024;115(4):687–697. doi:10.1002/cpt.3117 Epub 2023 Dec 12. PMID: 38018360.

55. Rana SA, Azizul ZH, Awan AA. A step toward building a unified framework for managing AI bias. PeerJ Comput Sci. 2023;9:e1630. doi:10.7717/peerj-cs.1630 PMID: 38077542; PMCID: PMC10702934.

56. Yen CP, Hung TW. Achieving equity with predictive policing algorithms: a social safety net perspective. Sci Eng Ethics. 2021;27(3):36. doi:10.1007/s11948-021-00312-x PMID: 34075448.

57. Cyr ME, Etchin AG, Guthrie BJ, et al. Access to specialty healthcare in urban versus rural US populations: a systematic literature review. BMC Health Serv Res. 2019;19:974. doi:10.1186/s12913-019-4815-5

58. Pierson E, Cutler DM, Leskovec J, Mullainathan S, Obermeyer Z. An algorithmic approach to reducing unexplained pain disparities in underserved populations. Nat Med. 2021;27:136–140. doi:10.1038/s41591-020-01192-7

59. Halamka J, Bydon M, Cerrato P, Bhagra A. Addressing racial disparities in surgical care with machine learning. NPJ Digit Med. 2022;5(1):152. doi:10.1038/s41746-022-00695-6 PMID: 36180724; PMCID: PMC9525720.

60. Yaseen I, Rather RA. A theoretical exploration of artificial intelligence’s impact on feto-maternal health from conception to delivery. Int J Women's Health. 2024;16:903–915. doi:10.2147/IJWH.S454127 PMID: 38800118; PMCID: PMC11128252.

61. Molnar C, König G, Herbinger J, et al. General Pitfalls of Model-Agnostic Interpretation Methods for Machine Learning Models. In: Holzinger A, Goebel R, Fong R, Moon T, Müller KR, Samek W, editors. xxAI - Beyond Explainable AI. xxAI 2020. Lecture Notes in Computer Science. Cham: Springer; Vol. 13200: 39–68, 2022

62. Chen F, Wang L, Hong J, Jiang J, Zhou L. Unmasking bias in artificial intelligence: a systematic review of bias detection and mitigation strategies in electronic health record-based models. J Am Med Inform Assoc. 2024:ocae060. Epub ahead of print. PMID: 38520723. doi:10.1093/jamia/ocae060

63. Abràmoff MD, Tarver ME, Loyo-Berrios N, et al.; Foundational Principles of Ophthalmic Imaging and Algorithmic Interpretation Working Group of the Collaborative Community for Ophthalmic Imaging Foundation, Washington, D.C.; Maisel WH. Considerations for addressing bias in artificial intelligence for health equity. NPJ Digit Med. 2023;6(1):170. PMID: 37700029; PMCID: PMC10497548. doi:10.1038/s41746-023-00913-9

64. Linardatos P, Papastefanopoulos V, Kotsiantis S. Explainable AI: a review of machine learning interpretability methods. Entropy. 2020;23(1):18. doi:10.3390/e23010018 PMID: 33375658; PMCID: PMC7824368.

65. Solans Noguero D, Ramírez-Cifuentes D, Ríssola EA, Freire A. Gender bias when using artificial intelligence to assess anorexia nervosa on social media: data-driven study. J Med Internet Res. 2023;25:e45184. doi:10.2196/45184 PMID: 37289496; PMCID: PMC10288345.

66. Flores L, Kim S, Young SD. Addressing bias in artificial intelligence for public health surveillance. J Med Ethics. 2024;50(3):190–194. doi:10.1136/jme-2022-108875 PMID: 37130756.

67. Mensah B, George. Artificial intelligence and ethics: a comprehensive review of bias mitigation. Transparency, and Accountability in AI Systems. 2023. doi:10.13140/RG.2.2.23381.19685/1

68. Wana Y, Wang L, Zhou Z, et al. Assessing fairness in machine learning models: a study of racial bias using matched counterparts in mortality prediction for patients with chronic diseases. J Biomed Inform. 2024;12:104677. doi:10.1016/j.jbi.2024.104677 Epub ahead of print. PMID: 38876453.

69. Perets O, Stagno E, Yehuda EB, et al. Inherent Bias in Electronic Health Records: A Scoping Review of Sources of Bias medRxiv [Preprint]. 2024. doi:10.1101/2024.04.09.24305594. PMID: 38680842; PMCID: PMC11046491.

70. Mazzolenis ME, Bulat E, Schatman ME, Gumb C, Gilligan CJ, Yong RJ. The ethical stewardship of artificial intelligence in chronic pain and headache: a narrative review. Curr Pain Headache Rep. 2024;28(8):785–792. doi:10.1007/s11916-024-01272-0 Epub 2024 May 29. PMID: 38809404.

Creative Commons License © 2025 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, 3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.