CC BY-NC-ND 4.0 · Yearb Med Inform 2022; 31(01): 094-099
DOI: 10.1055/s-0042-1742504
Special Section: Inclusive Digital Health
Working Group Contributions

The Untapped Potential of Nursing and Allied Health Data for Improved Representation of Social Determinants of Health and Intersectionality in Artificial Intelligence Applications: A Rapid Review

IMIA Student and Emerging Professionals Group
Charlene Esteban Ronquillo
1   School of Nursing, University of British Columbia Okanagan, kelowna, Canada
,
James Mitchell
2   School of Computing and Mathematics, Keele University, UK
,
Dari Alhuwail
3   Information Science Department, Kuwait University, Kuwait and Health Informatics Unit, Dasman Diabetes Institute, Kuwait
,
Laura-Maria Peltonen
4   Department of Nursing Science, University of Turku, Finland
,
Maxim Topaz
5   School of Nursing, Columbia University, New York, USA
,
Lorraine J. Block
6   School of Nursing University of British Columbia Vancouver, BC, Canada
› Author Affiliations
 

Summary

Objectives: The objective of this paper is to draw attention to the currently underused potential of clinical documentation by nursing and allied health professions to improve the representation of social determinants of health (SDoH) and intersectionality data in electronic health records (EHRs), towards the development of equitable artificial intelligence (AI) technologies.

Methods: A rapid review of the literature on the inclusion of nursing and allied health data and the nature of health equity information representation in the development and/or use of artificial intelligence approaches alongside expert perspectives from the International Medical Informatics Association (IMIA) Student and Emerging Professionals Working Group.

Results: Consideration of social determinants of health and intersectionality data are limited in both the medical AI and nursing and allied health AI literature. As a concept being newly discussed in the context of AI, the lack of discussion of intersectionality in the literature was unsurprising. However, the limited consideration of social determinants of health was surprising, given its relatively longstanding recognition and the importance of representation of the features of diverse populations as a key requirement for equitable AI.

Conclusions: Leveraging the rich contextual data collected by nursing and allied health professions has the potential to improve the capture and representation of social determinants of health and intersectionality. This will require addressing issues related to valuing AI goals (e.g., diagnostics versus supporting care delivery) and improved EHR infrastructure to facilitate documentation of data beyond medicine. Leveraging nursing and allied health data to support equitable AI development represents a current open question for further exploration and research.


#

1 Introduction

Inadequate representation of diverse populations in health system datasets represents an important challenge in the development of fair and equitable artificial intelligence (AI) technologies for health. Adverse impacts of AI technologies such as the development of racially biased AI algorithms that result from the lack of diversity of datasets have been spotlighted in recent famous examples, such as an AI algorithm that generated clear pictures from pixelated images which reconstructed the faces of President Obama and other people of colour into white men and women [[1]]. Improved representation of the features of underrepresented population subgroups in datasets used for AI development is one component towards AI fairness. It is commonly suggested that improved and more complete capture of sensitive features of underrepresented groups (e.g., demographic characteristics) and social determinants of health (SDoH) in AI datasets are necessary towards the development of less biased algorithms [[2], [3]]. Beyond providing a foundation for less biased algorithm development at the outset, diverse datasets are also necessary to enable computational approaches to mitigation of bias in AI; these approaches rely, to a large extent, on data quality. For instance, approaches to de-biasing algorithms include reducing the difference in model performance between predefined privileged and under-privileged groups relative to a chosen set of sensitive features (e.g., age or sex) [[4] [5] [6]]. Notably, many de-biasing approaches assume a sufficient volume and type of data to be already present in the dataset to enable computational bias mitigation approaches. We argue that this assumption is often false in the context of health systems. Data captured in electronic health records (EHRs) in health systems are notoriously difficult to access, messy, incomplete, and exist in formats not easily amenable for use in AI development and refinement (e.g., narrative clinical notes). As AI algorithms rely on historical data, it is also important to note the ways that the historical contexts of institutions and broader societal values shape the diversity of datasets [[7]]. For example, in many nations, it is only relatively recently that there are efforts towards the improved and routine collection of sensitive data such as race, ethnicity, gender identity, and sexual orientation (among others) [[8]].

As the aim of health equity is to reduce and eliminate disparities in health and mitigate determinants of health that adversely affect excluded or marginalized groups [[9]], representation of health equity in the context of EHRs is conceptually and operationally complex. Calls for better inclusion and routine capture of SDoH data towards improving AI development are encouraging [[2]] and signal a recognition of the necessity of these data for developing more equitable AI. However, the inclusion of SDoH is only a beginning step. To meaningfully represent the complexity of health equity in datasets, we argue that there is a need to capture sensitive features of populations beyond SDoH and extend to the consideration of intersectionality. Briefly, intersectionality theory orients towards the consideration of social power and the compounding impacts of multiple and concurrent experiences of advantage/disadvantage, power/oppression, and privilege/marginalization that may result from one's intersecting social locations [[10], [11]]. Given the complexity of health equity and the known influence of intersectionality on health and wellbeing [[12] [13] [14] [15] [16]], we propose that improved representation of intersectional data in datasets in addition to SDoH is a necessary development towards developing equitable AI.

Attention to the lived impacts of SDoH and intersectionality, and communication of the patient story are among the key foci of many nursing and allied health professionals' work, requiring a deep understanding of the patient context and environment. In particular, community nursing, public health nursing, home care nursing, social work, and other allied health professionals are optimally situated to witness the manifestations and lived experiences of health equity for patients; understanding of the patient story and contextual factors that influence health and wellbeing are things they need to be especially attuned to. Nevertheless, the attunement to health equity issues is assumed, given the scopes of practice and responsibilities within these professions. We have some understanding of the nature of documentation of the patient story within EHRs: how these data are not often amenable to capture as structured data and the importance of narrative clinical notes in capturing the patient story. Important questions remain outstanding regarding the capture of intersectional data within the EHR (i.e., capturing the complexity of health equity), including who documents intersectional data, where in the EHR these data are collected, and whether/how intersectional data are re-used for AI development.


#

2 Objectives

The objective of this paper is to draw attention to the currently underused potential of clinical documentation by nursing and allied health professions as data sources to improve the representation of SDoH and intersectionality data in EHRs and limited representation of these data in the medical informatics AI literature. We discuss the potential for nursing and allied health data towards the broader aim of improving AI fairness and developing health equity-considerate AI. Through a review of the health professions' informatics literature, we examine the incorporation of SDoH and intersectionality in the biomedical informatics literature and demonstrate the under exploration of nursing and allied health data in AI development in the nursing and allied health informatics literature. In this paper, we seek to develop insight into the question, “How are SDoH and intersectionality concepts considered in AI health professional research?”.


#

3 Methods

This rapid literature review sought to identify published articles that were related to AI, different health care professional groups (e.g., Nursing, Pharmacy, and Medicine), intersectionality and SDoH [[17]]. The intent was not to perform an exhaustive review but to offer an illustration of the representation of research published in this topic area as a beginning step to better understand the current state of the evidence.

Authors CR, LB, and JM created the initial list of search terms, which were expanded and validated by the research team. Using the Medical Subject Heading (MeSH) browser hosted by the U.S. National Library for Medicine (https://meshb.nlm.nih.gov/search; Version: MeSH 2022 Preview) the selected terms were manually mapped for MeSH term representation ([Table 1]). Then, using PubMed as the database, we created lists of articles which used our selected MeSH terms (November 6, 2021). When searching, we included [mh:noexp] to turn off the automatic explosion of MeSH headings. This specified the search to use only the specific MeSH heading term indicated in the search strategy. This method also ensured the exclusion of any citations which had yet to be indexed, citations that were out of scope, and articles that did not have MeSH terms applied [[18]].

Zoom Image
Table 1 Health Care Professional and Artificial Intelligence MeSH Term and Literature Representation in PubMed

The results for each search were uploaded into EndNote X9 software, which were then exported to Covidence software for title and abstract screening. Duplicates were reviewed and removed. Each article was screened for inclusion and exclusion by one reviewer. Inclusion criteria were those articles written in English and were related to AI and the professional group. Exclusion criteria included those articles which had no abstract, editorials, commentaries, conference reports, keynotes, were not written in English, or had no relationship to AI and the professional group ([table 1]). Once the articles were screened, full-text data extraction was completed by one researcher, who considered the type of research and whether any intersectionality or SDoH concepts were represented. These concepts, drawn from the literature, included: geography, gender, sex, ethnicity, race, disability, class, age, discrimination, privilege, social exclusion/disadvantage, economic status, education, and citizenship [[2] [19] [20] [21] [22] [23] [24]]. Data extraction results were exported to an excel spreadsheet where descriptive statistics and syntheses of concept representation occurred.


#

4 Results

The searches located a cumulative result of 136 (two duplicates) nursing and allied health papers and a total of 534 (nine duplicates) medicine papers. After screening for inclusion, a total of 49 nursing and allied health papers went through to full review and data extraction. Similarly, after screening for inclusion, a total of 45 medicine papers went through to full review and data extraction. Given the almost 1:4 ratio of nursing and allied health papers to medicine search results, and the almost 1:1 ratio of nursing and allied health to medicine papers going through full review and extraction, we chose to explain our results as the representation of AI, intersectionality, and SDoH for nursing and allied health separate from medicine.

4.1 The Nature of Nursing and Allied Health Informatics and Artificial Intelligence Studies

Using the search and screenings methods described above, results were: Allied Health Personnel 9/11 relevant articles; Dentistry 21/32 articles; Midwifery 0/2 relevant articles; Nurse Practitioners 1/6 relevant articles; Nursing 15/42 relevant articles; Occupational Therapy 1/2 relevant articles; Pharmacology 1/32 relevant articles; Physical Therapy 0 relevant articles; Physician Assistants 0/4 relevant articles; Social Work 0/3 relevant articles; and Speech Therapy 1/2 relevant articles. There were no MeSH terms found for Psychiatric Nursing and was therefore excluded from the article search method. Of the 49 papers evaluated, the most common types were text/opinion/discussion papers (21/49; 43%), followed by survey (10/49; 20%), case/prototype reports (7/49; 14%), literature reviews (6/49; 12%), diagnostic test accuracy (2/49; 4%), randomized control trial (1/49; 2%), descriptive (1/49; 2%), and qualitative (1/49; 2%).

None of the evaluated articles were explicitly written about intersectionality or SDoH in context to the health professions use or development of AI. However, some of the papers described using some SDoH or intersectionality concepts as data points in the application or development of AI. Two papers described data related to sex [[25], [26]], two papers described data/AI related to disability [[25], [27]], one paper described data/AI related to race and ethnicity [[26]], and one paper described data related to education and employment [[28]]. For example, Jain et al. [[26]] performed a diagnostic study, where physicians and nurse practitioners diagnosed dermatologic conditions with or without the assistance of an AI-enabled prediction model. The authors noted that the data set used to develop and test the AI included information related to age, sex, race, ethnicity, Fitzpatric scale skin type (classification of human skin colour with scale based on reaction to ultraviolet light, [[29]]), and skin condition. Notably, the authors also described the groups of people who were underrepresented in the AI case training and testing were those with Fitzpatrick skin types VI (dark brown or black skin; 0; 0%), type I (pale white skin; 2; 0.2%), and type V (brown skin; 25; 2.4%); those with type III (darker white skin, 668; 63.8%) were over represented. As another example, Taylor et al. [[25]] piloted a case-based reasoning tool for matching people with assistive smart home technologies using occupational therapist clinical assessment data. These data included (in-part) sex, mobility, cognitive abilities, and housing. The authors noted that the piloted reasoning tool used instrumental reasoning (means-ends method). However, they reflected that for many clients needing occupational therapy interventions, there is not always a clear or defined ‘end’ (i.e., recommend technology based on imputed data). It was explained that for many, the subtlety of matching a client to a meaningful intervention still required expert clinician knowledge.


#

4.2 The Nature of Medical Informatics and Artificial Intelligence Studies

Using the search and screenings methods described above, relevant articles totalled 45 of the 362 initially identified as related to AI + Medicine. Of the 45 papers extracted, the most common types were discussion papers (10/45; 22%), Diagnostic test accuracy studies (9/10; 20%), Cross-sectional studies (5/45; 11%), Systematic reviews (4/45; 9%), Qualitative research studies (4/45; 9%), Case reports (3/45; 7%), Cohort studies (2/45; 4%), Clinical prediction rule (1/45; 2%), non-randomised experimental studies (1/45; 2%). A further 6 of the 45 studies (13%) were classified as other, which represented studies such as discussions or evaluations outside of the categories described.

None of the evaluated articles explicitly discuss intersectionality or SDoH in context to the health professions use or development of AI. However, several papers described using some SDoH or intersectionality concepts as data points in the application or development of AI. Takamine et al. [[30]] mention the lack of socioeconomic status consideration in risk prediction models as a reason for the non-adoption of AI. No other studies discussed or mentioned economic status. Duron et al. [[31]] used an AI training set that had similar female to male ratios. Similar studies such as Arnold et al. [[32]], Anand et al. [[33]], and Canales [[34]] noted that data sets used to develop and test the AI should include information related to age, sex, race, ethnicity, etc. However, few of the studies demonstrated the use of SDoH and discussions were largely around recommendations for inclusion; none mention intersectionality. Studies also fail to discuss the impact of excluding aspects of intersectionality or SDoH nor offer discussion of bias. However, Clark et al. [[35]] state that aspects such as race, ethnicity, gender, and other sociodemographic factors should be introduced to ensure AI systems do not have the potential to widen inequities via algorithmic and analytic biases.


#
#

5 Discussion

This rapid review of the literature provides a first step in providing a snapshot into the current state of evidence related to the representation of SDoH and intersectionality data in the biomedical informatics literature. Our findings serve to catalyze discussion as to the challenges in conceptually and operationally incorporating health equity considerations in the development of AI. Given the complexity of intersectionality and only beginning discussions raising its relevance for equitable AI [[20]], it was unsurprising to find no discussion of intersectionality in our review of the medical informatics and AI literature. It is likely that there has not yet been attention paid to intersectionality in the context of AI in health, with one potential reason being that the roots of intersectionality in critical race theory, law, and the social sciences are fields that are not yet immediately linked to AI development in the context of health.

While the limited representation of intersectionality is unsurprising, we expected to find that representation and discussion of SDoH in the medical AI literature would be more common; our findings did not align with this expectation. Compared to intersectionality, recognition of the importance of SDoH across the health literature is substantial, as noted by the World Health Organization [[36]] and numerous national and international organizations. We also found that the term ‘Social Determinants of Health’ exists in MeSH, facilitating the searchability and coding of related papers, while the concept of Intersectionality could not be identified as a MeSH term. Nevertheless, very few articles in the medical informatics and AI literature mention any aspects of SDoH in the context of AI fairness and bias, suggesting that SDoH considerations are not yet routine in the development of AI for health. A potential contributor to the lack of SDoH representation in datasets are limitations posed by EHR designs, poor data quality, and similar to intersectionality, challenges in operationalizing the complex concept of SDoH into a format that is amenable to documentation in EHRs [[3], [37]]. Fluid definitions of SDoH and the lack of standards for the capture and representation of SDoH stand as substantial challenges [[37]]. There are important arguments for and against the need for some reductionism to facilitate data re-use versus the risk of losing meaningful, nuanced, and complete representation of the patient story. As established standards and consensus around the capture of SDoH and initial mention of intersectionality to support equitable AI development remains to be seen, looking to alternative data sources to improve representation of SDoH and intersectionality data may be a way forward in the interim.

The scope and nature of practice of nurses [[38]] and allied health professionals, particularly those who work in the community and public health, require rich clinical documentation of patient contexts and environments, as these inform care plans, goals of care, and guide decision making for care delivery and service referrals. This documentation that aims to capture patient context and social needs as it relates to their health and wellbeing and likely captures details of SDoH [[39]], and potentially, intersectionality. To leverage rich data collected by nursing and allied health, however, a number of issues must be addressed. One is a question of prioritization with regard to AI development targets, with a greater focus of AI towards supporting diagnostics (in the sphere of medicine) versus AI for the purpose of supporting care management and delivery (in the sphere of nursing and allied health). These trends suggest that there may be implicit values being attributed to some AI goals as compared to others, fuelling current foci of AI developments targeting medical goals. A related issue is the question of data availability and quality between medical data and allied health data. In many settings, EHRs are geared towards capture and reporting of medical data, manifested as the dominant and sometimes exclusive use of the International Classification of Diseases codes (for medicine). As such, the EHR infrastructure in the majority of health systems lends itself to producing the largest volumes of structured medical data, providing the datasets necessary for AI development. Meanwhile, there is evidence of the insufficiency of EHR designs for nursing [[40]], which likely relates to the tendency to capture rich nursing and allied health data (that can include details of SDoH and intersectionality) within narrative free-text clinical notes, a data source that is only beginning to be explored for use in AI developments [[39], [41], [42]]. While the results of our review highlight limited discussion of SDoH and intersectionality in the nursing and allied health AI literature similar to what was found in medicine, it is possible that this limitation is linked to the aforementioned issues of insufficiency of EHRs to support clinical documentation outside of medicine as well as the likelihood of narrative free-text documentation within EHRs that is completed by these health professions. As such, we offer that the potential to leverage nursing and allied health data to improve the capture and representation of SDoH and intersectionality in health systems datasets remains an outstanding question that requires further research and exploration. There is a particular opportunity to explore synergies in the form of increased engagement with AI developments in nursing [[43] [44] [45]] and allied health [[46] [47] [48] [49]].

Limitations

While we did not intend to fully examine the depth of literature on this topic, our findings are limited by the method of rapid literature review completed. We excluded papers based on the abstracts' lack of mention of the application of AI for use by a specific health care professional group. This decision could have missed papers whose authors may have had assumptions of applicability/use, rather than an explicit mention in the abstract. Bias may have been introduced through our decision to use only one reviewer to screen the literature and perform data extraction. A deeper review is necessary to further advance our understanding of how health care professional AI incorporates and impacts intersectionality and SDoH.

6 Conclusions

This paper sought to explore the topic of health equity and AI for different health care professional groups. Using a rapid literature review approach, we evaluated related papers for any representation of intersectionality and SDoH concepts. We found that none were explicitly written on the subject, and very few mentioned using or applying these concepts in AI design, development, or implementation. This suggests that equity-related concepts in AI are lacking, but opportunities to address this limitation could be found through better inclusion nursing and allied health professional data. Overall, the improved capture and representation of intersectionality data in health datasets and AI bias mitigation using these data will require interdisciplinary efforts and perspectives.


#
#
#

No conflict of interest has been declared by the author(s).

  • References

  • 1 Hamilton IA. An AI tool which reconstructed a pixelated picture of Barack Obama to look like a white man perfectly illustrates racial bias in algorithms. June 22, 2020 ed2020. Available from: https://www.businessinsider.com/depixelator-turned-obama-white-illustrates-racial-bias-in-ai-2020-6
  • 2 Kino S, Hsu Y-T, Shiba K, Chien Y-S, Mita C, Kawachi I, et al. A scoping review on the use of machine learning in research on social determinants of health: Trends and research prospects. SSM Popul Health 2021 Jun 5;15:100836.
  • 3 Cook LA, Sachs J, Weiskopf NG. The quality of social determinants data in the electronic health record: a systematic review. J Am Med Inform Assoc 2021 Dec 28;29(1):187-96.
  • 4 Wang, T, Zhao J, Yatskar M, Chang K-W, Ordonez V. Balanced datasets are not enough: Estimating and mitigating gender bias in deep image representations. Proceedings of the IEEE/CVF International Conference on Computer Vision; 2019. p. 5310-9.
  • 5 Zhang BH, Lemoine B, Mitchell M. Mitigating unwanted biases with adversarial learning. Proceedings of the 2018 AAAI/ACM Conference on AI, Ethics, and Society; 2018. p. 335-40.
  • 6 Zhao Q, Adeli E, Pohl KM. Training confounder-free deep learning models for medical applications. Nat Commun 2020 Nov 26;11(1):6010.
  • 7 Benjamin R. Assessing risk, automating racism. Science 2019;366(6464):421-2.
  • 8 Thompson E, Ejdjoc R, Atchessi N, Striha M, Gabrani-Juma I, Dawson T. COVID-19: A case for the collection of race data in Canada and abroad. Can Commun Dis Rep 2021 Jul 8;47(7-8):300-4.
  • 9 Braveman P. What is Health Equity? And What Difference Does a Definition Make? National Collaborating Centre for Determinants of Health; 2017.
  • 10 Crenshaw K. Demarginalizing the intersection of race and sex: A Black feminist critique of antidiscrimination doctrine, feminist theory and antiracist politics. University of Chicago Legal Forum; 1989. p139-67.
  • 11 Collins PH. Black Feminist Thought: Knowledge, Consciousness, and the Politics of Empowerment. New York: Routledge; 1991.
  • 12 Cyrus K. Multiple minorities as multiply marginalized: Applying the minority stress theory to LGBTQ people of color. J Gay Lesbian Ment Health 2017;21(3):194-202.
  • 13 Vu M, Li J, Haardörfer R, Windle M, Berg CJ. Mental health and substance use among women and men at the intersections of identities and experiences of discrimination: insights from the intersectionality framework. BMC Public Health 2019 Jan 23;19(1):108.
  • 14 Allana S, Ski CF, Thompson DR, Clark AM. Intersectionality and heart failure: what clinicians and researchers should know and do. Curr Opin Support Palliat Care 2021 Jun 1;15(2):141-6.
  • 15 Naqvi JB, Helgeson VS, Gary-Webb TL, Korytkowski MT, Seltman HJ. Sex, race, and the role of relationships in diabetes health: intersectionality matters. J Behav Med 2020 Feb;43(1):69-79.
  • 16 Viruell-Fuentes EA, Miranda PY, Abdulrahim S. More than culture: Structural racism, intersectionality theory, and immigrant health. Soc Sci Med 2012 Dec;75(12):2099-106.
  • 17 Dobbins M. Rapid review guidebook. Natl Collab Cent Method Tools 2017;13:25.
  • 18 NIH National Library of Medicine. Searching PubMed Using MeSH Search Tags: National Institutes of Health; 2020 [updated 2019]. Available from: https://www.nlm.nih.gov/bsd/disted/meshtutorial/searchingpubmedusingmeshtags/index.html
  • 19 Campbell KA, Mackinnon K, Dobbins M, Jack SM. Nurse-Family Partnership and Geography: An Intersectional Perspective. Glob Qual Nurs Res 2020 Jan 21;7:2333393619900888.
  • 20 Bauer GR, Lizotte DJ. Artificial Intelligence, Intersectionality, and the Future of Public Health. Am J Public Health 2021 Jan;111(1):98-100.
  • 21 Nixon SA. The coin model of privilege and critical allyship: implications for health. BMC Public Health 2019 Dec 5;19(1):1637.
  • 22 Reeves RM, Christensen L, Brown JR, Conway M, Levis M, Gobbel GT, et al. Adaptation of an NLP system to a new healthcare environment to identify social determinants of health. J Biomed Inform 2021 Aug;120:103851.
  • 23 Hancock A-M. When Multiplication Doesn't Equal Quick Addition: Examining Intersectionality as a Research Paradigm. Perspectives on Politics 2007;5(1):63-79.
  • 24 McCall L. The Complexity of Intersectionality. Signs 2005;30(3):1771-800.
  • 25 Taylor B, Robertson D, Wiratunga N, Craw S, Mitchell D, Stewart E. Using computer aided case based reasoning to support clinical reasoning in community occupational therapy. Comput Methods Programs Biomed 2007 Aug;87(2):170-9.
  • 26 Jain A, Way D, Gupta V, Gao Y, de Oliveira Marinho G, Hartford J, et al. Development and Assessment of an Artificial Intelligence-Based Tool for Skin Condition Diagnosis by Primary Care Physicians and Nurse Practitioners in Teledermatology Practices. JAMA Netw Open 2021 Apr 1;4(4):e217249.
  • 27 Parker M, Cunningham S, Enderby P, Hawley M, Green P. Automatic speech recognition and training for severely dysarthric users of assistive technology: The STARDUST project. Clin Linguist Phon 2006 Apr-May;20(2-3):149-56.
  • 28 Kokol P, Brumec V, Habjani A, Turk DM, Procter P, Nicklin L. Intelligent systems for nursing education. Stud Health Technol Inform 2001;84(Pt 2):1047-51.
  • 29 Fitzpatrick TB. The validity and practicality of sun-reactive skin types I through VI. Arch Dermatol 1988 Jun;124(6):869-71.
  • 30 Takamine L, Forman J, Damschroder LJ, Youles B, Sussman J. Understanding providers' attitudes and key concerns toward incorporating CVD risk prediction into clinical practice: a qualitative study. BMC Health Serv Res 2021 Jun 7;21(1):561.
  • 31 Duron L, Ducarouge A, Gillibert A, Lainé J, Allouche C, Cherel N, et al. Assessment of an AI Aid in Detection of Adult Appendicular Skeletal Fractures by Emergency Physicians and Radiologists: A Multicenter Cross-sectional Diagnostic Study. Radiology 2021 Jul;300(1):120-9.
  • 32 Arnold J, Davis A, Fischhoff B, Yecies E, Grace J, Klobuka A, et al. Comparing the predictive ability of a commercial artificial intelligence early warning system with physician judgement for clinical deterioration in hospitalised general internal medicine patients: a prospective observational study. BMJ Open 2019 Oct 10;9(10):e032187.
  • 33 Anand V, Carroll AE, Biondich PG, Dugan TM, Downs SM. Pediatric decision support using adapted Arden Syntax. Artif Intell Med 2018 Nov;92:15-23.
  • 34 Canales C, Lee C, Cannesson M. Science without conscience is but the ruin of the soul: the ethics of big data and artificial intelligence in perioperative medicine. Anesth Analg 2020 May;130(5):1234-43.
  • 35 Clark CR, Wilkins CH, Rodriguez JA, Preininger AM, Harris J, DesAutels S, et al. Health Care Equity in the Use of Advanced Analytics and Artificial Intelligence Technologies in Primary Care. J Gen Intern Med 2021 Oct;36(10):3188-93.
  • 36 Marmot M, Allen J, Bell R, Bloomer E, Goldblatt P; Consortium for the European Review of Social Determinants of Health and the Health Divide. WHO European review of social determinants of health and the health divide. Lancet 2012 Sep 15;380(9846):1011-29.
  • 37 Cantor MN, Thorpe L. Integrating data on social determinants of health into electronic health records. Health Aff (Millwood) 2018 Apr;37(4):585-90.
  • 38 Haupeltshofer A, Egerer V, Seeling S. Promoting health literacy: What potential does nursing informatics offer to support older adults in the use of technology? A scoping review. Health Informatics J 2020 Dec;26(4):2707-21.
  • 39 Wang M, Pantell MS, Gottlieb LM, Adler-Milstein J. Documentation and review of social determinants of health data in the EHR: measures and associated insights. J Am Med Inform Assoc 2021 Nov 25;28(12):2608-16.
  • 40 Topaz M, Ronquillo C Peltonen LM, Pruinelli L, Sarmiento RF, Badger MK, et al. Nurse Informaticians Report Low Satisfaction and Multi-level Concerns with Electronic Health Records: Results from an International Survey. AMIA Annu Symp Proc 2017 Feb 10;2016:2016-25.
  • 41 Bako AT, Taylor HL, Wiley K, Jr, Zheng J, Walter-McCabe H, Kasthurirathne SN, et al. Using natural language processing to classify social work interventions. Am J Manag Care 2021 Jan 1;27(1):e24-e31.
  • 42 Topaz M, Lai K, Dowding D, Lei VJ, Zisberg A, Bowles KH, et al. Automated identification of wound information in clinical notes of patients with heart diseases: Developing and validating a natural language processing application. Int J Nurs Stud 2016 Dec;64:25-31.
  • 43 Ronquillo CE, Peltonen LM, Pruinelli L, Chu CH, Bakken S, Beduschi A, et al. Artificial intelligence in nursing: Priorities and opportunities from an international invitational think-tank of the Nursing and Artificial Intelligence Leadership Collaborative. J Adv Nurs 2021 Sep;77(9):3707-17.
  • 44 Barrera A, Gee C, Wood A, Gibson O, Bayley D, Geddes J. Introducing artificial intelligence in acute psychiatric inpatient care: qualitative study of its use to conduct nursing observations. Evid Based Ment Health 2020 Feb;23(1):34-38. Erratum in: Evid Based Ment Health 2021 May;24(2):ebmental-2019-300136corr1.
  • 45 Liao P-H, Hsu P-T, Chu W, Chu W-C. Applying artificial intelligence technology to support decision-making in nursing: A case study in Taiwan. Health Informatics J 2015 Jun;21(2):137-48.
  • 46 Berzin SC, Singer J, Chan C. Practice innovation through technology in the digital age: A grand challenge for social work. American Academy of Social Work & Social Welfare; 2015.
  • 47 Yadav R. Conversation with the Twenty-First Century Social Work: Some ”Post(s)' Perspectives. The British Journal of Social Work 2021.
  • 48 Liu L. Occupational therapy in the fourth industrial revolution. Canadian Journal of Occupational Therapy 2018;85(4):272-83.
  • 49 Huang L, Liu G. Functional motion detection based on artificial intelligence. The Journal of Supercomputing 2021:1-40.

Correspondence to:

Charlene Esteban Ronquillo
University of British Columbia Okanagan
1147 Research Road, Kelowna BC, V1V 1V7
Canada   
Phone: +1 250 807 8332   

Publication History

Article published online:
02 June 2022

© 2022. IMIA and Thieme. This is an open access article published by Thieme under the terms of the Creative Commons Attribution-NonDerivative-NonCommercial License, permitting copying and reproduction so long as the original work is given appropriate credit. Contents may not be used for commercial purposes, or adapted, remixed, transformed or built upon. (https://creativecommons.org/licenses/by-nc-nd/4.0/)

Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany

  • References

  • 1 Hamilton IA. An AI tool which reconstructed a pixelated picture of Barack Obama to look like a white man perfectly illustrates racial bias in algorithms. June 22, 2020 ed2020. Available from: https://www.businessinsider.com/depixelator-turned-obama-white-illustrates-racial-bias-in-ai-2020-6
  • 2 Kino S, Hsu Y-T, Shiba K, Chien Y-S, Mita C, Kawachi I, et al. A scoping review on the use of machine learning in research on social determinants of health: Trends and research prospects. SSM Popul Health 2021 Jun 5;15:100836.
  • 3 Cook LA, Sachs J, Weiskopf NG. The quality of social determinants data in the electronic health record: a systematic review. J Am Med Inform Assoc 2021 Dec 28;29(1):187-96.
  • 4 Wang, T, Zhao J, Yatskar M, Chang K-W, Ordonez V. Balanced datasets are not enough: Estimating and mitigating gender bias in deep image representations. Proceedings of the IEEE/CVF International Conference on Computer Vision; 2019. p. 5310-9.
  • 5 Zhang BH, Lemoine B, Mitchell M. Mitigating unwanted biases with adversarial learning. Proceedings of the 2018 AAAI/ACM Conference on AI, Ethics, and Society; 2018. p. 335-40.
  • 6 Zhao Q, Adeli E, Pohl KM. Training confounder-free deep learning models for medical applications. Nat Commun 2020 Nov 26;11(1):6010.
  • 7 Benjamin R. Assessing risk, automating racism. Science 2019;366(6464):421-2.
  • 8 Thompson E, Ejdjoc R, Atchessi N, Striha M, Gabrani-Juma I, Dawson T. COVID-19: A case for the collection of race data in Canada and abroad. Can Commun Dis Rep 2021 Jul 8;47(7-8):300-4.
  • 9 Braveman P. What is Health Equity? And What Difference Does a Definition Make? National Collaborating Centre for Determinants of Health; 2017.
  • 10 Crenshaw K. Demarginalizing the intersection of race and sex: A Black feminist critique of antidiscrimination doctrine, feminist theory and antiracist politics. University of Chicago Legal Forum; 1989. p139-67.
  • 11 Collins PH. Black Feminist Thought: Knowledge, Consciousness, and the Politics of Empowerment. New York: Routledge; 1991.
  • 12 Cyrus K. Multiple minorities as multiply marginalized: Applying the minority stress theory to LGBTQ people of color. J Gay Lesbian Ment Health 2017;21(3):194-202.
  • 13 Vu M, Li J, Haardörfer R, Windle M, Berg CJ. Mental health and substance use among women and men at the intersections of identities and experiences of discrimination: insights from the intersectionality framework. BMC Public Health 2019 Jan 23;19(1):108.
  • 14 Allana S, Ski CF, Thompson DR, Clark AM. Intersectionality and heart failure: what clinicians and researchers should know and do. Curr Opin Support Palliat Care 2021 Jun 1;15(2):141-6.
  • 15 Naqvi JB, Helgeson VS, Gary-Webb TL, Korytkowski MT, Seltman HJ. Sex, race, and the role of relationships in diabetes health: intersectionality matters. J Behav Med 2020 Feb;43(1):69-79.
  • 16 Viruell-Fuentes EA, Miranda PY, Abdulrahim S. More than culture: Structural racism, intersectionality theory, and immigrant health. Soc Sci Med 2012 Dec;75(12):2099-106.
  • 17 Dobbins M. Rapid review guidebook. Natl Collab Cent Method Tools 2017;13:25.
  • 18 NIH National Library of Medicine. Searching PubMed Using MeSH Search Tags: National Institutes of Health; 2020 [updated 2019]. Available from: https://www.nlm.nih.gov/bsd/disted/meshtutorial/searchingpubmedusingmeshtags/index.html
  • 19 Campbell KA, Mackinnon K, Dobbins M, Jack SM. Nurse-Family Partnership and Geography: An Intersectional Perspective. Glob Qual Nurs Res 2020 Jan 21;7:2333393619900888.
  • 20 Bauer GR, Lizotte DJ. Artificial Intelligence, Intersectionality, and the Future of Public Health. Am J Public Health 2021 Jan;111(1):98-100.
  • 21 Nixon SA. The coin model of privilege and critical allyship: implications for health. BMC Public Health 2019 Dec 5;19(1):1637.
  • 22 Reeves RM, Christensen L, Brown JR, Conway M, Levis M, Gobbel GT, et al. Adaptation of an NLP system to a new healthcare environment to identify social determinants of health. J Biomed Inform 2021 Aug;120:103851.
  • 23 Hancock A-M. When Multiplication Doesn't Equal Quick Addition: Examining Intersectionality as a Research Paradigm. Perspectives on Politics 2007;5(1):63-79.
  • 24 McCall L. The Complexity of Intersectionality. Signs 2005;30(3):1771-800.
  • 25 Taylor B, Robertson D, Wiratunga N, Craw S, Mitchell D, Stewart E. Using computer aided case based reasoning to support clinical reasoning in community occupational therapy. Comput Methods Programs Biomed 2007 Aug;87(2):170-9.
  • 26 Jain A, Way D, Gupta V, Gao Y, de Oliveira Marinho G, Hartford J, et al. Development and Assessment of an Artificial Intelligence-Based Tool for Skin Condition Diagnosis by Primary Care Physicians and Nurse Practitioners in Teledermatology Practices. JAMA Netw Open 2021 Apr 1;4(4):e217249.
  • 27 Parker M, Cunningham S, Enderby P, Hawley M, Green P. Automatic speech recognition and training for severely dysarthric users of assistive technology: The STARDUST project. Clin Linguist Phon 2006 Apr-May;20(2-3):149-56.
  • 28 Kokol P, Brumec V, Habjani A, Turk DM, Procter P, Nicklin L. Intelligent systems for nursing education. Stud Health Technol Inform 2001;84(Pt 2):1047-51.
  • 29 Fitzpatrick TB. The validity and practicality of sun-reactive skin types I through VI. Arch Dermatol 1988 Jun;124(6):869-71.
  • 30 Takamine L, Forman J, Damschroder LJ, Youles B, Sussman J. Understanding providers' attitudes and key concerns toward incorporating CVD risk prediction into clinical practice: a qualitative study. BMC Health Serv Res 2021 Jun 7;21(1):561.
  • 31 Duron L, Ducarouge A, Gillibert A, Lainé J, Allouche C, Cherel N, et al. Assessment of an AI Aid in Detection of Adult Appendicular Skeletal Fractures by Emergency Physicians and Radiologists: A Multicenter Cross-sectional Diagnostic Study. Radiology 2021 Jul;300(1):120-9.
  • 32 Arnold J, Davis A, Fischhoff B, Yecies E, Grace J, Klobuka A, et al. Comparing the predictive ability of a commercial artificial intelligence early warning system with physician judgement for clinical deterioration in hospitalised general internal medicine patients: a prospective observational study. BMJ Open 2019 Oct 10;9(10):e032187.
  • 33 Anand V, Carroll AE, Biondich PG, Dugan TM, Downs SM. Pediatric decision support using adapted Arden Syntax. Artif Intell Med 2018 Nov;92:15-23.
  • 34 Canales C, Lee C, Cannesson M. Science without conscience is but the ruin of the soul: the ethics of big data and artificial intelligence in perioperative medicine. Anesth Analg 2020 May;130(5):1234-43.
  • 35 Clark CR, Wilkins CH, Rodriguez JA, Preininger AM, Harris J, DesAutels S, et al. Health Care Equity in the Use of Advanced Analytics and Artificial Intelligence Technologies in Primary Care. J Gen Intern Med 2021 Oct;36(10):3188-93.
  • 36 Marmot M, Allen J, Bell R, Bloomer E, Goldblatt P; Consortium for the European Review of Social Determinants of Health and the Health Divide. WHO European review of social determinants of health and the health divide. Lancet 2012 Sep 15;380(9846):1011-29.
  • 37 Cantor MN, Thorpe L. Integrating data on social determinants of health into electronic health records. Health Aff (Millwood) 2018 Apr;37(4):585-90.
  • 38 Haupeltshofer A, Egerer V, Seeling S. Promoting health literacy: What potential does nursing informatics offer to support older adults in the use of technology? A scoping review. Health Informatics J 2020 Dec;26(4):2707-21.
  • 39 Wang M, Pantell MS, Gottlieb LM, Adler-Milstein J. Documentation and review of social determinants of health data in the EHR: measures and associated insights. J Am Med Inform Assoc 2021 Nov 25;28(12):2608-16.
  • 40 Topaz M, Ronquillo C Peltonen LM, Pruinelli L, Sarmiento RF, Badger MK, et al. Nurse Informaticians Report Low Satisfaction and Multi-level Concerns with Electronic Health Records: Results from an International Survey. AMIA Annu Symp Proc 2017 Feb 10;2016:2016-25.
  • 41 Bako AT, Taylor HL, Wiley K, Jr, Zheng J, Walter-McCabe H, Kasthurirathne SN, et al. Using natural language processing to classify social work interventions. Am J Manag Care 2021 Jan 1;27(1):e24-e31.
  • 42 Topaz M, Lai K, Dowding D, Lei VJ, Zisberg A, Bowles KH, et al. Automated identification of wound information in clinical notes of patients with heart diseases: Developing and validating a natural language processing application. Int J Nurs Stud 2016 Dec;64:25-31.
  • 43 Ronquillo CE, Peltonen LM, Pruinelli L, Chu CH, Bakken S, Beduschi A, et al. Artificial intelligence in nursing: Priorities and opportunities from an international invitational think-tank of the Nursing and Artificial Intelligence Leadership Collaborative. J Adv Nurs 2021 Sep;77(9):3707-17.
  • 44 Barrera A, Gee C, Wood A, Gibson O, Bayley D, Geddes J. Introducing artificial intelligence in acute psychiatric inpatient care: qualitative study of its use to conduct nursing observations. Evid Based Ment Health 2020 Feb;23(1):34-38. Erratum in: Evid Based Ment Health 2021 May;24(2):ebmental-2019-300136corr1.
  • 45 Liao P-H, Hsu P-T, Chu W, Chu W-C. Applying artificial intelligence technology to support decision-making in nursing: A case study in Taiwan. Health Informatics J 2015 Jun;21(2):137-48.
  • 46 Berzin SC, Singer J, Chan C. Practice innovation through technology in the digital age: A grand challenge for social work. American Academy of Social Work & Social Welfare; 2015.
  • 47 Yadav R. Conversation with the Twenty-First Century Social Work: Some ”Post(s)' Perspectives. The British Journal of Social Work 2021.
  • 48 Liu L. Occupational therapy in the fourth industrial revolution. Canadian Journal of Occupational Therapy 2018;85(4):272-83.
  • 49 Huang L, Liu G. Functional motion detection based on artificial intelligence. The Journal of Supercomputing 2021:1-40.

Zoom Image
Table 1 Health Care Professional and Artificial Intelligence MeSH Term and Literature Representation in PubMed