The Evolving Dyad: Navigating the Impact of Artificial Intelligence on the Doctor-Patient Relationship
The Evolving Dyad: Navigating the Impact of Artificial Intelligence on the Doctor-Patient Relationship
Published: April 10, 2025
I. Introduction
Artificial intelligence (AI) – broadly defined as the capability of machines to imitate intelligent human behavior, including learning, reasoning, and problem-solving[1] – is rapidly emerging as a transformative force within healthcare. Powered by the increasing availability of vast healthcare datasets and significant progress in analytical techniques like machine learning (ML), deep learning, and natural language processing (NLP)[2], AI promises to revolutionize numerous aspects of medical practice. Applications range from enhancing diagnostic accuracy in fields like radiology and pathology[1] to streamlining administrative workflows[6], optimizing treatment planning[5], accelerating drug discovery[5], and improving patient monitoring and engagement.[2]
The potential benefits are substantial, including improved efficiency, reduced medical errors, enhanced patient safety, and the possibility of more personalized medicine.[10] A key hope is that AI can alleviate the significant administrative burdens and documentation demands that contribute to clinician burnout, thereby freeing up valuable time for more direct, meaningful interaction with patients.[6] This raises the prospect of strengthening the very foundation of effective healthcare: the doctor-patient relationship. This relationship, characterized by mutual trust, respect, and commitment, is crucial for positive health outcomes and patient experiences.[1]
However, the integration of AI into the clinical encounter is not without significant challenges and potential downsides. Concerns abound regarding the potential for AI to erode empathy, introduce biases that exacerbate health inequities, compromise patient privacy, create new liability quandaries, and fundamentally alter the dynamics of communication and decision-making.[9] The introduction of AI can be seen as adding a "third actor" to the traditional dyad, potentially disrupting established roles and responsibilities.[18] Public and professional opinions are mixed, reflecting both enthusiasm for AI's potential and significant apprehension about its risks.[21]
This report synthesizes current evidence on the multifaceted impact of AI on the doctor-patient relationship. It examines how AI is currently being implemented in clinical settings affecting this interaction, analyzes the potential positive and negative consequences, explores patient and professional perspectives, delves into the critical ethical considerations, assesses the influence on communication dynamics, and considers future trends. The central aim is to provide a nuanced understanding of how this powerful technology is reshaping a fundamental human connection in healthcare and to identify pathways for ensuring its responsible and beneficial integration.
II. Current AI Applications Impacting the Doctor-Patient Interaction
AI is manifesting in clinical settings through various applications that directly or indirectly influence the interaction between doctors and patients. These tools range from sophisticated diagnostic aids to administrative automation systems, each carrying implications for how care is delivered and experienced.
A. Diagnostic Assistance
AI algorithms, particularly deep learning models, are increasingly used to analyze medical images (radiology, pathology) and other diagnostic data.[1] Systems can highlight abnormalities on scans, potentially improving detection rates for conditions like cancer or diabetic retinopathy.[1] For example, AI can analyze retinal fundus photographs to diagnose diabetic retinopathy or predict patient characteristics beyond human capability.[4] In breast imaging, AI surpasses traditional computer-aided detection (CAD) and shows potential for automating screening mammography interpretation.[14] AI is also applied to analyze EHR data, sound patterns (like coughs for COVID-19 detection), and other inputs to assist in formulating differential diagnoses or predicting disease risk.[5] These tools can act as a "second opinion" for clinicians, potentially enhancing accuracy and identifying conditions earlier.[10] This directly impacts the diagnostic process shared between doctor and patient, influencing the information presented and the confidence in the diagnosis.
B. Treatment Recommendations & Personalized Medicine
AI systems can analyze vast amounts of scientific literature, clinical trial data, and individual patient data (including genomics) to suggest personalized treatment options.[1] For instance, AI can identify specific mutations in a patient's tumor and recommend targeted therapies.[1] Platforms like CURATE.AI create individual patient profiles to predict treatment outcomes based on intervention intensity.[5] Clinical Decision Support Systems (CDSSs) powered by AI can provide evidence-based recommendations at the point of care, assisting clinicians in treatment planning and medication management.[4] AI can also predict patient responses to therapies or identify patients at high risk for adverse events or readmission, allowing for proactive interventions.[9] This capability enhances the potential for shared decision-making by providing more tailored information but also raises questions about how these complex recommendations are communicated and integrated with patient preferences.[1]
C. Patient Monitoring & Engagement
AI facilitates remote patient monitoring through wearable devices and sensors, analyzing real-time data to detect changes in a patient's condition (e.g., cardiac activity, blood sugar, mobility) and alerting clinicians to potential issues.[9] AI-powered chatbots and virtual assistants are used for patient communication, answering common questions, providing health information, monitoring symptoms (e.g., post-surgery or during chemotherapy), scheduling appointments, and even triaging patients.[9] These tools aim to improve patient engagement, provide convenient access to information, and support self-management of chronic conditions.[2] For example, Northwell Health uses customized chatbots to monitor high-risk postpartum individuals and patients with chronic conditions like heart failure.[33] While potentially improving access and convenience, these tools mediate the communication traditionally handled directly by clinicians, changing the nature and frequency of interaction.
D. Administrative Task Automation
A significant driver for AI adoption is its potential to reduce the administrative burden on clinicians.[6] AI tools, particularly those using NLP and generative AI, can automate tasks like:
- Clinical Documentation: Ambient listening systems can record doctor-patient conversations and automatically generate clinical notes, summaries, and billing codes, reducing the need for manual data entry.[6] This promises to free clinicians from being "typists" and allow more face-to-face engagement.[6]
- EHR Management: AI can help organize and summarize information within complex EHRs.[23]
- Scheduling and Billing: Automation of routine administrative processes.[34]
- Prior Authorization: AI can streamline the often cumbersome process of obtaining insurance pre-approvals.[21]
- Patient Messaging: Generative AI can draft responses to patient messages in online portals, which physicians can then review and personalize.[31]
By reducing time spent on these tasks, AI aims to decrease burnout and create more time for direct patient care and relationship building.[6]
E. Table 1: AI Applications & Potential Relationship Impact
AI Application Category | Examples | Potential Positive Relationship Impact | Potential Negative Relationship Impact | Key Supporting Snippets |
---|---|---|---|---|
Diagnostic Assistance | Image analysis (radiology, pathology), risk prediction, differential diagnosis support | Increased diagnostic accuracy leading to greater patient trust; Faster diagnosis allowing more time for discussion; Facilitates shared understanding of condition. | Over-reliance leading to reduced clinician engagement; "Black box" nature hindering explanation and trust; Potential for bias leading to misdiagnosis and mistrust; Patient anxiety if AI findings contradict clinician assessment. | [1], [10] |
Treatment Recommendations | Personalized therapy suggestions (genomics), CDSS providing evidence-based options, outcome prediction | Enables more personalized care plans aligned with patient specifics; Provides richer data for shared decision-making (SDM); Increases patient confidence in treatment plan. | AI recommendations overriding patient values/preferences (digital paternalism); Difficulty explaining complex AI rationale; Reduced clinician autonomy impacting trust; Bias in recommendations affecting equity. | [1], [14] |
Patient Monitoring/Engagement | Remote monitoring via wearables, AI chatbots for communication/triage, symptom checkers, adherence tools | Improved convenience and access for patients; Proactive identification of issues enhancing safety; Increased patient involvement in self-management. | Reduced face-to-face interaction leading to depersonalization; Communication barriers with chatbots; Privacy concerns with continuous monitoring; Potential for anxiety due to constant data/alerts. | [9], [2], [33] |
Administrative Automation | Ambient documentation, automated notes/billing, prior authorization streamlining, AI-drafted messages | Frees up clinician time for direct patient interaction and empathy; Reduces clinician burnout, improving presence during visits; Potentially improves communication quality. | Concerns patient data used for documentation is secure/private; AI-generated notes may lack nuance or contain errors; Efficiency gains may not translate to more patient time if used to increase throughput. | [6], [21], [31] |
V. Stakeholder Perspectives on AI's Role
Understanding the attitudes and experiences of both patients and healthcare professionals is crucial for navigating the integration of AI into the clinical relationship. Perspectives vary significantly, influenced by factors like familiarity with AI, specific use cases, demographic background, and professional roles.
A. Patient Attitudes and Experiences
Public and patient perspectives on AI in healthcare are characterized by a mix of cautious optimism and significant apprehension.
- General Comfort Levels: Surveys indicate widespread discomfort. A 2022 Pew Research Center survey found 60% of US adults would be uncomfortable if their provider relied on AI for diagnosis and treatment, with only 39% feeling comfortable[22]. This discomfort is more pronounced among women (66% uncomfortable) and older adults[22]. A 2024 survey showed 60% of patients felt uncomfortable[59].
- Expectations of Benefit: Expectations are generally low but linked to trust. A 2024/2025 survey found only about 20% of US adults expected AI to improve their relationship with their doctor or make care more affordable, though 30% expected improved access[54].
- Impact on Relationship and Trust: A major concern is the potential negative impact on the patient-provider relationship; 57% of US adults believe AI use would make the relationship worse[22]. Qualitative studies echo this, highlighting the perceived importance of the human relationship and fears of impersonality[56].
- Concerns: Key concerns include privacy and data security[22], potential for errors[56], lack of empathy[56], and algorithmic bias[22].
- Bias Perception: While acknowledging bias, many patients believe AI could potentially make racial/ethnic bias better rather than worse, perhaps perceiving AI as more objective[22]. However, Black adults are more likely to see bias as a major problem[22].
Overall, patients recognize potential benefits but harbor significant reservations, particularly regarding the relational and ethical aspects of AI integration. Building patient trust appears contingent on demonstrating clear benefits, ensuring transparency, protecting privacy, and maintaining the human element in care[54].
B. Healthcare Professional Perspectives (Physicians, Nurses, AHPs)
Healthcare professionals exhibit a blend of enthusiasm, particularly for efficiency gains, and significant concerns regarding AI's impact on practice, patient safety, and their relationship with patients.
- Physician Perspectives: Adoption is increasing (38% in '23 to 66% in '24[36]). Enthusiasm is growing, especially for reducing administrative burdens[6]. Major concerns include relationship impact, privacy, accuracy, liability, and overriding clinical judgment[21]. They desire transparency, efficacy proof, and training[21].
- Nursing Perspectives: Generally low current use/knowledge[64], with widespread ambivalence[42]. They see potential for efficiency gains but have profound concerns about patient safety/care quality (73% top concern[42]), loss of human intuition, accuracy, accountability, deskilling, and damage to the nurse-patient relationship[27]. They strongly desire education[64].
- Allied Health Professionals (AHPs): Low current knowledge, concerns about skills, but desire training and see potential for efficiency improvements[64].
AI implementation strategies must explicitly address the deep-seated concerns of nurses regarding care integrity and the human element, rather than solely focusing on physician efficiency or diagnostic accuracy.
Feature | Physicians | Nurses | Allied Health Professionals (AHPs) |
---|---|---|---|
Adoption Rate/ Sentiment | Rapidly increasing adoption (38% '23 -> 66% '24)[36]; Guarded enthusiasm, growing optimism[21]; Positive attitude despite low knowledge in some groups.[63] | Generally low current use/knowledge[64]; Ambivalence common (78% neutral/slight inclination)[42]; Early/mid-career more receptive[42]; Positive attitude despite low knowledge.[63] | Low current knowledge/preparedness.[64] |
Perceived Benefits | Primary: Reduce admin burden (docs, prior auth)[6]; Diagnostic support[26]; Improve efficiency, care coordination, safety.[21] | Primary: Improve efficiency, reduce admin burden (docs, scheduling)[27]; Augment decision support (deterioration, meds)[27]; Increase time for direct patient care.[64] | Improve efficiency via non-clinical/admin tasks; Support patient assessment.[64] |
Major Concerns | Relationship impact[21]; Privacy/Security[21]; Accuracy/Reliability[16]; Liability[21]; Overriding judgment[21]; Workflow integration[26]; Bias.[26] | Primary: Patient safety/care quality impact (73% top concern)[42]; Lack of human intuition/holistic view[42]; Accuracy/Bias/Accountability[42]; Erosion of profession (agency, skills, connection)[27]; Increased workload justification.[42] | Workforce knowledge/skills deficit[64]; Losing valued tasks.[64] |
Key Needs | Transparency (how decisions made)[21]; Validation/Efficacy proof[21]; Regulation/Oversight[36]; Training[21]; Privacy assurances[36]; Liability limits[21]; Human intervention points.[21] | Strong desire for education/training[64]; AI Literacy[65]; Ethical frameworks/Data protection policies[27]; Nurse involvement in governance/design[27]; Preference for admin automation.[42] | Desire for training.[64] |
Key Sources | [6], [16], [21], [26], [36], [63] | [27], [42], [63], [64], [65] | [64] |
IX. Conclusion and Recommendations
A. Synthesis of Findings
The integration of Artificial Intelligence into healthcare presents a complex and paradoxical potential for the doctor-patient relationship. On one hand, AI offers powerful tools capable of enhancing diagnostic accuracy, enabling highly personalized treatment plans, improving patient safety, streamlining burdensome administrative tasks, and potentially freeing up clinician time for more meaningful interaction. Success stories are emerging, particularly in diagnostics and communication support.[23]
On the other hand, AI introduces significant risks that could fundamentally undermine the relationship. These include the potential for reduced empathy and dehumanization of care, serious threats to patient data privacy and security, the danger of algorithmic bias exacerbating health inequities, and the erosion of patient and professional trust due to opaque decision-making or errors. The introduction of AI as a "third actor" alters communication dynamics and complicates shared decision-making, with potential to both facilitate and hinder collaboration.
Stakeholder perspectives reflect this duality. Patients express significant discomfort and concern, particularly regarding the potential loss of human connection and data privacy, although some see potential benefits in efficiency and bias reduction. Healthcare professionals show growing adoption and enthusiasm, especially physicians regarding administrative relief, but harbor concerns about liability, workflow integration, accuracy, and relationship impacts. Nurses, while seeing efficiency potential, express profound concerns about AI's impact on patient safety, care quality, and the integrity of their professional practice, including the essential human connection. Across the board, the need for robust ethical governance, transparency, accountability, and comprehensive training is paramount, yet these areas often lag behind technological development.
B. Overarching Conclusion
Artificial Intelligence will inevitably reshape the landscape of healthcare and, consequently, the doctor-patient relationship. The technology holds the potential to augment clinical capabilities significantly, potentially leading to a future where clinicians, freed from routine tasks, can dedicate more time and cognitive energy to the humanistic aspects of care – fostering deeper trust, empathy, and collaborative decision-making. However, this positive outcome is not guaranteed. Without careful, deliberate, and ethically grounded implementation, AI also risks creating a healthcare system that is more impersonal, potentially biased, less transparent, and ultimately less trusted by both patients and providers. The future trajectory is not technologically determined; it depends critically on the proactive choices and actions taken today by all stakeholders to prioritize human values alongside technological advancement.
C. Recommendations
To navigate the integration of AI in a way that strengthens, rather than weakens, the doctor-patient relationship, a concerted effort is required from all involved parties:
- For Technology Developers: Prioritize Transparency (XAI)[17]; Design for Augmentation[13]; Mitigate Bias Actively[45]; Embed Privacy & Security[7]; Co-design with End-Users[26]; Focus on Patient Benefit[54].
- For Healthcare Organizations & Leaders: Strategic & Phased Implementation[34]; Invest in Workforce Training[13]; Adapt Workflows Purposefully[1]; Establish Clear Governance[36]; Foster Collaboration[34]; Engage Patients[72]; Communicate Clearly with Staff[42].
- For Clinicians (Doctors, Nurses, and other HCPs): Embrace Lifelong Learning[68]; Maintain Critical Judgment[13]; Communicate Transparently with Patients[7]; Hone Humanistic Skills[1]; Advocate for Ethical AI.
- For Policymakers & Regulators: Develop Agile Regulatory Frameworks[9]; Establish Standards[45]; Clarify Liability[16]; Incentivize Value and Equity[21]; Support Research.
- For Patients & the Public: Engage and Inquire[72]; Advocate for Rights.
D. Final Thought
The successful integration of artificial intelligence into healthcare ultimately depends on recognizing and upholding the irreplaceable value of the human relationship at the heart of healing. Technology must be employed as a means to enhance this connection and support compassionate, equitable, and trustworthy care, ensuring that efficiency and data do not overshadow empathy and human judgment.
References
- Paranjape, K., Fiske, A., & Bhavnani, S. P. (2020). How Will Artificial Intelligence Affect Patient-Clinician Relationships?. AMA Journal of Ethics, 22(5), E409-415. https://journalofethics.ama-assn.org/article/how-will-artificial-intelligence-affect-patient-clinician-relationships/2020-05
- Javaid, M., Haleem, A., & Singh, R. P. (2023). A Review of the Role of Artificial Intelligence in Healthcare. International Journal of Environmental Research and Public Health, 20(13), 6291. https://pmc.ncbi.nlm.nih.gov/articles/PMC10301994/
- Jiang, F., Jiang, Y., Zhi, H., et al. (2017). Artificial intelligence in healthcare: past, present and future. Stroke and Vascular Neurology, 2(4), 230-243. https://pubmed.ncbi.nlm.nih.gov/29507784/
- Asan, O., Bayrak, A. E., & Choudhury, A. (2024). The Effect of Artificial Intelligence on Patient-Physician Trust: Cross-Sectional Vignette Study. Journal of Medical Internet Research, 26, e50853. https://www.jmir.org/2024/1/e50853/
- Javaid, M., Haleem, A., Singh, R. P., et al. (2023). Advancing Patient Care: How Artificial Intelligence Is Transforming Healthcare. Healthcare (Basel), 11(17), 2405. https://pmc.ncbi.nlm.nih.gov/articles/PMC10455458/
- LSVP. (n.d.). How generative AI can improve the doctor-patient relationship. Retrieved April 10, 2025, from https://lsvp.com/stories/how-generative-ai-can-improve-the-doctor-patient-relationship/
- Forbes Business Council. (2023, September 21). AI And The Doctor-Patient Relationship: What Physicians Should Consider. Forbes. Retrieved April 10, 2025, from https://www.forbes.com/councils/forbesbusinesscouncil/2023/09/21/ai-and-the-doctor-patient-relationship-what-physicians-should-consider/
- Liberati, E., Rognoni, C., & V. A. (2024). Artificial Intelligence and Decision-Making in Healthcare: A Thematic Analysis of a Systematic Review of Reviews. International Journal of Environmental Research and Public Health, 21(3), 281. https://pmc.ncbi.nlm.nih.gov/articles/PMC10916499/
- Council of Europe. (2022). Report on the Impact of Artificial Intelligence on the Doctor-Patient Relationship (INF(2022)5). https://rm.coe.int/inf-2022-5-report-impact-of-ai-on-doctor-patient-relations-e/1680a68859
- Powell, A. (2025, March 4). How AI is transforming medicine. Harvard Gazette. https://news.harvard.edu/gazette/story/2025/03/how-ai-is-transforming-medicine-healthcare/
- Panch, T., Pearson-Stuttard, J., Greaves, F., et al. (2024). Benefits and Risks of AI in Health Care: Narrative Review. Journal of Medical Internet Research, 26, e57322. https://pmc.ncbi.nlm.nih.gov/articles/PMC11612599/
- Agency for Healthcare Research and Quality (AHRQ). (n.d.). Artificial Intelligence and Patient Safety: Promise and Challenges. PSNet. Retrieved April 10, 2025, from https://psnet.ahrq.gov/perspective/artificial-intelligence-and-patient-safety-promise-and-challenges
- Carter, S. M., Rogers, W., Entwistle, V., et al. (2023). The impact of artificial intelligence on the person-centred, doctor-patient relationship: some problems and solutions. Journal of Medical Ethics, 49(4), 245-253. https://pmc.ncbi.nlm.nih.gov/articles/PMC10116477/
- Pesapane, F., Codari, M., & Sardanelli, F. (2018). The Doctor-Patient Relationship With Artificial Intelligence. American Journal of Roentgenology, 211(6), W314. https://ajronline.org/doi/10.2214/AJR.18.20509
- Carter, S. M., Rogers, W., Entwistle, V., et al. (2023). The impact of artificial intelligence on the person-centred, doctor-patient relationship: some problems and solutions. ResearchGate. [Preprint or published version]. https://www.researchgate.net/publication/370152290_The_impact_of_artificial_intelligence_on_the_person-centred_doctor-patient_relationship_some_problems_and_solutions
- Antoniou, G. A., Antoniou, C. Z., & Georgakarakos, E. I. (2024). Risks of Artificial Intelligence (AI) in Medicine. Pneumon, 37(1). https://www.pneumon.org/Risks-of-Artificial-Intelligence-AI-in-Medicine,191736,0,2.html
- McCradden, M. D., Joshi, S., Anderson, J. A., et al. (2020). Trust and medical AI: the challenges we face and the expertise needed to overcome them. BMJ Health & Care Informatics, 27(1), e100109. https://pmc.ncbi.nlm.nih.gov/articles/PMC7973477/
- University of Oxford. (2022, June 7). AI standards essential to protect doctor-patient relationships and human rights. https://www.ox.ac.uk/news/2022-06-07-ai-standards-essential-protect-doctor-patient-relationships-and-human-rights
- Fiske, A., Prainsack, B., & Buyx, A. (2022). Understanding the impact of Artificial Intelligence on physician-patient relationship: a revisitation of conventional relationship models in the light of new technological frontiers. Medicina Historica, 6(3), e2022030. https://mattioli1885journals.com/index.php/MedHistor/article/view/13601
- Pelaccia, T., & Forestier, G. (2020). A “Third Wheel” Effect in Health Decision Making Involving Artificial Entities: A Psychological Perspective. Frontiers in Public Health, 8, 117. https://www.frontiersin.org/journals/public-health/articles/10.3389/fpubh.2020.00117/full
- American Medical Association (AMA). (n.d.). Big majority of doctors see upsides to using health care AI. Retrieved April 10, 2025, from https://www.ama-assn.org/practice-management/digital/big-majority-doctors-see-upsides-using-health-care-ai
- Funk, C., & Tyson, A. (2023, February 22). 60% of Americans Would Be Uncomfortable With Provider Relying on AI in Their Own Health Care. Pew Research Center. https://www.pewresearch.org/science/2023/02/22/60-of-americans-would-be-uncomfortable-with-provider-relying-on-ai-in-their-own-health-care/
- Yu, K. H., Beam, A. L., & Kohane, I. S. (2018). Artificial intelligence in healthcare: transforming the practice of medicine. Genome Medicine, 10(1), 92. https://pmc.ncbi.nlm.nih.gov/articles/PMC8285156/
- Fiske, A., Prainsack, B., & Buyx, A. (2023). Doctor-patient interactions in the age of AI: navigating innovation and expertise. Frontiers in Medicine, 10, 1241508. https://pmc.ncbi.nlm.nih.gov/articles/PMC10498385/
- Kelly, C. J., Karthikesalingam, A., Suleyman, M., et al. (2019). Opportunities and challenges of artificial intelligence in the medical field: current application, emerging problems, and problem-solving strategies. BMJ, 366, l4885. https://pmc.ncbi.nlm.nih.gov/articles/PMC8165857/
- Fiske, A., Paltz, M. C., Aeschbacher, S., et al. (2024). Navigating the doctor-patient-AI relationship - a mixed-methods study of physician attitudes toward artificial intelligence in primary care. BMC Primary Care, 25(1), 30. https://pmc.ncbi.nlm.nih.gov/articles/PMC10821550/
- Ronquillo, C. E., Peltonen, L. M., Pruinelli, L., et al. (2024). The integration of AI in nursing: addressing current applications ... BMC Nursing, 23(1), 113. https://pmc.ncbi.nlm.nih.gov/articles/PMC11850350/
- Malik, J., Anwaar, I., Gillani, S. A., et al. (2023). Revolutionizing healthcare: the role of artificial intelligence in clinical practice. BMC Medical Education, 23(1), 674. https://pmc.ncbi.nlm.nih.gov/articles/PMC10517477/
- Légaré, F., Stacey, D., Forest, P. G., et al. (2022). Application of Artificial Intelligence in Shared Decision Making: Scoping Review. Journal of Medical Internet Research, 24(8), e38576. https://pmc.ncbi.nlm.nih.gov/articles/PMC9399841/
- Boston Consulting Group (BCG). (n.d.). How Digital & AI Will Reshape Health Care in 2025. Retrieved April 10, 2025, from https://www.bcg.com/publications/2025/digital-ai-solutions-reshape-health-care-2025
- Healthcare Executive. (2024, March/April). Improving Patient Care and Safety With AI. Healthcare Executive. https://healthcareexecutive.org/archives/march-april-2024/improving-patient-care--and-safety-with-ai
- Fiske, A., Paltz, M. C., Buyx, A., et al. (2023). Critical analysis of the AI impact on the patient–physician relationship: A multi-stakeholder qualitative study. PLOS Digital Health, 2(12), e0000403. https://pmc.ncbi.nlm.nih.gov/articles/PMC10734361/
- Berg, S. (2024, August 16). How AI is helping doctors communicate with patients. AAMC. https://www.aamc.org/news/how-ai-helping-doctors-communicate-patients
- Holt Law. (n.d.). AI in Healthcare: Early Success Stories and Lessons Learned from Leading Health Systems. Retrieved April 10, 2025, from https://djholtlaw.com/ai-in-healthcare-early-success-stories-and-lessons-learned-from-leading-health-systems/
- Lumen. (n.d.). Transforming Healthcare: The Role Of AI In Enhancing The Patient Experience. Retrieved April 10, 2025, from https://blog.lumen.com/transforming-healthcare-the-role-of-ai-in-enhancing-the-patient-experience/
- Robeznieks, A. (2024, September 10). 2 in 3 physicians are using health AI—up 78% from 2023. American Medical Association. https://www.ama-assn.org/practice-management/digital/2-3-physicians-are-using-health-ai-78-2023
- McKinsey & Company. (2024, May 22). Tackling healthcare's biggest burdens with generative AI. https://www.mckinsey.com/industries/healthcare/our-insights/tackling-healthcares-biggest-burdens-with-generative-ai
- Performance Health Partners. (n.d.). How AI Enhances Doctor-Patient Communication. Retrieved April 10, 2025, from https://www.performancehealthus.com/blog/how-ai-enhances-doctor-patient-communication
- Gille, F., Jobin, A., & Biller-Andorno, N. (2024). Do patients prefer a human doctor, artificial intelligence, or a blend, and is this preference dependent on medical discipline? Empirical evidence and implications for medical practice. Frontiers in Psychology, 15, 1422177. https://www.frontiersin.org/journals/psychology/articles/10.3389/fpsyg.2024.1422177/full
- Richens, J. G., Lee, C. M., & Johri, S. (2020). The downsides of artificial intelligence in healthcare. BMJ, 371, m4492. https://pmc.ncbi.nlm.nih.gov/articles/PMC10764219/
- Reddy, S., Allan, S., Coghlan, S., et al. (2019). Ethical Issues of Artificial Intelligence in Medicine and Healthcare. Journal of Medical Internet Research, 22(11), e16823. https://pmc.ncbi.nlm.nih.gov/articles/PMC8826344/
- The Health Management Academy. (n.d.). What RNs Think about AI—And How Health System Leaders Should Communicate. Retrieved April 10, 2025, from https://hmacademy.com/insights/all-access/health-technology/what-rns-think-about-ai-and-how-health-system-leaders-should-communicate
- Nadarzynski, T., Miles, O., Cowie, A., et al. (2019). Patients' Perceptions Toward Human–Artificial Intelligence Interaction in Health Care: Experimental Study. Journal of Medical Internet Research, 23(11), e25856. https://www.jmir.org/2021/11/e25856/
- HITRUST Alliance. (n.d.). The Pros and Cons of AI in Healthcare. Retrieved April 10, 2025, from https://hitrustalliance.net/blog/the-pros-and-cons-of-ai-in-healthcare
- Javaid, M., Haleem, A., & Singh, R. P. (2023). Ethical implications of AI and robotics in healthcare: A review. Journal of Clinical Medicine, 12(24), 7698. https://pmc.ncbi.nlm.nih.gov/articles/PMC10727550/
- PRS Global. (n.d.). 6 Common Healthcare AI Mistakes. Retrieved April 10, 2025, from https://prsglobal.com/blog/6-common-healthcare-ai-mistakes
- Cascella, M., Montomoli, J., Bellini, V., et al. (2023). Ethical Considerations of Using ChatGPT in Health Care. Journal of Personalized Medicine, 13(8), 1266. https://pmc.ncbi.nlm.nih.gov/articles/PMC10457697/
- Stanford HAI. (n.d.). Who's at Fault when AI Fails in Health Care?. Retrieved April 10, 2025, from https://hai.stanford.edu/news/whos-fault-when-ai-fails-health-care
- Haleem, A., Javaid, M., Singh, R. P., et al. (2024). Ethical Considerations in the Use of Artificial Intelligence and Machine Learning in Health Care: A Comprehensive Review. Journal of Clinical Medicine, 13(13), 3863. https://pmc.ncbi.nlm.nih.gov/articles/PMC11249277/
- Council of Europe. (n.d.). V. Potential impact of ai on the doctor-patient relationship. Human Rights and Biomedicine. Retrieved April 10, 2025, from https://www.coe.int/en/web/human-rights-and-biomedicine/potential-impact-of-ai-on-the-doctor-patient-relationship
- Brookings Institution. (2023, October 17). Risks and remedies for artificial intelligence in health care. https://www.brookings.edu/articles/risks-and-remedies-for-artificial-intelligence-in-health-care/
- Oatmeal Health. (n.d.). Why AI in Healthcare Has Failed in 2024. Retrieved April 10, 2025, from https://oatmealhealth.com/why-has-ai-failed-so-far-in-healthcare-despite-billions-of-investment/
- Chen, M., Pourhomayoun, M., & Feyisa, D. (2023). Fairness of artificial intelligence in healthcare: review and recommendations. BMC Medical Informatics and Decision Making, 23(1), 288. https://pmc.ncbi.nlm.nih.gov/articles/PMC10764412/
- McCradden, M. D., Anderson, J. A., & Stephenson, E. A. (2024). Expectations of healthcare AI and the role of trust: understanding patient views on how AI will impact cost, access, and patient-provider relationships. JAMIA Open, 8(1), ocaf031. https://academic.oup.com/jamia/advance-article/doi/10.1093/jamia/ocaf031/8046745
- McCradden, M. D., Harrison, R., Anderson, J. A., et al. (2024). Addressing ethical issues in healthcare artificial intelligence using a lifecycle-informed process. JAMIA Open, 7(4), ooae108. https://academic.oup.com/jamiaopen/article/7/4/ooae108/7901079
- Blease, C., Kharko, A., Annoni, M., et al. (2025). Artificial Intelligence in Medical Care – Patients' Perceptions on Caregiving Relationships and Ethics: A Qualitative Study. Journal of General Internal Medicine. https://pmc.ncbi.nlm.nih.gov/articles/PMC11911933/ (Note: Check publication year/status)
- Gerke, S., Babic, B., Evgeniou, T., et al. (2023). Artificial intelligence and the doctor-patient relationship expanding the paradigm of shared decision making. ResearchGate. [Preprint or published version]. https://www.researchgate.net/publication/369531267_Artificial_intelligence_and_the_doctor-patient_relationship_expanding_the_paradigm_of_shared_decision_making
- Kerasidou, A., Bærøe, K., Biller-Andorno, N., et al. (2025). Attitudes toward artificial intelligence and robots in healthcare in the general population: a qualitative study. BMC Medical Ethics, 26(1), 23. https://pmc.ncbi.nlm.nih.gov/articles/PMC11808042/
- AIPRM. (n.d.). 50+ AI in Healthcare Statistics 2024. Retrieved April 10, 2025, from https://www.aiprm.com/ai-in-healthcare-statistics/
- Gille, F., Jobin, A., & Biller-Andorno, N. (2025). Changes in public perception of AI in healthcare after exposure to ChatGPT. medRxiv. [Preprint]. https://www.medrxiv.org/content/10.1101/2025.01.23.25321048v1.full-text
- Wolters Kluwer. (n.d.). The future of generative AI in healthcare is driven by consumer trust. Retrieved April 10, 2025, from https://www.wolterskluwer.com/en/expert-insights/the-future-of-generative-ai-in-healthcare-is-driven-by-consumer-trust
- RevSpring. (n.d.). Patients Welcome AI in Healthcare: Survey Shows Preference for Faster, More Convenient Service. Retrieved April 10, 2025, from https://revspringinc.com/resources/blog/patients-welcome-ai-in-healthcare/
- Fiske, A., Paltz, M. C., Aeschbacher, S., et al. (2025). ATTITUDES AND EXPECTATIONS TOWARD ARTIFICIAL INTELLIGENCE AMONG SWISS PRIMARY CARE PHYSICIANS A CROSS-SECTIONAL SURVEY STUDY 2024. medRxiv. [Preprint]. https://www.medrxiv.org/content/10.1101/2025.03.22.25324458v1
- Gibson, R., Boutlis, C. S., Livergant, J., et al. (2024). Allied Health Professionals' Perceptions of Artificial Intelligence in the Clinical Setting: Cross-Sectional Survey. JMIR Formative Research, 8, e57204. https://formative.jmir.org/2024/1/e57204
- Ronquillo, C. E., Peltonen, L. M., Pruinelli, L., et al. (2025). Artificial intelligence in nursing: an integrative review of clinical and operational impacts. Frontiers in Digital Health, 7, 1552372. https://www.frontiersin.org/journals/digital-health/articles/10.3389/fdgth.2025.1552372/full
- Panch, T., Duralde, E. R., Kotecha, G., et al. (2024). Navigating ethical considerations in the use of artificial intelligence for patient care: A systematic review. Journal of Medical Ethics. [Online ahead of print]. https://pubmed.ncbi.nlm.nih.gov/39545614/
- Taylor & Francis Online. (2025). Clinical perspectives on AI integration: assessing readiness and training needs among healthcare practitioners. Informatics for Health and Social Care. [Online ahead of print]. https://www.tandfonline.com/doi/full/10.1080/12460125.2025.2458874
- Alshammari, R., Alanazi, A., Alshammari, N., et al. (2024). Artificial Intelligence in Primary Care Decision-Making: Survey of Healthcare Professionals in Saudi Arabia. Cureus, 16(11), e69176. https://www.cureus.com/articles/361066-artificial-intelligence-in-primary-care-decision-making-survey-of-healthcare-professionals-in-saudi-arabia
- Centers for Disease Control and Prevention (CDC). (2024). Health Equity and Ethical Considerations in Using Artificial Intelligence in Public Health and Medicine. Preventing Chronic Disease, 21, E91. https://pmc.ncbi.nlm.nih.gov/articles/PMC11364282/
- Kim, B., Nyholm, S., & Hyun, I. (2023). Ethical considerations for the use of artificial intelligence in medical decision-making capacity assessments. Journal of Medical Ethics, 49(10), 699-704. https://pubmed.ncbi.nlm.nih.gov/37717548/
- Saxena, A., & Miller, J. A. (2022). Legal and Ethical Consideration in Artificial Intelligence in Healthcare: Who Takes Responsibility?. Cureus, 14(3), e23638. https://pmc.ncbi.nlm.nih.gov/articles/PMC8963864/
- The Academy of Medical Sciences. (n.d.). AI in healthcare: learning from success stories. Retrieved April 10, 2025, from https://acmedsci.ac.uk/more/news/ai-in-healthcare-learning-from-success-stories
- Fiske, A., Paltz, M. C., Buyx, A., et al. (2024). Prospectively investigating the impact of AI on shared decision-making in post kidney transplant care (PRIMA-AI): protocol for a longitudinal qualitative study among patients, their support persons and treating physicians at a tertiary care centre. BMJ Open, 14(10), e081318. https://bmjopen.bmj.com/content/14/10/e081318
- UCSF School of Medicine. (n.d.). Harnessing Generative AI to Improve Patient Communication and Reduce Disparities. Retrieved April 10, 2025, from https://medschool.ucsf.edu/news/harnessing-generative-ai-improve-patient-communication-and-reduce-disparities
- YouTube. (n.d.). Transforming Patient Communication with AI: A Small Practice Success Story. [Video]. Retrieved April 10, 2025, from https://www.youtube.com/watch?v=JjunWVRy61Y (Note: Need original, stable URL)
- Fiske, A., Prainsack, B., & Buyx, A. (2023). Doctor-patient interactions in the age of AI: navigating innovation and expertise. Frontiers in Medicine, 10, 1241508. https://www.frontiersin.org/journals/medicine/articles/10.3389/fmed.2023.1241508/full
- WillowTree Apps. (n.d.). Generative AI in Healthcare: 3 Real-World Examples. Retrieved April 10, 2025, from https://www.willowtreeapps.com/insights/generative-ai-in-healthcare
- University of Nevada, Reno. (2025, January 23). AI-driven research uncovers how physician media choice shapes online patient experience. Nevada Today. https://www.unr.edu/nevada-today/news/2025/ai-physician-communication
- McKinsey & Company. (2024, February 27). Harnessing AI to reshape consumer experiences in healthcare. https://www.mckinsey.com/industries/healthcare/our-insights/harnessing-ai-to-reshape-consumer-experiences-in-healthcare
- McKinsey & Company. (2024, May 22). Generative AI in healthcare: Current trends and future outlook. https://www.mckinsey.com/industries/healthcare/our-insights/generative-ai-in-healthcare-current-trends-and-future-outlook
- California Health Care Foundation (CHCF). (n.d.). AI and the Future of Health Care. Retrieved April 10, 2025, from https://www.chcf.org/blog/ai-future-health-care/
- Javaid, M., Haleem, A., Singh, R. P., et al. (2024). The Role of AI in Hospitals and Clinics: Transforming Healthcare in the 21st Century. Healthcare (Basel), 12(8), 871. https://pmc.ncbi.nlm.nih.gov/articles/PMC11047988/
- Press Ganey. (n.d.). Patient experience in 2024: Bridging the gap in patient care journeys. Retrieved April 10, 2025, from https://info.pressganey.com/press-ganey-blog-healthcare-experience-insights/patient-experience-in-2024-bridging-the-gap
Comments
Post a Comment