Skip to main content

Navigating the doctor-patient-AI relationship - a mixed-methods study of physician attitudes toward artificial intelligence in primary care

Abstract

Background

Artificial intelligence (AI) is a rapidly advancing field that is beginning to enter the practice of medicine. Primary care is a cornerstone of medicine and deals with challenges such as physician shortage and burnout which impact patient care. AI and its application via digital health is increasingly presented as a possible solution. However, there is a scarcity of research focusing on primary care physician (PCP) attitudes toward AI. This study examines PCP views on AI in primary care. We explore its potential impact on topics pertinent to primary care such as the doctor-patient relationship and clinical workflow. By doing so, we aim to inform primary care stakeholders to encourage successful, equitable uptake of future AI tools. Our study is the first to our knowledge to explore PCP attitudes using specific primary care AI use cases rather than discussing AI in medicine in general terms.

Methods

From June to August 2023, we conducted a survey among 47 primary care physicians affiliated with a large academic health system in Southern California. The survey quantified attitudes toward AI in general as well as concerning two specific AI use cases. Additionally, we conducted interviews with 15 survey respondents.

Results

Our findings suggest that PCPs have largely positive views of AI. However, attitudes often hinged on the context of adoption. While some concerns reported by PCPs regarding AI in primary care focused on technology (accuracy, safety, bias), many focused on people-and-process factors (workflow, equity, reimbursement, doctor-patient relationship).

Conclusion

Our study offers nuanced insights into PCP attitudes towards AI in primary care and highlights the need for primary care stakeholder alignment on key issues raised by PCPs. AI initiatives that fail to address both the technological and people-and-process concerns raised by PCPs may struggle to make an impact.

Peer Review reports

Background

While the potential impact of AI in medicine has been long discussed, real-life, clinician-facing applications of AI have only recently become a reality [1,2,3]. AI-assisted chronic disease management, diagnostic support, and administrative work (such as documentation, billing, and patient messaging) have significant potential to improve medicine and to take some burden off physicians allowing them to focus on physician-level patient care [4]. Further, use of clinical AI is part of a broader shift in medicine toward “digital health” where many aspects of medical care are conducted remotely, mediated by a technological intermediary leading to potential improvements in efficiency and access [5, 6]. These developments stand to make a substantial impact in primary care, a field that is currently grappling with high rates of physician burnout, inadequate compensation, and a growing shortage of physicians [7,8,9]. However, there is concern that if AI is poorly integrated it could exacerbate the “disconnect between professional values and the realities of primary care practice” [10, 11]. For example, despite the crucial role of the doctor-patient relationship in medicine, the impact of AI and digital health on this essential component of primary care remains underexplored [12,13,14].

Despite so much on the line, there is limited literature on PCP views toward AI [15, 16]. Much of the research that does exist has taken place in a purely theoretical context exploring AI in general terms with physicians that did not have experience using AI-powered systems. We propose that more end-user engagement with clinicians discussing tangible, specific use cases of clinical AI is needed [17]. By highlighting specific AI use cases, we hope to elicit new concerns and attitudes that would remain hidden when discussing AI in general. Failure to engage end users in the design of AI-powered digital health tools leads to inefficient or unsuccessful integration of these tools into clinical workflow leading to added clinician burnout and even patient harm [10, 18, 19].

Primary care, technology and health equity

Our study recognizes the potential of technology to exacerbate or ameliorate existing inequalities in healthcare [20,21,22,23,24]. AI systems are particularly at risk of worsening health equity due to factors like potentially biased data becoming engrained in AI systems or unequal distribution of newly developed AI tools [25]. Equity considerations are especially vital in the context of primary care. PCPs are often the first point of contact for patients and are central in providing healthcare to communities with limited access due to geographical, economic, or social factors [26,27,28,29,30]. PCPs also make up the largest potential group of AI end users among health professionals [9]. Despite the foundational nature of primary care, this field has long endured a lack of attention, resources, and recognition compared to other medical specialties [31,32,33]. This has contributed to a comparative lack of AI progress and implementation in primary care in spite of huge need and potential [9, 22, 34,35,36]. Accordingly, equity is a key consideration for AI in primary care.

Objective

In our pursuit of user-centric design, we employed a mixed-methods approach to delve into PCP attitudes regarding the potential transformative influence of AI and the broader shift towards digitalization in primary care. Our initial aim is to inform primary care stakeholders of PCP apprehensions regarding potential adverse effects of AI in primary care. Our findings reveal pivotal factors that can either facilitate or hinder the integration of AI systems in primary care. Our long-term aim is to use these findings to develop the AI tools outlined in the manuscript with the goal of improving patient care. While this is not the first exploratory investigation of PCP attitudes about AI, to our knowledge it is the first study that extends beyond the theoretical realm, weaving in specific AI use cases and input from PCPs with real-world experience using primary care AI and digital health tools.

Methods

Participant engagement with AI and digital health

Our study participants are affiliated with an academic medical center (AMC) actively engaged in the development, pilot testing, or implementation of several AI applications within its healthcare system. Here, we spotlight specific use cases relevant to primary care, highlighting their pivotal role in the study. These use cases, characterized by their remote and asynchronous elements, also fall under the broader category of digital health [37].

AI-enhanced disease screening: obstructive sleep apnea

Acknowledging the growing prevalence of Obstructive Sleep Apnea (OSA) and its often-undetected status, a research team at our AMC identified OSA as a suitable target for AI-based disease screening [38]. This initiative builds upon prior research employing electronic health record (EHR) data to identify individuals at high risk of OSA [39, 40]. This use case is an archetype for multiple types of disease screening in primary care and raises important questions such as what to do with positive screening results from an AI tool run on a patient panel.

AI-facilitated disease management: hypertension

Our institution is exploring a digital health strategy for hypertension management, integrating home blood pressure measurements and AI-powered clinical decision support through a panel-level registry [41, 42]. This approach has potential to help primary care physicians give precise hypertension care based on unique patient characteristics while also giving them tools and efficiencies to do so at a population level [42,43,44]. Additionally, our institution has introduced a population health service enabling PCPs to refer patients with hypertension to digital medication management facilitated remotely by nurses and pharmacists. This use case is an archetype for chronic disease management in primary care and raises questions such as: how can AI augment PCP abilities or coordinate care between different primary care team members?

AI-facilitated administrative tasks: patient messaging

Inbox overload, which was exacerbated during the Covid-19 pandemic, contributes to burnout and “pajama time” in primary care [45]. To mitigate this challenge, our AMC is currently piloting the utilization of Large Language Models like ChatGPT for drafting patient message responses within the EHR [45]. This use case is an archetype for AI assisting with administrative tasks and raises questions including potential impacts on the doctor-patient relationship.

Digital survey

As the first step in our mixed-method approach, we employed a digital survey (appendix) specifically developed for our project to quantify PCP attitudes. Given the novel nature of our research focus and the absence of pre-existing validated questionnaires, members of our research team with qualitative research expertise lead the creation of the survey instrument which ensured impartiality, methodological rigor, and ability to capture nuanced insight. Our survey instrument was primarily descriptive in nature and served as a valuable data source for understanding the frequency of responses and providing a framework for the subsequent interviews. Using Likert scales, the survey explored participants’ comfort levels and perceptions about AI in healthcare. We also gathered deidentified demographic data to contextualize perspectives. The survey captured responses from a diverse group of primary care physicians (N = 47), providing perspectives from different primary care specialties and practice settings. The sample included physicians from AMC Faculty Internal Medicine (n = 6), AMC Faculty Family Medicine (n = 36), AMC Clinical Internal Medicine (n = 1), AMC Clinical Family Medicine (n = 4).

The respondents’ ages encompassed a wide range: 25–34 years old (n = 7), 35–44 years old (n = 19), 45–54 years old (n = 12), 55–64 years old (n = 9). Gender diversity was evident with 46.8% male (n = 22) and 53.2% female (n = 25) respondents. Years of experience in practice varied (mean = 3.17 years; n = 47), including 1–5 years (n = 15), 6–10 years (n = 6), 11–15 years (n = 8), 16–20 years (n = 5), 21–25 years (n = 5), 26–30 years (n = 4), 31–35 years (n = 3). This spectrum ensures broad insights into PCP perspectives across different demographics and career stages.

Semi-structured interview

In the survey, respondents were asked if they would be willing to engage in a follow-up interview. Employing a semi-structured interview format with an interview guide (appendix) iteratively developed for our project in conjunction with qualitative research experts, we provided participants with an open and adaptable platform to share their perspectives which we then scrutinized using thematic analysis. Interviews were conducted in a confidential environment via remote teleconferencing software (Zoom). Automated transcription software (Otter.AI) was used for transcription generation and collected interview data underwent rigorous thematic analysis using Quirkos qualitative analysis software. This method involved a systematic process of coding and categorizing responses to identify recurring patterns, insights, and emerging themes. Through iterative refinement, we extracted meaningful themes that captured the essence of PCP’s views regarding AI in primary care. Following our initial interviews (n = 6) that highlighted PCP’s concerns about increased workload and expectations, we added a question to delve deeper into how AI could affect the doctor-patient relationship. Our final number of interviews was 15.

Results

General perceptions of a.i. in medicine

The majority of survey respondents (76.6%) held an optimistic perspective regarding the potential of AI in medicine. Comfort levels in integrating AI-based technologies into clinical practice varied across different domains (Table 1).

Table 1 Table percentages represent the proportion of PCPs reporting varying levels of comfort with AI involvement in different domains as reported via the digital survey

While some physicians reported feeling comfortable communicating the role of AI-based tools to patients (very comfortable: 6.4%, somewhat comfortable: 36.2%) a sizeable percentage did not (somewhat uncomfortable: 23.4%, very uncomfortable: 12.8%). Importantly, 70.2% of surveyed physicians described their approach to learning about AI in medicine as “passively learning via popular news sources or casual conversation” with only 25.5% “actively seeking education through established organizations, coursework, lectures, professional journals, or books.”

Concerns about AI in primary care

Despite the general positivity quantified in the survey, interview participants expressed numerous concerns about AI—especially when discussing specific AI use cases. We subjected our interview data on concerns regarding AI to a thematic coding analysis and identified the following themes which have been categorized as concerns regarding technology or people-and-processes (Fig. 1) [46].

Fig. 1
figure 1

PCP concerns about AI. Description: Emergent themes from interviews with primary care physicians regarding AI divided into concerns about AI technology itself and concerns about the context and manner of AI implementation

Technological concerns included factors such as algorithmic bias (1 participant) or accuracy and safety (7 participants).

The thing I’m apprehensive about is, how are we teaching AI these things because some of those biases could leak in. [Participant C]

My concerns around AI in medicine have most to do with the space of accuracy. And a tool that I feel is reliable. [Participant G]

Concern about external validity and the ability of AI algorithms to appreciate the nuances of specific patients (8 participants) was an important consideration for PCPs.

I’ve known a lot of my patients now for 30 years and know a lot about them. That can’t really be put into a data set that AI can draw upon. [Participant F]

Interviewed PCPs reported differing opinions on whether they prioritized explainability [47] in AI models (3 participants),

At this point, I want to be able to get a logical explanation. [Participant L]

or if this was not imperative for them (10 participants).

I’m perfectly okay if my own experience with time goes better and better and I feel like, you know, it works. Don’t ask me how it works, but it works. [Participant A]

I almost think that the tool adds more to the decision-making process if it’s operating outside of that human accessible reasoning process. [Participant B]

However, many reported concerns centered around systemic issues rather than technological ones. One common concern (5 participants) was that of the medicolegal implications of acting, or failing to act, on AI guidance.

If the system is saying, ‘Hey, this person has severe sleep apnea’, and what if they get in a car accident tomorrow and we had that data today? [Participant O]

When discussing the potential of algorithmic detection of OSA, PCPs (5 participants) pointed out that without augmenting the system’s ability to definitively diagnose and treat more patients with OSA, the AI tool would not be helpful.

When we are loading our system from this side, we need to have the resources on the other side. [Participant E]

I think we need to be better equipped before we start telling people this because, you know, it’s like, Hey, you might have sleep apnea. Wait six months for your sleep study… [Participant C]

Additionally, 10 participants reported concern that the integration of AI into primary care could potentially lead to increased workload and physician burnout.

My concern is that like everything else that we have tried to do to make things better in medicine is that it actually makes things harder on the physician and creates more work for us instead of less work. [Participant D]

Interviewees reported multiple ways in which this could happen including AI tools delegating work to physicians that could potentially be handled by other team members,

Is it really the physicians that should deal with this in the first place? [Participant K]

a need to constantly verify or redo work done by AI,

It’s like having a student with me all the time, where I’ve got to just double check everything. [Participant D]

or an excessive focus on productivity.

We are going to add two extra patients per session because now we have help there. So unfortunately, sometimes more help is used in a negative way. [Participant E]

I’ve always seen that the system wants productivity, and the way productivity is defined is based on the number of patients seen. [Participant L]

Not all PCPs shared this concern of increased workload due to AI, with one participant expressing that increased efficiency due to AI would be welcome even if it meant seeing more patients.

If I could see 30 patients in a day, and actually close out my charts by 6pm, smiling, and get home for dinner, I’d be happy. [Participant G]

Physicians also reported concerns on how AI might impact the doctor-patient relationship. Some expressed positive hopes for AI to improve the doctor-patient relationship (10 participants) by doing things such as alleviating clinician burden or improving patient engagement.

Maybe you are actually then more compassionate in an encounter, because you haven’t had to do all of that mental lifting. [Participant M]

But many—some that had also expressed positive sentiment—worried that AI could harm the doctor-patient relationship (12 participants) by factors such as warping patient expectations or prioritizing patient needs over physician well-being.

Patients may end up feeling that, you know, if the AI can tell me that then why did I bother to come to you? [Participant L]

The more we sort of train patients to expect things quickly and efficiently, the more expectations are on the doctor to then produce in the same way. [Participant D]

Participating PCPs also lamented a lack of focus on physician-wellbeing when implementing new technologies (9 participants).

I feel like right when I get efficient, something new gets introduced. [Participant M]

The system is all about the patient’s satisfaction. Is there any of that focus on physician satisfaction? [Participant E]

One key point was the concern that the current way healthcare is paid for does not encourage innovative ways of care delivery such as AI-powered digital health tools.

In essence, we’re providing a bunch of free care, which, you know, is not sustainable. [Participant H]

Thus, it appears that PCPs see a disconnect between care innovation and the way they are forced to practice due to how care is reimbursed.

They’re told to do both things. So they’re really there to crank it out, crank out these RVUs while also doing value based medicine and population based medicine. [Participant J]

Dedicated time for digital health as well as alternative reimbursement models were frequently voiced (11 participants) as a key determiner of the uptake and success of AI tools.

I’d like dedicated time daily or at least weekly to review. Otherwise, I might only see it if I see the patient. [Participant F]

I think that this system really needs to rethink how it employs physicians and providers. [Participant L]

Data from our survey corroborates this finding that PCPs are unsure of where digital health tools fit into their workflow. When asked about preferences about when to receive communication from an AI tool regarding a patient screening positive for OSA, responses varied widely with 3 respondents using the free response to indicate that they would not like to be notified. When asked if they were aware of a pre-existing EHR registry of patients with hypertension, nearly half (45.83%) responded no. For those that were aware of the registry, more than half reported that their usage of the registry was “infrequently” (30.77%) or “never” (23.08%) with the most common reported reason being a lack of time.

Discussion

AI as a double-edged sword

Our findings reveal the dual nature of AI in healthcare, uncovering its potential to alleviate or exacerbate challenges in primary care. Some of our identified concerns about AI adoption in healthcare, including lack of external validity, potential for bias, and safety issues, have been well-documented in the literature [48,49,50]. Our study expands upon these concerns highlighting that, for clinicians, the mechanics of AI itself may take a back seat to its potential impact on their professional lives, personal well-being, and their relationships with patients [51]. We argue that concerns such as apprehension about increased workload stem from a broader sentiment among PCPs that advancements in healthcare often prioritize productivity over physician well-being or put financial considerations over human relationships [52,53,54]. Accordingly, some PCP concerns about AI may be a reflection of a disillusionment with the evolving landscape of medicine in general. In this context, the introduction of AI is perceived as yet another instance where physician interests may be subjugated to organizational efficiency. These concerns are not unfounded with previous literature proposing using technology to add capacity as one of the solutions to keep up with the increasing physician shortage [55, 56]. Further, PCP panel sizes are already felt to be excessive and a fear regarding AI being used to justify the addition of patients may be rational [57,58,59]. Digital health and AI in primary care must be applied thoughtfully to avoid further ostracizing PCPs from their professional values.

I think we all worry that more work is what things are aimed at. [Participant F]

These are not selfish concerns as the well-being of physicians is intrinsically linked to patient outcomes and is aligned with the Quadruple Aim of healthcare [60, 61]. This concern for physician well-being is especially pertinent in the context of primary care—a cornerstone of healthcare critical for providing access to underserved populations that is chronically undervalued by the healthcare system [62, 63]. PCPs should be able to share in benefits such as time or cost-savings produced by the implementation of AI systems. Our findings suggest that if all benefit goes to the organization, physician appetite for uptake will remain low.

Navigating the doctor-patient-AI relationship

Our work also highlights the evolving role of the primary care physician [64]. Once commonly viewed as the source of medical truth, physicians now coexist with “Dr. Google,” a digital repository of health information that empowers patients to engage proactively in their own care [65]. This change has had mixed effects on the doctor-patient relationship but can be positive if both parties engage in proper communication and shared-decision making [66,67,68]. These positive effects also hinge on factors such as strong patient health information literacy and adequate doctor-patient communication time [69]. Against this backdrop, The impending integration of AI into healthcare will similarly revolutionize the doctor-patient dynamic (Fig. 2).

Fig. 2
figure 2

The evolution of the doctor-patient relationship. Description: The advent of the internet had significant impacts on the doctor-patient relationship. Primary care physicians have a mix of concern and optimism about how AI may do the same

Much of the same research that was done to explore the impact of internet health information on the doctor-patient relationship needs to be repeated and expanded upon in the context of AI. Unlike static online information, interactive AI systems are poised to assume a more active role in shaping the interactions between patients and physicians. Ensuring a positive impact of AI on the doctor-patient relationship is essential to maintaining medicine’s social contract with society [70].

AI’s role in providing patients with information, be it accurate or misleading, could confound the physician-patient dynamic. Patients arriving with AI-informed information could complicate collaborative decision-making [71]. Possible consequences include rigid adherence to AI-driven advice without considering individual medical history or difficulties for physicians attempting to reconcile their expertise with AI suggestions. This not only risks eroding the PCP’s role but also could reshape the doctor-patient relationship into a consumer-provider model.

I’m worried about AI, introducing a dynamic where misinformation is enhanced… if a patient comes in and they’re like, hey, like, you know, WebMD.GPT told me that I need an MRI then there’s another powerful thing that I’m arguing against. [Participant B]

A key part of ensuring a positive impact of AI on the doctor-patient relationship is promoting realistic and aligned expectations regarding AI via education for doctors and patients before implementation of AI tools.

Instead of just letting the cat out of the bag and seeing what happens, you want to make sure that everyone that is going to be interacting with it has accurate expectations and has been educated on what role this is supposed to play. [Participant J]

Previous literature has highlighted concerns that unequal knowledge or differing backgrounds in the doctor-patient relationship could exacerbate health inequity [72, 73]. Thoughtful and equitable implementation of AI could encourage increased patient engagement and understanding leading to more effective doctor-patient communication and increased equity.

The future of primary care workflow

Our findings call for a reconsideration of fundamental questions regarding primary care workflow. If PCPs are going to be active participants in new forms of healthcare delivery, including AI-powered digital health, when are they supposed to do that work? This shift toward digital health is already occurring albeit in an unscheduled and uncompensated way [45, 74, 75].

“Most of our care is delivered in MyChart. Like let’s just be honest, that’s how it’s getting delivered. [Participant G]

Inbox burden is a well-known problem, but it is only the beginning of asynchronous digital versus synchronous in-person workload conflict in primary care [76]. The failure of health systems to identify proper ways of allocating time, resources, and standards to these new ways of interacting with patients has already had substantial consequences [77]. AI-powered digital tools for chronic disease management and disease screening augmented by remote patient monitoring systems will likely become increasingly common [78, 79]. Accordingly, consideration needs to be given to how best to allocate physician time to support digital health. Succeeding in digital health is more than solving inbox overload or alarm fatigue but rather realizing a fundamental shift in how primary care interacts with and takes care of patients [6]. We propose that experimenting with hybrid in-person and virtual work schedules could empower physicians to actualize the potential of digital health [80].

Another consideration is: how can AI be integrated into patient-centered, team-based primary care [81]? A population health approach to primary care consists of a physician acting as a “healthcare quarterback” who is responsible for the health of an entire patient panel, regardless of how, where, or by whom each component of care is delivered [81, 82]. Our interview participants frequently indicated that physicians need not always be the primary point of contact for an AI recommendation. Identifying when other team members can review and act upon AI-produced guidance while maintaining the PCP in the loop could mitigate concerns around the possibility of AI and digital health creating more work for physicians [82, 83]. This must be done carefully in a manner that enhances—rather than erodes—the core doctor-patient relationship [11, 12].

Fig. 3
figure 3

AI as a member of the healthcare team. Description: AI becoming a member of the primary care team

In a more general sense, care coordination is a core challenge of primary care. More research should be done on using AI as a facilitator of task follow-up, delegation, and other components of care coordination (Fig. 3) [84]. Designing standardized primary care workflow is challenging [85]. However, enabling PCPs with time, team members, and incentives to use AI-powered digital tools that handle some aspects of care remotely and asynchronously could facilitate more meaningful, effective, and focused in-person clinic visits [86,87,88]. More time spent on digital avenues of care has also been shown to improve quality of care metrics [89]. Currently, PCPs already engage in this sort of work but they do it at the expense of time with their patients or their families [75, 90]. Organizations should be wary of relying too much on physician altruism to find time to use digital tools, forcing physicians to choose between their personal wellbeing and that of their patients [91,92,93].

Considerations for primary care stakeholders

As a first step, stakeholders need to ensure that primary care AI systems are rigorously evaluated from an accuracy, safety, and bias standpoint. However, subsequent attention must be given to workflow integration and impact on physician well-being (Table 2) [86, 94]. Previous literature has shown that such non-technical factors are essential to promote uptake of new technologies [95]. Caution must be taken to ensure that AI does not result in “less doctor work and more office work” leading to PCPs who feel exploited by the healthcare system [10]. Rather, AI should be used as an opportunity to address primary care challenges such as helping deliver better care while preventing physician burnout [96].

Additionally, the way in which healthcare is compensated has a substantial impact on the behavior and time-allocation of PCPs [97]. Some attempts, such as charging a user fee for patient messages and billing payers for e-visits (i.e., responding to patient messages), have been made to reimburse digital health services in a fee-for-service model [98]. However, digital health is likely more well suited for value-based primary care that would incentivize and provide flexibility for physicians to engage in asynchronous, population-level digital health tools [99]. Organizations wishing to reap the benefits of AI in primary care must tackle this challenge head on and be willing to reimagine how care is delivered and paid for rather than further ingraining legacy systems and approaches. This is especially pertinent in considering the nature of healthcare reimbursement in the United States. Efforts at payment modernization are underway, [100] but failure to quickly advance and innovate our payment models could lead to our systems lagging behind non-fee-for-service nations in terms of AI and digital health innovation.

Table 2 Recommendations for primary care stakeholders

AI and equity

The people who will get it are the people who can pay for the compute. And so that’s my biggest fear is that we will leave out the poorest people from getting the best care. – Participant G.

This sentiment expressed by multiple study participants underscores the gravity of ensuring equitable access to AI-powered healthcare solutions for patients across the spectrum of socioeconomic backgrounds. The transformation brought by AI should not inadvertently reinforce existing disparities, but rather serve as a tool to alleviate them [101]. For example, AI must not alleviate physician burnout and improve patient outcomes only at large AMCs that have adequate resources to engage in AI development and implementation but should be broadly accessible and applicable across diverse healthcare settings and institutions, ensuring equitable access to its benefits for all. Safety-net health systems, federally qualified health centers, rural areas, and other practice settings that could be left out need to be included in the primary care AI revolution [102,103,104].

The reality is that our patients have much more healthcare that they need delivered than we can ever deliver, so they’re going to need AI tools. – Participant G.

Strengths, limitations, and directions for future research

In contrast to much of the existing literature, all our study participants have had actual experience with digital health and some of our study participants have actual experience with medical AI. Because these technological shifts are just beginning, clinicians with perspectives informed by actual experience are rare, making our findings more valuable. However, a limitation is that terms such as “AI” or “digital health” have evolving definitions and may not mean the same thing to different individuals. While we tried to ameliorate this effect by grounding our discussions in tangible use-cases and examples, participants differing preconceived ideas of AI may have affected participant responses. Further, focusing on certain use-cases over others may have influenced reported PCP views on AI in general. Future efforts should more comprehensively evaluate perceptions of AI in primary care to ensure that reported PCP attitudes are not overly influenced by any particular use-case. Additionally, while our survey and interview guides were developed in a rigorous manner in collaboration with qualitative methods experts, future research should attempt to develop validated qualitative tools for assessing PCP attitudes toward AI in primary care.

Our research focused only on internal medicine and family medicine physician attitudes toward AI and digital health. Future research needs to include other primary care team members including primary care pediatricians, nurses, and advanced practice providers such as nurse practitioners and physician assistants. In addition, greater research is needed on how patients—especially those that may be marginalized—are experiencing a shift toward AI and digital health in primary care [105, 106]. Finally, future research should also expand beyond our selected AI use cases to incorporate other AI applications pertinent to primary care.

Our relatively small sample size limits the generalizability of our findings to larger populations. Additionally, all our respondents originated from the same organization. While we attempted to assess attitudes of both academic and non-academic physicians, this organizational homogeneity might further limit generalizability. Moreover, the potential for response bias in self-reported responses should be acknowledged.

In light of these limitations, we emphasize the need for future research endeavors to employ quantitative methods to explore questions regarding AI in primary care and to incorporate larger and more diverse samples from various healthcare settings. Incorporating multiple organizations—especially those that are not well-funded AMCs in urban environments—can provide a broader perspective on the adoption of AI and digital health in primary care.

Conclusion

This study was the first to investigate PCP attitudes toward AI in primary care focusing on specific AI use cases. Reported attitudes varied, but PCP responses showed general optimism around AI in primary care tempered by certain concerns. While some concerns focused on technological factors like algorithmic accuracy, safety, and bias, others focused on people-and-process factors such as effects on physician workflow, equity, reimbursement, and the doctor-patient relationship. These findings suggest that AI initiatives that fail to address both the technological and people-and-process concerns raised by PCPs may struggle to make an impact. Primary care stakeholders should use these findings to inform development and implementation of AI in primary care.

Data availability

The digital survey questions and interview guide question stems will be included as an appendix to the manuscript. Complete interview and survey data will be made available upon reasonable request.

Abbreviations

AI:

Artificial Intelligence

PCP:

Primary Care Physician

AMC:

Academic Medical Center

OSA:

Obstructive Sleep Apnea

EHR:

Electronic Health Record

References

  1. Strohm L, Hehakaya C, Ranschaert ER, Boon WPC, Moors EHM. Implementation of artificial intelligence (AI) applications in radiology: hindering and facilitating factors. Eur Radiol. 2020;30(10):5525–32.

    Article  PubMed  Google Scholar 

  2. Park CJ, Yi PH, Siegel EL. Medical Student perspectives on the impact of Artificial Intelligence on the practice of Medicine. Curr Probl Diagn Radiol. 2021;50(5):614–9.

    Article  PubMed  Google Scholar 

  3. Briganti G, Le Moine O. Artificial Intelligence in Medicine: Today and Tomorrow. Front Med [Internet]. 2020 [cited 2023 Aug 6];7. Available from: https://www.frontiersin.org/articles/https://doi.org/10.3389/fmed.2020.00027.

  4. Seneviratne MG, Shah NH, Chu L. Bridging the implementation gap of machine learning in healthcare. BMJ Innov. 2020;6(2):45–7.

    Article  Google Scholar 

  5. Zimlichman E, Nicklin W, Aggarwal R, Bates D, Health Care. 2030: The Coming Transformation. NEJM Catal Innov Care Deliv [Internet]. (March 1, 2021). Available from: https://catalyst.nejm.org/doi/full/https://doi.org/10.1056/CAT.20.0569.

  6. Pagliari C. Digital health and primary care: past, pandemic and prospects. J Glob Health. 2021;11:01005.

    Article  PubMed  Google Scholar 

  7. Rabatin J, Williams E, Baier Manwell L, Schwartz MD, Brown RL, Linzer M. Predictors and outcomes of Burnout in Primary Care Physicians. J Prim Care Community Health. 2016;7(1):41–3.

    Article  PubMed  Google Scholar 

  8. Petterson SM, Liaw WR, Tran C, Bazemore AW. Estimating the Residency Expansion required to avoid projected primary care physician shortages by 2035. Ann Fam Med. 2015;13(2):107–14.

    Article  PubMed  Google Scholar 

  9. Lin SA, Clinician’s, Guide to Artificial Intelligence (AI). Why and how primary care should lead the Health Care AI revolution. J Am Board Fam Med. 2022;35(1):175–84.

    Article  PubMed  Google Scholar 

  10. Agarwal SD, Pabo E, Rozenblum R, Sherritt KM. Professional Dissonance and Burnout in Primary Care: a qualitative study. JAMA Intern Med. 2020;180(3):395.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Amano A, Brown-Johnson CG, Winget M, Sinha A, Shah S, Sinsky CA, et al. Perspectives on the Intersection of Electronic Health Records and Health Care Team Communication, function, and well-being. JAMA Netw Open. 2023;6(5):e2313178.

    Article  PubMed  Google Scholar 

  12. Kearley KE, Freeman GK, Heath A. An exploration of the value of the personal doctor-patient relationship in general practice. Br J Gen Pract J R Coll Gen Pract. 2001;51(470):712–8.

    CAS  Google Scholar 

  13. Aminololama-Shakeri S, López JE. The Doctor-Patient Relationship with Artificial Intelligence. Am J Roentgenol. 2019;212(2):308–10.

    Article  Google Scholar 

  14. Nagy M, Sisk B. How will Artificial Intelligence Affect Patient-Clinician relationships? AMA J Ethics. 2020;22(5):E395–400.

    Article  PubMed  Google Scholar 

  15. Blease C, Kaptchuk TJ, Bernstein MH, Mandl KD, Halamka JD, DesRoches CM. Artificial Intelligence and the future of primary care: exploratory qualitative study of UK General practitioners’ views. J Med Internet Res. 2019;21(3):e12802.

    Article  PubMed  Google Scholar 

  16. Buck C, Doctor E, Hennrich J, Jöhnk J, Eymann T. General practitioners’ attitudes toward Artificial intelligence–enabled systems: interview study. J Med Internet Res. 2022;24(1):e28916.

    Article  PubMed  Google Scholar 

  17. Kueper JK, Terry AL, Zwarenstein M, Lizotte DJ. Artificial Intelligence and Primary Care Research: a scoping review. Ann Fam Med. 2020;18(3):250–8.

    Article  PubMed  Google Scholar 

  18. Marwaha JS, Landman AB, Brat GA, Dunn T, Gordon WJ. Deploying digital health tools within large, complex health systems: key considerations for adoption and implementation. Npj Digit Med. 2022;5(1):1–7.

    Article  Google Scholar 

  19. Downing NL, Bates DW, Longhurst CA. Physician burnout in the Electronic Health Record Era: are we ignoring the Real cause? Ann Intern Med. 2018;169(1):50.

    Article  PubMed  Google Scholar 

  20. López L, Green AR, Tan-McGrory A, King RS, Betancourt JR. Bridging the Digital divide in Health Care: the role of Health Information Technology in addressing racial and ethnic disparities. Jt Comm J Qual Patient Saf. 2011;37(10):437–45.

    PubMed  Google Scholar 

  21. Latulippe K, Hamel C, Giroux D. Social Health Inequalities and eHealth: A literature review with qualitative synthesis of theoretical and empirical studies. J Med Internet Res. 2017;19(4):e136.

    Article  PubMed  Google Scholar 

  22. Weiss D, Rydland HT, Øversveen E, Jensen MR, Solhaug S, Krokstad S. G Virgili editor 2018 Innovative technologies and social inequalities in health: a scoping review of the literature. PLoS ONE 13 4 e0195447.

    Article  PubMed  Google Scholar 

  23. Ramsetty A, Adams C. Impact of the digital divide in the age of COVID-19. J Am Med Inform Assoc. 2020;27(7):1147–8.

    Article  PubMed  Google Scholar 

  24. Yao R, Zhang W, Evans R, Cao G, Rui T, Shen L. Inequities in Health Care services caused by the Adoption of Digital Health Technologies: scoping review. J Med Internet Res. 2022;24(3):e34144.

    Article  PubMed  Google Scholar 

  25. Joyce K, Smith-Doerr L, Alegria S, Bell S, Cruz T, Hoffman SG, et al. Toward a sociology of Artificial Intelligence: a call for Research on inequalities and Structural Change. Socius Sociol Res Dyn World. 2021;7:237802312199958.

    Article  Google Scholar 

  26. Salhi RA, Dupati A, Burkhardt JC. Interest in serving the Underserved: role of race, gender, and Medical Specialty Plans. Health Equity. 2022;6(1):933–41.

    Article  PubMed  Google Scholar 

  27. Jetty A, Hyppolite J, Eden AR, Taylor MK, Jabbarpour Y. Underrepresented Minority Family Physicians more likely to care for vulnerable populations. J Am Board Fam Med. 2022;35(2):223–4.

    Article  PubMed  Google Scholar 

  28. Starfield B, Shi L, Macinko J. Contribution of primary care to health systems and health. Milbank Q. 2005;83(3):457–502.

    Article  PubMed  Google Scholar 

  29. Blumenthal D, Mort E, Edwards J. The efficacy of primary care for vulnerable population groups. Health Serv Res. 1995;30(1 Pt 2):253–73.

    PubMed  CAS  Google Scholar 

  30. Marrast LM, Zallman L, Woolhandler S, Bor DH, McCormick D. Minority Physicians’ role in the Care of Underserved patients: diversifying the physician workforce may be key in addressing Health disparities. JAMA Intern Med. 2014;174(2):289.

    Article  PubMed  Google Scholar 

  31. Lambert SI, Madi M, Sopka S, Lenes A, Stange H, Buszello CP, et al. An integrative review on the acceptance of artificial intelligence among healthcare professionals in hospitals. Npj Digit Med. 2023;6(1):1–14.

    Google Scholar 

  32. Stange KC, Ferrer RL. The Paradox of Primary Care. Ann Fam Med. 2009;7(4):293–9.

    Article  PubMed  Google Scholar 

  33. Shi L. The impact of primary care: a focused review. Scientifica. 2012;2012:1–22.

    Article  Google Scholar 

  34. Liaw W, Kakadiaris IA. Primary care Artificial Intelligence: a branch hiding in Plain Sight. Ann Fam Med. 2020;18(3):194–5.

    Article  PubMed  Google Scholar 

  35. Liaw W, Kakadiaris IA. Artificial Intelligence and Family Medicine: better together. Fam Med. 2020;52(1):8–10.

    Article  PubMed  Google Scholar 

  36. Lin SY, Mahoney MR, Sinsky CA. Ten ways Artificial Intelligence Will Transform Primary Care. J Gen Intern Med. 2019;34(8):1626–30.

    Article  PubMed  Google Scholar 

  37. Fatehi F, Samadbeik M, Kazemi A. What is Digital Health? Review of definitions. Stud Health Technol Inform. 2020;275:67–71.

    PubMed  Google Scholar 

  38. Motamedi KK, McClary AC, Amedee RG. Obstructive sleep apnea: a growing problem. Ochsner J. 2009;9(3):149–53.

    PubMed  PubMed Central  Google Scholar 

  39. Ramesh J, Keeran N, Sagahyroon A, Aloul F. Towards validating the effectiveness of obstructive sleep apnea classification from Electronic Health Records Using Machine Learning. Healthc Basel Switz. 2021;9(11):1450.

    Google Scholar 

  40. Maniaci A, Riela PM, Iannella G, Lechien JR, La Mantia I, De Vincentiis M, et al. Machine learning identification of obstructive sleep apnea severity through the patient clinical features: a retrospective study. Life Basel Switz. 2023;13(3):702.

    Google Scholar 

  41. Molina-Ortiz EI, Vega AC, Calman NS. Patient registries in primary care: essential element for Quality Improvement: P ATIENT R EGISTRIES IN P RIMARY C ARE. Mt Sinai. J Med J Transl Pers Med. 2012;79(4):475–80.

    Google Scholar 

  42. Hu Y, Huerta J, Cordella N, Mishuris RG, Paschalidis IC. Personalized hypertension treatment recommendations by a data-driven model. BMC Med Inform Decis Mak. 2023;23(1):44.

    Article  PubMed  Google Scholar 

  43. Fisher NDL, Fera LE, Dunning JR, Desai S, Matta L, Liquori V, et al. Development of an entirely remote, non-physician led hypertension management program. Clin Cardiol. 2019;42(2):285–91.

    Article  PubMed  Google Scholar 

  44. Visco V, Izzo C, Mancusi C, Rispoli A, Tedeschi M, Virtuoso N, et al. Artificial Intelligence in Hypertension Management: an Ace up your sleeve. J Cardiovasc Dev Dis. 2023;10(2):74.

    PubMed  Google Scholar 

  45. Holmgren AJ, Downing NL, Tang M, Sharp C, Longhurst C, Huckman RS. Assessing the impact of the COVID-19 pandemic on clinician ambulatory electronic health record use. J Am Med Inform Assoc. 2022;29(3):453–60.

    Article  PubMed  Google Scholar 

  46. University NCS, Payton F, Pare G, Montréal HEC, Le Rouge C, Saint L et al. University,. Health Care IT: Process, People, Patients and Interdisciplinary Considerations. J Assoc Inf Syst. 2011;12(2):I–XIII.

  47. the Precise4Q consortium, Amann J, Blasimme A, Vayena E, Frey D, Madai VI. Explainability for artificial intelligence in healthcare: a multidisciplinary perspective. BMC Med Inform Decis Mak. 2020;20(1):310.

    Article  Google Scholar 

  48. Challen R, Denny J, Pitt M, Gompels L, Edwards T, Tsaneva-Atanasova K. Artificial intelligence, bias and clinical safety. BMJ Qual Saf. 2019;28(3):231–7.

    Article  PubMed  Google Scholar 

  49. Wong A, Otles E, Donnelly JP, Krumm A, McCullough J, DeTroyer-Cooley O, et al. External validation of a widely implemented proprietary Sepsis prediction model in hospitalized patients. JAMA Intern Med. 2021;181(8):1065.

    Article  PubMed  Google Scholar 

  50. Yu KH, Kohane IS. Framing the challenges of artificial intelligence in medicine. BMJ Qual Saf. 2019;28(3):238–41.

    Article  PubMed  Google Scholar 

  51. Terry AL, Kueper JK, Beleno R, Brown JB, Cejic S, Dang J, et al. Is primary health care ready for artificial intelligence? What do primary health care stakeholders say? BMC Med Inform Decis Mak. 2022;22(1):237.

    Article  PubMed  Google Scholar 

  52. Phillips RL, Bazemore AW, Newton WP. PURSUING PRACTICAL PROFESSIONALISM: FORM FOLLOWS FUNCTION. Ann Fam Med. 2019;17(5):472–5.

    Article  Google Scholar 

  53. Relman AS. The New Medical-Industrial Complex. N Engl J Med. 1980;303(17):963–70.

    Article  PubMed  CAS  Google Scholar 

  54. Tai-Seale M, Baxter S, Millen M, Cheung M, Zisook S, Çelebi J et al. Association of physician burnout with perceived EHR work stress and potentially actionable factors. J Am Med Inform Assoc. 2023;ocad136.

  55. Bodenheimer TS, Smith MD. Primary care: proposed solutions to the physician shortage without training more Physicians. Health Aff (Millwood). 2013;32(11):1881–6.

    Article  PubMed  Google Scholar 

  56. Green LV, Savin S, Lu Y. Primary care physician shortages could be eliminated through use of teams, nonphysicians, and Electronic Communication. Health Aff (Millwood). 2013;32(1):11–9.

    Article  PubMed  Google Scholar 

  57. Raffoul M, Moore M, Kamerow D, Bazemore A. A primary care panel size of 2500 is neither Accurate nor reasonable. J Am Board Fam Med JABFM. 2016;29(4):496–9.

    Article  PubMed  Google Scholar 

  58. Harrington C. Considerations for patient panel size. Del J Public Health. 2022;8(5):154–7.

    Article  Google Scholar 

  59. Salisbury C, Murphy M, Duncan P. The impact of Digital-First Consultations on workload in General Practice: modeling study. J Med Internet Res. 2020;22(6):e18203.

    Article  PubMed  Google Scholar 

  60. Halbesleben JRB, Rathert C. Linking physician burnout and patient outcomes: exploring the dyadic relationship between physicians and patients. Health Care Manage Rev. 2008;33(1):29–39.

    Article  PubMed  Google Scholar 

  61. Sikka R, Morath JM, Leape L. The Quadruple Aim: care, health, cost and meaning in work. BMJ Qual Saf. 2015;24(10):608–10.

    Article  PubMed  Google Scholar 

  62. Streeter RA, Snyder JE, Kepley H, Stahl AL, Li T, Washko MM. The geographic alignment of primary care Health Professional Shortage Areas with markers for social determinants of health. Shah TI, editor. PLOS ONE. 2020;15(4):e0231443.

  63. Liu J, Jason. Health Professional Shortage and Health Status and Health Care Access. J Health Care Poor Underserved. 2007;18(3):590–8.

    Article  PubMed  Google Scholar 

  64. Ahuja AS. The impact of artificial intelligence in medicine on the future role of the physician. PeerJ. 2019;7:e7702.

    Article  PubMed  Google Scholar 

  65. Van Riel N, Auwerx K, Debbaut P, Van Hees S, Schoenmakers B. The effect of Dr Google on doctor–patient encounters in primary care: a quantitative, observational, cross-sectional study. BJGP Open. 2017;1(2):bjgpopen17X100833.

    Article  PubMed  Google Scholar 

  66. Murray E, Lo B, Pollack L, Donelan K, Catania J, White M, et al. The impact of Health Information on the internet on the physician-patient relationship: patient perceptions. Arch Intern Med. 2003;163(14):1727.

    Article  PubMed  Google Scholar 

  67. Tan SSL, Goonawardene N. Internet Health Information seeking and the patient-physician relationship: a systematic review. J Med Internet Res. 2017;19(1):e9.

    Article  PubMed  Google Scholar 

  68. Liaw W, Kakadiaris IA, Yang Z. Is Artificial Intelligence the Key to Reclaiming relationships in Primary Care? Am Fam Physician. 2021;104(6):558–9.

    PubMed  Google Scholar 

  69. Luo A, Qin L, Yuan Y, Yang Z, Liu F, Huang P, et al. The Effect of Online Health Information seeking on Physician-Patient relationships: systematic review. J Med Internet Res. 2022;24(2):e23354.

    Article  PubMed  PubMed Central  Google Scholar 

  70. Cruess SR. Professionalism and Medicine’s Social Contract with Society. Clin Orthop. 2006;449:170–6.

    Article  PubMed  Google Scholar 

  71. Resnicow K, Catley D, Goggin K, Hawley S, Williams GC. Shared decision making in Health Care: theoretical perspectives for why it works and for whom. Med Decis Mak Int J Soc Med Decis Mak. 2022;42(6):755–64.

    Article  Google Scholar 

  72. Verlinde E, De Laender N, De Maesschalck S, Deveugele M, Willems S. The social gradient in doctor-patient communication. Int J Equity Health. 2012;11(1):12.

    Article  PubMed  Google Scholar 

  73. Willems S, De Maesschalck S, Deveugele M, Derese A, De Maeseneer J. Socio-economic status of the patient and doctor–patient communication: does it make a difference? Patient Educ Couns. 2005;56(2):139–46.

    Article  PubMed  CAS  Google Scholar 

  74. Arndt BG, Beasley JW, Watkinson MD, Temte JL, Tuan WJ, Sinsky CA, et al. Tethered to the EHR: Primary Care Physician Workload Assessment using EHR Event Log Data and Time-Motion observations. Ann Fam Med. 2017;15(5):419–26.

    Article  PubMed  Google Scholar 

  75. Tai-Seale M, Olson CW, Li J, Chan AS, Morikawa C, Durbin M, et al. Electronic Health Record logs Indicate that Physicians Split Time evenly between seeing patients and Desktop Medicine. Health Aff (Millwood). 2017;36(4):655–62.

    Article  PubMed  Google Scholar 

  76. Murphy DR, Meyer AND, Russo E, Sittig DF, Wei L, Singh H. The Burden of Inbox Notifications in Commercial Electronic Health Records. JAMA Intern Med. 2016;176(4):559.

    Article  PubMed  Google Scholar 

  77. Tai-Seale M, Dillon EC, Yang Y, Nordgren R, Steinberg RL, Nauenberg T, et al. Physicians’ well-being linked to In-Basket messages generated by Algorithms in Electronic Health Records. Health Aff (Millwood). 2019;38(7):1073–8.

    Article  PubMed  Google Scholar 

  78. Muller AE, Berg RC, Jardim PSJ, Johansen TB, Ormstad SS. Can remote patient monitoring be the New Standard in Primary Care of Chronic Diseases, Post-COVID-19? Telemed E-Health. 2022;28(7):942–69.

    Article  Google Scholar 

  79. Willis VC, Thomas Craig KJ, Jabbarpour Y, Scheufele EL, Arriaga YE, Ajinkya M, et al. Digital Health Interventions To Enhance Prevention in Primary Care: scoping review. JMIR Med Inform. 2022;10(1):e33518.

    Article  PubMed  Google Scholar 

  80. Nochomovitz M, Sharma R. Is it time for a New Medical Specialty? The Medical Virtualist. JAMA. 2018;319(5):437.

    Article  PubMed  Google Scholar 

  81. Goldberg DG, Beeson T, Kuzel AJ, Love LE, Carver MC. Team-Based Care: a critical element of Primary Care Practice Transformation. Popul Health Manag. 2013;16(3):150–6.

    Article  PubMed  Google Scholar 

  82. American College of Physicians, Smith CD, Balatbat C, National Academy of Medicine, Corbridge S. University of Illinois at Chicago, Implementing Optimal Team-Based Care to Reduce Clinician Burnout. NAM Perspect [Internet]. 2018 Sep 17 [cited 2023 Aug 1];8(9). Available from: https://nam.edu/implementing-optimal-team-based-care-to-reduce-clinician-burnout.

  83. Loeb DF, Bayliss EA, Candrian C, deGruy FV, Binswanger IA. Primary care providers’ experiences caring for complex patients in primary care: a qualitative study. BMC Fam Pract. 2016;17(1):34.

    Article  PubMed  Google Scholar 

  84. Wagner EH, Sandhu N, Coleman K, Phillips KE, Sugarman JR. Improving care coordination in primary care. Med Care. 2014;52(11 Suppl 4):33–8.

    Article  Google Scholar 

  85. Holman GT, Beasley JW, Karsh BT, Stone JA, Smith PD, Wetterneck TB. The myth of standardized workflow in primary care. J Am Med Inform Assoc JAMIA. 2016;23(1):29–37.

    Article  PubMed  Google Scholar 

  86. Committee on Implementing High-Quality Primary Care, Board on Health Care Services, Health and Medicine Division, National Academies of Sciences, Engineering, and Medicine. Implementing High-Quality Primary Care: Rebuilding the Foundation of Health Care [Internet]. McCauley L, Phillips RL, Meisnere M, Robinson SK, editors. Washington, D.C.: National Academies Press. ; 2021 [cited 2023 Jul 27]. Available from: https://www.nap.edu/catalog/25983.

  87. Tai-Seale M, McGuire TG, Zhang W. Time Allocation in Primary Care Office visits: Time Allocation in Primary Care. Health Serv Res. 2007;42(5):1871–94.

    Article  PubMed  Google Scholar 

  88. Hilty DM, Torous J, Parish MB, Chan SR, Xiong G, Scher L, et al. A literature review comparing clinicians’ approaches and skills to In-Person, Synchronous, and Asynchronous Care: moving toward competencies to ensure Quality Care. Telemed E-Health. 2021;27(4):356–73.

    Article  Google Scholar 

  89. Rotenstein LS, Holmgren AJ, Healey MJ, Horn DM, Ting DY, Lipsitz S, et al. Association between Electronic Health Record Time and Quality of Care Metrics in Primary Care. JAMA Netw Open. 2022;5(10):e2237086.

    Article  PubMed  Google Scholar 

  90. Saag HS, Shah K, Jones SA, Testa PA, Horwitz LI. Pajama Time: Working after Work in the Electronic Health Record. J Gen Intern Med. 2019;34(9):1695–6.

    Article  PubMed  Google Scholar 

  91. Pellegrino ED, Altruism. Self-interest, and Medical Ethics. JAMA J Am Med Assoc. 1987;258(14):1939.

    Article  CAS  Google Scholar 

  92. Sajjad M, Qayyum S, Iltaf S, Khan RA. The best interest of patients, not self-interest: how clinicians understand altruism. BMC Med Educ. 2021;21(1):477.

    Article  PubMed  Google Scholar 

  93. Jones R. Declining altruism in medicine. BMJ. 2002;324(7338):624–5.

    Article  PubMed  Google Scholar 

  94. Yin J, Ngiam KY, Teo HH. Role of Artificial Intelligence Applications in Real-Life Clinical Practice: systematic review. J Med Internet Res. 2021;23(4):e25759.

    Article  PubMed  Google Scholar 

  95. Choudhury A, Asan O, Medow JE. Clinicians’ perceptions of an Artificial Intelligence-based blood utilization calculator: qualitative exploratory study. JMIR Hum Factors. 2022;9(4):e38411.

    Article  PubMed  Google Scholar 

  96. Thomas Craig KJ, Willis VC, Gruen D, Rhee K, Jackson GP. The burden of the digital environment: a systematic review on organization-directed workplace interventions to mitigate physician burnout. J Am Med Inform Assoc JAMIA. 2021;28(5):985–97.

    Article  PubMed  Google Scholar 

  97. Gosden T, Forland F, Kristiansen I, Sutton M, Leese B, Giuffrida A et al. Capitation, salary, fee-for-service and mixed systems of payment: effects on the behaviour of primary care physicians. Cochrane Effective Practice and Organisation of Care Group, editor. Cochrane Database Syst Rev [Internet]. 2000 Jul 24 [cited 2023 Jul 31];2011(10). https://doi.org/10.1002/14651858.CD002215.

  98. Diaz N. More health systems charging for MyChart messages. Becker’s Health IT [Internet]. 2022 Aug 28 [cited 2023 Aug 17]; Available from: https://www.beckershospitalreview.com/ehrs/more-health-systems-charging-for-mychart-messages.html.

  99. Porter ME, Pabo EA, Lee TH. Redesigning primary care: a Strategic Vision to improve Value by Organizing around patients’ needs. Health Aff (Millwood). 2013;32(3):516–25.

    Article  PubMed  Google Scholar 

  100. AMA Future of Health issue brief: Commercial Payer Coverage for Digital Medicine Codes. American Medical Association [Internet]. 2023 Sep 18 [cited 2023 Sep 26]; Available from: https://www.ama-assn.org/practice-management/digital/ama-future-health-issue-brief-commercial-payer-coverage-digital.

  101. Richardson S, Lawrence K, Schoenthaler AM, Mann D. A framework for digital health equity. Npj Digit Med. 2022;5(1):119.

    Article  PubMed  Google Scholar 

  102. Frimpong JA, Jackson BE, Stewart LM, Singh KP, Rivers PA, Bae S. Health information technology capacity at federally qualified health centers: a mechanism for improving quality of care. BMC Health Serv Res. 2013;13(1):35.

    Article  PubMed  Google Scholar 

  103. Guo J, Li B. The application of Medical Artificial Intelligence Technology in Rural areas of developing countries. Health Equity. 2018;2(1):174–81.

    Article  PubMed  Google Scholar 

  104. Davlyatov G, Borkowski N, Feldman S, Qu H, Burke D, Bronstein J, et al. Health Information Technology Adoption and Clinical Performance in Federally Qualified Health Centers. J Healthc Qual. 2020;42(5):287–93.

    Article  PubMed  Google Scholar 

  105. Iyanna S, Kaur P, Ractham P, Talwar S, Najmul Islam AKM. Digital transformation of healthcare sector. What is impeding adoption and continued usage of technology-driven innovations by end-users? J Bus Res. 2022;153:150–61.

    Article  Google Scholar 

  106. Richardson JP, Smith C, Curtis S, Watson S, Zhu X, Barry B, et al. Patient apprehensions about the use of artificial intelligence in healthcare. Npj Digit Med. 2021;4(1):140.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

This work was supported by the American Board of Family Medicine Artificial Intelligence/Machine Learning grant and the UCSD School of Medicine Research Fellowship. The funders played no role in any elements of the study.

Author information

Authors and Affiliations

Authors

Contributions

MRA, AM, and GK devised the initial study design. MTS provided substantive contributions to study design and survey instrument refinement. MRA and SW carried out the study and drafted the initial version of the manuscript. MRA coded the interview transcripts and SW verified the interview codes. MF, AM, MTS, and GK aided in manuscript revision, data analysis, and contributed intellectual value to the manuscript. All authors approve of the final version of the manuscript.

Corresponding author

Correspondence to Matthew R. Allen.

Ethics declarations

Ethics approval and consent to participate

Study methods and protocols were approved by the Aligning and Coordinating QUality Improvement, Research, and Evaluation Committee of the University of California San Diego (ACQUIRE Project #751) acting under the authority of the Institutional Review Board of the University of California San Diego. Informed consent was obtained from all subjects. All methods were carried out in accordance with relevant guidelines and regulations or declaration of Helsinki.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1:

Question stems from the digital survey

Supplementary Material 2:

Questions stems from the semi-structured interview

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Allen, M.R., Webb, S., Mandvi, A. et al. Navigating the doctor-patient-AI relationship - a mixed-methods study of physician attitudes toward artificial intelligence in primary care. BMC Prim. Care 25, 42 (2024). https://doi.org/10.1186/s12875-024-02282-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s12875-024-02282-y

Keywords