Doctors have always had mixed feelings about patients who show up informed. For the last two decades, as millions of people have turned to the Internet for answers, the worry has been “Dr. Google.” Now, it’s AI, particularly tools like ChatGPT.
I’ve seen a growing number of clinicians claim that patients who use AI tools before their appointments make visits longer and more difficult. One physician on LinkedIn went as far as to call it an “AI tax.” While some agreed with him, others noted that AI generally upskills patients, making conversations more focused rather than more difficult.
This concern is understandable, in a system where clinicians are overwhelmed and pressed for time. But the claim itself isn’t supported by evidence, and limiting patient access to AI, even implicitly, moves care in the wrong direction.
The “AI tax” idea does not hold up when you look at the research, and withholding high-quality AI tools from patients only reinforces a more paternalistic model of medicine. As a longtime healthcare investor and public health advocate, I’d argue that a better approach is to use AI to help both clinicians and patients prepare, communicate, and make decisions together more effectively.
The “AI tax” has no evidence behind it
As of 2025, there are no peer-reviewed studies showing that AI-prepared patients extend visits or make them less productive. None.
What we do have are early evaluations of digital tools that help patients prepare before their visits: for example, electronic pre-visit questionnaires and intake systems that ask patients to list priorities or symptoms in advance. A systematic review of 49 studies found that 38 reported these tools as effective in improving patient-centered care and patient–provider communication. And a recent qualitative study found that patients who used a digital pre-visit tool felt better prepared for their appointments and believed it helped shape the conversation. While more rigorous research is needed, these findings challenge the idea that informed, prepared patients inherently take up extra clinician time.
If anything, patients who take the time to understand their condition are often easier to treat. They ask clearer questions. They have better recall of prior symptoms. They’re more likely to follow through on treatment. Clinicians who already embrace shared decision-making tend to welcome this preparation.
The idea of an “AI tax” is less about the technology and more about the long-standing pressure to fit meaningful care into too little time. A well-run primary care visit is expected to fit within a 15-minute window, including chart review, history, physical exam, documentation, counseling, ordering tests, updating medications, and answering questions. When that system is stretched to its limit, anything new can feel like a burden. But that is a problem with system design, not with informed patients.
