January 25, 2024

Talking to Your Clients About Generative AI in Healthcare

Talking to Your Clients About Generative AI in Healthcare

If it feels like artificial intelligence (AI) has been dominating the news lately, you aren’t imagining it. Much optimism surrounds AI’s promise to tackle healthcare’s biggest burdens. Generative AI in healthcare made headlines in the Wall Street Journal, Forbes, and the world’s most-read scientific journal, Nature

Despite the excitement around AI, it's normal for some people to be unsure about using generative AI in healthcare. About 75% of Americans worry that healthcare providers will rush into using AI without fully grasping the potential dangers for patients. 

This doesn’t mean you can’t tap into AI to improve your workflows and lighten your administrative load. You should talk to your clients about how and when you use technology to provide their care. 

This article will help to guide you through those critical conversations. 

Understanding patient perspectives

Recent AI headlines reveal varying opinions on generative AI in healthcare, leading to different attitudes among people. 

From the New York Times: A.I. Poses 'Risk of Extinction,' Industry Leaders Warn.

From Fortune: Intelligence for good: Using artificial intelligence to accelerate social impact

People’s own experiences with AI also influence their attitudes towards generative AI in healthcare. Deloitte’s 2023 Health Care Consumer Survey found that people who had experience using AI tools were more optimistic about its potential impact on healthcare:

  • Slightly more than 70% of generative AI users thought the technology could transform care delivery (vs. 50% of non-users)
  • 69% of users thought it could improve access (vs. 53% of all respondents)
  • 63% of users said it had the potential to make healthcare more affordable (vs. 46% of all)
  • 80% of consumers think it’s important that their healthcare provider disclose when they are using generative AI in healthcare delivery. 

Informed consent and documentation are critical when it comes to using AI tools. Access a handy AI Charting Assistant Consent Form built right into Practice Better, or download a pdf version.

If you want to use AI tools, offering a supportive environment that encourages open dialogue is key. 

Tailoring the conversation to the individual

Making all clients feel more comfortable with the presence of AI in your practice starts with meeting them where they are. 

  • Understand them. Take time to ask questions about their specific fears, beliefs, and biases regarding AI. What do they think AI is? Are there any definitions they need to clear up?
  • Are they keeping up with the advancements in AI that have the potential to affect our everyday lives? Do they know AI is used in things they use daily, like social media, streaming, and online shopping? 
  • Address resistance. People worry about AI's impact on data privacy, accuracy, and the loss of human touch in care. You can ease worries by demonstrating how AI solutions become safer and explaining how you use them in your own work. 
  • Customize for comfort. Shift your communication style depending on the demographics of the clients you work with. For example, younger adults express more comfort with AI-led healthcare. 

Start with an honest conversation, share examples of how AI in healthcare can improve care, and keep communication open. 

Establishing trust and transparency

The people you care for must trust that AI tools won't affect your ability to treat them with empathy and compassion.

Healthcare has a distinctive set of ethical considerations regarding AI. It is extremely important to protect patient health data, prevent biases in care, and reduce medical errors, and AI makes these things more difficult. 

The AMA Journal of Ethics recommends that physicians and other healthcare professionals should be responsible for acquiring a basic understanding of the AI devices they use. 

Luckily, many organizations demonstrate a strong commitment to AI transparency. It’s becoming commonplace to create and even publish guidelines around intentions and activities involving AI. For example:

  • Cedars-Sinai has created a framework for the ethical development and use of AI. 
  • At Practice Better, we are clear about our commitment to the responsible and ethical development of AI in our platform. 

Education and information sharing

AI has been around in some form since the 1950s and has evolved to be used in many everyday tasks. Some of the things you do every day are powered by AI without you even knowing it. For example, AI suggests terms that appear when typing in Google's search bar. 

Generative AI is the latest evolution of artificial intelligence, and its popularity is driving a lot of those viral headlines we discussed earlier. This new version of AI is dubbed “generative” because of its ability to go beyond simply following rules or doing the job it was trained to do. Generative AI can actually produce new things, like text, images, and other useful content. 

Here are some other common AI-related terms that are helpful to understand: 

  • Machine Learning. This involves teaching a computer to make predictions or decisions by showing it lots of relevant examples. For example, Netflix uses machine learning to keep improving the personalized viewing recommendations it feeds you.  
  • Neural Networks. This falls under the umbrella of machine learning. Just as our brains have neurons that help us think and process information, a neural network in AI consists of interconnected "nodes" that process and analyze data. Each node contributes a little bit to figuring out the answer, and together, they solve complex problems.

In the healthcare field, researchers used machine learning to train a convolutional neural network (CNN) to identify skin cancer by showing it more than 100,000 images of malignant melanomas and benign moles. Comparing the AI’s performance with a group of 58 international dermatologists revealed that CNN missed fewer melanomas and misdiagnosed benign moles less often as malignant.

  • Training data. Data scientists use vast amounts of data to teach AI algorithms to make the right decisions. The data can take many forms, including text, image, video, audio or speech, biometric, and even simulated data. 

Researchers at MIT are developing an AI model named “Sybil” to detect the risk of future lung cancer using low-dose computed tomography (LDCT) scans. The team is training Sybil using real CT scans – both with and without visible cancerous tumors. 

  • Natural Language Processing (NLP). This is how computers interpret human language. NLP helps computers understand what you're saying or writing so they can respond accurately or perform tasks based on your commands. If you’ve ever asked Siri or Alexa a question, you’re using NLP. 

In healthcare, NLP plays a significant role in AI-powered charting and note-taking. NLP algorithms can accurately transcribe spoken words into text during live conversations. NLP can also quickly summarize lengthy notes into more concise formats and extract crucial information about diagnoses, medications, and other relevant details from unstructured text notes.  

  • LLM (large language model). LLM is a type of NLP. It’s a type of AI model that specializes in understanding and generating “human” language. It involves training a neural network on a huge amount of data. If you’ve ever asked questions of ChatGPT and received answers back, you've interacted with an LLM. 

There are endless applications for AI in healthcare – from answering patient questions 24/7 through chatbots to generating post-appointment instructions to increasing billing and coding accuracy. And this is just a small sampling of what’s possible. 

Giving your clients or patients a glimpse of what’s possible with AI can help them see its potential to improve the quality, speed, and cost of their healthcare.

Demonstrating the human element

It may be tempting for some people to picture a future where robotic experts provide healthcare to patients with a cool, detached efficiency. In reality, AI will never replace human judgment, empathy, and experience. 

AI tools are meant to enhance human care, not replace it. By trusting AI to do the things it’s really good at, healthcare providers can make smarter decisions, improve admin efficiency, and help patients feel safely supported.

Need to document your AI tools conversation? Access a handy AI Charting Assistant Consent Form built right into Practice Better, or download a pdf version.

Here’s a glimpse of what AI-enhanced care could look like a wellness practice: 

  • Virtual Health Assistants. AI-powered chatbots provide 24/7 access to health information, answering client questions and offering guidance on best practices or next steps. They offer a reliable channel to support clients after hours and while you’re busy offering higher-value care.
  • Behavioral Coaching: AI provides behavioral coaching by tracking habits, providing feedback on lifestyle choices, and motivating people to adopt healthier behaviors. It can help with admin tasks by sending reminders to drink water, take breaks, and do stress-relieving activities.
  • Appointment Scheduling: AI optimizes appointment scheduling by considering various factors such as patient preferences, practitioner availability, and urgency. This helps reduce wait times and improves patient satisfaction while freeing up your time to help more clients.
  • Data-Driven Insights. AI can aggregate and analyze patient data, providing insights about broader health trends, effective treatments, and interventions. You can use these insights to better tailor care plans.
  • Note-taking and charting. AI automates session note-taking with recorded transcriptions, summaries, and clear action items. This allows you to be more fully present, focusing on your patient rather than taking notes.

Check out Practice Better's AI Charting Assistant in action:

Informed consent in AI implementation

When using generative AI in healthcare, it is crucial to inform clients about the particular AI application. This includes informing them about the potential effects, both positive and negative, before obtaining their consent. 

The person giving consent must also be able to voluntarily decide whether they wish to proceed.  

If your practice management platform offers forms, you can get consent during your intake process or even before a client is entered into the system.

Now, let’s imagine you are planning to incorporate Practice Better’s AI Charting Assistant into your practice. Here’s a summary of some steps you should take: 

EXPECTATIONSACTIONSYou must understand how your AI technology works so you can explain it to your clients. Any tech provider should provide useful information about how their products work.
We created an 8-minute video demonstrating the functionality of AI Charting Assistant. Additionally, our CTO provided a detailed discussion on privacy and security in AI.  You should clearly distinguish between the role AI will play vs. the human element.Reinforce that the AI helps you do your job better and won't fully replace you. 
AI Charting Assistant transcribes sessions, creates clear session summaries, and captures action items to support clients between sessions. But you still review the output for completeness and accuracy. You should be able to clearly state the risks, benefits, and alternatives to using AI.Privacy is always a concern for clients. In the case of AI Charting Assistant, your clients can rest assured that their data is used only for its intended purpose — healthcare — and is never sold to third parties. The tool is also fully compliant with all relevant healthcare regulations, including HIPAA
And since it provides a reliable record of each conversation, it frees you to focus more on client conversations.  
If a client still isn’t comfortable, you can turn AI Charting Assistant off and revert to manually taking notes during a call. Allow your clients to ask clarifying questions to feel more comfortable.  Make sure to offer a supportive environment for open dialogue around AI charting in your practice.

Practice Better customers on using AI

Practitioners who use AI Charting Assistant love that it allows them to truly be present for their clients instead of focusing on their notes. See how our customers approach consent, concerns, and communicating benefits. 

Get a handy AI Charting Assistant Consent Form built right into Practice Better, or download a pdf version.

Obtaining consent

“The use of AI has always been in my contract, so technically my clients should know about it. Since some of them don't read the contract carefully, I also let them know I use AI. Some of my clients are not comfortable with it at first. When I tell them about the benefits – for them and me – they are more than willing to have the recording available.”

  • Maria Manrique,  Holistic Nutritionist @ The EAT Method

“I ask permission from the client first to make sure they're okay with recording. I haven't had any clients say no so far. I frame it as a resource that they can go back to. That way they can relax during the session itself and know that, if they forget anything, the information is there to refer back to.”

  • Candice Esposito, Naturopathic and Functional Medicine Doctor

Addressing concerns

“I let my client know that the AI Charting Assistant doesn't capture the video part of our calls. That’s one thing that they were very particular about, because some of them check in with me wearing their pajamas. I let them know that the output is a transcript, and we use it as an anchor to make sure we both know what happened during a session.” 

  • Maria Manrique, Holistic Nutritionist @ The EAT Method

Highlighting benefits

“The AI allows me to make sure I have the information captured without the need to type as we're talking. I'm actually able to pay more attention to my clients. Even though I'm not physically taking notes during the session, I have them at my fingertips afterwards.”

  • Maria Manrique,  Holistic Nutritionist @ The EAT Method

“I've been modifying the AI-generated transcript a little bit, but I find it’s been helpful to speed up my notes. I still take notes, but the focus isn't as much on the note-taking as in the past. There’s a more relaxed feel to a session knowing that I can refer to a backup transcript.” 

  • Candice Esposito, Naturopathic and Functional Medicine Doctor

Get your patients and your practice ready for AI 

The AI in healthcare market is projected to grow from USD $14.6 billion to $102.7 billion in 2028. Big tech changes are coming as healthcare providers increasingly explore applications of generative AI in both administrative and clinical settings. 

Increased exposure to generative AI in other areas of life will also increase client familiarity and, hopefully, comfort with the technology. By practicing transparency and encouraging ongoing communication, you can get your patients on board with the many benefits of generative AI in healthcare. 

Want to dip your toe in the AI waters? You can try our AI Charting Assistant today. The first 600 minutes are free! Existing Practice Better users can simply add it to their subscriptions and add-ons

Informed consent and documentation are critical when it comes to using AI tools. Access a handy AI Charting Assistant Consent Form built right into Practice Better, or download a pdf version.

See what Practice Better’s AI Charting Assistant can do