The use of artificial intelligence (AI) in business has gotten a lot of attention in the media lately. More and more businesses are using artificial intelligence to analyze large data, simplify complex tasks, enhance interactions with customers, and streamline operations.
So, how can healthcare providers, such as pharmacies, benefit from using artificial intelligence and specific tools such as ChatGPT?
AI, more specifically, generative AI, holds promise for use in healthcare. Generative AI is a type of artificial intelligence that can write code, power chatbots in customer service, and create new content, such as text or images. It can ‘learn’ and adapt without the need for further instructions, using algorithms from patterns in the large amounts of data it accesses.
The most talked about generative AI tool is OpenAI’s ChatGPT. Since November 2022, ChatGPT has dominated the AI tool market with more than 60% market share and more than 14 billion visits to its website just a year after its launch.
The University of Waterloo’s School of Pharmacy in Canada began using generative AI in early 2018 to bolster experiential learning to train its pharmacy students and nurse practitioners. The school designed an AI-assisted course on warfarin management, a course which typically requires weeks of experiential learning. Using the tool, Dr. Nagge created AI versions of patients displaying the most typical scenarios a pharmacist faces while handling warfarin.
“I was initially a skeptic,” says clinical associate professor Dr. Jeff Nagge. But using AI to teach students proved to be “just as good or better than traditional learning,” he says.
The school’s use of AI proves that generative AI can be an invaluable tool in preparing pharmacy students for real-world situations, when implemented with the appropriate guidelines.
ChatGPT and similar tools can be used in healthcare in the following instances:
Generative learning models like ChatGPT, in a community pharmacy setting, can significantly improve medication adherence and streamline a pharmacist's workflow. Here are some effective ways to integrate these technologies:
Given all this potential, what is holding healthcare companies back from fully embracing AI?
There are a number of reasons.
In a study of Jordanian pharmacists in May and June 2023 who used chatbots powered by ChatGPT in a pharmacy, 70% of the participants recognized the advantages of ChatGPT when used as an educational tool for patients, such as marketing, social media or educational content.
However, respondents were concerned about how accurate ChatGPT was when providing medicine-related information. More than 55% of participants were worried about the possibility of bias and ‘discriminatory patterns’ the tool could pick up, based on data it was trained on.
Experts caution against using ChatGPT or any AI tool to generate a diagnosis. The tool itself warns against this with the warning that “ChatGPT can make mistakes. Consider checking important information.” And when asked questions concerning medical conditions, the tool gives a warning about the importance of “consulting with a healthcare professional.”
At the end of the day, AI tools are just that – tools. They are only as effective as the input you give them. For now, the consensus is that AI is safe to use to answer general queries and summarize large amounts of data or text and other non-critical tasks.
The American Institute of Healthcare Compliance (AIHC) notes the tool’s potential but cautions against its use, noting that safeguards must be in place to prevent confidentiality breaches, because using AI tools, such as chatbots, can expose user personal information, including protected health information (PHI) and personally identifiable information (PII). The AIHC also recognizes that while updates have been made to the Health Insurance Portability and Accountability Act (HIPAA) since its implementation in 1996, technology has far outpaced it in terms of development.
Worries about security and breaches do not seem unfounded.
In early 2023, OpenAI, the creators of ChatGPT, identified a bug in the code that leaked user chat histories, billing information, and partial credit card information, highlighting the need for careful planning when dealing with sensitive data.
At present, the two biggest hurdles to fully utilizing ChatGPT, or any other general purpose AI tool in any healthcare setting, are their lack of HIPAA compliance and concerns about its accuracy in generating medical information.
In contrast to other generative AI tools such as ChatGPT, Google Bard, YouChat, or any of the other commercially available tools that can only competently perform general tasks, Google’s Med-PaLM 2 was specifically developed for medical-related tasks. Also, unlike all other AI tools presently offered, Med-PaLM 2 supports HIPAA compliance.
Med-PaLM 2 is also notable for getting an unprecedented 86.5% accuracy score in United States Medical Licensing Examination-style (USMLE) questions. Accuracy in answering USMLE-type questions is a benchmark for the evaluation of medical question/answering performance.
In a review of answer quality to medical questions and compared with real physicians’ answers, Med-PaLM2's answers were preferred to doctors’ answers when reviewed by clinicians and evaluated against criteria including scientific factuality, precision, reasoning, bias, and likelihood of possible harm.
While still in development, Med-PaLM is currently being tested at some hospitals, including the Mayo Clinic. Other institutions and providers testing it include pharmaceutical maker Bayer, hospital operator HCA Healthcare, and health informatics company MEDITECH.
As AI tools continue to rapidly evolve, the complexities of deploying the technology to ensure HIPAA compliance are becoming better understood.
Here are some important things to consider in protecting PHI and PII when using AI in a healthcare setting:
In summary, while using AI for virtual patient consultations in a pharmacy setting can enhance service quality and accessibility, it must be done with a strong emphasis on security and compliance with healthcare privacy laws. Careful planning and adherence to best practices for data protection are essential to avoid violating HIPAA and to maintain patient confidentiality.