by Steve Waldon
Editor’s Note: Madeleine McDowell, MD, Senior Principal and Trine Tsouderos, Senior Vice President, Intelligence contributed to this blog post.
Generative artificial intelligence (AI) has seized the public consciousness, signaling a potential disruption that could be one of the most profound in generations. Large language models, such as ChatGPT, are advancing rapidly, demonstrating an uncanny ability to process natural language and write compellingly (if not always accurately) by predicting word sequences.
As more AI tools become available and researchers work to understand their effects, a significant and lasting boost in productivity may be in the offing. Any major change, however, brings unanticipated consequences, and safeguards should be employed to mitigate unforeseen dangers as well as more obvious misuse of the technology. This is singularly true in health care, so we expect the industry to start modestly, handing off low-level administrative duties that relieve paperwork burdens—and improve staff morale—without threatening patient health.
While health care companies should move deliberately and continue to follow the rapidly evolving research on AI, executives should prepare now to develop system-wide strategies. In today’s era of exponential change, no one can afford to take a wait-and-see approach.
Here are seven steps to get ready for generative AI.
- Reassure workers of job security. If the technology works as promised, it can be harnessed to do what most humans hate to do: paperwork. Physicians and other staff spend hours every day on That stokes job burnout and cuts into physician time with patients. AI systems churn through mountains of paperwork 24/7, never tiring, never complaining. But AI will not run itself; humans still need to be in charge to ensure safety and accuracy. Emphasize to workers that while AI could handle some tasks, it also will help them be more efficient, eliminate unnecessary work and potentially ease strain caused by existing workforce shortages. Consider upskilling employees who are relieved of some paperwork duties.
- Set clear decision-making authority. Form a multidisciplinary committee, including data-driven clinicians, to thoroughly vet AI technology before it is used. This committee should be involved in developing safeguards and protocols, educating clinical staff, and measuring whether AI works as promised. It should make plans for scaling and improving systems or scrapping them if they don’t deliver. Keep in mind, however, that ChatGPT and its rivals are improving rapidly; something that is not ready for prime time now may be in six months or a year.
- Invest in change management. Health systems will need to be agile to align organizational structure, workforce duties and new roles with AI’s growing role in the workplace. Generative AI already can be used by physicians to take medical notes, create visit summaries, produce documentation for insurance authorization, draft responses to patient questions, find billing codes for visits, identify drug interactions and other tasks. Eventually it could scan electronic health records, summarize patient histories in preparation for visits and help with shift changeovers in hospitals.
- Plan to hire. Systems will need data scientists, engineers and managers to create and oversee these new systems. One key task: convert information from text (progress notes, visit summaries, discharge summaries) into discrete data.
- Start looking for partners. Many hospitals will seek ready-made solutions rather than invest in development on their own. The decision to buy, build or partner may be unique for each institution and project. If you don’t have the in-house capabilities or resources, partnering with an established AI company with a track record of success can speed up your solution.
- Enhance security. Devise an enterprise-wide data governance plan to bolster security and data integrity. For example, it should guard against biases stemming from data sets lacking in diversity. Ensure platform interoperability and limit bolt-on solutions that could exacerbate security issues.
- Power up. Assess power and storage capabilities. AI systems rely on massive amounts of data. A cloud-based data storage plan with adequate capacity and processing power is a must.
Above all, health leaders need to establish a culture that embraces innovation and technological change. With these new AI tools, clinicians will be best positioned to make the case to their peers about the benefits of balancing autonomy and standardization. Still, health leaders should expect healthy skepticism from staffers until these systems prove their abilities—and their safety—beyond a doubt.