Prioritizing Security is one key consideration when using generative artificial intelligence (AI) clinically. Make sure you also consider:
Sensitive, proprietary, and personally identifiable information (PII) should not be entered into generative AI tools. Remember that if you are a covered entity under the Health Information Portability and Accountability Act (HIPAA), you must safeguard protected health information (PHI) for your patients.
Generative AI tools save all information that is written into the platform as a way of gathering more data for learning. Therefore, it is not HIPAA compliant, and protected health information should not be shared. Assume that all generative AI tools are not HIPAA compliant, unless vetted by your facility's privacy, legal, or compliance team. This may include generative AI tools in a HIPAA compliant program like an electronic medical record (EMR).
Confidentiality must remain a top priority—regardless of what tools or technology the clinician uses. Many products that use generative AI list terms of use and privacy information for consumers. In your role as a clinician, you should not upload PHI to a generative AI application without express direction from your facility (e.g., AI tools integrated into a vetted electronic health record).
Suppose, for example, you are using a publicly available generative AI tool—ChatGPT, for instance. Even if your version of that tool is a paid account, do not share clients’ or staff members’ PII.
The clinician needs to analyze what pieces of information may be considered “personally identifiable.”
Example: If your client has a rare disorder or may be identifiable by their treatment protocol, then you should avoid naming the disorder in your AI prompt—that may reveal the person’s PHI, considering they are the only patient in the practice with this rare disorder.
The same goes for names of academic programs, clinics, private practices, and other organizations: Treat them like you would an individual patient’s name. Clinicians should not upload names of academic programs, clinics, private practices, and other organizations to any generative AI tool.
In summary, as a rule, do NOT give generative AI access to PHI or other sensitive, proprietary, and PII—this goes for individuals’ names as well as names of organizations.
Before you invest time, money, and energy into any new tool, do your research first. Before purchasing, downloading, or using generative AI software or tools for clinical work, it’s a good idea to consider taking these precautionary steps first:
See the following use cases of audiologists and SLPs prioritizing security when using generative AI tools.
An audiology team is supervising two audiology students in their final year of training. The two students are progressing nicely, but the team notes that
The primary supervisor wants to design a 6-month milestone-based learning path for each student. Using the provided rubric from each student’s university, the audiologist creates a timeline for skill development:
A school-based SLP has received a free trial of an individualized education program (IEP) goal-writing tool. They reach out to their leadership team to determine how their school can trial this tool.
After discussing with their administration how to proceed with using the tool, and getting the go-ahead, the SLP takes the following steps:
Previous: Ensuring Ethical Responsibility