Artificial Intelligence (AI) Considerations: Prioritizing Security

Prioritizing Security is one key consideration when using generative artificial intelligence (AI) clinically. Make sure you also consider:

Sensitive, proprietary, and personally identifiable information (PII) should not be entered into generative AI tools. Remember that if you are a covered entity under the Health Information Portability and Accountability Act (HIPAA), you must safeguard protected health information (PHI) for your patients.

Generative AI tools save all information that is written into the platform as a way of gathering more data for learning. Therefore, it is not HIPAA compliant, and protected health information should not be shared. Assume that all generative AI tools are not HIPAA compliant, unless vetted by your facility's privacy, legal, or compliance team. This may include generative AI tools in a HIPAA compliant program like an electronic medical record (EMR).

Confidentiality must remain a top priority—regardless of what tools or technology the clinician uses. Many products that use generative AI list terms of use and privacy information for consumers. In your role as a clinician, you should not upload PHI to a generative AI application without express direction from your facility (e.g., AI tools integrated into a vetted electronic health record).

Suppose, for example, you are using a publicly available generative AI tool—ChatGPT, for instance. Even if your version of that tool is a paid account, do not share clients’ or staff members’ PII.

The clinician needs to analyze what pieces of information may be considered “personally identifiable.”

Example: If your client has a rare disorder or may be identifiable by their treatment protocol, then you should avoid naming the disorder in your AI prompt—that may reveal the person’s PHI, considering they are the only patient in the practice with this rare disorder.

The same goes for names of academic programs, clinics, private practices, and other organizations: Treat them like you would an individual patient’s name. Clinicians should not upload names of academic programs, clinics, private practices, and other organizations to any generative AI tool.

In summary, as a rule, do NOT give generative AI access to PHI or other sensitive, proprietary, and PII—this goes for individuals’ names as well as names of organizations.

Before you invest time, money, and energy into any new tool, do your research first. Before purchasing, downloading, or using generative AI software or tools for clinical work, it’s a good idea to consider taking these precautionary steps first:

  • Connect with your employer’s technology and digital security team. If you own your practice, consult with legal counsel, product developers, or other consultants to understand your business and software security needs.
  • Consult with your attorney or employer to ensure that you’re in compliance with state and local laws and with payer and facility regulations.
  • Turn off software or in-app permissions that grant access to your contacts or other applications.
  • Review settings and in-app permissions that allow you to save data.

Examples

See the following use cases of audiologists and SLPs prioritizing security when using generative AI tools.

An audiology team is supervising two audiology students in their final year of training. The two students are progressing nicely, but the team notes that

  • each student has strengths in different areas and
  • each student would benefit from additional support in their respective area.

The primary supervisor wants to design a 6-month milestone-based learning path for each student. Using the provided rubric from each student’s university, the audiologist creates a timeline for skill development:

  • The audiologist uses a generative AI platform to create an individualized schedule for each student that provides more time working with audiologists that specialize in the students’ respective weaker areas.
  • The audiologist avoids entering any information that would identify the students or their graduate programs.
  • The audiology team uses the timeline and schedule to set expectations for the students and for their supervising audiologists.

A school-based SLP has received a free trial of an individualized education program (IEP) goal-writing tool. They reach out to their leadership team to determine how their school can trial this tool.

After discussing with their administration how to proceed with using the tool, and getting the go-ahead, the SLP takes the following steps:

  • deidentifies the individuals and organizations involved
  • inputs the case history—including areas of need, areas of strength, frequency, and location of services
  • uses the AI tool to generate possible IEP goals and short-term objectives for the student
  • reviews, edits, and individualizes the generated goals for the student
  • presents the completed goals at the student’s next IEP meeting

ASHA Corporate Partners