Artificial Intelligence and the Health Insurance Portability and Accountability Act of 1996 (HIPAA)
Artificial intelligence offers a lot of promise for the health care industry. Individually, AI can repeatedly examine a patient’s health data over time and make adjustments based on the data changes to improve the patient’s health. On a large scale, AI can be used to examine data from numerous patients and resources to better analyze what treatments work best, when treatments should be changed, and what new treatments may become available. AI works by continually reexamining health data to learn. Unlike static software programs, AI uses what it’s learned to essentially reprogram the program it runs to fine-tune and adapt as new information is received and reviewed.
HIPAA essentially protects patient health information in the form of electronic records. Typically, insurance companies, hospitals, and health practices are covered by HIPAA. Developers may be bound by HIPAA depending on their relationships with the covered health companies and whether electronic patient health information is being accessed.
Basic HIPAA requirements
HIPAA generally applies to health insurance companies, the health providers (hospitals, doctors, nursing homes, psychologists, chiropractors, pharmacies, and others) that bill the insurance companies. It also applies to:
- Healthcare clearing houses
- Companies that administer the health plans
- IT professionals that work with the electronic patient health information (ePHI) records, and other people and entities with access to ePHI records.
HIPAA was enacted in 1996 and will likely need to be updated to address AI concerns. Patient health information records are more commonly called electronic health records (EHRs).
Protected information includes:
- Information that medical professionals put into your medical records
- Conversations doctors have with nurses and other people about your treatment
- Billing information
Compliance steps health insurance companies, medical providers, and others covered by the HIPAA law must take to help secure your EHRs are:
- Safeguards must be put in place to protect your EHRs
- Disclosures of the information must be kept to a minimum
Protocols need to be created to limit who has access to the records
Interview with an expert in healthcare privacy and security, both on the federal side with HIPAA, and the state side, especially California law.
Is HIPAA compliance a mystery to you? What do you have to do for HIPAA, and when?
Generally, there are privacy limits on who can have access to your electronic health records. Some qualified exceptions are allowed so that health providers can share information with other doctors and other people (such as a spouse the patient gives permission to see your records) to ensure patients are getting the best health care possible.
Some of the ways AI is helping patients even though it has access to PHI
Some of the many benefits of AI that may involve access to health records of patients include:
- Algorithms that review MRI scans to find signs of illness that even a trained clinician would have missed.
- The use of supercomputers to do research on new drugs by running simulations based on patient information that is routinely updated.
- Scanning DNA data to help track cancer and other diseases before they become life threatening.
- Robotic equipment that examines patient information to help doctors perform surgeries such as surgeries that require more precision than the human hands and eyes allows.
All of these involve PHI and implicate HIPAA concerns.
Some of the complicated factors involved with the determination of whether HIPAA applies to AI
All of these procedures and new software devices need to be reviewed with an experienced healthcare lawyer to determine whether they violate HIPAA and what steps can be taken to show the makers and users of these new software devices and products are compliant with the law.
- AI technology is hard to analyze. “As healthcare AIs grow more complex and their decision-making patterns more opaque to human care providers, it will grow increasingly difficult to determine when, where, how, and even if they are doing anything that might fall under the HIPAA umbrella.”
Part of the solution will be in the software design itself so that doctors, nurses, and others who use the AI will be alerted to the HIPAA issues. This has the drawback of requiring the doctors, nurses, and others to become tech savvy.
- Potential misuse of ePHIs. A recent article in USA Today, explained that another part of the problem with using AI is the potential for misuse of the data. For example, employers who give their workers Fitbits to analyze their health and their performance may be violating the health care rights of their workers if the information is shared with researchers, with other workers, and with company staff – without the consent of the workers. This information could be used to terminate a patient with certain disorders. It could be used by marketing companies to direct marketing towards people with certain conditions.
- Reserve uses. Even when information about the identity of a patient is stripped from the data the AI software analyzes, AI can essentially figure out which patient’s data is being examined through back-door methods that show, for example, that a patient with certain medical characteristics must be John Smith.
AI developers and HIPAA compliance issues
Due to the broad range of uses of AI; manufacturers and sellers of AI software will likely be subject to future regulations if they are not already covered by HIPAA.
Current complicated legal issues include:
- Is the developer a business associate? Whether AI developers are covered already, depends, in large part, on whether they are considered a business associate of the hospital, medical practice, or insurance company. The more the developer works directly with the healthcare industry, the more the developer is likely to be considered a business associate or partner– and thus subject to HIPAA compliance rules.
- Can contracts be used to determine the business associate status? Contracts between developers (researchers and manufacturers, for example) may need to address are who is paying for the development of the software (the health company, the government through research grants, the private sector, or some private organization). The authorized uses of the AI software should, along with other issues, also be addressed. The more the contract indicates a company or entity is a business associate of the hospital, insurance company, or health provider – the more likely it will need to ensure HIPAA compliance.
- Having the covered health company strip the ePHI from the data. In one scenario, the health care developer “de-identifies” the EHR/PHI so that the developer doesn’t know who the patient is. Once the key patient information is stripped out, then, in theory, the developer should be OK to use it without violating HIPAA – even if the developer is considered a business associate. De-identifying sounds like a good solution – but many health providers don’t have the time, finances, or technical know-how to strip out the patient health information. Even when they do have the necessary tools, they may make mistakes which breach HIPAA.
- Having the developer strip out the ePHI records. The developer could agree to de-identify the patient health information – if, in essence, it agreed to the status of business associate and thus, essentially, agreed to be bound by HIPAA including HIPAA’s security rule and other data protection requirements. Even if the developer doesn’t start by using PHI records, if the AI software accesses any healthcare information once the AI is installed, then the developer may find itself in the role of a business associate who is bound by HIPAA.
HIPAA and software apps
Recently, Health and Human Services through the Office for Civil Rights (OCR), released five questions (with corresponding answers) that discuss software apps and HIPAA. While these discussions don’t’ specifically mention artificial intelligence, companies and researchers that use AI do need to understand how these discussion impact their HIPAA requirements.
- Does a HIPAA covered entity that fulfills an individual’s request to transmit electronic protected health information (ePHI) to an application or other software (collectively “app”) bear liability under the HIPAA Privacy, Security, or Breach Notification Rules (HIPAA Rules) for the app’s use or disclosure of the health information it received?
- What liability does a covered entity face if it fulfills an individual’s request to send their ePHI using an unsecure method to an app?
- Where an individual directs a covered entity to send ePHI to a designated app, does a covered entity’s electronic health record (EHR) system developer bear HIPAA liability after completing the transmission of ePHI to the app on behalf of the covered entity?
- Can a covered entity refuse to disclose ePHI to an app chosen by an individual because of concerns about how the app will use or disclose the ePHI it receives?
- Does HIPAA require a covered entity or its EHR system developer to enter into a business associate agreement with an app designated by the individual in order to transmit ePHI to the app?
HIPAA violations
Any developer covered by HPAA and any medical entity that is covered by HIPAA can be subject to both civil and criminal penalties. Developers who are covered need to work with experienced healthcare compliance lawyers to determine what security protocols are required, how patient ePHI records should be kept private, and what steps should be taken if they become aware of any breaches.
In addition to HIPAA, other data protection laws in American and abroad may apply.
You think you’ve got HIPAA compliance handled, in order to try to stay ahead of steep federal penalties, and then learn that is just the beginning of the story. HIPAA compliance itself is thorny. […]