Connect with us


AI Security and Privacy Risks in Healthcare



AI Security and Privacy Risks in Healthcare

Photo created by –

Artificial Intelligence (AI) has great potential for the healthcare sector. Machine learning software plays a big role in statistical predictions and data science. This impact on treatment, diagnosis, operations, and patient monitoring is immense.

It is quite interesting that AI adoption in healthcare is lagging. This is unlike what is happening in other industries. Indeed, job postings show that a paltry 1 out of 1,250 jobs requires AI skills. Some reasons for low adoption are difficulty understanding the algorithms’ workings. The time-consuming nature of collecting data presents another challenge.

Fear of job loss amongst top management has also affected AI adoption. Finally, data privacy, liability concerns, and regulatory compliance are also present. These last stumbling blocks form the basis of discussion in this article. Let’s see what we can uncover.

AI in Healthcare

The most critical aspect of AI in healthcare is data. Much information exchanges hands between healthcare providers, patients, and other stakeholders. With that comes another huge challenge, which is data privacy.

How can the industry safeguard and secure the same data that AI technologies depend on? For machine learning technologies to perform well, they need large data blocks.

What makes it more difficult is that the onus is not only on the users of the AI technologies. There are regulatory compliance guidelines they must adhere to as well. The most significant is the Health Insurance Portability and Accountability, or HIPPA Laws. This covers protected health information (PHI).

PHI is any health-based information that any party can use to identify an individual. Healthcare practitioners cannot disclose the health condition, service or payments rendered.

Two parties must follow HIPPA rules.

  • The first are covered entities who create, collect, or share PHI. These are the insurance companies, doctors, hospitals, or clinics.
  • The other group is the business associates that work on behalf of covered entities. These include cloud or email providers, billing, law firms, and more.

The HIPAA law has several identifiers that healthcare professionals must adhere to. There are some core rules under HIPAA.

  • Privacy rules cover the what, how, when, and why of PHI sharing
  • Security rules look at the electronic PHI. These include safeguards around technical, administrative, and physical processes
  • Breach notification and reporting protocols
  • Omnibus rule updates all previous HIPPA rules

Please note that other states, regions, and country-specific data protection laws exist. The General Data Protection Regulation (GDPR) protects EU citizens. It is a good example of a law with its fair share of success.

Privacy Violations in the Healthcare Sector

Patient data can leak in various ways, thus a violation of HIPPA and other regulatory laws.

  • Information can leak due to recklessness on the part of healthcare providers.
  • Data breaches can happen due to insufficient cybersecurity measures. There is a need to check and vet external access points to limit any hacker attacks.
  • Third-party AI vendor risks are a significant problem for healthcare adopters and must be aware of. There is a need to establish rigorous data protection standards that apply to the vendors. Only after that can the healthcare providers trust the AI vendors with their private information.
  • Data location and movement are also an issue. In 2016, Deep Mind (now Google) partnered with the Royal Free London NHS Foundation trust. At some point, Google transferred patient data from the UK to the US. The move drew concerns about patient data moving from one jurisdiction to another. It also highlighted the implementation of such commercial healthcare AI.

Shortcomings around Privacy Issues

There are some shortcomings in data generation and management.

Regulatory loopholes: Some major loopholes are worth noting. HIPPA does not cover third-party vendors like AI tech companies. Unless these companies are business associates of the covered entities, the vendors can access sensitive patient data without adhering to the compliance guidelines.

Patient consent: Patients may not even know that companies are collecting health-related information. Facebook launched a campaign in 2017 to create awareness around suicide. The company used an AI suicide detection algorithm. The software would gather data without the users’ consent. It would use the information to assess the user’s mental state. Facebook doesn’t fall under the HIPAA-covered entities, so it is not breaking any laws. The social media giant was also unclear about how long they would store the collected data.

Data de-identification: Data de-identification aims to remove any PHI information from existing data. But, the AI algorithms seem to be able to re-identify the PHI. When users add new data to the software, the algorithms create linkages. In the end, they can identify the original data source. This further exposes the vulnerability of AI and data privacy.

Sale of patient data: Players who do not fall under HIPPA sell patient data to other companies. Those that have come under scrutiny are genetic testing companies. They will sell patient data to biotech or pharma industries.

Safeguarding Healthcare Data

Healthcare providers must take proactive steps to enhance data privacy and security. This becomes critical in adopting AI technologies that need big data to work. Some workable steps include:

  • Routine auditing of data and information systems
  • Establishing proper access controls on who can access the data
  • Proper training for all stakeholders in the use of AI in healthcare. Particular focus should be on PHI privacy rules, breach notifications, and security obligations.
  • Better communication to patients about how the companies will use their data
  • Greater legal framework for the collection and use of patient data. And this includes sealing any loopholes that some companies can wiggle through. Self-regulation by healthcare professionals does not seem to be a workable solution.

Final Thoughts

It would be hard to argue about how beneficial AI is in healthcare. This technology can be a game changer in disease diagnosis, treatment, and management. But, we also can’t ignore the challenges that come with big data management.

Generating, collecting, and handling tons of data is quite tricky. Yet, there is the pressing issue of ensuring data security and privacy. Regulatory laws like GDPR and HIPAA have been a big help.

But the industry players also need to take a more active role. Patients should also have a greater say about where their data ends up.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *