Encryption issues and health care non-compliance audit situations to avoid
THE DATA SECURITY AND PRIVACY OF PERSONAL HEALTH INFORMATION (PHI) has dominated the health care sector since the enactment of HIPAA in 1996. Over this time, the issue of encryption has gradually grown in importance — from its evolution as an aspirational goal in the 1990s, to safe-harbor foundation in the 2000s via the HITECH Act, to effective requirement in the 2010s.
Technology catches up
The shift began with HIPAA, which addresses the data security and privacy requirements applicable to health care providers (“covered entities”) via its Security Rule. Notably, regulations do not mandate its use by health care providers.
However, the Security Rule does mandate the implementation and maintenance of “reasonable and appropriate administrative, technical and physical safeguards” to protect electronic PHI. Significantly, the rule does not mandate the use of any given tools or approaches. HHS recognizes that covered entities range from the smallest provider to the largest multi-state health plan.
Therefore, the Security Rule is flexible and scalable to allow covered entities to analyze their own needs and implement solutions appropriate for their specific environments. Therefore, when a covered entity is deciding which security measures to use, the rule does not dictate those measures but rather requires the covered entity to consider:
- Its size, complexity and capabilities;
- Its technical, hardware and software infrastructure;
- The costs of security measures; and
- The likelihood and possible impact of potential risks to ePHI.
Not required…but required
As such, covered entities must review and modify their security measures to reflect evolving security threats to ePHI in an always-changing environment. For example, encryption and decryption are “addressable” standards, meaning that their use is not required, but rather a covered entity must either implement the addressable specification (i.e., encryption/decryption), or consider/document why it would not be reasonable and appropriate to implement and then identify alternative and/or compensating safeguards as reasonable and appropriate.
Thus, with respect to encryption/decryption, providers are not required to implement them, but instead (as part of their regularly conducted risk assessments), must document/determine why non-use in favor of some alternative approach would be reasonable and appropriate given the considerations facing that particular provider. Thus, in the event of a breach absent the use of encryption, the provider would have to justify to HHS why encryption’s absence was “reasonable and appropriate” for its situation.
Encryption comes of age
As a result of this approach, encryption was rarely implemented because in most specific situations, providers were able to demonstrate that encryption tools were too burdensome for the typical systems of the 1990s-2000s — meaning that the tools themselves were cumbersome to use in real time, and that the hardware was not powerful enough to permit seamless real-time usage. Thus, when balancing costs, alternative options and threat levels, the decision to not use encryption was deemed reasonable.
However, over the last 20 years or so, two things gradually happened — encryption/decryption tools improved enormously (allowing real-time usage of encrypted data to become an achievable goal) and systems became more powerful, thus allowing the improved encryption/decryption tools to be used more effectively. Consequently, the variables relevant to the reasonability of encryption/decryption use shifted in such a way to make its use more “reasonable and appropriate” over the last few years. As a result, the development of better tools and stronger systems has made the use of encryption less burdensome. Thus, the bar has been lowered as to the use of encryption in real time by covered entities, and the ever-growing cyber threats targeting health care providers have raised the level of urgency.
As noted above, if a covered entity opts not to use encryption, it is deciding to implement something that is an equivalent method of protection, and it must then also document the basis for that implementation.
As such, given today’s technical environment, covered entities put themselves at risk when they decide to use addressable controls in place of encryption. Those risks include:
- Failing to document the reasons why encryption cannot be used;
- Failing to document the particular hardships that encryption creates;
- Failing to implement a reasonable alternative to encryption; and
- Failing to implement an equivalent method of protection.
In short, when a covered entity now suffers a data breach, the fact that the data was not encrypted and that the alternative method of protection (assuming there was one) did not prevent the data breach will likely result in the commencement of a compliance action. It will be very difficult to argue that a protection method equivalent to encryption was used when it failed to prevent the loss of patient data.
The alternative-method approach is demonstrated by recent HHS consent decrees:
On Feb. 1, 2018, the Office for Civil Rights (“OCR”) at HHS entered into a no-fault settlement with Fresenius Medical Care North America on behalf of several of its entities. OCR noted that a USB drive containing the ePHI of 245 individuals was stolen from one of Magnolia’s work force member’s cars. Separately, one of Augusta’s employee’s unencrypted laptops that contained the ePHI of 10 individuals was stolen from that individual’s car. In both of these instances, OCR specifically determined that Magnolia and Augusta both failed to implement a mechanism to encrypt and decrypt ePHI. The total settlement amounted to $3.5 million.
One year earlier on Feb. 1, 2017, OCR issued a notice of final determination to Children’s Medical Center of Dallas (“Children’s”) in response to a series of incidents related to unencrypted devices. An unencrypted, non-password-protected BlackBerry that contained the ePHI of approximately 3,800 individuals was lost at the Dallas/Fort Worth International airport. Later, Children’s reported the theft of an unencrypted laptop from its premises that contained the ePHI of nearly 2,500 individuals to OCR. OCR’s investigation concluded that despite Children’s knowledge about the risks, it continued to issue unencrypted devices to its workforce members. As a result, Children’s was fined with an approximate $3.2 million civil money penalty.
Health care providers should routinely analyze their data protection risks to select the most worthwhile security solutions. The reality is that the underlying software and system hardware have been advancing to make encryption more viable — particularly in the face of the ever-growing threat of data breach in the health care field — and thus leaving those providers opting against encryption with little room for acceptable explanations.
As a result, the risks are rising against those health care providers who opt against the use of data encryption, given that the reasonability of not encrypting data is dissipating. Given this landscape, plus the size of the penalties being imposed by OCR, encryption appears to be the most viable technical solution available for most health care providers at this time.
KENNETH K. DORT is a partner in the intellectual property practice group of Drinker Biddle & Reath LLP, working out of its Chicago office. He focuses on various aspects of information technology and cybersecurity issues in numerous business sectors, including health care.
SUMAYA M. NOUSH is a health care associate at Drinker Biddle & Reath LLP, where she helps her clients navigate the daily challenges of running their operations while identifying opportunities where others see obstacles. Sumaya is an active contributor to the DBR on Data blog, where she regularly publishes material on a wide range of health care information privacy issues, including HIPAA enforcement efforts and government-issued cybersecurity guidance.