← Susanne Fortunato

HIPAA Is Not the Villain

Every time someone in tech decides healthcare is too hard to touch, HIPAA is usually the reason they give. The regulation is intimidating. The compliance burden is enormous. The penalties are severe. It’s just not worth the risk.

Most of that is wrong, and the people saying it usually haven’t actually read the law.

HIPAA — the Health Insurance Portability and Accountability Act — was designed to have a long shelf life. It’s principle-based rather than prescriptive on purpose: the people who wrote it knew that technology would change massively and they didn’t want the regulation to become obsolete. That design choice is also why it looks so big and intimidating on paper. When a regulation is written to last across decades of technological change, it uses a lot of words to describe a small number of durable principles.

The actual technical requirements of HIPAA are three things: data must be encrypted at rest, encrypted in transit, and there must be an audit trail over who has accessed what information. That’s it. If you can name a modern web application that doesn’t already meet those three requirements, I’d name a product you shouldn’t use. Any serious software product built in the last decade (more?) handles encryption and access logging as a matter of course. HIPAA didn’t invent those requirements — it just wrote them into law for healthcare.

Where things get more complicated is on the operational side. HIPAA requires a compliance officer, a security officer, and documented policies and procedures for what happens when there are data breaches. It also requires conformance to specific reporting standards when breaches occur. That’s the layer that actually trips organizations up, and it’s also where the real penalties live. Entities get in trouble for data breaches, and they get in more trouble for improperly reporting on those breaches. There is no such thing as being officially certified as HIPAA compliant — that’s not a real designation. But there is such a thing as getting in trouble for getting it wrong or hiding a data breach.

The other thing most people don’t know: HIPAA regulates who has the right to access or share health information, not the information itself. You are allowed to share your own medical records with anyone you want, in any way you want. You could print all of your medical records and paper them across every tree in the largest park near your house and there would be no HIPAA problem with that. The law applies when someone accesses information they don’t have a right to, or shares information they don’t have a right to share. A patient sharing their own information is neither.

Which makes it worth sitting with this: the P in HIPAA stands for Portability. Sharing records for treatment purposes isn’t just permitted under the law — it’s deeply encouraged. When an organization tells a patient that they can’t transfer records to another hospital because of HIPAA, that is almost never the real reason. The real reason is usually that nobody wants to do the work, or nobody is sure whose job it is, or somebody is afraid of doing the wrong thing and doesn’t want to stick their neck out.

That last piece is underrated as an explanation for how healthcare actually operates. Outside of the formal chain of command around patient care, it’s often genuinely unclear inside a health system who owns what decision. A front desk person doesn’t want to give wrong information. A mid-level administrator doesn’t want to be the one who caused a compliance incident. The existence of regulations with large scary consequences — even when those consequences are much narrower than people assume — makes organizations broadly risk averse and therefore action averse. HIPAA becomes a convenient answer because it ends the conversation without anyone having to make a call.

The real problem underneath all of this isn’t regulation. It’s incentives. What one organization wants is not always what’s good for a patient. What’s efficient for a health system doesn’t always align with what’s good for the person receiving care. Hospitals are a good example: the systems and schedules that keep a large institution running are optimized for the institution, not for any one patient’s moment-to-moment wellbeing. That misalignment runs through data sharing too. Organizations don’t share data freely not because HIPAA prohibits it, but because sharing data takes effort, coordination, and money — and they don’t have internal incentive to do it unless they’re forced to.

HIPAA is a reasonable law that solved a real problem reasonably well. It kept up with decades of technological change. Its core requirements are things any competent engineering team already does. Blaming it for healthcare’s data problems is a way of not looking at what’s actually in the way — which is harder to fix, because it’s not a law you can just read and comply with. It’s the structure of the system itself.