A Stolen Laptop, a Missing Exception, and a $3 Million Problem

In 2018, the University of Texas MD Anderson Cancer Center lost an unencrypted laptop and two USB drives containing patient data. They argued it wasn't really a breach. OCR disagreed — and hit them with a $4.3 million penalty. The case turned on one thing: did the event meet the definition of HIPAA breach under federal law? It did, and MD Anderson's failure to encrypt ePHI sealed the outcome.

If you work at a covered entity or business associate, you need to know this definition cold. Not roughly. Not conceptually. The actual regulatory language, because that's what OCR uses when it comes knocking.

I've spent years helping organizations sort real breaches from near-misses. The line between the two is more specific — and more consequential — than most people realize.

The Exact Definition of HIPAA Breach Under Federal Law

Here's the regulatory definition straight from 45 CFR § 164.402: a breach is the acquisition, access, use, or disclosure of protected health information (PHI) in a manner not permitted by the HIPAA Privacy Rule that compromises the security or privacy of the PHI.

That last clause — "compromises the security or privacy" — is where organizations get tripped up. It doesn't mean the data has to end up on the dark web. It means any impermissible use or disclosure is presumed to be a breach unless you can demonstrate a low probability that PHI was actually compromised.

The burden of proof sits on your organization. Not on the patient. Not on OCR. On you. That's a detail I've watched catch dozens of compliance officers off guard.

What "Compromises the Security or Privacy" Really Means

HHS uses a four-factor risk assessment to determine whether an impermissible use or disclosure crosses the line into breach territory. You'll find these factors in HHS's official breach notification guidance:

  • The nature and extent of the PHI involved. Did it include names, Social Security numbers, diagnoses, or treatment records? The more identifiable and sensitive, the higher the risk.
  • The unauthorized person who used the PHI or to whom it was disclosed. A nurse at the same hospital who accidentally opens the wrong chart is different from a stranger finding records in a dumpster.
  • Whether the PHI was actually acquired or viewed. If you can prove the information was never actually seen or retained, your risk drops significantly.
  • The extent to which the risk has been mitigated. Did you recover the data? Get a signed destruction confirmation? The faster and more thoroughly you act, the better your assessment looks.

If your risk assessment shows a low probability of compromise across all four factors, you can document it as not a reportable breach. But you must conduct — and retain — that analysis every single time.

Three Exceptions That Save Organizations Every Day

Not every impermissible disclosure qualifies as a breach. The Breach Notification Rule carves out three specific exceptions under 45 CFR § 164.402:

Exception 1: Unintentional Access by a Workforce Member

A billing specialist accidentally pulls up the wrong patient record, realizes it immediately, and closes it. If the access was made in good faith, within the scope of their job, and the information isn't further used or disclosed — it's not a breach. I've seen this scenario play out hundreds of times in hospital settings. It's the most commonly applied exception.

Exception 2: Inadvertent Disclosure Between Authorized Persons

A doctor at your covered entity accidentally sends a patient's lab results to a colleague at the same organization who's also authorized to access PHI. As long as the information isn't further used or disclosed improperly, this falls outside the definition of HIPAA breach.

Exception 3: Good Faith Belief That PHI Can't Be Retained

You mail a billing statement and it goes to the wrong address, but it's returned unopened. Or an email containing PHI bounces back. If you have a good faith belief that the unintended recipient couldn't reasonably retain the information, you're in exception territory.

Here's my warning: these exceptions are narrow. I've reviewed cases where organizations tried to stretch Exception 1 to cover situations that clearly didn't qualify — and ended up in worse shape than if they'd just reported the breach upfront.

Breach vs. Security Incident: A Distinction That Matters

Every breach is a security incident, but not every security incident is a breach. A security incident under HIPAA is any attempted or successful unauthorized access, use, disclosure, modification, or destruction of information — or interference with system operations.

Your firewall blocks 10,000 unauthorized access attempts a day. Those are security incidents. They're not breaches because no PHI was actually acquired, accessed, used, or disclosed.

The confusion between these two terms costs organizations time and credibility. I've walked into compliance audits where teams had been logging every phishing email as a "breach" in their tracking system, which made their breach history look catastrophic when it was actually clean.

Train your workforce to understand this distinction. Our HIPAA training catalog covers incident classification in practical, scenario-based modules that stick.

What Happens When You Confirm a Breach

Once your risk assessment confirms a breach, the clock starts. The Breach Notification Rule under 45 CFR §§ 164.404–164.408 requires three things:

Individual Notification

You must notify each affected individual in writing within 60 days of discovering the breach. Not 60 days from when it happened — 60 days from discovery. The letter must describe what happened, what PHI was involved, steps the individual should take, and what your organization is doing about it.

HHS Notification

If the breach affects 500 or more individuals, you notify HHS simultaneously and it goes on the OCR Breach Portal — sometimes called the "Wall of Shame." Breaches affecting fewer than 500 individuals get reported to HHS annually, within 60 days of the end of the calendar year.

Media Notification

Breaches affecting 500 or more residents of a single state or jurisdiction trigger a media notification requirement. You must alert prominent media outlets in that area within the same 60-day window.

Miss any of these timelines and you're stacking violations. Anthem Inc. paid $16 million in 2018 — the largest HIPAA settlement in history at that time — partly because of how they handled breach response after a massive cyberattack exposed nearly 79 million records.

The Presumption That Catches People Off Guard

Here's what makes the definition of HIPAA breach so aggressive: every impermissible use or disclosure is presumed to be a reportable breach. You start at "guilty" and work backward.

Your only path out is the four-factor risk assessment or one of the three exceptions. If you can't document your way into one of those safe harbors, you report. Period.

This presumption flips the traditional approach most organizations take. They want to investigate first and decide later. HIPAA says the opposite: it's a breach unless you can prove otherwise, and you'd better prove it fast.

Encryption: The Breach You Never Have to Report

There's one more escape hatch worth knowing. If the PHI involved was encrypted to NIST standards and the encryption key wasn't compromised, the data is considered "unsecured PHI" — which means breach notification rules don't apply. It's not that the event didn't happen. It's that the PHI is rendered unusable, unreadable, and indecipherable to unauthorized individuals.

This is why I push every organization I work with to encrypt ePHI at rest and in transit. It's the single most effective way to turn a potential $4.3 million disaster into a documented non-event.

Building a Workforce That Recognizes Breaches Before They Escalate

Your staff are your first line of detection. If a medical assistant doesn't understand what qualifies as a breach, they won't report the fax that went to the wrong number. If your IT team doesn't know the difference between a security incident and a breach, they'll misclassify events in your logs.

Workforce training isn't optional — it's required under 45 CFR § 164.530(b). And it needs to go beyond definitions. Your people need scenario-based practice. Explore the HIPAA compliance training courses at HIPAACertify.com to get your entire team aligned on breach identification and response.

Quick Reference: Is It a HIPAA Breach?

Ask these questions in order:

  • Was PHI involved? If no, stop — it's not a HIPAA breach.
  • Was the use or disclosure permitted under the Privacy Rule? If yes, stop — no breach.
  • Does one of the three exceptions apply? If yes, document it and stop.
  • Does a four-factor risk assessment show low probability of compromise? If yes, document your analysis thoroughly.
  • If none of the above apply — you have a reportable breach. Start the notification clock.

Keep this framework posted in your compliance office. I've seen it prevent panic and bad decisions in real time.

The Bottom Line on Breach Definitions

The definition of HIPAA breach isn't academic. It's the trigger for a cascade of federal obligations — notification letters, HHS reports, potential media attention, and OCR scrutiny. Every covered entity and business associate needs a workforce that understands this definition, applies the risk assessment correctly, and documents everything.

Get it right, and a potential breach stays a documented incident in a file drawer. Get it wrong, and your organization's name ends up on the OCR breach portal for the world to see.