Protecting the Digital You
As personal data becomes more vulnerable online, DU law professor Zahra Takhshid says privacy tort law can help strengthen legal protections.

It’s 7:00 a.m. Your daily alarm buzzes, reverberating on the bedside table. You reach for your phone, and your face unlocks the screen, even though you barely recognize yourself. By 7:30, Alexa has greeted you by name and delivered the weather. By 8:00, your phone has already predicted how long your commute will take.
These routines, along with your clicks, searches, and late-night scrolling, are quietly building something else: a digital version of you. And it’s being increasingly used by companies for purposes you never agreed to. , assistant professor at the ’s believes the law needs to catch up with this new reality.
Takhshid, who teaches tort law, privacy, and technology, is pushing for a reinterpretation of the tort related to name and likeness—called the appropriation tort—in her latest paper, A tort is a civil wrong, and appropriation is one of four types of privacy torts. The others are intrusion, publication of private facts, and false light.
A little background: At the federal level, privacy is governed by a patchwork of sector-specific privacy laws—like the Health Insurance Portability and Accountability Act (HIPPA), which protects medical data; the Fair Credit and Reporting Act (FCRA), which regulates credit information; and the Family Educational Rights and Privacy Act (FERPA), which protects student education records. But there is no single comprehensive privacy law, and state-level protections vary widely.
“Everyone is trying to figure out how to address this new privacy harm—digital harm. While the legislature is the best way to do this, I thought—why don’t we benefit from the protection that tort law has offered for years?” Takhshid says.
Name, image, and likeness
Among the four privacy torts, Takhshid says, appropriation is the most relevant in the age of digital identity. It focuses on the unauthorized use of someone’s name, image, or likeness. Over the years, the courts have interpreted it more broadly. It’s not just your photo—it could be voice, style, or even mannerisms.
In a 1989 landmark case, the singer Bette Midler sued Ford Motor Co. for using a backup singer to create a TV commercial that sounded like her. The court sided with Midler, saying that using a voice that mimicked hers was a violation of her likeness.
That legal logic is what Takhshid wants to apply to today’s digital environment. But what other types of data could amount to your personal data?
“We have to narrow it down to personally identifiable data that is unique to you,” she says. “For example, if a company uses biometric data—like facial recognition—to create a deep fake, it could be considered a misuse of your digital persona.”
Even though users often agree to consent forms when signing up for services, Takhshid and other scholars question whether those agreements are legally valid.
“Sometimes, you have companies that collect personal data, but they use it outside the consent you’ve given,” Takhshid says.
Until legislators catch up, tort law offers an underutilized path forward—one that empowers individuals to take legal action when their likeness is misused. In the meantime, Takhshid urges people to be mindful of the technology they use.
“At a time where it seems like the legislature is failing to live up to protecting privacy rights, some of that responsibility is falling to citizens themselves,” Takhshid says. “I’d advise schools to offer introductory courses on privacy and technology and for colleges training future programmers and engineers to include coursework on privacy, so they graduate with a deeper awareness of these issues.”