Why Most Data Privacy Laws Are Deeply Flawed

Why Most Data Privacy Laws Are Deeply Flawed

Do you ever wonder where all your data goes? 

New York Times technology reporter Kashmir Hill has tried to answer that question more than a few times, with mixed results. She shared her story in a keynote address to audience members participating in The Privacy + Security Academy’s Fall Forum. 

In 2019, Hill exercised her right to access her data profile from Sift, a vetting software company which shares “consumer scores” with companies like Airbnb and Yelp. These scores, designed to determine whether consumers can be trusted to use certain online services, are largely inaccessible to the public.

Hill got much more than a score in return. “They sent me a hundred-page long document that went back years,” she says. It had Yelp orders from five years ago, a log of her Coinbase log-ins and, most shockingly, all of her private Airbnb messages.

She shared her experience in an article and urged readers to get their files too. 

Thousands of users tried to follow her lead, but to no avail. It was then that Hill came to a realization: Sift had misinterpreted the law when handing over her data. For future requests, Sift referred consumers to go directly to their clients, like Airbnb and Yelp, if they wanted their information.

“It showed how difficult it is for businesses to interpret these laws and how frustrating it is, as a consumer, to take advantage of these things” says Hill. “It’s too much data, or it’s not enough.”

This is a common concern for consumers. When you exercise your data rights, all this data gets dumped on you. What are you supposed to do with it? How does it impact your life? Is there any way you can protect your privacy without cutting off the Internet?

“As a privacy journalist, my solution was often: we just need more transparency, we need more access, we need to understand where our data is going so that we can make more informed decisions,” says Hill. After this experience, however, she realized that transparency doesn’t always bring clarity.

For consumers, her conclusion is grim. “There’s not much you can do as an individual to protect your privacy. There’s just so many places this data is flowing though. It’s this big ecosystem.”

The solution seems simple: If you don’t want Google or Facebook or Amazon collecting your data, just don’t use them.

Hill put this suggestion to the test by building a VPN that blocked any IP addresses controlled by Amazon, Apple, Facebook, Google or Microsoft. She found that using the Internet became practically impossible. Since so much of the Internet is built on infrastructure like Amazon Web Services, the only way to cut out big data is to go offline.

Even if cutting out these companies was possible, it wouldn’t fully solve the problem. “The big tech companies get much of the criticism, but there’re also many bad actors you haven’t heard of that are collecting your data in ways that are almost unavoidable,” says Hill. “It’s really challenging to go out there and single-handedly protect yourself.”

On a micro level, a company knowing what movie you like or when you order food might not be such a big deal. However, as the data accumulates, algorithms can create a sophisticated analysis of your behavior to be used in ways you didn’t see coming.

This underlines the issue of how so much data is collected beyond the consumer’s expectations. We use Airbnb to find convenient lodging, but our messages get turned into a score. We buy smart vacuums to clean our house, but the layouts of our homes get sold to tech companies so they know which products to advertise.

Hill points to the example of Clearview AI, a facial recognition company that created a database by scrubbing the Internet for publicly accessible images. 

“All these people who posted their photos online didn’t realize one day someone’s going to scrape all this to build a universal face recognition app that can identify me,” she says. 

It’s become a classic story. Feeling exploited by a company that used their data in ways they weren’t aware of, consumers take legal action. In its defense, the company cites the First Amendment. Because the data was public, it had a right to use it.

This argument holds up in America, with few exceptions: Thanks to its Biometric Information Privacy Act (BIPA), Illinois requires consent from consumers before companies can collect or disclose their facial geometry. This law was powerful enough to get Clearview to pull its services out of Illinois and take steps to remove Illinois residents from their database.

In Hill’s eyes, this is why data regulation is so important. Data goes places consumers couldn’t even imagine when they first consented. Without laws to protect them, it’s a free-for-all.

However, current legislation may have a long way to go. “So much regulation [right now] is just put on the shoulder of the consumer,” says privacy expert Daniel Solove, one of the Fall Forum’s organizers, and a guest on ADCG’s podcast

Indeed, notice-and-choice legislation mandates consent from consumers based on transparent privacy policies. These consumers lack the bandwidth or information they would need to effectively manage their privacy, yet are forced to shoulder the burden.

According to Solove, a more appropriate approach would be to regulate uses. Certain uses of data would be prohibited regardless of if the consumer consents. “Everyone’s afraid to go down that road because it involves making some tough, substantive judgments,” said Solove. However, it may be a necessary line to draw in order to ensure a true right to privacy.

Even if the principles of the California Consumer Privacy Act are organized into a coherent federal law, these issues may remain top of mind. Requiring consent doesn’t teach people where their data goes or how it moves through the system. It won’t stop algorithms, such as Sift, from silently judging them and denying them opportunities based on information understood to be private. It won’t stop law enforcement from using facial recognition to arrest a man for a crime he did not commit.

“I don’t think we’ve solved some of the building blocks of privacy problems,” says Hill. “I feel like I could go back to a story I worked on ten years ago, and just write it again now. It would probably be the same story.”

 

Max Totsky

Max Totsky is journalist based in Chicago. His writing can be found at Inc.com, PopMatters, and ADCG.org.

Leave a Reply

Back To Top