California is gearing up to write rules to enforce its California’s Privacy Rights Act (CPRA). Regulators, led by California Privacy Protection Agency Director Ashkan Soltani, are preparing to write rules to guide that enforcement and those rules could address the new forms of identity technologies that advertisers and publishers are currently testing.

  • Soltani criticized email-based identifiers that use emails as the foundation for identifying people online, saying that these identifiers are “more privacy-invasive than even cookies.” He alluded to CPRA, adding that “regulators will not stand for” data transfers from publishers that include emails—even encrypted emails—that enable tracking when people have opted-out from it.

  • Email-based IDs, even when emails are hashed to enable encryption for privacy purposes, are considered to be personal information under both the existing California Consumer Privacy Act (CCPA) and the CPRA which will subsume CCPA. According to the laws, a transfer of an email to a third party would be prohibited if people have opted out from sharing or selling their information for purposes such as targeted advertising.

  • But the cons lie in the particulars of determining whether the use of an email-based identifier constitutes data sharing or a data sale in the eyes of enforcers. Because ad technologies use emails to recognize people who have asked not to have their data shared, some require an email address to actually act on consumers’ privacy preferences. California’s requirements are not entirely clear when it comes to how publishers can use emails to prevent targeting of people who have opted out.

  • Tech companies will need to accommodate a variety of tracking methods including email-based IDs, especially as third-party cookies are phased out. The current definition for a unique identifier under the CCPA and CPRA includes IP addresses, phone numbers, mobile ad identifiers, and even “other forms of persistent or probabilistic identifiers,” but it does not mention email-based identifiers specifically.

How FLoC Will Replace Third-party Cookies

The third-party cookie has been the backbone of a multibillion-dollar advertising-surveillance economy on the Internet. Google is spearheading the effort to replace third-party cookies with a new set of technologies for online ad targeting.

FLoC ( Federated Learning of Cohorts) is a new approach for browsers to do the profiling that third-party trackers used to do—in this case by breaking down current browsing activity into a behavioral profile and sharing it with websites and advertising. Third-party cookies pose a privacy concern, but FLoC will create new ones in the process. It may also worsen many of behavioral advertising’s greatest privacy issues, such as discrimination and predatory targeting.

Because of its specified duration. FLoC cohorts are re-calculated on a weekly basis, each time using data from the previous week’s browsing. This makes FLoC cohorts more potent measures of how users behave over time.

Google’s Privacy Pitch

Google’s appeal to privacy activists is that a world with FLoC (and other components of the (“privacy sandbox”) will be better than the one we have now, in which data brokers and ad-tech companies may track and profile people without consequence. However, the framing is founded on the erroneous assumption that we must choose between “old tracking” and “new tracking.” It’s not a binary choice. Rather than reinventing the tracking wheel, we should imagine a better society free of the numerous difficulties associated with targeted advertising.

New Privacy Problems

Browser fingerprinting is the practice of gathering many discrete pieces of information from a user’s browser to create a unique, stable identifier for that user. Google has promised that the vast majority of FLoC cohorts will comprise thousands of users each, so a cohort ID alone shouldn’t distinguish one user from a few thousand other people like that user. However, that still gives fingerprinters a massive head start. This information is even more potent given that it is unlikely to be correlated with other information that the browser exposes. This will make fingerprinting difficult to stop. Browsers like Safari and Tor have engaged in years-long wars of attrition against trackers, sacrificing large swaths of their own feature sets in order to reduce fingerprinting attack surfaces. Fingerprinting mitigation generally involves trimming away or restricting unnecessary sources of entropy—which is what FLoC is. Google should not create new fingerprinting risks until it’s figured out how to deal with existing ones.

Another major problem with FLoC is that the technology will share new personal data with trackers who can already identify users. For FLoC to be useful to advertisers, a user’s cohort will necessarily reveal information about their behavior. FLoC cohorts shouldn’t work as identifiers by themselves. However, any company able to identify a user in other ways—say, by offering “log in with Google” services to sites around the Internet—will be able to tie the information it learns from FLoC to the user’s profile.

Two categories of information may be exposed in this way:

  1. Specific information about browsing history.

  2. General information about demographics or interests.

Consumers should have a right to present different aspects of their identity in different contexts. If you visit a site for medical information, you might trust it with information about your health, but there’s no reason it needs to know what your politics are. Likewise, if you visit a retail website, that site shouldn’t need to know whether you’ve recently read up on treatment for depression. FLoC erodes this separation of contexts and instead presents the same behavioral summary to everyone you interact with.

Beyond Privacy

FLoC is designed to prevent a very specific threat: the kind of individualized profiling that is enabled by cross-context identifiers today. The goal of FLoC and other proposals is to avoid letting trackers access specific pieces of information that they can tie to specific people. FLoC may actually help trackers in many contexts, but even if Google is able to iterate on its design and prevent these risks, the harms of targeted advertising are not limited to violations of privacy. FLoC’s core objective is at odds with other civil liberties.

Google, Facebook, and many other ad platforms already try to rein in certain uses of their targeting platforms. The power to target is the power to discriminate. By definition, targeted ads allow advertisers to reach some kinds of people while excluding others. A targeting system may be used to decide who gets to see job postings or loan offers just as easily as it is used to advertise shoes.

Previous
Previous

Lloyd v Google: A Sigh of Relief for Data Controllers

Next
Next

Developing a Defensible Disposition Process