How to Follow CPRA’s Rules for ‘Dark Patterns’
When it was passed into law last year, one of the most groundbreaking precedents the California Privacy Rights Act (CPRA) established was its regulation of “dark patterns”— insidious user-interface design choices intended to subconsciously influence user behavior.
Even if you’ve never heard the term before, you’ve likely experienced it in some form–think ads disguised as regular content or a “buy” button that looks like a “close” button. Just last year, New York State’s Department of Financial Services released a report highlighting the dark patterns employed by several tax preparation companies, including TurboTax and H&R Block.
But under CPRA–and perhaps eventually the Washington Privacy Act–purposefully tricking users into acting the way you want them to is becoming illegal…in certain cases. Here’s what you need to know about dark pattern regulation.
What is Strictly Illegal?
It’s logical to ask why a data privacy law is so concerned with tricky UI design. Well, CPRA doesn’t ban dark patterns outright, nor does it regulate specific manipulative behaviors.
Instead, it focuses on how dark patterns can be used in the process of obtaining consent. In short, a data subject cannot give valid consent for the sale or sharing of their personal data if a company uses dark patterns to get it.
This is in line with the common principle that consent must be voluntary. However, unlike its older sibling, the California Consumer Privacy Act (CCPA), CPRA redefines consent to explicitly exclude “agreement obtained through the use of dark patterns.”
CPRA’s definition of dark patterns is “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice, as further defined by regulation.” That being said, it’s unclear who decides whether an interface was designed to impair user autonomy.
Fight Dark Patterns With Privacy-By-Design
CPRA’s vagueness invites it to be interpreted as broadly as possible–and the newly-minted California Privacy Protection Agency will have the power to do just that. To protect your organization from enforcement, it’s not enough to avoid intentionally manipulative methods. You need to make sure that they can’t be interpreted as such.
That’s easier said than done. That’s due in part to the sheer size and noise of the internet. It’s difficult to get users’ attention, which means website designers have to find ways to grab attention and to subtly influence user decision-making. There’s a reason why unsubscribe buttons are usually hard to find–or why ads are designed to fit right into users’ Instagram feeds. However, that type of design must be avoided at all costs when it comes to data rights.
This starts with overhauling the way you think about user data privacy laws– instead of viewing them as an obstacle to bypass, treat them like guidelines to creating a more secure user experience. Of course, this impulse might be ingrained in many of your UI designers, so communicating with them about data privacy regulations is essential. Make sure they know what type of designs constitute a violation, stressing the message that the goal of consent forms and privacy policies is not to manipulate the user into handing over their data.
In the end, it comes down to privacy by design. Design–or redesign your interface with a genuine commitment to clarity and user autonomy. Field-test your platforms and look out for any subtle elements that appear to be leading you in a specific direction, such as:
Language that shames users for not consenting to data collection
“No” being smaller or harder to find than “yes”
An outline of the reasons users should consent
Deceivingly worded options (like “No, I do consent” or “I don’t not consent”)
Of course, the ambiguity around what can be construed as a “dark pattern” makes it tough to comply. Err on the side of caution by being explicitly clear about the user’s options and neutral towards what action they choose to take in the consent process. If you can show an evident and consistent commitment to user privacy, you are less likely to find yourself at risk of non-compliance.