Data Accountability and Transparency Act of 2020 looks beyond consent
In the United States, data privacy is hard work—particularly for the American people. But one US Senator believes it shouldn’t have to be.
In June, Democratic Senator Sherrod Brown of Ohio released a discussion draft of a new data privacy bill to improve Americans’ data privacy rights and their relationship with the countless companies that collect, store, and share their personal data. While the proposed federal bill includes data rights for the public and data restrictions for organizations that align with many previous data privacy bills, its primary thrust is somewhat novel: Consent is unmanageable at today’s scale.
Instead of having to click “Yes” to innumerable, unknown data collection practices, Sen. Brown said, Americans should be able to trust that their online privacy remains intact, no clicking necessary.
As the Senator wrote in his opinion piece published in Wired: “Privacy isn’t a right you can click away.”
The Data Accountability and Transparency Act
In mid-June, Sen. Brown introduced the discussion draft of the Data Accountability and Transparency Act (which does not appear to have an official acronym, and which bears a perhaps confusing similarity in title to the 2014 law, the Digital Accountability and Transparency Act).
Broadly, the bill attempts to wrangle better data privacy protections in three ways. First, it grants now-commonly proposed data privacy rights to Americans, including the rights of data access, portability, transparency, deletion, and accuracy and correction. Second, it places new restrictions on how companies and organizations can collect, store, share, and sell Americans’ personal data. The bill’s restrictions are tighter than many other bills, and they include strict rules on how long a company can keep a person’s data. Finally, the bill would create a new data privacy agency that would enforce the rules of the bill and manage consumer complaints.
Buried deeper into the bill though are two proposals that are less common. The bill proposes an outright ban on facial recognition technology, and it extends what is called a “private right of action” to the American public, meaning that, if a company were to violate the data privacy rights of an everyday consumer, that consumer could, on their own, bring legal action against the company.
Frustratingly, that is not how it works today. Instead, Americans must often rely on government agencies or their own state Attorney General to get any legal recourse in the case of, for example, a harmful data breach.
If Americans don’t like the end results of the government’s enforcement attempts? Tough luck. Many Americans faced this unfortunate truth last year, when the US Federal Trade Commission reached a settlement agreement with Equifax, following the credit reporting agency’s enormous data breach which affected 147 million Americans.
Announced with some premature fanfare online, the FTC secured a way for Americans affected by the data breach to apply for up to $125 each. The problem? If every affected American actually opted for a cash repayment, the real money they’d see would be 21 cents. Cents.
That’s what happens for one of the largest data breaches in recent history. But what about for smaller data breaches that don’t get national or statewide attention? That’s where a private right of action might come into play.
As we wrote last year, some privacy experts see a private right of action as the cornerstone to an effective, meaningful data privacy bill. In speaking then with Malwarebytes Labs, Purism founder and chief executive Todd Weaver said:
“If you can’t sue or do anything to go after these companies that are committing these atrocities, where does that leave us?”
For many Americans, it could leave them with a couple of dimes in their pocket.
Casting away consent management in the Data Accountability and Transparency Act
Today, the bargain that most Americans agree to when using various online platforms is tilted against their favor. First, they are told that, to use a certain platform, they must create an account, and in creating that account, they must agree to having their data used in ways that only a lawyer can understand, described to them in a paragraph buried deep in a thousand-page-long end-user license agreement. If a consumer disagrees with the way their data will be used, they are often told they cannot access the platform itself. Better luck next time.
But under the Data Accountability and Transparency Act, there would be no opportunity for a consumer’s data to be used in ways they do not anticipate, because the bill would prohibit many uses of personal data that are not necessary for the basic operation of a company. And the bill’s broad applicability affects many companies today.
Sen. Brown’s bill targets what it calls “data aggregators,” a term that includes any individual, government entity, company, corporation, or organization that collects personal data in a non-insignificant way. Individual people who collect, use, and share personal data for personal reasons, however, are exempt from the bill’s provisions.
The bill’s wide net thus includes all of today’s most popular tech companies, from Facebook to Google to Airbnb to Lyft to Pinterest. It also includes the countless data brokers who help power today’s data economy, packaging Americans’ personal data and online behavior and selling it to the highest bidders.
The restrictions on these companies are concise and firm.
According to the bill, data aggregators “shall not collect, use, or share, or cause to be collected, used, or shared any personal data,” except for “strictly necessary” purposes. Those purposes are laid out in the bill, and they include providing a good, service, or specific feature requested by an individual in an intentional interaction,” engaging in journalism, conducting scientific research, employing workers and paying them, and complying with laws and with legal inquiries. In some cases, the bill allows for delivering advertisements, too.
The purpose of these restrictions, Sen. Brown explained, is to prevent the aftershock of worrying data practices that impact Americans every day. Because invariably, Sen. Brown said, when an American consumer agrees to have their data used in one obvious way, their data actually gets used in an unseen multitude of other ways.
Under the Data Accountability and Transparency Act, that wouldn’t happen, Sen. Brown said.
“For example, signing up for a credit card online won’t give the bank the right to use your data for anything else—not marketing, and certainly not to use that data to sign you up for five more accounts you didn’t ask for (we’re looking at you, Wells Fargo),” Sen. Brown said in Wired. “It’s not only the specific companies you sign away your data to that profit off it—they sell it to other companies you’ve never heard of, without your knowledge.”
Thus, Sen. Brown’s bill proposes a different data ecosystem: Perhaps data, at its outset, should be restricted.
Are data restrictions enough?
Doing away with consent in tomorrow’s data privacy regime is not a unique idea—the Center for Democracy and Technology released its own draft data privacy bill in 2018 that extended a set of digital civil rights that cannot be signed away.
But what if consent were not something to be replaced, but rather something to be built on?
That’s the theory proposed by Electronic Frontier Foundation, said Adam Schwartz, a senior staff attorney for the digital rights nonprofit.
Schwartz said that Sen. Sherrod’s bill follows on a “kind of philosophical view that we see in some corners of the privacy discourse, which is that consent is just too hard—that consumers are being overwhelmed by screens that say ‘Do you consent?’”
Therefore, Schwartz said, for a bill like the Data Accountability and Transparency Act, “in lieu of consent, you see data minimization,”—a term used to describe the set of practices that require companies to only collect what they need, store what is necessary, and share as little as possible when giving the consumer what they asked for.
But instead of ascribing only to data minimization, Schwartz said, EFF takes what he called a “belt-and-suspenders” approach that includes consent. In other words, the more support systems for consumers, the better.
“We concede there are problems with consent—confusing click-throughs, yes—but think that if you do consent plus two other things, it can become meaningful.”
To make a consent model more meaningful, Schwartz said consumers should receive two other protections. First, any screens or agreements that ask for a user’s consent should not include the use of any “dark patterns.” The term describes user-experience design techniques that could push a consumer into a decision that does not benefit themselves. For example, a company could ask for a user’s consent to use their data in myriad, imperceptible ways, and then present the options to the user in two ways: one, with a bright, bold green button, and the other in pale gray, small text.
The practice is popular—and despised—enough to warrant a sort of watchdog Twitter account.
Second, Schwartz said, a consent model should require a ban on “pay for privacy” schemes, in which organizations and companies could retaliate against a consumer who opts into protecting their own privacy. That could mean consumers pay a literal price to exercise their privacy rights, or it could mean withholding a discount or feature that is offered to those who waive their privacy rights.
Sen. Brown’s bill does prohibit “pay for privacy” schemes—a move that we are glad to see, as we have reported on the potential dangers of these frameworks in the past.
What’s next?
Because Congress is attempting—and failing—to properly address the likely immediate homelessness crisis that will kick off this month due to the cratering American economy colliding with the evaporation of eviction protections across the country, an issue like data privacy is probably not top of mind.
That said, the introduction of more data privacy bills over the past two years has pushed the legislative discussion into a more substantial realm. Just a little more than two years ago, data privacy bills took more piece-meal approaches, focusing on the “clarity” of end-user license agreements, for example.
Today, the conversation has advanced to the point that a bill like the Data Accountability and Transparency Act does not seek “clarity,” it seeks to do away with the entire consent infrastructure built around us.
It’s not a bad start.
The post Data Accountability and Transparency Act of 2020 looks beyond consent appeared first on Malwarebytes Labs.
If you like the site, please consider joining the telegram channel or supporting us on Patreon using the button below.