With the pandemic moving much of life online, we've never had less of a chance to escape the corporate Eye of Sauron. (If you want to break out in hives, read Business Insider's nightmare list of all the data that companies and governments collect from you on an hourly basis.)
And now, as Steven Soderbergh's forthcoming Seattle-based Zoë Kravitz vehicle correctly warns us, our "smart" devices are gathering up an alarming amount of information about what we do in real life. To give one creepy example, Cambridge researcher Alina Utrata recently tweeted out some of the data Amazon keeps on her as a former Alexa user and Whole Foods shopper:
I downloaded all the data Amazon has on me, and honestly the creepiest thing about it is that they sent me the *actual audio files* of every time I spoke* to Amazon Alexa
*years ago when I was young and foolish about surveillance pic.twitter.com/XH4Lp4bDob
— Alina Utrata (@AlinaUtrata) January 23, 2022
All this personal data that companies capture leave us vulnerable to breaches, and exploitation abounds. To name a few more examples: The military keeps tabs on Muslims via prayer apps, cops tap the cell phones of protesters, tech companies scoop up data on students engaging in remote learning, and the people who run menstruation apps sell your fap habit data to third parties.
For the last three years, lawmakers in Olympia have thrown around ideas to protect Washingtonians from the dangers and indignities of all this online corporate surveillance. In general, the Senate has favored industry-friendly legislation designed to give some people some control over some data while shielding tech companies from lawsuits. To support their approach, they fearmonger about "frivolous litigation" and "unworkable" opt-in frameworks.
Until this year, the House remained opposed to those efforts and countered with the People's Privacy Act (PPA), a gold-standard data privacy bill sponsored by Rep. Shelley Kloba (D-Kirkland) and written with input from the ACLU of Washington and the Tech Equity Coalition. Among other provisions, Kloba's bill prevents companies from collecting, using, and selling personal information without permission. And, unlike the proposals Microsoft and Google prefer, the bill would allow people to sue companies who violate the law. If enacted, it'd be the strongest data privacy bill in the country.
This session, however, there have been some new developments. Last week a Senate committee heard a new weird Frankenstein data privacy bill from Seattle Sen. Reuven Carlyle, who's driven the conversation in that chamber from the beginning. Meanwhile, State House Rep. Vandana Slatter dropped a bill that would compete with the PPA. Though Kloba's bill has been around longer, Slatter's bill will get a hearing this Tuesday while a hearing on Kloba's PPA remains TBD. (Over the phone, Kloba said she had "a few different ideas cooking" for changes to it, and so a hearing would only come after those amendments are ready to serve.) That said, the clock is ticking: The cut-off date for policy bills to pass committees comes next Thursday.
Overall, rather than simply rehearse old arguments again this year, lawmakers appear to be flooding the field with new legislation and hoping something breaks through. Unfortunately, at the moment, every bill but Kloba's contains loopholes the size of hula-hoops. Let's run them down real quick.
An Old Bad Senate Bill Waits in the Wings
For the last three years, Sen. Carlyle has pushed the Washington Privacy Act, which Microsoft lobbyists heavily influenced and other tech firms supported. Last year Virginia passed a "virtually identical" version of it, to borrow a phrase from Washington State Attorney General Bob Ferguson, that "was considered a 'huge victory' inside Amazon," according to Reuters.
This year, the latest version of that bill was automatically reintroduced to the Senate Rules committee after passing the chamber last year 48 to 1, so it's floating around and ready for a floor vote any time Carlyle wants to use it as a political chip, as he did last year.
Matt and I have written extensively about the drawbacks in the various iterations of this proposal, but they more or less boil down to a few main issues. Since most of the bad stuff in the other bills stems from the bad stuff in this bill, it's worth a brief review:
• Though the bill allows everyone to "access, correct, and delete personal data," it applies only to data collected by large companies, as if your right to your data should depend on the size of a company.
• Carlyle structures the bill using an opt-out framework, which means the onus is on you — rather than on the giant tech firms — to tell companies you don't want them to use or to make a profit from your data. Imagine having to "opt out" before accessing any of your other civil rights, and the absurdity of this framework will come clear.
• The bill that passed the Senate forbids a "private right of action," and it also gives the companies a 30-day "right to cure" that's enforceable only by the Attorney General's office. (In this case, a "right to cure" means that a company the AG caught violating our rights would have 30 days to say "oopsie," fix the issue, and then face little-to-no consequences. So, the former provision prevents us from suing big tech firms when they violate our rights, and the latter basically gives those big tech firms our lawyer, (i.e., the AG).
New Weird Frankenstein Senate Bill
Carlyle's latest submission takes an even more incremental, patchwork approach to data privacy, which he basically admits. During the hearing last week, he said he proposed the bill to "just elevate the dialogue" on the issue, not as a signal of his willingness to abandon a more comprehensive bill. The result is a weird, three-part bill that pleased no one and offered no real new ideas.
In Part I, rather than preventing all businesses from collecting and selling all data without consent from all people, the bill only prevents all businesses from collecting and selling "personal data or sensitive data" from "known" children and adolescents (people aged 13 to 17) without parental permission.
In Part II, rather than regulate all companies that collect data (such as Google, Amazon, Facebook, or your menstruation app), this bill regulates "data brokers," which are basically companies that only exist to buy and sell data. This section establishes an opt-in model for processing "sensitive data" but an opt-out model for "personal data" (a hair-splitting exercise I will spare you), and allows people to see, correct, and delete data only from these companies.
In Part III, rather than empowering all people with the ability to control all of their data, this bill only gives people the ability to opt out from businesses using personal data for targeted marketing campaigns or selling it to data brokers.
Issues with this bill abound:
• During testimony, a representative for the AG’s office flagged the bill as inconsistent with the federal Children's Online Privacy Protection Rule (COPPA), which could render it unenforceable.
• Several qualifiers throughout the bill render the protections vague. For instance, the bill's definition of "biometric data" excludes video and voice recordings, so there would be some cases where companies could use your voice recordings without your permission but not your voice print, another hair-splitting exercise I will spare you. As a representative for the ACLU pointed out during testimony that the definition of "sale" in the data-tracking section excludes "affiliates," so Facebook could still sell your data to Instagram or WhatsApp or any "affiliate" it suddenly wants to acquire for some reason.
• In all sections, the bill lets people sue, but not to cover damages or attorney's fees, which renders the protection meaningless, as it would make it hard to find a lawyer to take the case. The bill also gives companies a "right to cure," which, again, lets them off the hook at our expense. During the hearing last week, tech industry people whined about the bill allowing people to sue even a little bit, and consumer advocates slammed the bill's right to cure. Same old heads buttin'.
A New Idea in the House!
Rep. Slatter's Washington Foundational Data Privacy Act (WFDPA) runs into many of the same issues as the original bad Senate bill, but it does present a fun new enforcement option.
The proposal allows people to see, correct, and delete "personal data" collected by large companies, and it allows people to opt out of data processing for targeted advertising, “sharing” data, and creating profiles that would have legal consequences for users. The bill also forces those companies to adopt data protection practices, which would be cool.
To enforce all that, the bill sets up a three-person, Governor-appointed commission with subpoena power. The agency could perform audits of companies and investigate complaints. If a company violates your rights under this legislation, the commission could send out a useless letter, make them pay a fine of up to “$2,500 for each violation, or up to $7,500 for each intentional violation and each violation involving the personal data of a minor.” (I imagine companies will have many reasons for why their violation was "unintentional.") The commission is also tasked with promoting data privacy rights to let people know they have them, which is also cool.
Aside from that commission, the bill allows people to sue for "actual damages" and "repetitional harm," but not explicitly for attorney's fees, which, again might make it difficult for people to find attorneys. Moreover, companies would have a "right to cure" for 30 days, so a company could say "oops haha," fix the issue, and avoid suit. The dreaded right-to-cure provision sunsets in a year, so it operates more as an onramp to get them used to obeying the law.
Though the bill is better than some Senate versions, issues here abound as well:
• The bill only applies to large companies, and it exempts nonprofits, airlines, and a bunch of other entities. The bill also mostly operates under an opt-out framework, which puts the onus on us to make sure some opaque companies are following complex data law.
• The bill preempts all city laws, so Seattle couldn't pass a stronger version even if anyone cared to.
• According to the ACLU, loopholes in the language allow tech firms such as Google and Facebook "to track and profile consumers without their consent, though consumers may opt out of seeing the targeted ads," which gives you a good idea about the kinds of hair-splitting going on in all these bills.
Over the phone, Kloba expressed interest in all the new bills and a willingness to find compromise. "I fully accept and support the notion of some kind of reasonable onramp, and I support the idea you need to give smaller less well-resourced businesses more time," she said.
In the meantime, the Attorney General's office argues, Washingtonians will enjoy spillover benefits from companies switching their data practices to comply with California's privacy law. And our grievances will continue to add up alongside class action lawsuits filed against companies who violate privacy rights. Just this week, Ferguson joined other AGs in suing Google for allegedly continuing to track location data after people turn off their location history.
All that said, we still have an opportunity to pass the strongest data privacy laws in the country. Would be a shame not to do it just because big tech firms fear a few lawsuits.