Friday, June 14, 2019
Thursday, June 13, 2019
Wednesday, June 12, 2019
Tuesday, June 11, 2019
Sunday, June 09, 2019
"By The Editorial Board
In the past year, Congress has been happy to drag tech C.E.O.s into hearings and question them about how they vacuum up and exploit personal information about their users. But so far those hearings haven’t amounted to much more than talk. Lawmakers have yet to do their job and rewrite the law to ensure that such abuses don’t continue.
Americans have been far too vulnerable for far too long when they venture online. Companies are free today to monitor Americans’ behavior and collect information about them from across the web and the real world to do everything from sell them cars to influence their votes to set their life insurance rates — all usually without users’ knowledge of the collection and manipulation taking place behind the scenes. It’s taken more than a decade of shocking revelations — of data breaches and other privacy abuses — to get to this moment, when there finally seems to be enough momentum to pass a federal law. Congress is considering several pieces of legislation that would strengthen Americans’ privacy rights, and alongside them, a few bills that would make it easier for tech companies to strip away what few privacy rights we now enjoy.
[If you use technology, someone is using your information. We’ll tell you how — and what you can do about it. Sign up for our limited-run newsletter.]
American lawmakers are late to the party. Europe has already set what amounts to a global privacy standard with its General Data Protection Regulation, which went into effect in 2018. G.D.P.R. establishes several privacy rights that do not exist in the United States — including a requirement for companies to inform users about their data practices and receive explicit permission before collecting any personal information. Although Americans cannot legally avail themselves of specific rights under G.D.P.R., the fact that the biggest global tech companies are complying everywhere with the new European rules means that the technocrats in Brussels are doing more for Americans’ digital privacy rights than their own Congress.
The toughest privacy law in the United States today, is the California Consumer Privacy Act, which is set to go into effect on Jan. 1, 2020. Just like G.D.P.R., it requires companies to take adequate security measures to protect data and also offers consumers the right to request access to the data that has been collected about them. Under the California law, consumers not only have a right to know whether their data is being sold or handed off to third parties, they also have a right to block that sale. And the opt-out can’t be a false choice — Facebook and Google would not be able to refuse service just because a user didn’t want their data sold.
While the California Legislature is still working out the precise details of the law and its implementation, other states — including New York — are hard at work on their own privacy legislation. The prospect of a patchwork of state-level rules explains why tech companies are suddenly eager for Washington to step in to set a national standard.
If a weak federal privacy law pre-empts state law, it would roll back the protections that Californians are supposed to get — and it would make it impossible for other states to set the bar even higher. That’s exactly what’s going on with privacy bills introduced by Senator Marco Rubio (the American Data Dissemination Act) and Senator Marsha Blackburn (the Balancing the Rights of Web Surfers Equally and Responsibly Act). Both offer weak privacy protections bundled with federal pre-emption. If passed, they would gut the California law. In the House, Representative Suzan DelBene’s Information Transparency and Personal Data Control Act also pre-empts state law, while offering a respectable amount of privacy protection, like a requirement for companies to secure opt-in consent before collecting user data. Still, even that bill lacks some rights that the California law provides.
The Senate bills that take privacy seriously do not contain pre-emption clauses. Senator Catherine Cortez Masto’s DATA Privacy Act, for instance, bears similarities to the California law and to the G.D.P.R., as does Senator Ed Markey’s significantly more ambitious Privacy Bill of Rights Act. Although Ms. Cortez Masto’s bill does not create a private right of action — that is, the ability for consumers to sue tech companies for privacy violations — Mr. Markey’s does, and invalidates arbitration clauses that could otherwise shield companies from individual lawsuits. Consumer lawsuits are a hot-button issue — in the California law, the private right of action exists only in a limited form thanks in part to corporate lobbying. Most interestingly, Mr. Markey’s bill requires the creation of a public list of data brokers in the United States — third party companies who buy and sell your data.
Not all bills on the table take an omnibus approach. Some appear to be highly specific swipes at Facebook. For example, a social media privacy bill introduced by Senators Amy Klobuchar and John Kennedy does not add very much to consumer privacy, but each of its provisions — like one that forbids a change to a product that “overrides the privacy preferences of a user” — seems to be a reference to something Facebook has done in the past. Senators Mark Warner and Deb Fischer have introduced a bill circumscribing experimentation on users without their consent. It might seem shocking that any company would do such a thing, but, in fact, Facebook tinkered with its News Feed in 2014 to test whether it could alter its users’ emotions. (The bill also bars designing sites targeted at children under the age of 13 “with the purpose or substantial effect of cultivating compulsive usage, including video auto-play functions initiated without the consent of a user” — a provision aimed at YouTube and its effect on children.)
Where the Warner/Fischer bill looks to alleviate the harmful effects of data collection on consumers, Senator Josh Hawley’s Do Not Track Act seeks to stop the problem much closer to the source, by creating a Do Not Track system administered by the Federal Trade Commission. Commercial websites would be required by law not to harvest unnecessary data from consumers who have Do Not Track turned on.
A similar idea appeared in a more comprehensive draft bill circulated last year by Senator Ron Wyden, but Mr. Wyden has yet to introduce that bill this session. Instead, like Mr. Warner, he seems to have turned his attention to downstream effects — for the time being, at least. This year, he is sponsoring a bill for algorithmic accountability, requiring the largest tech companies to test their artificial intelligence systems for biases, such as racial discrimination, and to fix those biases that are found.
A grand bargain privacy bill is said to be in the works, with a handful of lawmakers from both parties haggling privately over the details. Forward-thinking legislation — and the public hearings that would inform its passage — are urgently needed. Americans deserve a robust discussion of what privacy rights they are entitled to and strong privacy laws to protect them.
Congress’s earliest attempts to regulate computing in the 1980s and 1990s were embarrassing. The Congressional Record shows that the Computer Fraud and Abuse Act of 1984, for instance, was prompted by a fantastical Hollywood film about a boy hacker. The Communications Decency Act of 1996 — many sections of which were deemed unconstitutional by the Supreme Court in the following year — had its origins in a moral panic about internet pornography touched off by questionable research. All this lent support to the received wisdom that the tech industry is best left to its own devices without the interference of a clueless legislature. More recent attempts, like the abortive Stop Online Piracy Act, an overbroad piece of copyright enforcement legislation that was killed in 2012 after furious backlash from internet users, have not instilled much confidence in Capitol Hill’s understanding of technology. But encouragingly, many of the privacy bills introduced this session show a sophisticated understanding of the market for personal information, the nation’s woefully inadequate cybersecurity and the many dangers posed by a sector of the economy that has proved itself incapable of self-regulation. Legislators have stepped up their game.
A single bill is of course not the end of government’s responsibilities to its citizens. Any regulation must evolve alongside technology to safeguard fundamental freedoms. But a strong law would be a welcome start. The California privacy law will go into effect in less than seven months. Congress should seize the moment and the public momentum to enshrine digital privacy rights into federal law."
Opinion | Why Is America So Far Behind Europe on Digital Privacy?
"So which choice is the least bad? Well, it depends whom you ask.
In April, Quartz published an article titled “Your cotton tote is pretty much the worst replacement for a plastic bag.” The story was based on a 2018 life cycle assessment of grocery bags from the Danish Environmental Protection Agency, which found that single-use plastic was less detrimental than cotton totes or even paper bags when it comes to how their manufacturing affects climate change, ozone depletion, water use, air pollution and toxicity for humans.
“Cotton bags must be reused thousands of times before they meet the environmental performance of plastic bags — and, the Denmark researchers write, organic cotton is worse than conventional cotton when it comes to overall environmental impact,” according to Quartz."
Are Plastic, Paper Or Reusable Bags Better For The Environment?