Data privacy

The big new law that might finally shake up digital privacy

The Health Insurance Portability and Accountability Act of 1996—commonly known as HIPAA—created a uniform legal framework for protecting medical records across the entire healthcare sector. Basically, one set of pro-consumer rules applicable to everyone. And as a result, patients don’t have to perform privacy-policy due diligence just to select a doctor. 

If HIPAA had not been passed into law, the process for selecting a new doctor might look quite different than it does today. You would need to figure out how the doctor intended to use your medical records by struggling through a lengthy, legalese-heavy disclosure. You would need to determine who other than the doctor’s office could access your records, and whether they were authorized to sell health records to other parties. And you would need to continually monitor all of this, because the doctor could change policies at any time and, say, sell the practice’s entire medical record database wholesale. 

Now contrast healthcare to the technology industry, which operates without a HIPAA-like charter. As it stands today the technology companies that provide the most popular digital services do business without enforceable mandates to protect personally identifiable information. And so consumers are forced to personally manage their own digital privacy even as a shadowy multi-multi-multi-billion data brokerage industry operates largely without regulation, selling data about you without your knowledge or approval. This is online “privacy” in the U.S. circa 2017.

Privacy isn’t an abstract concept. How it is defined and legislated determines what companies can and cannot do with your personal information. The primary obstacle to the U.S. enacting any sort of overarching privacy law is—wait for it—money. Personal data drives advertising networks, and advertising revenues are the lifeblood of technology and media businesses the world over. 

If you’re someone concerned about personal privacy and the pervasiveness of digital surveillance, it’s unsatisfying how little discussion there is about online privacy rights in the U.S. Instead, it’s a lot of “Nothing to see here, carry on” from the titans of tech.

In sharp contrast: Europe’s right to privacy

It is enlightening to compare digital privacy rights in the U.S. and Europe. Or, more accurately, rules in the U.S. and rights in the EU. Data protection in the EU is protected with as much devotion as Americans view the Constitution and Bill of Rights. And “Rights” is the right word here. Article 7 of the European Union Charter of Fundamental Rights, the treaty that defines the political, social, and economic rights of EU citizens, succinctly outlines a right to respect for private and family life, home, and communications. Then builds on that foundation in Article 8 to further expand the right to privacy

1. Everyone has the right to the protection of personal data concerning him or her.

2. Such data must be processed fairly for specified purposes and on the basis of the consent of the person concerned or some other legitimate basis laid down by law. Everyone has the right of access to data which has been collected concerning him or her, and the right to have it rectified.

3. Compliance with these rules shall be subject to control by an independent authority.

The protection of these principles has been the responsibility of individual EU nations, which initially resulted in a patchwork of data protection regulations. That is, until the passage of the General Data Protection Regulations (GDPR) in 2016. GDPR will be enforced across all the nation states within the EU and UK beginning in May 2018. 

This

means the GDPR will operate EU-wide as the core privacy regulation, replacing the patchwork of national directives that preceded the GDPR. 

Big picture, GDPR means that a single privacy law will apply across all the countries of the EU. Viewed from another angle, the new law will effectively safeguard any and all data collected about any EU citizen, including everything from data generated from using a digital service to purely technical data about an individual’s devices. 

The broad scope of GDPR

One of the most striking aspects of GDPR is just how many types of data are categorized as protected personal data. The law states that “personal data is any information relating to an individual, whether it relates to his or her private, professional or public life. It can be anything from a name, a photo, an email address, bank details, your posts on social networking websites, your medical information, or your computer’s IP address.” 

Under GDPR, EU citizens benefit from new or stronger privacy protections, such as:

• Consent. Users must be informed exactly what data will be collected and how it will be used. This consent is revocable. 

• Data portability. Users can freely transfer their personal data from one service to another. 

• Erasure. Users can require a data collector to erase their personal data. 

• Corrections. Users have the right to correct inaccurate or incomplete information.

Nice and simple and a far cry from the situation in the U.S., where even resources meant to help you delete yourself from the Internet end up basically saying that you can’t really.

There’s an interesting back story here. Many elements of the GDPR emerged in the United States as far back as the 1970’s. But as the Internet era dawned, the U.S. had already lost its lead in privacy protections to the EU. 

Playing online privacy “what if” 

GDPR is an expansive law that applies to any company that collects data on any EU citizen, which means that the law will apply to companies that do not have a physical presence, server, or personnel within the EU or UK. So long as a company is processing the personal data of EU citizens—including simply owning a website with traffic from the EU—that company is required to comply with the provisions of GDPR. 

This definition includes pretty much every multinational corporation and organization of any size doing business in Europe. So it’s not surprising that GDPR compliance is a top data-privacy and security priority of 92% of U.S. companies. Individual companies report compliance costs in the millions of dollars.

GDPR has enormous practical implications for U.S. companies that have grown comfortable selling their customers’ personal data, defaulting users into opting-in status through byzantine Terms of Service agreements, and implicitly acting as if consumer personal data privacy is a privilege and not a right.

Let’s play “what if” and imagine for a moment a U.S. GDPR. The relationship between users and digital service providers would change substantially. Here are three examples.

1. Companies would have to clean house before signing up even one user 

The first change to a U.S. version of GDPR would relate to what a user finds when “arriving” to do business with a digital service. Today, signing up for a new service tends to look the same across services: set up an account by providing some personal data; click “Agree” after not reading nearly incomprehensible Terms of Service and Privacy Policy statements; accept that being tracked and having your data monetized is the “price of free” that pays for the service. 

But GDPR-compliant companies must “clean house” and organize themselves in a privacy-friendly way before users even show up. The content of their Terms of Service agreements, for example, must describe how the services safeguard and protect, not monitor and collect. 

As we described at the outset, you can get a glimmering of this approach in the U.S. when you visit a HIPAA-compliant doctor’s office. You can visit that doctor knowing your data will be protected in specific and comprehensive ways. You don’t have to evaluate the “terms” of working with each and every doctor to make sure they behave in the way you want them to behave. 

2. Users must freely opt-in to a service after receiving transparent descriptions of how the service collects and handles their data

GDPR defines a true opt-in in which a consumer provides knowing agreement with having their personal data collected. Transparency is critical under GDPR, which means the notification to opt-in must be very clearly disclosed and explained to the user or website visitor. 

This is very different from how opt-ins are handled in the U.S. where we live by default in an opt-in system. Digital services begrudgingly offer consumers an opt-out when they take proactive steps to opt-out of having their personal data processed and held. 

3. Users are granted the right to erasure and data portability

As it stands today, Americans can’t completely break up with a digital service. You can request an account be closed, but you have no right to have your personal data scrubbed from the service’s servers completely. That photo you posted to social media 10 years ago will remain the property of the social site indefinitely. 

The GDPR introduced two new rights for data subjects. The first is the right to erasure, which allows individuals to request the deletion of their personal data. The GDPR also gives users the right to receive their personal data in a common format that allows them to transfer the data to another service. 

Taken together this means a user can move everything they ever posted on one social media site to another site, and require the former site to delete everything about the user. The U.S. today has nothing like this.

We may thank Europe for new privacy safeguards

It is easy to get lost in the technical and regulatory details of privacy law and lose sight of its significance. This discussion is ultimately about the tension between an individual’s right to protect and control information by and about them, and an entity’s interest in benefitting from such information. 

The next chapter in the history of online privacy in the U.S. will likely be influenced by EU’s GDPR. Come May 2018, much of corporate America will need to comply with the GDPR or find their corporations subject to fines and sanctions. 

Perhaps the discussion in the U.S. should start with an idea that many people support: data privacy is a right, not a privilege. Seen in this light, much of the digital surveillance we accept today may suddenly seem far less acceptable.