Welcome back to our series on cookies, how they work, how they're used, and how they're changing. It's a pretty big series - last time around we talked about how data collection took of and became a big business, and how that business was publicly exposed, setting off privacy concerns among consumers.
Although the Wall Street Journal published their article on cookies and data trafficking in 2010, it would take almost ten more years for real legislation around data privacy to hit the books. Which isn't to say there wasn't an immediate response: Advertisers banded together with trade orgs to create some new data collection best practices, in part to stave off legislation. The United States Congress has for some reason been pretty reluctant to go after advertising practices and data collection, despite the fact that hating ads is a bipartisan issue.
Groups like the iab and 4A released new guidelines around data collection, PII, and what constituted "sensitive information," while data collectors like BlueKai stopped collecting certain types of data completely - targetable segments for P13-17 were completely removed. But real change around data collection would have to come from Europe, where lawmakers had much greater appetites for punishing American tech companies.
Data Scandals Re-ignite the Debate
In the early days of Facebook's IPO, the running joke was that everyone knew how to monetize the platform except Facebook - they were still at the time struggling to generate ad revenue while third-parties like advertisers and game developers were mining a treasure trove of data from the social platform's users and monetizing said data. But following the IPO Facebook would move quickly to close off those data collection avenues, bringing full control of their user data in-house. No longer was it possible for developers and advertisers to extract that data - at least, not legally, anyways.
Enter Cambridge Analytica, a British firm specializing in political advertising. Using a custom app, they illegally harvested the data of nearly ninety million Facebook profiles for the purposes of building psychological profiles which could be used for online targeting, primarily for the 2016 presidential campaigns for Donald Trump and Ted Cruz. This data was used to serve customized messages about candidates to US voters across different social platforms.
The story on Cambridge Analytica's involvement in the election was published by both The Guardian and The New York Times in March, 2018, and would set off a new wave of public concern around data collection and, coupled with a number of high profile data breaches such as a breach with Wells Fargo involving millions of customers' records, would lead to new demand for legislative solutions.
GDPR
In 2018 the EU put into effect its General Data Privacy Regulation (GDPR), a sweeping piece of legislation governing the collection and processing of data on EU citizens and laying out harsh penalties for violating user privacy. And those penalties are real - designed to be big enough that they'll give companies like Google and Apple pause, with fines of up to 10 million Euros or 2% of a company's annual revenue - whichever is higher. That's no joke, and designed explicitly to act as a real deterrent to companies who might otherwise feel they could just eat the hit. The largest single fine to-date has been for Meta, who in May 2023 were hit with a 1.2 billion Euro fine by the Irish Data Protection Commission.
GDPR sets up strict regulations around the collection and use of personal data. One of the biggest was clarifying the meaning of consent, requiring explicit permission from the consumer before you could collect their data. The law also set up an online bill of rights, allowing consumers to request access to their data, request the ability to correct it or have it deleted, and object to its use elsewhere. The law also set up strict regulations about what could be collected, how long it could be stored, and when and how it could be sent across state lines.
Under GDPR companies are split into two categories with regard to data: data processors and data controllers. Controllers determine how data is processed and are responsible for making sure everyone involved is GDPR compliant, while processors are the ones responsible for collecting and processing the data. In most digital advertising cases, the website serving ads is the controller and the data partners they work with for ad delivery are processors. In practical terms, this means that the website is responsible for delivering you the opt-in message informing you what's going to be collected on your site and which third party companies they're working with. In effect, this was how advertisers passed the buck on GDPR - they named themselves controllers and passed responsibility for privacy management and opt-in to the publishers.
But what was the impact on advertising? In the months leading up to GDPR's implementation, the ad industry was awash in speculation that opt-outs would skyrocket, conversions would go down, and CPAs would go up. And after GDPR went into effect, that did happen... for about three weeks. It turns out, people were willing to opt-in at embarrassingly high rates, in part because websites were clever about gating content - turns out when it's more of a pain to opt-out, people will just mash "accept" in order to get to their content. We'll come back to this later.
The groups hardest hit by GDPR were mobile advertisers and data collection platforms. Under GDPR most location data tracking became difficult or impossible, leading to a number of location-based and mobile data services to just stop operating in Europe. GDPR made European apps remarkably less intrusive but also sharply reduced the introduction of new apps and led to the withdrawal of many others.
While GDPR only governs the rights of EU citizens, it does so in a broad way, forcing US companies to act in compliance with GDPR when those citizens are visiting US websites. This would lead to US and Canadian companies building the framework for future privacy regulation internally and rolling it out piecemeal, something which would come in handy as privacy regulation hit the United States.
CCPA
One year later the California Consumer Privacy Act (CCPA) would become the first US law to address modern data privacy, focusing on giving consumers insight into the data being collected on them and control over how that data could be sold and traded between companies. While not nearly as wide-ranging as GDPR in its scope (and focused more on the sale of data than its collection), CCPA would set the tone for other privacy legislation in the United States, establishing similar guidelines and forcing US companies to build compliance around the strictest common set of laws. Maine would follow suit in July 2020 with An Act to Protect the Privacy of Online Consumer Information, and Colorado
Since CCPA, more than a dozen other states have signed PII privacy laws and nearly as many are currently considering laws of their own - and across the country. Data privacy is a bipartisan issue and as more states enact legislation the burden on advertisers and ad tech firms to comply with the harshest set of shared standards across all active laws increases.
Next Time: The Cookie Dies... or Does It?
Next week we'll wrap up our series by looking at Google's decision to deprecate cookies, how they attempted to create replacements, and why they ultimately ended up pushing that decision indefinitely. We'll also talk about what it means and why it does - or doesn't - matter.
Comments