Former Information and Privacy Commissioner of Ontario, Ann Cavoukian, originally developed Privacy by Design as a way to approach the design of any technology system with privacy taken into consideration throughout the entire engineering process.

The concept entered the collective consciousness of the digital advertising industry in recent years after the European Union unleashed the General Data Protection Regulation (GDPR) in 2018. It has garnered renewed focus with the impending deprecation of third-party cookies in Google’s Chrome browser and Apple’s new restrictions on the iOS IDFA advertising identifier.

First introduced in 1995, and then codified into an official framework in 2009, Privacy by Design consists of seven foundational principles. However, there is one principle from this framework that may sound familiar:

Privacy as a default setting: If an individual does nothing, their privacy still remains intact. No action is required on the part of the individual to protect their privacy − it is built into the system by default.

Privacy as a default setting materializes in some form in article 25 of the GDPR, as “Data Protection by Design and by Default.” While similar, these two concepts are very different, and it’s crucial to understand that distinction.

“Data protection by design and by default” assumes that a system will collect personal data and should take care to protect this data. “By default” is the idea that a system should only collect what is necessary for a specific purpose and no more.

In contrast, with “privacy by design,” there is no need to consider the protection of personal data because there is no collection or use of personal data. Platforms considering privacy by design would not drop cookies, share identifiers, or store personal information to serve digital advertisements.

This concept could sound shocking to anybody who has worked in ad tech. Pervasive tracking and user identification form the bedrock of modern-day digital advertising. Understanding user behavior and their interests are critical to power the detailed measurement and audience targeting capabilities that make digital such an attractive alternative to traditional forms of advertising.

But what if there was a way to retain the advanced targeting and measurement benefits of digital advertising while adhering to privacy by design? There are political, social, and business undercurrents that may carry privacy by design from a fringe concept to a core tenet of any advertising platform.

Companies like Amazon believe that privacy by design is important enough to hire individuals solely focused on the concept. They recently posted Product Management and Engineering roles focused wholly on privacy in advertising. The engineering job posting specifically mentions that “The team will focus on engineering solutions that will support privacy by design."

The third most valuable company in the world is concerned with this idea taking off, but why?

The Need for Privacy by Design Ad Tech

Antitrust regulation against Google and its possible anticompetitive behavior in the online advertising market may drag some proverbial skeletons out of the closet, like the unprecedented amount of user data collection and tracking it performs. If these concerns enter the news cycle during the inevitable reporting of the upcoming DOJ case against Google, then the less technical-savvy of our friends may start paying attention.

Consumers around the world are waking up to privacy concerns with the help of the most valuable public company in the world. Apple’s recent national advertising push focusing on privacy may start causing users to take their privacy more seriously, and Apple is willing to help them do that.

GDPR shook the ad industry, but Europe represents a fraction of those sweet American advertising budgets. Regulations on the homefront of the large advertising companies like the California Consumer Privacy Act (CCPA) are more concerning. Even though CCPA is much less restrictive than its European cousin, it may signal continued focus and regulation on privacy in the United States.

Even though GDPR originated on a separate continent, users around the world bore the brunt of the regulation through the now prevalent cookie consent banners. These are the banners that ask, sometimes without the ability to decline, if a website can drop cookies on your browser.

Despite cookie consent banners proactively informing users of data collection, the implementation of these consent banners may not qualify to some as privacy by design. The simple consent banners that only display a single “Accept” option disingenuously asks for consent to all uses of cookies and often do not give users a simple way to decline, or even a method to decline at all.

A 2018 research paper entitled (Un)informed Consent: Studying GDPR Consent Notices in the Field, found that 86% of the cookie consent dialogs on the most popular websites in the European Union offered either no option of consent or just a single confirmation to all cookies.

The research paper also details an experiment conducted on 82,000 unique visitors to a German e-commerce website where they measured the opt-in rate based on the type of consent banner displayed. Researchers presented the below “categories” version of the consent banner to users with all categories pre-checked:

Users on desktop only accepted all selections 11.9% of the time, and when researchers removed the pre-checked options, users never opted into all categories. When presented with no category options, and a simple “Accept” or “Decline” dialog, the acceptance rate increased slightly to 21%.

So when given a clear choice, users overwhelmingly choose to maintain their complete privacy. What would happen if a similar consent dialog appears in native apps on Android or iOS? We will find out in early 2021 when Apple introduces their new opt-in framework to ask the user permission before an app can read a device IDFA.

Similar dialogs have already started to pop up on smart TVs:

Automatic Content Recognition can track everything displayed on a smart TV even if the source of the content is not internet-enabled. Does the average user understand the implications of accepting the terms above? If smart TVs explained how they use data, users might not be so quick to select agree to the terms.

The Interactive Advertising Bureau and the advertising industry often hold to the view that consumers are delighted by personalized advertising experiences. The IAB even touted this in a deck titled “The Value of Targeted Advertising to Consumers,” where they cite an Adlucent study that found “71% of respondents prefer ads tailored to interests and shopping habits”. But this does not tell the whole story.

If users were aware of how their advertising experiences are personalized, then sentiment degrades sharply. A study performed by cybersecurity company RSA found that when it comes to tracking online activity to tailor advertisements, “17% of those surveyed viewed it as ethical, and 68% find it unethical.”

Users want personalized experiences, but overwhelmingly opt-out of the mechanisms required to facilitate those experiences when given a clear choice. The majority also find the practice of tracking to perform advertising as unethical. These factors paint a grim picture for the future of digital advertising, but several initiatives may satisfy users and advertisers.

The Importance of Identifiers

It is common for advertisers to use data to target audiences by behavior, interest, or demographics. Data-driven targeting helps them narrow down the audience their ad is displayed to and ensures that ads are seen only by those most likely to resonate with the message.

The issue is that to power audience targeting, advertisers either purchase third-party data sets or collect user data through opaque and widespread tracking methods.

Data brokers enter into agreements with credit card companies, websites, and app owners to surreptitiously track what a user purchases, the websites they visit, and the apps they download. The right for companies to sell user data is often buried deep within a privacy policy or terms & conditions.

Companies like Criteo help advertisers retarget users based on their browsing history or behavior. Criteo often plays a role in “creepy” ads that display products that users previously looked at on a website or app. The Criteo stock price may reflect investor perception on the longevity of these retargeting methods:

The wild west days of nearly universal data collection are coming to a close, and not just because of regulations like GDPR and CCPA. The companies that make the smartphones and browsers are being swept up in the newfound focus on privacy, depriving data companies access to individual identifiers used to tie all their user profiles together.

Apple will start requiring apps to ask users for consent if they want to track them starting in early 2021, and if they choose to opt-out, developers will not have access to the user’s advertising identifier, the IDFA.

Google has pledged to remove third-party cookies from Chrome by 2022, following in the footsteps of privacy-focused browsers like Apple’s Safari and Mozilla’s Firefox.

Both of these unprecedented shifts in policy adhere directly to the “privacy as a default setting” principle, and will render third-party data collection in its current form useless. These impending changes have the digital advertising industry reeling and scrambling for answers.

The IAB’s answer is called “Project Rearc,” an effort to shift from cookies and IFAs to user emails as the user identifier of choice. The idea is that users will willingly share their email in exchange for access to free content — which is a strong assumption. But does Project Rearc genuinely consider privacy by design?

Technically, Project Rearc can follow privacy by design if publishers are transparent with how they will use a user’s email on dialogs prompting for its collection.

Philosophically, some may find Project Rearc at odds with the essence of privacy by design due to the concept of “privacy as a default setting.” If publishers deny access to their content without a user providing an email, you are maintaining user privacy but refusing service to the user for not sacrificing their privacy.

Many users do not understand or do not care to think about the value exchange powering the free and open web they have used for decades. Digital ads, for better or worse, have empowered individuals and media companies around the world to create and provide access to free content for billions of people. If users had to pay for content, the breadth of information they access could be limited to a few gatekeepers, and individuals who couldn’t afford it may not have access to any information or entertainment at all.

So what are the alternatives?

Contextual - The Boring Answer

The boring answer to this is contextual advertising. The answer is boring because ad platforms like Google’s Adsense have been delivering contextual ads for years.

The general idea is that the advertisement is targeted based on the context of the content displayed with the ad. Companies like Peer39 scan the content of a page and place the URL into a database organized by specific categories and assign keywords. Partner DSPs of Peer39 then use the URL passed in an ad request to query Peer39’s database to power category and keyword targeting.

If advertisers only use “context” as a signal to power targeting, and do not use any individual identifiers or personal information, then this adheres to privacy by design. But contextual advertising lacks the specificity and power advertisers have enjoyed through tracking.

Advertisers employing contextual targeting today often use it in conjunction with powerful tools that leverage tracking. Advertisers would not use contextual only unless forced to, which may happen very soon with the impending removal of cookies from Chrome and IDFA from iOS.

Privacy by Design Ad Tech

Google and Apple are already well on the way of designing the future of data targeting and measurement while maintaining user privacy. Google’s Chrome Privacy Sandbox and Apple’s SKAdNetwork reveal a glimpse of a possible future of digital advertising. Both of these efforts present solutions that genuinely consider privacy by design.

Privacy by Design in Chrome

The Chrome Privacy Sandbox presents a series of proposals, with each one meant to solve a particular facet of digital advertising that would be rendered useless in a future privacy-supercharged version of Chrome.

In Google’s TURTLEDOVE proposal, a privacy-focused version of Chrome would allow advertisers to use local storage to store behavioral targeting information. This storage mechanism would store data locally on a user’s browser. The browser would then be responsible for conducting all auction logic for behaviorally-targeted ads client-side with no information about a user ever leaving their device.

In a separate FLoC (Federated Learning of Cohorts) Proposal, Google lays out a vision meant to solve for demographic and interest targeting in a privacy-compliant manner. The browser would monitor user browsing behavior and place users in “flocks” or groups of similar users with similar browsing habits. These flocks would contain thousands of users, ensuring anonymity, and Google will make available flock membership identifiers through a request header.

Advertisers could record that some flocks tend to buy one item and target them for similar items in ads. Ad platforms could build models that optimize for different goals, like click-through rate, based on flocks — perhaps noticing some groups click certain ads more than others.

Both of these proposals offer an alternative to third-party cookies and pervasive tracking of users across the web. The clear advantage of both of these proposals is that all of a user’s information stays locally on their browser or is anonymous.

TURTLEDOVE and FLoC are the two main and most thought-out proposals, but Google also released even more privacy-preserving plans. These additional proposals put forth ideas on facilitating cookieless ways to count unique users, prevent fraud, and hide IP addresses.

Privacy by Design in iOS

The new tracking consent requirements from Apple will seriously restrict access to the device IDFA (Identifier for Advertisers) on iOS. Opting-out of tracking would prevent an app from accessing the IDFA, rendering conversion tracking impossible.

The new consent requirements delivered shockwaves through the digital advertising industry. The announced changes sent companies scrambling to adopt Apple’s SKAdNetwork since it provided the only way to measure the conversion rate of advertised app installs without an IDFA.

SKAdNetwork is a privacy by design solution to track if a user successfully installed an app after clicking an ad to download it. The privacy by design framework provides a way to measure app install conversion without tracking a user and their IDFA by triggering anonymous conversion notifications from a user’s phone or tablet. This new method adheres to the company ethos of privacy that Apple is establishing in the mind of consumers worldwide.

A Few Control The Future for All

The actions by Google and Apple are clearly in response to the growing desire of privacy by their users.  The initiatives presented provide privacy by design methods to conduct digital advertising by not requiring a user to share any data with an advertiser in the first place, underscoring the distinction between “data protection by design and by default” and “privacy by design.”

Google still has a lot of work to do on its Privacy Sandbox proposals, and Apple’s SKAdNetwork doesn’t solve for maintaining crucial advertising needs like frequency capping.

Privacy by design advertising technology presents a significant challenge for any ad tech company. Moving away from cookies and identifiers will require the mastery of entirely new disciplines by the people who are taking on the challenge.

The IAB’s project Rearc could present an alternative path but may ultimately be counter to a user’s ultimate desire for privacy by design advertising experiences. Adapting and creating privacy by design platforms and tools will present a considerable challenge to ad tech providers. But the companies who own the most popular browsers and smartphone operating systems will ultimately establish the rules everyone must play by.