Skip to main content

4 critical tips for creating and implementing a privacy plan

Businesses must account for situations that can put the data of their customers, employees, and partners at risk. Some things are easily overlooked when implementing an actual privacy plan. Here is a checklist to make sure you get it right.

You don’t need an ivory-tower lawyer to tell you that most user-facing privacy policies are made up of mumbo jumbo that essentially say, "We can do whatever we want, neener neener." But what about aspects of user and customer privacy that the organization really does care about—for brand reasons, compliance risk reasons, or both? You probably genuinely want to protect the data of your customers, business partners, and employees. Here’s how to start with a sincere, workable privacy plan.

Security frameworks get a lot of press, and so do user-facing, disclaimer-ridden privacy policies. But when it comes to reducing privacy-related risk while leveraging privacy as a value-added organizational feature, businesses need to plan their work and work their plan.

First, let’s distinguish the concept of a privacy plan from other popular privacy documents and statements. The term "accountability framework," for example, has come into parlance as an aggregate for everything the European Union's General Data Protection Regulation (GDPR) requires to be able to demonstrate compliance to data protection authorities. An accountability framework encompasses a variety of data practices, data governance factors, assessments, and procedures. The rubric of "privacy by design," meanwhile, represents an incorporation of privacy into the engineering and product development process from the beginning.

A privacy plan is broader than that. It demands that privacy be incorporated throughout the entirety of each and every business process. Its goals are loftier than mere check-the-box compliance. To wit, a business privacy plan accounts for (or attempts to account for) all manner of contingencies that can put the data of its customers, customers' customers, employees, and partners at risk—even if nobody does anything wrong per se. Herein, the basic starting points.

Don't just pay lip service to the privacy officer

Last year, Sidewalk Labs, a subsidiary of Google parent Alphabet, was given the go-ahead to give an entire district in Toronto a high-tech makeover. It was meant to be a model for the smart city of the future. Sidewalk Labs hired privacy consultant Ann Cavoukian, renowned for originally formulating the aforementioned privacy-by-design concept. As part of her work, Cavoukian incorporated privacy by design into Sidewalk Labs' privacy planning to ensure the protection of residents' personal data. A crucial component of privacy by design, in this case, hinged upon anonymizing data at all collection sources.

Toward the end of October 2018, Cavoukian quit on the spot after reportedly being told that anonymization protocols for collected data were not guaranteed—something that was counter to her understanding when she began her work. Despite the purported commitment by Sidewalk Labs itself to data anonymization, Waterfront Toronto, the organization that recruited Sidewalk Labs for the smart city project, declined to commit itself or the third parties it worked with to data anonymization.

Sidewalk Labs should have worked with Waterfront Toronto to hash out its privacy plan, including data anonymization agreements, to begin with. Instead, someone either took the assumptions for granted or miscommunicated the ground rules. If the details had been worked out early on, the project would not have lost a talented privacy team member and it wouldn't have received a bunch of bad press over Cavoukian's departure—piled on top of the heaps of criticism it already faced.

That's the thing about privacy professionals, though: You have to do it for the right reasons. Too often, the fundamental reason companies hire privacy officers is so the business can still do whatever it wants (or close to it) and remain in substantial compliance with applicable laws, regulations, and industry standards.

Need information to help you with your artificial intelligence deep learning journey? We have a Dummies guide for that.

Sometimes, however, part of keeping the privacy team happy has to do with one of the reasons team members get into privacy to begin with: their passion for privacy. And even given groups as greedily data-promiscuous as, say, certain marketing professionals, the organization that fails to thoroughly put its privacy officers' advice into practice is only begging for trouble, as regulators and politicians become bloodthirsty when something—inevitably—goes wrong.

Remember the third parties

Sidewalk Labs could not effectively put Cavoukian's advice, nor its own assurances to her, into practice. That was because it apparently had neither action nor mechanism (or, at least, wherewithal) for establishing bright-line rules with third parties such as Waterfront Toronto and any additional partners Waterfront Toronto might choose to engage with.

In any situation involving data (including privacy and security), keep in mind that vendors, outside developers, and other third parties may seek—and, perhaps, obtain—your customers' or users' data.

"'Yes' is nothing without 'how,'" noted negotiation consultant Chris Voss in his 2016 book, "Never Split the Difference." "It only takes one bit player to screw up a deal."

At a GDPR-focused event put on last year by the Massachusetts Technology Leadership Council, Harriet Pearson, a partner at law firm Hogan Lovells, urged attendees to update their vendor and partner agreements. Otherwise, she pointed out, they might get stuck with whatever insufficient or inappropriate boilerplate the vendors draw up.

Common third-party data collection and data handling practices can lag behind current data protection legal frameworks as well as fail to account for the subtleties in new ones. Marketing vendors may be particularly prone to this. One example lies in some vendors' standard overzealous collection of website users' session-replay data, which often includes personally identifiable information. It may further include more strictly regulated biometric data under GDPR because of those vendors' ability to identify someone with said data with "something you do" authentication methodology.

Indeed, organizations, subject to GDPR or not, need policies and draft agreements in place that are informed by an overarching data privacy plan. This is not only for revamping third-party agreements, but also to inform future agreements. You need to examine and create those policies to eliminate and mitigate both known risks and still unknown ones.

Great idea. Now, how? Start with security scenarios.

Use threat-risk modeling to identify, categorize, and assess the risks

Businesses should apply the same kind of cybersecurity threat-risk models and their attendant rating systems to their work with data privacy and compliance. The crucial benefit is that the security policies account for both the severity of a given threat and its likelihood. This allows the privacy and compliance team to categorize different kinds of risks to privacy. From there, the team can design a path to documenting and promulgating processes for each privacy risk category.

In particular, OCTAVE Allegro is probably a good threat-risk model to build upon. In its original and pure form, OCTAVE (which stands for the Operationally Critical Threat, Asset, and Vulnerability Evaluation) was designed with general business and organizational risk in mind instead of technology risk (which most of the popular threat-risk models focus on). OCTAVE Allegro represents a standardized adaptation of OCTAVE principles that is geared specifically toward information security.

But no one threat-risk model is for everybody. The ideal threat-risk model depends on your organization's particular requirements, taking into account its need for legal compliance, brand protection, IT systems and software, and all other aspects of the business.

Actually follow your processes

Once a business determines a way to properly evaluate privacy-threat risks, it must then establish and implement clear, step-by-step, "no regrets" processes. These need to internally address incident response (arising from third-party actions or not) and direct incident response at the appropriate level. And, of course, the business has to follow through on those processes when the time comes.

None of these tasks are quick. That’s especially so for process planning, notes Pearson, where third parties are involved. This requires time, commitment, and patience. Many organizations often forget to budget for these elements, or they throw them out the window—whether because of panic, apathy, or simply not having fully fleshed out processes and procedures to begin with.

Still, the reward for all the work these privacy plans require at the outset is that they save a lot of effort, headaches, and possible penalties in the long run—as long as you actually follow the process you establish.

Facebook, for example, recently learned this lesson the hard way. The U.K.'s Information Commissioner's Office (ICO) slapped it with the maximum (pre-GDPR) fine of £500,000 for data privacy violations after finding that the company didn't react quickly or effectively enough against third parties that had abused user data. The ICO particularly faulted Facebook for neither auditing the third parties nor suspending them from the platform. This was especially significant because the ICO had never before levied anywhere close to its pre-GDPR maximum fine, with one exception (last year, in the case of Equifax).

To underline the point, the ICO made clear that it would have fined Facebook even more if it could have done so. And under GDPR, which carries maximum fines (depending upon the nature of any given violation) of the greater of either €20 million or 4 percent of an organization's annual revenues, it now does have that authority.

The upshot is that, for all of the ways an organization can get into trouble—whether on its own or with the help of third parties—it is always how the organization handles things when something goes wrong that matters, more than the original sin.

Consequently, think of a privacy plan like a business continuity or disaster recovery plan. After all, a privacy incident or privacy threat is its own kind of disaster.

When a business has a document that clearly outlines, "This is what you do if…," it can both reduce organizational risk and everybody's stress levels should a privacy threat arise.

Creating and implementing a privacy plan: Lessons for leaders

  • The roughest points in data privacy are at the edges, where information is shared with third parties.
  • Use data security analyses as a starting point for identifying risks and frightening outcomes to be avoided.
  • The only thing more painful than doing a data privacy plan right is the consequences of not doing it right.

This article/content was written by the individual writer identified and does not necessarily reflect the view of Hewlett Packard Enterprise Company.