Can privacy really be protected anymore? - EY - US · Building on the key trends from 2015 Moving...

13
Insights on governance, risk and compliance Can privacy really be protected anymore? Privacy trends 2016

Transcript of Can privacy really be protected anymore? - EY - US · Building on the key trends from 2015 Moving...

Page 1: Can privacy really be protected anymore? - EY - US · Building on the key trends from 2015 Moving forward: 10 Actions to help improve accountability in 2016

Insights ongovernance, riskand compliance

Can privacy reallybe protected anymore?

Privacy trends 2016

Page 2: Can privacy really be protected anymore? - EY - US · Building on the key trends from 2015 Moving forward: 10 Actions to help improve accountability in 2016

1Can privacy really be protected anymore? — Privacy trends 2016 |

Contents

Introduction 1

Privacy demands more rigorous accountability 2

The view today: 3 Building on the key trends from 2015

Moving forward: 10 Actions to help improve accountability in 2016

Conclusion 19

As a concept, privacy, and the need to protect it, has been around for decades. Yet the programs and governance structures in place to turn the concept of privacy protection into reality remain, if not in their infancy, then certainly in their adolescence.

In EY’s Global Information Security Survey (GISS) 2015 privacy questionnaire, 38% of respondents admit that they address security in new business processes and technologies, but not privacy specifically.1

However, more telling, and perhaps more concerning for organizations and individuals alike when it comes to managing privacy, is that for nearly half (46%) of survey respondents, their number one or two concern is not having a clear picture of where personal information is stored or processed outside of their main systems and servers.2 This is exacerbated by the fact that for 40% of respondents, their number one or two concern is that there are simply not enough people to support their privacy program.3

In a world where laws and regulations cannot keep pace with digital change, the question many are asking is: can privacy really be protected anymore? As the onus of accountability shifts from regulators to organizations, organizations need to take heed of where they are in terms of their privacy maturity and what they need to do to make privacy protection a part of everything in an organization.

Can privacy really be protected anymore?

Introduction

1 EY’s Global Information Security Survey 2015 asked approximately 630 privacy professionals for their views on a number of critical privacy issues.

2 Ibid.

3 Ibid.

EY’s Global Information Security Survey 2015 privacy questionnaire asked 630 privacy professionals what concerned them most when it comes to how privacy is managed in their organization. (Rate each of the following concerns from 1 to 5; 1 = most important, 5 = least important)

15% 18% 25% 18% 23%

0%

1 2 3 4 5

10% 20% 30% 40% 50% 60% 70% 80% 90% 100%

8% 20% 33% 23% 15%

12% 24% 33% 19% 12%

11% 24% 34% 20% 11%

14% 26% 30% 19% 11%

12% 23% 36% 20% 9%

23% 23% 24% 14% 15%

15% 25% 33% 16% 10%

13% 12% 42% 13% 20%

Lack of tone from the top

There is a disconnect between the corporate program and its level of implementation in the business units

We don’t control what our vendors/suppliers are doing with the personal information they have access to

Insufficient funds to address known gaps

We don’t control what employees are doing with the personal information they have access to

Overreliance on policy and training with limited technical controls to guide correct behavior

We don’t have a full picture of where our personal information is stored and processed outside of our main systems and servers

Insufficient people assigned to support our privacy program

Other (please specify)

Page 3: Can privacy really be protected anymore? - EY - US · Building on the key trends from 2015 Moving forward: 10 Actions to help improve accountability in 2016

2 | Can privacy really be protected anymore? — Privacy trends 2016

We see an increasing overlap of four areas that were once viewed as distinctly separate: financial reporting; cybercrime; national security; and the use of personal information.

The merging of some of these areas is not new. Some, such as cybercrime and national security, have had intersecting points for some time now. However, during 2016, we expect these intersections to becomemore frequent and more pronounced.

As organizations increasingly monitor employees in an effort to track cyber threats, for example, they need to be doing so in a way that balances security with employee privacy. Similarly, the Securities and Exchange Commission in the US is considering requiring organizations to report on cybercrime or cybersecurity risks as part of its financial reporting because of the calamitous financial impact a major cyber breach can have.

In the context of this mashup, organizations need to be mindful of the impact these overlaps have and how they address the associated privacy risks. Specifically, organizations should be thinking about three trends that will improve the level of privacy accountability that all stakeholders, from customers and clients to vendors to governments themselves, expect.

1. Governance Given the increasing complexity of the cyber world, organizations can no longer rely solely on ad hoc privacy processes to protect personal data. Rather, they need a robust privacy program that demonstrates accountability throughout the privacy life cycle and effectively interacts with other parts of the organization that process personal information.

Privacy demands more rigorous accountability

Report from the field:  top-level engagement in  cybersecurity Each year, EY conducts its Global Information Security Survey (GISS) to ask information security executives around the world to uncover the key cybersecurity trends and emerging issues dominating their focus. Although our research reveals great strides in the highest organizational level taking ownership of information security, much more needs to be done. The results of this cybersecurity survey from 1,755 respondents were published in our GISS 2015 report “Creating trust in the digital world.”

As an essential part of the GISS 2015, we asked 630 participants to answer questions focused on the privacy issues facing organizations today: many of the findings quoted in this report are taken from both sets of results.

We would like to thank all our clients who took part and are especially grateful to those who contributed the insights provided in this report.

Dr. Sagi LeizerovEY Global Privacy Leader [email protected]

The view today2. Rigor

Promises are fine, but stakeholders are also demanding proof of accountability, adopting the motto “trust, but verify.” To meet the rigors of verification, auditors acknowledge that they are serving as independent verifiers not only of financial reporting, but also cybersecurity reporting through Reports on Controls at a Service Organization Relevant to Security, Availability, Processing Integrity, Confidentiality and Privacy (SOC 2) because of increasing cybersecurity risks.

3. Trust in service providers Just as customers are looking to trust organizations to be accountable for their privacy, organizations are looking to trust third-party providers with their data. Following the financial crisis in 2008, the demand for cloud services spiked, primarily as an effective means of cost-cutting — even as organizations worried about the cybersecurity risks. Now, organizations are moving to the cloud because cloud services providers are seen to have more sophisticated cybersecurity programs to prevent advanced attacks than in-house resources can provide. Moving forward, trust in third-party providers will become the critical foundation upon which organizations build their businesses for a digital era.

Looking forward, expect these trends in privacy management to become ever more critical as the lines among financial reporting, cybercrime, national security and use of personal information blur into a single overriding issue for which organizations will be accountable.

Digital disruption, government-mandated organizational accountability for privacy, and the concept of “trust but verify”

Creating trust in the digital world: EY’s Global Information Security Survey 2015

www.ey.com/GISS2015

Page 4: Can privacy really be protected anymore? - EY - US · Building on the key trends from 2015 Moving forward: 10 Actions to help improve accountability in 2016

5Can privacy really be protected anymore? — Privacy trends 2016 |4 | Can privacy really be protected anymore? — Privacy trends 2016

Building on the key trends from 2015

The view today

As more aspects of an individual’s life move online, their propensity to share personally identifiable information, whether intentionally or inadvertently, has risen exponentially. The digital age abounds with opportunities to connect and grow and expand in every direction.

Yet, as individuals, organizations and governments explore these opportunities, someone has to be responsible for monitoring — and managing — the privacy risks. Given the speed of change, regulators and individuals alike are increasingly looking to organizations to assume this responsibility. And they are demanding proof.

Digital disruption, government-mandated organizational accountability for privacy, and the concept of “trust but verify” are the 2015 trends we explore.

4 European Commission, “Progress on EU data protection reform now irreversible following European Parliament vote,” 12 March 2014, http://europa.eu/rapid/press-release_ MEMO-14-186_en.htm.

5 EY’s Global Information Security Survey 2015 privacy questionnaire.

When I think about the Internet of Things, I actually refer to it as the “Internet of Everything”: 50 billion sensors, devices and information gathering technology that are becoming increasingly integrated into our world. Within this context, strategically I see four aspects — four “Es” — of the Internet of Things that have created both a qualitative and a quantitative shift in how companies think about privacy.

1. Internet of Everything. The Internet of Everything looks at the Internet of Things as a whole. This includes factory sensors, global positioning systems, wearable devices (Google Glass, Fitbit, etc.), and “smart” technology that tells you when it’s time to order food or when it’s time for a medical checkup. Right now, there’s a huge market out there to build cheap sensors to connect ordinary things to the internet, and to add content and capacity into traditional telecommunications type networks to enable that connection.

2. Internet of Everyone. This is where privacy as a notion of the authorized processing of personalized identifiable information according to fair legal and moral principles lives. In the process of the Internet of Everything gathering and transferring information, the Internet of Everyone has to have a say. If the Internet of Everything and the data associated with it offers a quantitative distinction, the Internet of Everyone identifies a qualitative difference. We need to be designing the Internet of Things so that everyone can feel as if they have the right level of control and management over their data.

3. Internet of Ethics. The Internet of Ethics challenges us to ask what our ethics are in a world of devices that do not recognize our cultural barriers. Around the world, cultural norming varies vastly from household to household. We need to have a really robust, cross-cultural and cross-generational discussion around the Internet of Ethics that considers all kinds of different perspectives, as well as the quantitative and qualitative differences when we think about exponential data flow.

4. Internet of Experience. People like the experiences of being able to communicate on our smartphones and tablets. However, the Internet of Experience isn’t only about what the technology can do for us, but also what experience we want to have, and who gets to control these experiences. As well, we need to consider whether all of these billions of micro-experiences need to be recorded, tracked and saved. We need to do a better job of documenting what’s important and deleting the rest.

For the most part, we tend to see the Internet of Things as technology that gathers data for the purposes of making our lives easier or more enlightened. But the missing ingredient within all of this quantitative data is the human element — the qualitative differences that explain what the data can’t. Ultimately, in the age of the Internet, the age of the Internet of Things, the most important thing we need to remember is to be human.

Privacy in the age of the Internet of EverythingDigital disruption threatens privacyDigital technology continues to disrupt everything and everyone in every industry around the globe. Social, mobile, big data, cloud, Internet of Things: all of these technology advances are fundamentally altering how companies do business. They are creating massive opportunities for companies to develop new products and services, strengthen relationships with customers and employees, and expand into new territories.

Social media enables organizations to interact directly with customers and employees. Mobile is becoming the primary channel for commerce and an indispensable tool for work. Cloud is increasingly being adopted as the go-to data storage option for anywhere anytime access. Big data enables organizations to understand the individual preferences of their customers and tailor their products and services accordingly. And the Internet of Things makes everything smarter: refrigerators that can tell homeowners when their milk is going bad; thermostats that can automatically adjust based on peak loads and rates; and smoke detectors can let you know if there’s a problem when you’re not at home.

Yet for all the opportunities digital technologies present, they also come with an abundance of new and unforeseen risks and complexity when it comes to dealing with personal information. Who, for example, is monitoring — and protecting — all of the data associated with these “smart” technologies? Likewise, as individuals and organizations entrust their data to third-party cloud providers, how can they make sure providers are worthy of that trust? And what about all that big data that organizations are collecting to personalize customer experiences, improve products and generally learn more about their target audiences? Who’s making sure that all that personally identifiable data is both safe and

scrubbed of anything that may jeopardize an individual’s privacy? The answers to some of these questions, according to EY’s GISS 2015, are less than satisfying.

In this year’s GISS survey, 54% of privacy questionnaire respondents say that their organization have no formalized

Michelle DennedyVice President and Chief Privacy Officer

Cisco SystemsSan Jose, CA, USA

requirements for using big data while addressing its privacy obligations. Similarly, although 46% of respondents plan to increase their use of social media in the coming year, 37% have no formalized requirements to address privacy concerns related to social media. As for minimizing personal identification of data, 61% of respondents say that they either have no mandate to minimize or de-identify personal information, or they only do it in unique circumstances.

Innovation, convenience, customer satisfaction, profitable growth are all important, but they cannot come at the expense of personal privacy. Increasingly, organizations are compelled to determine not only what information they collect, but how they collect it, store it, use it, maintain it, and protect it.

Current privacy regulatory regimes may not be enough to address privacy issues in digital ecosystems where data flows across regions and continents, and where different players are subject to different laws and users can access the services anywhere. As such, regulators and individuals are looking to organizations to assume the mantle.

Message from regulators to organizations: take responsibilityIn 2010 at the 32nd International Conference of Data Protection and Privacy Commission in Jerusalem, Privacy by Design (PbD) gained international recognition with the signing of the Privacy by Design Resolution. Five years later, PbD, which advocates that all new business processes and technologies be created with privacy already embedded into them, has become the international privacy standard that has been translated into multiple languages. Further, both the United Nations, the European Parliament and the US government have indicated that PbD is absolutely critical in maintaining personal privacy.

Yet, despite its international reputation as the leading privacy standard, only 18% of the 630 GISS privacy questionnaire respondents use PbD as part of how they create new processes and technologies.5 Another 24% plan on including PbD in the near future. However, a full 38% don’t specifically address privacy at all.

For organizations all over the world, this laissez-faire approach to managing privacy has to change, particularly as regulators mandate that organizations do so.

In the EU, data protection reform produced MEMO/13/923 and MEMO/14/60, which improves data protection rules and puts citizens back in control of their data. Specifically, PbD became part of the fundamental principles in EU data protection rules that put privacy front and center, rather than an afterthought.

In October 2014 in Australia, the Commissioner for Privacy and Data Protection (CPDP) formally adopted PbD as a core policy to strengthen information privacy management in the Victorian public sector.

Continued

“ ‘Privacy by Design’ and ‘Privacy by Default,’ will become essential principles in EU data protection rules — this means that data protection safeguards should be built into products and services from the earliest stage of development, and that privacy-friendly default settings should be the norm — for example, on social networks.”

European Commission4

Page 5: Can privacy really be protected anymore? - EY - US · Building on the key trends from 2015 Moving forward: 10 Actions to help improve accountability in 2016

7Can privacy really be protected anymore? — Privacy trends 2016 |6 | Can privacy really be protected anymore? — Privacy trends 2016

In 2012 in the US, a Federal Trade Commission (FTC) report, Protecting Consumer Privacy in an Era of Rapid Change, proposed a framework that made PbD a core value for business and policymakers. Then in January 2015 the FTC issued a staff report that addresses privacy and security related to the Internet of Things, and makes recommendations based on PbD.6

Also in the US in 2015, the White House released a draft of the proposed Consumer Privacy Bill of Rights. The bill is intended to govern the collection and dissemination of consumer data. It will also require organizations to disclose data breaches in a timely manner to mitigate risk of identity theft. Although the title suggests the bill of rights is for customers, it has the potential to help both consumers and businesses.

6 FTC Staff Report, internet of things: Privacy & Security in a Connected World, January 2015, https://www.ftc.gov/system/files/documents/reports/federal-trade-commission-staff- report-november-2013-workshop-entitled-internet-things-privacy/150127Internet of Thingsrpt.pdf.

7 Rich, Jessica, “Beyond Cookies: Privacy Lessons for Online Advertising,” AdExchanger Industry Preview 2015, 21 January 2015, https://www.ftc.gov/system/files/documents/public_statements/620061/150121beyondcookies.pdf.

8 EY, Navigating the European Court of Justice’s data protection ruling, 2015.

In the automotive industry, the notion of the connected car is not new. The emergency “e-call” feature in BMW cars has been available since 1999. What is new is the level of connectedness in vehicles as digital technology revolutionizes the driving experience.

This is a challenge for the automotive industry because the development cycle of a vehicle is seven or eight years, whereas the development cycle for technology can be as short as seven to 14 days.

BMW has a Privacy by Design (PbD) policy that we maintain and enhance. We also spend time training the appropriate resources within the business to adhere to the principles outlined in the policy. Part of that policy states that at a very early stage, our privacy professionals need to be involved in the development of new features and apps.

More importantly, we have defined methods for privacy that consider and synchronize the traditional long-term development cycle of classic car production with the extremely short-term IT development life cycle.

We have also achieved BCR certification to establish and maintain consistent and adequately protective data privacy policies across all of our legal entities around the world.

We believe that PbD and BCR offer robust frameworks for establishing privacy programs. However, today’s and tomorrow’s connected cars offer automated driving features. The questions automotive companies are asking are: who should be controlling these features and who should control the data? The e-call feature is a good example. If the driver has the option to deactivate the feature, but the next driver doesn’t reengage it, experiences an accident and no one comes to the rescue, who is responsible? One person’s choice is impacting another’s driving experience. Similarly, there is the issue of who controls the data that a vehicle collects. Does the driver or owner of the vehicle control it? Or does the original equipment manufacturer (OEM)? The connected car creates conflicts between consumer choice and the OEM. Much like the health care industry, consumers and OEMs alike may look to regulators to resolve these conflicts.

Ultimately, privacy may become a feature that influences a consumer’s car-buying decision so it’s important that automotive manufacturers get it right early. Privacy professionals add that level of value.

The conundrum of privacy in the connected car

Werner BednarschHead of Group Data Protection

BMW GroupMunich, Germany

The view today

Proof of privacy programs emphasizes accountabilityBCR in EU and CBPR in Asia-Pacific come of age, while Safe Harbor in the US faces a setback

In today’s global and increasingly digital economy, processing and transferring personal information across borders is a fundamental part of an organization’s daily activities. That’s why standard-setters in the US, EU and Asia-Pacific regions, established frameworks and guidelines that set the tone for cross-border privacy protection. There are three recognized programs for transferring personal data: the US-EU Safe Harbor Framework, Binding Corporate Rules (BCR) and the Asia-Pacific Economic Cooperation (APEC) Cross-Border Privacy Rules (CBPR).

To date, more than 5,000 entities globally have certified under the US-EU Safe Harbor Framework. However, on 6 October 2015, the European Court of Justice (ECJ) declared the US Safe Harbor agreement to be invalid because it does not protect data transferred from the EU to the US against access by US intelligence agencies.8 Although we have yet to understand the full impact, we do know that Safe Harbor will have to change. EU and US authorities are working on a Safe Harbor 2.0 framework to meet a January 2016 deadline imposed by the ECJ. However, it remains a work in progress. In the meantime, Safe Harbor cannot be relied on as a certified framework for transferring data from the EU to the US in compliance with EU data protection laws. Organizations using the Safe Harbor framework will need to review their data protection policies, contracts, and terms and conditions as a result of this ruling. They will also want to consider other mechanisms, such as BCR, to meet EU requirements.

Approximately 60 companies have successfully applied for approval of their BCRs — 20 more companies than the number approved when we reported on BCR in 2013. Adopted by APEC in 2011, CBPR has gained traction beyond the Asia-Pacific, with Mexico joining in 2013, Japan in 2014 and Canada in 2015. These certifications push more responsibility from regulators to companies themselves, requiring companies to have relatively mature privacy programs in place, from having a privacy official, to training, to maintaining the program.

Until the October 2015 ruling, there was every indication that all three programs were reaching or had already reached a tipping point. Yet, despite their increasing adoption across the globe, according to EY’s GISS 2015 privacy questionnaire, of the 43% of respondents who plan on transferring personal information out of the EU, only 10% are in the process of or have already obtained BCR approval. Similarly, of the 42% of respondents intending to transfer personal information out of regulated APEC countries, only 6% are in the process of or have already obtained CBPR approval.

As accountability for privacy protection continues to shift to corporations, companies who are on the fence about implementing a certifiable privacy framework will need to make a move sooner rather than later.

“Companies should build-in privacy protections at every stage as they develop their products and services…Privacy protections are most effective when they are part of a company’s fundamental business model and not overlooked or added later as an afterthought.” 7

Jessica Rich, Director, Bureau of Consumer Protection

EY’s GISS privacy questionnaire asked 630 respondents “when it comes to adopting BCR as a means of legitimizing those transfers out of the EU, which of the following statements best describes your organization?” (Select one)

Numbers may not sum due to rounding

57%

17%

10%

9%

7%

0% 10% 20% 30% 40% 50% 60%

We don’t transfer personal information out of the EU countries

Undecided as to whether to take it or not

We are in the process of, or have already obtained, BCR approval

BCR is not a path we intend to take

It is very likely that we will apply for BCR but don’t know when

Page 6: Can privacy really be protected anymore? - EY - US · Building on the key trends from 2015 Moving forward: 10 Actions to help improve accountability in 2016

9Can privacy really be protected anymore? — Privacy trends 2016 |8 | Can privacy really be protected anymore? — Privacy trends 2016

At Corning Incorporated (Corning), we take the protection of personal data seriously, particularly when it is being transferred across borders among the Corning group of companies.

Four and a half years ago, Corning implemented policies that would enable us to adhere to the principles associated with US-EU and US-Swiss Safe Harbor Frameworks. These certify that we comply with appropriate standards for privacy protection when transferring data between Corning’s US and EU entities. When we started working on complying with Safe Harbor, we made a conscious effort to address as many BCR requirements as possible. We have since gone on to receive BCR certification, making it possible to transfer personal data between Corning entities throughout the world. Formally adopting the BCR requirements was easier mainly because we already had many of the basic principles in place — a data privacy office, a corporate privacy policy, privacy impact assessments that identified current practices and any gaps, and an inventory system for third parties.

Furthermore, we had embedded privacy into our culture. The principles that certify us under BCR and Safe Harbor are the same principles we apply globally when it comes to protecting personal information.

BCR, Safe Harbor and other certifications serve as motivational compliance tools for organizations to be accountable for the personal information in their care. We found at Corning that compliance is much easier if organizations implement a strong privacy foundation and accountability mindset from the outset.

A strong foundation for privacy accountability makes compliance easier

Kevin Murphy, CISM, CIPM, CIPT, CIPP/E/UInformation Security and Data Privacy Officer

Corning InternationalCorning, NY, USA

The view today

More countries adopting breach notification regulationsDespite the financial, operational and reputational damage a privacy breach can cause, breach notification remains a tactical rather than a strategic imperative. Executives continue to be more interested in compliance than understanding the risks that precipitated the breach.

As we have stated in previous Privacy trends reports, this leaves individuals increasingly vulnerable to both aggressive organizations seeking to create competitive advantage, or nefarious actors looking to profit from unauthorized access to personally identifiable information.

In the US, in the absence of federal regulatory action to produce standardized breach notification requirements, the states are going it alone. In 2015, at least 32 states introduced or are considering security breach notification bills or resolutions. Many of these bills would amend existing data breach laws to require:

• Companies to report breaches to state attorneys general or other central state agency and expand the definition of “personal information” to include medical, insurance or biometric data (fingerprints)

• Businesses and governments to implement security plans or security measures

• Educational institutions to notify parents or government entities if a breach occurs.

This patchwork of data breach laws across 47 states makes it hard for organizations to implement consistent breach notification programs that address every state’s compliance requirements. Moreover, there is no single standard reporting mechanism.

In June 2015, Canada’s mandatory federal security breach notification law received royal assent. Organizations are now required to notify the Privacy Commissioner and affected individuals of any security breach involving personal information the organization experiences if there is a reasonable expectation that the breach will create a real risk of significant harm to an individual.

Most recently, in December 2015, the EU issued a Network and Information Security Directive (NIS Directive), the first EU-wide legislation for cybersecurity. In addition to improving cybersecurity capabilities in EU Member States and enhancing member state cooperation on cybersecurity, it requires “operators of essential services in the energy, transport, banking and healthcare sectors, and providers of key digital services like search engines and cloud computing, to take appropriate security measures and report incidents to national authorities.” 9 Once the text of the NIS Directive is formally approved, Member States will have 21 months to implement the directive into national laws. This new law will have particular implications for organizations deemed digital service providers — online retailers, cloud computing services and search engines.

9 “Commission welcomes agreement to make EU online environment more secure,” European Commission, 8 December 2015, http://europa.eu/rapid/press-release_IP-15-6270_en.htm.

36% more than a third of EY’s GISS cybersecurity respondents say that it is unlikely that they

would be able to detect a sophisticated attack

58%

50% 60% 70%

EY’s GISS privacy questionnaire asked 630 respondents “when it comes to adopting CBPR as a means of legitimizing transfers out of regulated APEC countries, which of the following statements best describes your organization?” (Select one)

Numbers may not sum due to rounding

21%

8%

7%

6%

0% 10% 20% 30% 40%

We don’t transfer personal information out of regulated APEC countries

Undecided as to whether to take it or not

CBPR is not a path we intend to take

It is very likely that we will apply for CBPR but don’t know when

We are in the process of, or already obtained, CBPR approval

Page 7: Can privacy really be protected anymore? - EY - US · Building on the key trends from 2015 Moving forward: 10 Actions to help improve accountability in 2016

11Can privacy really be protected anymore? — Privacy trends 2016 |

How can companies improve their accountability and become trend leaders?

Actions to help improve accountability in 2016The pace at which new and emerging technologies are penetrating and disrupting every aspect of our lives is far outstripping the pace at which lawmakers can keep up. As a result, regulators — and consumers — are looking to companies to assume accountability for privacy.

There are eight actions companies can take to help improve their accountability and become trend leaders.

In EY’s GISS privacy questionnaire, we asked 630 respondents to rank which KPIs they think are most important to track, as well as the KPIs that executive leadership thinks are most important. Many are the same. Where the results deviate is where it gets interesting. Both rate progress in addressing compliance obligations, tracking the number and impact of breaches and progress in addressing privacy risks as their top three priorities. However, management appears to be far more concerned about priorities two and three than their executive counterparts. Attitudes deviate even further on management’s priorities around measuring the company’s ability to close known gaps and its ability to complete and maintain inventory of personal information. For executives, tracking complaints received trumped closing known gaps as their number four priority. And maintaining an inventory of personal information fell to number seven on their list, below closing known gaps (number five) and collaborating with the business (number six).

For privacy accountability to be successful, leadership and management need to be aligned in terms of both priorities and culture when it comes to managing privacy.

Organizational leaders need to understand that it’s no longer enough to know what they are tracking. They have to know why they are tracking it and which KPIs will enable them to develop a robust privacy program that zeros in on accountability, within and outside the organization.

Companies will want to consider adopting KPIs for privacy in the same way they do for any other performance-based program. These KPIs can be tied to the company’s existing GRC program. In a recent survey conducted by the International Association of Privacy Professionals (IAPP) and sponsored by EY, IAPP-EY Annual Privacy Governance Report 2015, 31% of respondents indicate that they plan to increase their use of GRC tools in the coming year, with more than half ranking data protection controls as their most used GRC tool.10

Automating KPIs helps enable companies to gather and analyze accurate privacy data that they can then use to develop, implement, monitor and maintain robust privacy programs that help increase compliance with regulations, and meets increasing consumer demands.

Moving forwardDevelop KPIs for privacy 1

10 IAPP-EY, IAPP-EY Annual Privacy Governance Report 2015.

Develop KPIs for privacy 1

Build privacy impact assessments into the system development life cycle 2

Prepare a robust incident response plan — and prepare to respond 3

Monitor for insider threats 4

Know the assurance options   5

Implement identity and access management for privacy 6

Consider right-on-time notice of privacy policy 7

Get consensus on an approach to de-identification8

Progress in addressing compliance obligations 42% 41%

Number and impact of breaches and incidents 42% 32%

Progress in addressing risks to privacy 39% 23%

Closing known gaps 32% 19%

Complete and maintain inventory of personal information

Complaints received 22% 22%

Degree of collaboration with the business 14% 23%

KPI Top priority for management

Top priority for executive leadership

31% 12%

Page 8: Can privacy really be protected anymore? - EY - US · Building on the key trends from 2015 Moving forward: 10 Actions to help improve accountability in 2016

13Can privacy really be protected anymore? — Privacy trends 2016 |12 | Can privacy really be protected anymore? — Privacy trends 2016

Privacy impact assessments (PIAs) analyze how personally identifiable information is collected, used, shared and maintained. It aids organizations in identifying and mitigating privacy risks within projects and across the entire enterprises. PIAs are not new. But where they were once optional, today they are leading practice. In the recent IAPP-EY survey, 59% of privacy professionals indicate that they use PIAs, with approximately 50% saying that they are part of their organizations system development life cycle process.11

One of the PIA leaders is the UK. It was the first country in Europe to develop and disseminate a privacy impact assessment methodology. In 2012, the EC’s Data Protection Regulation introduced made PIAs mandatory for both public and private sector organizations throughout Europe.

We see 2016 as the year that organizations build PIAs into their development system life cycle so that they are conducted consistently for every project, every time.

In EY’s GISS only 7% of 1,755 respondents to the cybersecurity questionnaire claim to have incident response programs for cyber attacks. These include third parties and law enforcement and are integrated with their broader threat and vulnerability management function — a percentage that remains unchanged from last year. The next question becomes, of the 7% that have incident response programs in place for cyber attacks, how many have incorporated responses to privacy breaches as part of their program?

In April 2015 in the US, the cybersecurity unit of the Department of Justice released new guidance on leading practices for responding to data breaches. The guidance is divided into four sections that address what steps to take before a cyber attack, how to respond when a cyber attack occurs, what not to do following a cyber incident and what to do in the aftermath.12

First and foremost, the guidance recommends that before a breach occurs, organizations should conduct risk assessments to identify and prioritize critical assets, data and services. From a privacy perspective these risk assessments should particularly focus on identifying and flagging any personally identifiable information. The guidance goes on to suggest that organizations develop a robust incident response plan that outlines the concrete steps they need to take in the event of a cyber breach.13

Governments in Canada, the EU and the UK all have similar advice regarding incident response. Yet, for global organizations, a country-by-country response is a poor option. Instead, multinational organizations may want to consider developing a robust, consistent, unified and clearly articulated approach to addressing incidents related to privacy breaches before they occur.

Given that more than a third of EY’s GISS cybersecurity respondents say that it is unlikely that they would be able to detect a sophisticated attack, there’s a good chance that one or more readers of this report is currently experiencing a breach and doesn’t even know it yet.

In early December 2015, the Australian government proposed a new law that will require organizations operating in Australia to disclose serious breaches of people’s information within 30 days if the breach has a potential to cause serious harm. The concept of mandatory data breach notification has been around for a long time. In fact, the previous government introduced similar legislation, but it was not put to parliament for a vote.

Broadly speaking, Telstra is supportive of a mandatory breach notification law because we believe customers expect that they should be notified if there is a significant breach that can cause them harm. We haven’t seen the final details of the legislation. However, we believe that organizations should be doing everything they can to notify, remediate and manage to the best of their ability any such lapses. Certainly, this is what Telstra does.

However, we need to make sure that laws such as these do not produce notification fatigue. A lack of harmonization in breach notification laws around the world means organizations need to understand the different laws in different jurisdictions. Does a breach require telling the regulator before the consumer? When contacting the consumer, which form must the notification take — verbal, written? And then there’s the additional requirement of notification based on the size of the breach. We’re conscious of over notifying customers and creating levels of anxiety for breaches that will have no material impact. That said, we are fully in favor of legislative instruments for instances where serious harm can occur.

In recognition of the anxiety customers feel around breach notification, the Australian Privacy Commissioner has released guidance on what he expects should happen if there is a data breach, but at the moment, without a law, it’s entirely voluntary.

Telstra already has a framework in place that ensures we not only meet our legal obligations, but also our customers’ expectations on privacy and data protection. As a large organization, we have large numbers of resources invested in compliance and management of regulatory requirements. We will be well-positioned to meet any compliance obligations the new breach notification law imposes.

The same cannot be said for smaller organizations with fewer resources, less experience, and no back office IT systems to support the requirements of the new law. For them, this new law will likely pose significant challenges — ones worth resolving to keep customers informed and to remediate and mitigate the impact today’s data breaches can cause.

Proposed breach notification law eases customer anxieties

Ben CarrChief Privacy Officer

TelstraMelbourne, Australia

In EY’s GISS 2015, when we asked the cybersecurity respondents what they consider to be the most likely source of a cyber attack, 56% indicated their employees, a 12 percentage point increase from last year, and second only to criminal syndicates.14

When asked whether they monitor their employees’ use of data, 42% indicate they have formalized requirements for monitoring employees while balancing their privacy obligations. That said, a majority of respondents admit that protecting proprietary information is more important than protecting employee privacy.

Although a healthy minority monitor their employees’ use of data, few organizations assess their employees’ adherence to data protection requirements through their performance evaluation process. Instead, organizations rely on more traditional, administrative mechanisms, such as computer-based education, emails, posters and agreements. Many of these mechanisms focus on communicating expectations rather than emphasizing deterrence.

Understandably, organizations want to use monitoring tools to keep an eye on their data. However, these tools can also end up monitoring an employee’s personal information. This is particularly evident in the case of bring your own device (BYOD), which is now ubiquitous across organizations.

Moving forward

11 IAPP-EY, IAPP-EY Annual Privacy Governance Report 2015.

12 “Best Practices for Victim Response and Reporting of Cyber Incidents,” Cybersecurity Unit, Computer Crime & Intellectual Property Section, Criminal Division, U.S. Department of Justice, April 2015, http://www.justice.gov/sites/default/files/opa/speeches/attachments/2015/04/29/criminal_division_guidance_on_best_practices_for_victim_response_and_reporting_cyber_incidents.pdf.

13 Ibid.

14 EY’s Global Information Security Survey 2015 “Creating trust in the digital world.”

Build privacy impact assessments into the system development life cycle 2 Prepare a robust incident response

plan — and prepare to respond 3 Monitor for insider threats 4

7%of GISS cybersecurity respondents have

incident response programs for cyber attacks that include third parties and law enforcement and is integrated with their broader threat and

vulnerability management function

56%of GISS cybersecurity respondents consider their employees to be the most likely source

of attack — second only to criminal syndicates

Page 9: Can privacy really be protected anymore? - EY - US · Building on the key trends from 2015 Moving forward: 10 Actions to help improve accountability in 2016

15Can privacy really be protected anymore? — Privacy trends 2016 |14 | Can privacy really be protected anymore? — Privacy trends 2016

Rather than monitoring employees, which is a less than ideal option for all parties, organizations may want to consider some or all of the following options to better balance privacy and regulation with the need to monitor for insider threats:

• Partitioning. This would help to alleviate at least some of the privacy issues associated with dual-use devices (laptops, tablets and smartphones). Partitioning would provide the device with two different desktops — one for work and one for personal — located on two separate components of the device’s hard drive.

• Guest network. A guest network separate from the main network allows employees to use their personal device to gain access to the web directly, perhaps even through a work-only email account.

• Sandboxes. Using third-party services or an organization’s own coding to create “sandboxes” enables organizations to separate company data and company-issued applications from any interaction with personal data, applications or online services.

As the workforce becomes increasingly mobile and technologies continue to evolve, finding the right balance between personal privacy and corporate security is more important than ever.

Service providers are often asked to obtain an independent assessment of their privacy and data security practices. Previously, organizations used the Statement on Auditing Standard No. 70 reports (SAS 70 reports). However, these reports are not intended to address privacy or security.

In 2011, AICPA issued a new framework — Reports on Controls at a Service Organization Relevant to Security, Availability, Processing Integrity, Confidentiality and Privacy (SOC 2). SOC 2 reports provide independent assurance on a wider range of service provisions than simply financial reporting. They enable service providers to be transparent and accountable to their clients. Additionally, organizations that outsource can use the SOC 2 reports to be accountable to their shareholders and other stakeholders.

SOC 2 reports enable service providers to be transparent and accountable to their clients. Further, organizations that outsource can use the SOC 2 reports to be accountable to their shareholders and other stakeholders.

With a market capitalization of $124b, revenues of approximately $50b, and more than 400 diverse brands globally, Unilever is a large organization with unique challenges in terms of data privacy. When we embarked on a cross-enterprise, cross-geography privacy program in 2013, we were interested not only in developing and implementing a robust global privacy program, but also in monitoring and measuring its success.

When we officially launched our privacy program, we broke it down into four or five main components: 1) governance, which encompassed accountability, ownership and responsibility, as well as establishing a data protection network; 2) policy, which related to standards, guidance and legal requirements; 3) privacy awareness and training; and 4) operationalizing privacy and introducing privacy monitoring, which included establishing KPIs.

With regard to the last component, we’ve been implementing a privacy tracking and monitoring tool, which to date has been rolled out to more than 50% of our data protection officer network. Using this self-assessment tool, data protection officers are responsible for checking that the controls we’ve stipulated in our global privacy standard are operating effectively. We’ve also been working with corporate audit to educate and train data protection officers on the types of issues our organization faces, including where data is stored and how long we retain it. Additionally, we’ve trained our procurement people in terms of what model clauses should look like as well as responses.

Based on the standards, policies and tools that we’ve implemented, we are getting to a place where we can monitor and measure the effectiveness of our program. So far, we have been able to test the monitoring across our European countries to determine how they are doing against our KPIs. We report the results of our progress on a quarterly basis to the Chief Legal Officer and the Quarterly Information Protection Council.

Our next step is to test the awareness level of our employees through surveying, again, to measure the effectiveness of our controls, to raise awareness, and to identify and remediate any issues that arise. As an unregulated industry, our organization is unaccustomed to adhering to compliance requirements. This has presented some unique challenges, as well as some opportunities to shift our corporate culture and embed accountability for privacy into the senior management levels of the organization. It makes our efforts as much a process as a program, helping the business to understand its responsibilities through awareness and KPIs.

Monitoring program effectiveness through KPIsSOC 2 audits are conducted based on the AICPA’s five trust services principles:

1. Security. The system is protected against unauthorized access (both physical and logical).

2. Availability. The system is available for operation and use as committed.

3. Processing integrity. System processing is complete, accurate, timely and authorized.

4. Confidentiality. Information designated as confidential is protected as committed.

5. Privacy. Personal information is collected, used, retained, disclosed and destroyed in conformity with the commitments in the entity’s privacy notice and with criteria set forth in Generally Accepted Privacy Principles issued by the AICPA and CICA. The principles of security, availability and processing integrity are used to evaluate whether a system is reliable.15

SOC 2 reports play a different role in giving consumers, commercial customers, shareholders and the market at large the confidence that an organization is meeting a comprehensive standard when protecting their data.

Identity and access management for privacy differs from traditional identity and access management programs because it’s about privacy, not security. Organizations can control access by modifying, or de-identifying the data.

Identity and access management for security focuses on determining what level of detail a user needs, which is determined, in part, on role. Identity and access management for privacy needs to be customized, not by role, but by customizing data elements with privacy in mind.

Organizations can be smarter about how they handle privacy by providing users with the least-specific data based on a user’s need. For example, a company provides reports to 1,000 users over the course of a year. Currently, every one of those 1,000 people gets access to the complete set of data. However, not everyone needs complete access. Identity and access management for privacy helps companies to evolve their privacy programs so that only 100 of the 1,000 get a complete set of data. The rest of the people get a subset of the data based on their need.

The benefits of this approach as organizations move to implement it in the coming year are twofold: fewer vulnerabilities related to insider threats; and greater rigor for organizational accountability.

15 “Trust Services Principles and Criteria,” AICPA, www.aicpa. org, ©2006-2015 American Institute of CPAs, http://www.aicpa.org/interestareas/informationtechnology/ resources/soc/trustservices/pages/trust%20services%20 principles%E2%80%94an%20overview.aspx.

Moving forward

Steve WrightChief Privacy Officer

Unilever London, UK

Know the assurance options   5 Implement identity and access management for privacy 6

61% of GISS privacy questionnaire respondents say that they either have no mandate to minimize or de-identify personal information, or they

only do it in unique circumstances

37%of GISS cybersecurity respondents have no formalized requirements to address privacy

concerns related to social media

Page 10: Can privacy really be protected anymore? - EY - US · Building on the key trends from 2015 Moving forward: 10 Actions to help improve accountability in 2016

17Can privacy really be protected anymore? — Privacy trends 2016 |16 | Can privacy really be protected anymore? — Privacy trends 2016

In 2012, Colombia’s congress issued new statutory law related to a new data protection law. Part of these regulations requires that the data controller within an organization must be in charge of the company’s data protection compliance program. As the Chief Information Security Officer for my organization, that operational responsibility was delegated to me and my team. I now serve as the Chief Information Security and Privacy Officer.

Combining these two roles provides a unique perspective for me to look at the issue of data protection from both a security and a privacy standpoint. My team and I need to know what information needs to be protected, where it needs to be protected and how it needs to be protected. It’s not good enough to protect the information itself. It also requires a better understanding of where the privacy issues are within the various functions of the organization, and to identify and implement policies and procedures to mitigate the associated risks.

When we talk about protecting information, we are, of course, talking about customer data and employee data and how to protect both. How to protect the privacy of our employees. But also, how to ensure that our employees respect and protect our customers’ privacy too.

One of the biggest advantages I see to having dual responsibility for information security and privacy is our visibility and ability to trace the flow of information across the enterprise from start to finish so that we can ensure that the data is protected at every touch point. When I only held the role of Chief Information Security Officer, my primary focus was securing the data. As both Chief Information Security Officer and Chief Privacy Officer, my focus is now on the people as well as data security. This is a significant shift that enables us to look at more than the compliance implications — it enables us to look at the human implications.

The human side of data

Jeimy J. Cano M., PhD, CFEChief Information Security and Privacy Officer

Ecopetrol S.A. Bogota, Colombia

Moving Forward

Traditional privacy notifications and choice options have largely lost their meaning. Consumers blindly click on the “Accept” button without reading or understanding their privacy rights, often because it’s several pages long and written in legalese. These notifications also tend to be blanket consent for every subsequent interaction consumer has with organization.

A better option for organizations and consumers alike is right-on-time notification. This would require consumers to consent to each interaction. At the same time, for each of these interactions, organizations would have to explain what they plan on doing with a customer’s information and what options consumers have with regard to that use.

In the recent IAPP-EY survey, nearly a third of respondents indicate having completed initiatives around privacy choice and consent consolidation with a further third in the short- or long-term planning stages.16 It is unclear, however, whether these initiatives included a review and implementation of right-on-time notification.

In 2016, organizations should consider working towards making consent more detailed and relevant to the interaction and explain the specific intent of the use it has for the data.

Loosely defined, de-identification involves the scrubbing of data until any hint of an individual’s identity is gone. The purpose is to make the data safe from a privacy perspective, but useful from a big data/data analytics standpoint.

In EY’s GISS privacy questionnaire, we asked participants about their use of big data, as well as what steps they were taking to de-identify personal information. Over the last year, 37% of organizations have invested in identifying new uses for their data in the context of big data analysis. In the next year, 45% plan to invest more in big data. Yet 54% have no formalized requirements for using big data while addressing their privacy obligations. Further, for 61% there is either no mandate, or a requirement only in unique circumstances, to de-identify personal information.

As big data plays an increasingly important role in almost every decision a company makes, the debate over what data a company collects, stores, manages and protects will continue to escalate. It is in this context that the concept and the definition of de-identification grows increasingly critical. Yet, the definition, particularly when it’s considered in a legal context, remains vague — which may explain why only a little more than a quarter (27%) of survey respondents have a plan for de-identification.

However, that has not stopped regulators from trying. In the US, the Health Insurance Portability and Accountability Act (HIPAA) defines protected health information as identifiable data that “identifies the individual or for which there is a reasonable basis to believe [that the information] can be used to identify an individual.”17 The FTC doesn’t provide a definition per se. Instead, it offers advice to help organizations assess whether data is identifiable. However, there is currently no leading practice.

In Europe, the General Data Protection Regulation promotes techniques such as anonymization (removing personally identifiable information where it is not needed), pseudonymization (replacing personally identifiable material with artificial identifiers), and encryption (encoding messages so only those authorized can read it) to protect personal data.

Two organizations have taken additional steps to help de-mystify de-identification:

1. Future of Privacy Forum (FPF) has developed a framework for de-identification.

2. National Institute of Standards and Technology (NIST) has published the De-Identification of Personally Identifiable Information to provide an overview of the different approaches and gaps associated with common de-identification technique.

Despite these noble efforts, before organizations can move forward in developing and implementing a de-identification plan, the global community needs to agree on a common approach to de-identification.

In a recent draft report, FPF makes the argument that rather than focusing on what constitutes personal data and therefore which data should be de-identified, we should consider reorienting legal rules for data based on “multiple categories of identifiability, while keeping the option open to assess other factors such as the data’s sensitivity, accessibility and permanence to modify additional legal requirements.”18 This approach allows that rather than a black and white approach to data, organizations and policymakers should “view data in various shades of gray.”19

In 2016, we expect to see progress by the global community in finding consensus in terms of what constitutes de-identification, and a framework to help organizations develop a plan to achieve it.

16 IAPP-EY, IAPP-EY Annual Privacy Governance Report 2015.

17 “Guidance Regarding Methods for De-identification of Protected Health Information in Accordance with the Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule,” U.S. Department of Health and Human Services, HHS.gov, http://www.hhs.gov/hipaa/for-professionals/privacy/special-topics/de-identification/index.html.

18 Polonetsky, Jules, Tene, Omer, Finch, Kelsey, “Shades of Gray: Seeing the Full Spectrum of Practical Data De-Identification,” Future of Privacy Forum, 2015.

19 Ibid.

Consider right-on-time notice of privacy policy 7 Get consensus on an approach to

de-identification8

54%of GISS privacy questionnaire respondents

have no formalized requirements for using big data while addressing their privacy obligations

Page 11: Can privacy really be protected anymore? - EY - US · Building on the key trends from 2015 Moving forward: 10 Actions to help improve accountability in 2016

If the evolution of privacy protection is still in the adolescent phase, then it needs to grow up — fast. The digital future is upon us and it won’t wait for governments to craft laws that address the myriad privacy risks it creates.

Organizations are increasingly being held to account for the reams upon reams of personally identifiable data they collect. Yet, alarmingly large numbers of organizations still have no idea where this data lives within their systems, let alone how to protect it.

Organizations need to be taking clear and decisive action to develop and enhance privacy management beyond ad hoc policies and toward fully accountable, certified and trusted privacy programs. Knowing how information is collected, used, shared and maintained, developing KPIs, finding the balance between monitoring for insider threats and employee privacy, controlling access by modifying and de-identifying data, preparing for the worst and providing independent assurance on privacy programs are all signs of an evolving maturity that governments and individuals alike are demanding.

For years, we’ve been talking about the need for accountability in privacy management. We expect 2016 may be the year when privacy programs mature from adolescence to adulthood.

The need for maturity in privacy accountability

Conclusion

Organizations need to take clear and decisive action to develop and enhance privacy management

Page 12: Can privacy really be protected anymore? - EY - US · Building on the key trends from 2015 Moving forward: 10 Actions to help improve accountability in 2016

20 | Can privacy really be protected anymore? — Privacy trends 2016

Insights on governance, risk and compliance is an ongoing series of thought leadership reports focused on IT and other business risks and the many related challenges and opportunities. These timely and topical publications are designed to help you understand the issues and provide you with valuable insights about our perspective.

Please visit our Insights on governance, risk and compliance series at: www.ey.com/GRCinsights.

Want to learn more?

Unlocking the value of your program investments: How predictive analytics can help in achieving successful outcomeswww.ey.com/PRM

Enhancing your security operations with Active Defensewww.ey.com/activedefense

There’s no reward without risk: EY’s global governance, risk and compliance survey 2015www.ey.com/GRCsurvey2015

Cyber program management: identifying ways to get ahead of cybercrimewww.ey.com/CPM

Achieving resilience in the cyber ecosystemwww.ey.com/cyberecosystem

Creating trust in the digital world: EY’s Global Information Security Survey 2015www.ey.com/GISS2015

Reducing risk with Cyber Threat Intelligencewww.ey.com/CTI

Cybersecurity and the Internet of Thingswww.ey.com/IOT

If you were under cyber attack, would you ever know?As many organizations have learned, sometimes the hard way, cyber attacks are no longer a matter of if, but when. Hackers are increasingly relentless. When one tactic fails, they will try another until they breach an organization’s defenses. At the same time, technology is increasing an organization’s vulnerability to attack through increased online presence, broader use of social media, mass adoption of mobile devices, increased usage of cloud services, and the collection and analysis of big data. Our ecosystems of digitally connected entities, people and data increase the likelihood of exposure to cybercrime in both the work and home environment. Even traditionally closed operational technology systems are now being given IP addresses, enabling cyber threats to make their way out of back-office systems and into critical infrastructures such as power generation and transportation systems.

Anticipating cyber attacks is the only way to be ahead of cyber criminals. With our focus on you, we ask better questions about your operations, priorities and vulnerabilities. We then collaborate with you to create innovative answers that help you activate, adapt and anticipate cybercrime. Together, we help you design better outcomes and realize long-lasting results, from strategy to execution.

We believe that when organizations manage cybersecurity better, the world works better.

So, if you were under cyber attack, would you ever know? Ask EY.

Using cyber analytics to help you get on top of cybercrime: Third-generation Security Operations Centers www.ey.com/3SOC

Page 13: Can privacy really be protected anymore? - EY - US · Building on the key trends from 2015 Moving forward: 10 Actions to help improve accountability in 2016

About EYEY is a global leader in assurance, tax, transaction and advisory services. The insights and quality services we deliver help build trust and confidence in the capital markets and in economies the world over. We develop outstanding leaders who team to deliver on our promises to all of our stakeholders. In so doing, we play a critical role in building a better working world for our people, for our clients and for our communities.

EY refers to the global organization, and may refer to one or more, of the member firms of Ernst & Young Global Limited, each of which is a separate legal entity. Ernst & Young Global Limited, a UK company limited by guarantee, does not provide services to clients. For more information about our organization, please visit ey.com.

© 2016 EYGM Limited. All Rights Reserved.

EYG no. AU3733 ED None

This material has been prepared for general informational purposes only and is not intended to be relied upon as accounting, tax, or other professional advice. Please refer to your advisors for specific advice. The views of third parties set out in this publication are not necessarily the views of the global EY organization or its member firms. Moreover, they should be seen in the context of the time they were made.

ey.com/GRCinsights

EY | Assurance | Tax | Transactions | Advisory

Our Cybersecurity Leaders are:

Area Cybersecurity Leaders

Americas Bob Sydow +1 513 612 1591 [email protected] Gelber +44 207 951 6930 [email protected] O’Rourke +65 8691 8635 paul.o’[email protected] Nagao +81 3 3503 1100 [email protected]

Global Cybersecurity Leader

Ken Allan +44 20 795 15769 [email protected]

Our Risk Advisory Leaders are:

Area Risk Leaders

Americas Amy Brachio +1 612 371 8537 [email protected] EMEIAJonathan Blackmore +971 4 312 9921 [email protected] Burnet +61 8 9429 2486 [email protected] Azuma +81 3 3503 1100 [email protected]

Global Risk Leader

Paul van Kessel +31 88 40 71271 [email protected]

About EY’s Advisory Services

In a world of unprecedented change, EY Advisory believes a better working world means helping clients solve big, complex industry issues and capitalize on opportunities to grow, optimize and protect their businesses.

Through a collaborative, industry-focused approach, EY Advisory combines a wealth of consulting capabilities — strategy, customer, finance, IT, supply chain, people advisory, program management and risk — with a complete understanding of a client’s most complex issues and opportunities, such as digital disruption, innovation, analytics, cybersecurity, risk and transformation. EY Advisory’s high-performance teams also draw on the breadth of EY’s Assurance, Tax and Transaction Advisory service professionals, as well as the organization’s industry centers of excellence, to help clients realize sustainable results.

True to EY’s 150-year heritage in finance and risk, EY Advisory thinks about risk management when working on performance improvement, and performance improvement is top of mind when providing risk management services. EY Advisory also infuses analytics, cybersecurity and digital perspectives into every service offering.

EY Advisory’s global connectivity, diversity and collaborative culture inspires its consultants to ask better questions. EY consultants develop trusted relationships with clients across the C-suite, functions and business unit leadership levels, from Fortune 100 multinationals to leading disruptive innovators. Together, EY works with clients to create innovative answers that help their businesses work better.

The better the question. The better the answer. The better the world works.

Global Privacy Leader

Sagi Leizerov +1 703 747 0899 [email protected]