Four easy steps for organizations to hand over data control

To stay connected with patients, healthcare providers are turning to telehealth services. In fact, 34.5 million telehealth services were delivered from March through June, according to the Centers for Medicare and Medicaid Services. The shift to remote healthcare has also impacted the roll out of new regulations that would give patients secure and free access to their health data.

hand over data control

The shift to online services shines a light on a major cybersecurity issue within all industries (but especially healthcare where people have zero control over their data): consent.

Hand over data control

Data transparency allows people to know what personal data has been collected, what data an organization wants to collect and how it will be used. Data control provides the end-user with choice and authority over what is collected and even where it is shared. Together the two lead to a competitive edge, as 85% of consumers say they will take their business elsewhere if they do not trust how a company is handling their data.

Regulations such as the GDPR and the CCPA have been enacted to hold companies accountable unlike ever before – providing greater protection, transparency and control to consumers over their personal data.

The U.S. Department of Health and Human Services’ (HHS) regulation, which is set to go into effect in early 2021, would provide interoperability, allowing patients to access, share and manage their healthcare data as they do their financial data. Healthcare organizations must provide people with control over their data and where it goes, which in turn strengthens trust.

How to earn patients’ trust

Organizations must improve their ability to earn patients’ confidence and trust by putting comprehensive identity and access management (IAM) systems in place. Such systems need to offer the ability to manage privacy settings, account for data download and deletion, and enable data sharing with not just third-party apps but also other people, such as additional care providers and family members.

The right digital identity solution should empower the orchestration of user identity journeys, such as registration and authentication, in a convenient way that unifies configuring security and user experience choices.

It should also enable the healthcare organization to protect patients’ personal data while offering their end-users a unified means of control of their data consents and permissions. Below are the four key steps companies should take to earn trust when users hand over data control:

  • Identify where digital transformation opportunities and user trust risks intersect. Since users are becoming more skeptical, organizations must analyze “trust gaps” while they are discovering clever new ways to leverage personal data.
  • Consider personal data as a joint asset. It’s easy for a company to say consumers own their own personal data, but business leaders have incentives to leverage that data for the value it brings to their business. This changes the equation. All the stakeholders within an organization need to come together and view data as a joint asset in which all parties, including end-users, have a stake.
  • Lean into consent. Given the realities of regulations, a business often has a choice to offer consent to end-users rather than just collecting and using data. Seek to offer the option – it provides benefits when building trust with skeptical consumers, as well as when proving your right to use that data.
  • Take advantage of consumer identity and access management (CIAM) for building trust. Identity management platforms automate and provide visibility into the entire customer journey across many different applications and channels. They also allow end-users to retain the controls to manage their own profiles, passwords, privacy settings and personal data.

Providing data transparency and data control to the end-user enhances the relationship between business and consumer. Organizations can achieve this trust with consumers in a comprehensive fashion by applying consumer identity and access management that scales across all of their applications. To see these benefits before regulations like the HHS regulations go into effect, organizations need to act now.

CPRA: More opportunity than threat for employers

Increasingly demanded by consumers, data privacy laws can create onerous burdens on even the most well-meaning businesses. California presents plenty of evidence to back up this statement, as more than half of organizations that do business in California still aren’t compliant with the California Consumer Privacy Act (CCPA), which went into effect earlier this year.

CPRA

As companies struggle with their existing compliance requirements, many fear that a new privacy ballot initiative – the California Privacy Rights Act (CPRA) – could complicate matters further. While it’s true that if passed this November, the CPRA would fundamentally change the way businesses in California handle both customer and employee data, companies shouldn’t panic. In fact, this law presents an opportunity for organizations to change their relationship with employee data to their benefit.

CPRA, the Californian GDPR?

Set to appear on the November 2020 ballot, the CPRA, also known as CCPA 2.0 or Prop 24 (its name on the ballot), builds on what is already the most comprehensive data protection law in the US. In essence, the CPRA will bring data protection in California nearer to the current European legal standard, the General Data Protection Regulation (GDPR).

In the process of “getting closer to GDPR,” the CCPA would gain substantial new components. Besides enhancing consumer rights, the CPRA also creates new provisions for employee data as it relates to their employers, as well as data that businesses collect from B2B business partners.

Although controversial, the CPRA is likely to pass. August polling shows that more than 80% of voters support the measure. However, many businesses do not. This is because, at first glance, the CPRA appears to create all kinds of legal complexities in how employers can and cannot collect information from workers.

Fearful of having to meet the same demanding requirements as their European counterparts, many organizations’ natural reaction towards the prospect of CPRA becoming law is fear. However, this is unfounded. In reality, if the CPRA passes, it might not be as scary as some businesses think.

CPRA and employment data

The CPRA is actually a lot more lenient than the GDPR in regard to how it polices the relationship between employers and employees’ data. Unlike for its EU equivalent, there are already lots of exceptions written into the proposed Californian law acknowledging that worker-employer relations are not like consumer-vendor relations.

Moreover, the CPRA extends the CCPA exemption for employers, set to end on January 1, 2021. This means that if the CPRA passes into law, employers would be released from both their existing and potential new employee data protection obligations for two more years, until January 1, 2023. This exemption would apply to most provisions under the CPRA, including the personal information collected from individuals acting as job applicants, staff members, employees, contractors, officers, directors, and owners.

However, employers would still need to provide notice of data collection and maintain safeguards for personal information. It’s highly likely that during this two-year window, additional reforms would be passed that might further ease employer-employee data privacy requirements.

Nonetheless, employers should act now

While the CPRA won’t change much overnight, impacted organizations shouldn’t wait to take action, but should take this time to consider what employee data they collect, why they do so, and how they store this information.

This is especially pertinent now that businesses are collecting more data than ever on their employees. With companies like the workplace monitoring company Prodoscore reporting that interest from prospective customers rose by 600% since the pandemic began, we are seeing rapid growth in companies looking to monitor how, where, and when their employees work.

This trend emphasizes the fact that the information flow between companies and their employees is mostly one-sided (i.e., from the worker to the employer). Currently, businesses have no legal requirement to be transparent about this information exchange. That will change for California-based companies if the CPRA comes into effect and they will have no choice but to disclose the type of data they’re collecting about their staff.

The only sustainable solution for impacted businesses is to be transparent about their data collection with employees and work towards creating a “culture of privacy” within their organization.

Creating a culture of privacy

Rather than viewing employee data privacy as some perfunctory obligation where the bare minimum is done for the sake of appeasing regulators, companies need to start thinking about worker privacy as a benefit. Presented as part of a benefits package, comprehensive privacy protection is a perk that companies can offer prospective and existing employees.

Privacy benefits can include access to privacy protection services that give employees privacy benefits beyond the workplace. Packaged alongside privacy awareness training and education, these can create privacy plus benefits that can be offered to employees alongside standard perks like health or retirement plans. Doing so will build a culture of privacy which can help companies ensure they’re in regulatory compliance, while also making it easier to attract qualified talent and retain workers.

It’s also worth bearing in mind that creating a culture of privacy doesn’t necessarily mean that companies have to stop monitoring employee activity. In fact, employees are less worried about being watched than they are by the possibility of their employers misusing their data. Their fears are well-founded. Although over 60% of businesses today use workforce data, only 3 in 10 business leaders are confident that this data is treated responsibly.

For this reason, companies that want to keep employee trust and avoid bad PR need to prioritize transparency. This could mean drawing up a “bill of rights” that lets employees know what data is being collected and how it will be used.

Research into employee satisfaction backs up the value of transparency. Studies show that while only 30% of workers are comfortable with their employer monitoring their email, the number of employees open to the use of workforce data goes up to 50% when the employer explains the reasons for doing so. This number further jumps to 92% if employees believe that data collection will improve their performance or well-being or come with other personal benefits, like fairer pay.

On the other hand, most employees would leave an organization if its leaders did not use workplace data responsibly. Moreover, 55% of candidates would not even apply for a job with such an organization in the first place.

Final thoughts

With many exceptions for workplace data management already built-in and more likely to come down the line, most employers should be able to easily navigate the stipulations CPRA entails.

That being said, if it becomes law this November, employers shouldn’t misuse the two-year window they have to prepare for new compliance requirements. Rather than seeing this time as breathing space before a regulatory crackdown, organizations should instead use it to be proactive in their approach to how they manage their employees’ data. As well as just ensuring they comply with the law, businesses should look at how they can turn employee privacy into an asset.

As data privacy stays at the forefront of employees’ minds, businesses that can show they have a genuine privacy culture will be able to gain an edge when it comes to attracting and retaining talent and, ultimately, coming out on top.

Phishers are targeting employees with fake GDPR compliance reminders

Phishers are using a bogus GDPR compliance reminder to trick recipients – employees of businesses across several industry verticals – into handing over their email login credentials.

Phishers GDPR compliance

The lure

“The attacker lures targets under the pretense that their email security is not GDPR compliant and requires immediate action. For many who are not versed in GDPR regulations, this phish could be merely taken as more red tape to contend with rather than being identified as a malicious message,” Area 1 Security researchers noted.

In this evolving campaign, the attackers targeted mostly email addresses they could glean from company websites and, to a lesser extent, emails of people who are high in the organization’s hierarchy (execs and upper management).

Every and any pretense is good for a phishing email, but when targeting businesses, the lure can be very effective if it can pass as an email sent from inside the organization. So the attackers attempted to make it look like the email was coming from the company’s “security services”, though some initial mistakes on their part would reveal to careful targets that the email was sent from an outside email account (a Gmail address).

“On the second day of the campaign the attacker began inserting SMTP HELO commands to tell receiving email servers that the phishing message originated from the target company’s domain, when in fact it came from an entirely different origin. This is a common tactic used by malicious actors to spoof legitimate domains and easily bypass legacy email security solutions,” the researchers explained.

The phishing site

Following the link in the email takes victims to the phishing site, initially hosted on a compromised, outdated WordPress site.

The link is “personalized” with the target’s email address, so the HTML form on the malicious webpage auto-populates the username field with the correct email address (found in the URL’s “email” parameter). Despite the “generic” look of the phishing page, this capability can convince some users to log in.

Once the password is submitted, a script sends the credentials to the phishers and the victim is shown an error page.

As always, users/employees are advised not to click on links in unsolicited emails and to avoid entering their credentials into unfamiliar login pages.

The state of GDPR compliance in the mobile app space

Among the rights bestowed upon EU citizens by the General Data Protection Regulation (GDPR) is the right to access their personal data stored by companies (i.e., data controllers) and information about how this personal data is being processed. A group of academics from three German universities has decided to investigate whether and how mobile app vendors respond to subject access requests, and the results of their four-year undercover field study are dispiriting.

The results of the study

“In three iterations between 2015 and 2019, we sent subject access requests to vendors of 225 mobile apps popular in Germany. Throughout the iterations, 19 to 26 % of the vendors were unreachable or did not reply at all. Our subject access requests were fulfilled in 15 to 53 % of the cases, with an unexpected decline between the GDPR enforcement date and the end of our study,” they shared.

“The remaining responses exhibit a long list of shortcomings, including severe violations of information security and data protection principles. Some responses even contained deceptive and misleading statements (7 to 13 %). Further, 9 % of the apps were discontinued and 27 % of the user accounts vanished during our study, mostly without proper notification about the consequences for our personal data.”

GDPR mobile

The researchers – Jacob Leon Kröger from TU Berlin (Weizenbaum Institute), Jens Lindemann from the University of Hamburg, and Prof. Dr. Dominik Herrmann from the University of Bamberg – made sure to test a representative sample of iOS and Android apps: popular and less popular, from a variety of app categories, and from vendors based in Germany, the EU, and outside of the EU.

They disguised themselves as an ordinary German user, created accounts needed for the apps to work, interacted with each app for about ten minutes, and asked app providers for information about their stored personal data (before and after GDPR enforcement).

They also used different a request text for each round of inquiries. The first one was more informal, while the last two were more elaborate and included references to relevant data protection laws and a warning that the responsible data protection authorities would be notified in the case of no response.

“While we cannot precisely determine their individual influence, it can be assumed that both the introduction of the GDPR as well as the more formal and threatening tone of our inquiry in [the latter two inquiries] had an impact on the vendors’ behavior,” they noted.

Solving the problem

Smartphones are ubiquitous and most users use a variety of mobile apps, which usually collect personal user data and share it with third parties.

In theory, the GDPR should force mobile app vendors to provide information about this data and how it’s used to users. In practice, though, many app vendors are obviously hoping that users won’t care enough about it and won’t make a stink when they don’t receive a satisfactory reply, and that GDPR regulators won’t have the resources to enforce the regulation.

“We (…) suspected that some vendors merely pretended to be poorly reachable when they received subject access requests – while others actually had insufficient resources to process incoming emails,” the researchers noted.

“To confirm this hypothesis, we tested how the vendors that failed to respond to our requests reacted to non-privacy related inquiries. Using another (different) fake identity, we emailed the vendors who had not replied [to the first inquiry] and [to the third inquiry], expressing interest in promoting their apps on a personal blog or YouTube channel. Out of the group of initial non-responders, 31 % [first inquiry] and 22 % [third inquiry] replied to these dummy requests, many of them within a few hours, proving that their email inbox was in fact being monitored.”

The researchers believe the situation for users can be improved by authorities doing random compliance checks and offering better support for data controllers through industry-specific guidelines and best practices.

“In particular, there should be mandatory standard interfaces for providing data exports and other privacy-related information to data subjects, obviating the need for the manual processing of GDPR requests,” they concluded.

How AI can alleviate data lifecycle risks and challenges

The volume of business data worldwide is growing at an astounding pace, with some estimates showing the figure doubling every year. Over time, every company generates and accumulates a massive trove of data, files and content – some inconsequential and some highly sensitive and confidential in nature.

Throughout the data lifecycle there are a variety of risks and considerations to manage. The more data you create, the more you must find a way to track, store and protect against theft, leaks, noncompliance and more.

Faced with massive data growth, most organizations can no longer rely on manual processes for managing these risks. Many have instead adopted a vast web of tracking, endpoint detection, encryption, access control and data policy tools to maintain security, privacy and compliance. But, deploying and managing so many disparate solutions creates a tremendous amount of complexity and friction for IT and security teams as well as end users. The problem with this approach is that it comes up short in terms of the level of integration and intelligence needed to manage enterprise files and content at scale.

Let’s explore several of the most common data lifecycle challenges and risks businesses are facing today and how to overcome them:

Maintaining security – As companies continue to build up an ocean of sensitive files and content, the risk of data breaches grows exponentially. Smart data governance means applying security across the points at which the risk is greatest. In just about every case, this includes both ensuring the integrity of company data and content, as well as any user with access to it. Every layer of enterprise file sharing, collaboration and storage must be protected by controls such as automated user behavior monitoring to deter insider threats and compromised accounts, multi-factor authentication, secure storage in certified data centers, and end-to-end encryption, as well as signature-based and zero-day malware detection.

Classification and compliance – Gone are the days when organizations could require users to label, categorize or tag company files and content, or task IT to manage and manually enforce data policies. Not only is manual data classification and management impractical, it’s far too risky. You might house millions of files that are accessible by thousands of users – there’s simply too much, spread out too broadly. Moreover, regulations like GDPR, CCPA and HIPAA add further complexity to the mix, with intricate (and sometimes conflicting) requirements. The definition of PII (personally identifiable information) under GDPR alone encompasses potentially hundreds of pieces of information, and one mistake could result in hefty financial penalties.

Incorrect categorization can lead to a variety of issues including data theft and regulatory penalties. Fortunately, machines can do in seconds–and often with better accuracy–what it might take years for a human to do. AI and ML technologies are helping companies quickly scan files across data repositories to identify sensitive information such as credit card numbers, addresses, dates of birth, social security numbers, and health-related data, to apply automatic classifications. They can also track files across popular data sources such as OneDrive, Windows File Server, SharePoint, Amazon S3, Google Cloud, GSuite, Box, Microsoft Azure Blob, and generic CIFS/SMB repositories to better visualize and control your data.

Retention – As data storage costs have plummeted over the past 10 years, many organizations have fallen into the trap of simply “keeping everything” because it’s (deceptively) cheap to do so. This approach carries many security and regulatory risks, as well as potential costs. Our research shows that exposure of just a single terabyte of data could cost you $129,324; now think about how many terabytes of data your organization stores today. The longer you retain sensitive files, the greater the opportunity for them to be compromised or stolen.

Certain types of data must be stored for a specific period of time in order to adhere to various customer contracts and regulatory criteria. For example, HIPAA regulations require organizations to retain documentation for six years from the date of its creation. GDPR is less specific, stating that data shall be kept for no longer than is necessary for the purposes for which it is being processed.

Keeping data any longer than absolutely necessary is not only risky, but those “affordable” costs can add up quickly. AI-enabled governance can track these set retention periods and minimize risk by automatically securing or eliminating any old or redundant files longer required (or allowed). With streamlined data retention processes, you can decrease storage costs, reduce security and noncompliance exposure and optimize data processing performance.

Ongoing monitoring and management – Strong governance gets easier with good data hygiene practices over the long term, but with so many files to manage across a variety of different repositories and storage platforms, it can be challenging to track risks and suspicious activities at all times. Defining dedicated policies for what data types can be stored in which locations, which users can access it, and all parties with which it be shared will help you focus your attention on further minimizing risk. AI can multiply these efforts by eliminating manual monitoring processes, providing better visibility into how data is being used and alerts when sensitive content might have been shared externally or with unapproved users. This makes it far easier to identify and respond to threats and risky behavior, enabling you to take immediate action on compromised accounts, move or delete sensitive content that is being shared too broadly or stored in unauthorized locations, etc.

The key to data lifecycle management

The sheer volume of data, files and content businesses are now generating and managing creates massive amounts of complexity and risk. You have to know what assets exist, where they’re stored, the specific users have access to them, when they’re being shared, what files can be deleted, which need to be stored in accordance with regulatory requirements, and so on. Falling short in any one of these areas can lead to major operational, financial and reputational consequences.

Fortunately, recent advances in AI and ML are enabling companies to streamline data governance to find and secure sensitive data at its source, sense and respond to potentially malicious behaviors, maintain compliance and adapt to changing regulatory criteria, and more. As manual processes and piecemeal point solutions fall short, AI-enabled data governance will continue to dramatically reduce complexity both for users and administrators, and deliver a level of visibility and control that business needs in today’s data-centric world.

340 GDPR fines for a total of €158,135,806 issued since May 2018

Since rolling out in May 2018, there have been 340 GDPR fines issued by European data protection authorities. Every one of the 28 EU nations, plus the United Kingdom, has issued at least one GDPR fine, Privacy Affairs finds.

GDPR fines

Whilst GDPR sets out the regulatory framework that all EU countries must follow, each member state legislates independently and is permitted to interpret the regulations differently and impose their own penalties to organizations that break the law.

Nations with the highest fines

  • France: €51,100,000
  • Italy: €39,452,000
  • Germany: €26,492,925
  • Austria: €18,070,100
  • Sweden: €7,085,430
  • Spain: €3,306,771
  • Bulgaria: €3,238,850
  • Netherlands: €3,490,000
  • Poland: €1,162,648
  • Norway: €985,400

Nations with the most fines

  • Spain: 99
  • Hungary: 32
  • Romania: 29
  • Germany: 28
  • Bulgaria: 21
  • Czech Republic: 13
  • Belgium: 12
  • Italy: 11
  • Norway: 9
  • Cyprus: 8

The second-highest number of fines comes from Hungary. The National Authority for Data Protection and Freedom of Information has issued 32 fines to date. The largest being €288,000 issued to an ISP for improper and non-secure storage of customers’ personal data.

UK organizations have been issued just seven fines, totalling over €640,000, by the Information Commissioner. The average penalty within the UK is €160,000. This does not include the potentially massive fines for Marriott International and British Airways that are still under review.

British Airways could face a fine of €204,600,000 for a data breach in 2019 that resulted in the loss of personal data of 500,000 customers.

Similarly, Marriott International suffered a breach that exposed 339 million people’s data. The hotel group faces a fine of €110,390,200.

The largest and highest GDPR fines

The largest GDPR fine to date was issued by French authorities to Google in January 2019. The €50 million was issued on the basis of “lack of transparency, inadequate information and lack of valid consent regarding ads personalization.”

Highest fines issued to private individuals:

  • €20,000 issued to an individual in Spain for unlawful video surveillance of employees.
  • €11,000 issued to a soccer coach in Austria who was found to be secretly filming female players while they were taking showers.
  • €9,000 issued to another individual in Spain for unlawful video surveillance of employees.
  • €2,500 issued to a person in Germany who sent emails to several recipients, where each could see the other recipients’ email addresses. Over 130 email addresses were visible.
  • €2,200 issued to a person in Austria for having unlawfully filmed public areas using a private CCTV system. The system filmed parking lots, sidewalks, a garden area of a nearby property, and it also filmed the neighbors going in and out of their homes.

70% of organizations experienced a public cloud security incident in the last year

70% of organizations experienced a public cloud security incident in the last year – including ransomware and other malware (50%), exposed data (29%), compromised accounts (25%), and cryptojacking (17%), according to Sophos.

public cloud security incident

Organizations running multi-cloud environments are greater than 50% more likely to suffer a cloud security incident than those running a single cloud.

Europeans suffered the lowest percentage of security incidents in the cloud, an indicator that compliance with GDPR guidelines are helping to protect organizations from being compromised. India, on the other hand, fared the worst, with 93% of organizations being hit by an attack in the last year.

“Ransomware, not surprisingly, is one of the most widely reported cybercrimes in the public cloud. The most successful ransomware attacks include data in the public cloud, according to the State of Ransomware 2020 report, and attackers are shifting their methods to target cloud environments that cripple necessary infrastructure and increase the likelihood of payment,” said Chester Wisniewski, principal research scientist, Sophos.

“The recent increase in remote working provides extra motivation to disable cloud infrastructure that is being relied on more than ever, so it’s worrisome that many organizations still don’t understand their responsibility in securing cloud data and workloads. Cloud security is a shared responsibility, and organizations need to carefully manage and monitor cloud environments in order to stay one step ahead of determined attackers.”

The unintentional open door: How attackers break in

Accidental exposure continues to plague organizations, with misconfigurations exploited in 66% of reported attacks. Misconfigurations drive the majority of incidents and are all too common given cloud management complexities.

Additionally, 33% of organizations report that cybercriminals gained access through stolen cloud provider account credentials. Despite this, only a quarter of organizations say managing access to cloud accounts is a top area of concern.

Data further reveals that 91% of accounts have overprivileged identity and access management roles, and 98% have multi-factor authentication disabled on their cloud provider accounts.

public cloud security incident

Public cloud security incident: The silver lining

96% of respondents admit to concern about their current level of cloud security, an encouraging sign that it’s top of mind and important.

Appropriately, “data leaks” top the list of security concerns for nearly half of respondents (44%); identifying and responding to security incidents is a close second (41%). Notwithstanding this silver lining, only one in four respondents view lack of staff expertise as a top concern.

Does analyzing employee emails run afoul of the GDPR?

A desire to remain compliant with the European Union’s General Data Protection Regulation (GDPR) and other privacy laws has made HR leaders wary of any new technology that digs too deeply into employee emails. This is understandable, as GDPR non-compliance pay lead to stiff penalties.

analyzing employee emails

At the same time, new technologies are applying artificial intelligence (AI) and machine learning (ML) to solve HR problems like analyzing employee data to help with hiring, completing performance reviews or tracking employee engagement. This has great potential for helping businesses coach and empower employees (and thus help them retain top talent), but these tools often analyze employee emails as a data source. Does this create a privacy issue in regard to the GDPR?

In most cases, the answer is “no.” Let’s explore these misconceptions and explain how companies can stay compliant with global privacy laws while still using AI/ML workplace technologies to provide coaching and empowerment solutions to their employees.

Analyzing employee data with AI/ML isn’t unique to HR

First of all, many appliances already analyze digital messages with AI/ML. Many of these are likely already used by your organization and do not ask for consent from every sender for every message they analyze. Antivirus software uses AI/ML to scan incoming messages for viruses, chatbots use it to answer support emails, and email clients themselves use AI/ML to suggest responses to common questions as the user types them or create prompts to schedule meetings.

Applications like Gmail, Office 365 Scheduler, ZenDesk and Norton Antivirus do these tasks all the time. Office 365 Scheduler even analyzes emails using natural language processing to streamline the simple task of scheduling a meeting. Imagine if they had to ask for the user’s permission every time they did this! HR technologies that do something similar are not unique.

Employers also process employee’s personal data without their consent on a daily basis. Consider these tasks: automatically storing employee communications, creating paperwork for employee reviews or disciplinary action, or sending payroll information to government agencies. Employees don’t need to give consent for this. That’s because there’s a different legal basis at work that allows the company to share data in this way.

Companies do not need employee consent in this context

This isn’t an issue because the GDPR offers five alternative legal bases pursuant to which employee personal data can be processed, including the pursuit of the employer’s “legitimate interests.” This concept is intentionally broad and gives organizations flexibility to determine whether its interests are appropriate, regardless of whether these interests are commercial, individual, or broader societal benefits, or even whether the interests are a company’s own or those of a third party.

GDPR regulations single out preventing fraud and direct marketing as two specific purposes where personal data may be processed in pursuit of legitimate interest, but there are many more.

These “legitimate interest” bases give employers grounds to process personal data using AI/ML applications without requiring consent. In fact, employers should avoid relying on consent to process employee’s personal data whenever possible. Employees are almost never in a position to voluntarily or freely give consent due to the imbalance of power inherent in employer-employee relationships, and therefore the consents are often invalid. In all the cases listed above, the employer relies on legitimate interest to process employee data. HR tools fall into the same category and don’t require consent.

A right to control your inbox

We’ve established that employers can process email communication data internally with new HR tools that use AI/ML and be compliant with the GDPR. But should they?

Here is where we move from legal issues to ethical issues. Some companies that value privacy might believe that employees should control their own inbox, even though that’s not a GDPR requirement. That means letting employees grant and revoke permission to the applications that can read their workplace emails (and which have already been approved by the company). This lets the individual control their own data. Other organizations may value the benefits of new tools over employee privacy and may put them in place without employees’ consent.

I have seen some organizations create a middle ground by making these tools available to employees but requiring them to opt in to use them (rather than installing them and giving employees the option to opt out, which puts an extra burden on them to maintain privacy). This can both respect employee’s privacy and allow HR departments to use new technologies to empower individuals if they so choose. This is more important than ever in the new era of widespread work from home where we have an abundance of workplace communication and companies are charting new courses to help their employees thrive in the future of work.

Fully understanding compliance around new AI/ML tools is key to effectively rolling them out. While these solutions can be powerful and may help your employees become more self-aware and better leaders, organizations should fully understand compliance and privacy issues associated with their use in order to roll them out effectively.

EU Commission: The GDPR has been an overall success

The European Commission has published an evaluation report on the General Data Protection Regulation (GDPR), two years after the regulation became enforceable.

GDPR two years

Two years of the GDPR: What was achieved?

Citizens are more empowered and aware of their rights: The GDPR enhances transparency and gives individuals enforceable rights, such as the right of access, rectification, erasure, the right to object and the right to data portability. Today, 69% of the population above the age of 16 in the EU have heard about the GDPR and 71% of people heard about their national data protection authority, according to results published last week in a survey from the EU Fundamental Rights Agency. However, more can be done to help citizens exercise their rights, notably the right to data portability.

Data protection rules are fit for the digital age: The GDPR has empowered individuals to play a more active role in relation to what is happening with their data in the digital transition. It is also contributing to fostering trustworthy innovation, notably through a risk-based approach and principles such as data protection by design and by default.

Data protection authorities are making use of their stronger corrective powers: From warnings and reprimands to administrative fines, the GDPR provides national data protection authorities with the right tools to enforce the rules. However, they need to be adequately supported with the necessary human, technical and financial resources. Many Member States are doing this, with notable increases in budgetary and staff allocations. Overall, there has been a 42% increase in staff and 49% in budget for all national data protection authorities taken together in the EU between 2016 and 2019. However, there are still stark differences between Member States.

Data protection authorities are working together in the context of the European Data Protection Board (EDPB), but there is room for improvement: The GDPR established a governance system which is designed to ensure a consistent and effective application of the GDPR through the so called ‘one stop shop’, which provides that a company processing data cross-border has only one data protection authority as interlocutor, namely the authority of the Member State where its main establishment is located. Between 25 May 2018 and 31 December 2019, 141 draft decisions were submitted through the ‘one-stop-shop’, 79 of which resulted in final decisions. However, more can be done to develop a truly common data protection culture. In particular, the handling of cross-border cases calls for a more efficient and harmonised approach and an effective use of all tools provided in the GDPR for the data protection authorities to cooperate.

Advice and guidelines by data protection authorities: The EDPB is issuing guidelines covering key aspects of the Regulation and emerging topics. Several data protection authorities have created new tools, including helplines for individuals and businesses, and toolkits for small and micro-enterprises. It is essential to ensure that guidance provided at national level is fully consistent with guidelines adopted by the EDPB.

Harnessing the full potential of international data transfers: Over the past two years, the Commission’s international engagement on free and safe data transfers has yielded important results. This includes Japan, with which the EU now shares the world’s largest area of free and safe data flows. The Commission will continue its work on adequacy, with its partners around the world. In addition and in cooperation with the EDPB, the Commission is looking at modernising other mechanisms for data transfers, including Standard Contractual Clauses, the most widely used data transfer tool. The EDPB is working on specific guidance on the use of certification and codes of conduct for transferring data outside of the EU, which need to be finalised as soon as possible. Given the European Court of Justice may provide clarifications in a judgment to be delivered on 16 July that could be relevant for certain elements of the adequacy standard, the Commission will report separately on the existing adequacy decisions after the Court of Justice has handed down its judgment.

Promoting international cooperation: Over the last two years, the Commission has stepped up bilateral, regional and multilateral dialogue, fostering a global culture of respect for privacy and convergence between different privacy systems to the benefit of citizens and businesses alike. The Commission is committed to continuing this work as part of its broader external action, for example, in the context of the Africa-EU Partnership and in its support for international initiatives, such as ‘Data Free Flow with Trust’. At a time when violations of privacy rules may affect large numbers of individuals simultaneously in several parts of the world, it is time to step up international cooperation between data protection enforcers. This is why the Commission will seek authorisation from the Council to open negotiations for the conclusion of mutual assistance and enforcement cooperation agreements with relevant third countries.

GDPR: What’s next?

According to the report, in two years the GDPR has met most of its objectives, in particular by offering citizens a strong set of enforceable rights and by creating a new European system of governance and enforcement.

The GDPR proved to be flexible to support digital solutions in unforeseen circumstances such as the Covid-19 crisis. The report also concludes that harmonisation across the Member States is increasing, although there is a certain level of fragmentation that must be continually monitored. It also finds that businesses are developing a compliance culture and increasingly use strong data protection as a competitive advantage.

The GDPR has acted as a catalyst for many countries and states around the world – e.g., Chile, South Korea, Brazil, Japan, Kenya, India, Tunisia, Indonesia, Taiwan and the state of California – to consider how to modernise their privacy rules, the EC noted.

They also pointed out that it provided data protection authorities many corrective powers to enforce it (administrative fines, orders to comply with data subject’s requests, bans on processing or the suspension of data flows, etc.)

There is room for improvement, though.

“For example, we need more uniformity in the application of the rules across the Union: this is important for citizens and for businesses, especially SMEs. We need also to ensure that citizens can make full use of their rights,” noted Didier Reynders, Commissioner for Justice.

The EC also noted that stakeholders should also make sure to closely monitoring the application of the GDPR to new technologies such as AI, Internet of Things, abd blockchain.

Companies are rethinking their approach to privacy management

TrustArc announced the results of its survey on how organizations are protecting and leveraging data, their most valuable asset. The survey polled more than 1,500 respondents from around the world at all levels of the organization.

privacy management

“There are more than 900 global privacy laws to which organizations must adhere, making privacy management an ongoing and dynamic challenge,” said Chris Babel, CEO, TrustArc.

“The TrustArc survey highlights just how difficult it can be to comply with even a single new regulation, such as CCPA, let alone the entire list of existing laws. The results also show how the COVID-19 pandemic and its attendant technologies, such as video conferencing, have exacerbated an already difficult privacy challenge and forced respondents to rethink their approaches.”

CCPA compliance readiness mostly lacking, prior GDPR preparedness a boost

29% of respondents say they have just started planning for CCPA.

  • More than 20% of respondents report they are either somewhat unlikely to be, very unlikely to be, or don’t know if they will be fully compliant with CCPA on July 1.
  • Just 14% of respondents are done with CCPA compliance. Nine percent have not started with CCPA compliance, and 15% have a plan but have not started implementation.
  • Of respondents who reported as being slightly or very knowledgeable about CCPA and GDPR regulations, 82% are leveraging at least some of the work they did for GDPR in implementing CCPA requirements.

Privacy professionals still use inefficient technologies for compliance programs

Though 90% of respondents agree or strongly agree that they are “mindful of privacy as a business,” many privacy professionals are left building privacy programs without automation.

  • 19% of respondents report they are most deficient in automating privacy processes.
  • Just 17% of all respondents have implemented privacy management software, which matches the 17% who are still using spreadsheets and word processors.
  • In addition, 19% are using open source/free software and 9% are doing nothing.
  • Even in the U.S., which boasts the highest rate of privacy management software adoption, just 22% of respondents use privacy management software as their primary compliance software.

Respondents understand the importance of data privacy and continue to invest in ongoing privacy programs. However, many are still attempting to implement these programs using manual processes and technologies that do not offer automation.

Moving forward, the companies that can leverage automation to simplify data privacy can protect their most valuable asset—data—and use it to drive business growth.

New technologies present additional challenges to compliance

With the move to all-remote workforces, companies are increasingly turning to technologies, such as video conferencing and collaboration tools. These tools present new avenues for data creation that privacy professionals must consider in their company-wide plans.

  • Twenty-two percent of respondents said personal device security during the pandemic has added a great deal of risk to their businesses. “Personal device security” received the highest proportion of “a great deal of risk” responses, compared to the other four response options.
  • A majority of respondents said that third-party data, supply chain, personal-device security, unintentional data sharing, and required or voluntary data sharing for public health purposes all added at least a moderate amount of risk to their businesses.
  • Seventy percent of respondents say video conferencing tools have required a moderate or great change to their privacy approach, and 65% of respondents say collaboration tools have required a moderate or great change to privacy approaches.

Despite financial impact of pandemic, privacy compliance remains a high priority

Though many respondents expect a significant decrease in their company’s revenues as a result of the COVID-19 pandemic, they are still prioritizing privacy-related investments.

  • Forty-four percent of companies expect a decrease or steep decrease in overall company revenues for the balance of 2020 as a result of COVID-19.
  • Just 15% of respondents report they plan to spend less or a great deal less on privacy efforts in 2020 as a result of the pandemic.
  • 42% of respondents plan to spend $500,000 or more in 2020 on CCPA efforts alone.

Boards of directors actively involved in privacy management

The mandate for increased privacy investments is coming from the very top of organizations.

  • Eighty-three percent of respondents indicate their board of directors regularly reviews privacy approaches.
  • An impressive 86% of respondents say that everyone from the board of directors to the front-line staff knows their role in protecting privacy.
  • Four out of five respondents view privacy as a key differentiator for their company.

GDPR enforcement over the past two years

Two years after the GDPR went into effect, official data show that Data Protection Authorities (DPAs), crippled by a lack of resources, tight budgets, and administrative hurdles, have not yet been able to create adequate GDPR enforcement.

GDPR enforcement

Worse, some public authorities have grossly misused the GDPR to undermine other fundamental rights such as the right to free expression and freedom of the press, Access Now reveals.

The GDPR’s first two years have been marked by crisis, whether internal, external, political, geopolitical, or administrative. Beyond enforcement challenges, the report explores how these crises have impacted the protection of personal data in the EU, taking a close look at both Brexit and the COVID-19 outbreak.

“Through this report, we raise the alarm to the EU institutions and Data Protection Authorities that it’s high time to act to enforce the GDPR and condemn its misuses,” said Estelle Massé, Senior Policy Analyst and Global Data Protection Lead at Access Now.

“The European Union may have the best law in the world for the protection of personal data, but if it is not enforced, it risks being as useful as a chocolate teapot.”

The GDPR remains a strong framework, and if authorities take urgent action, it can go a long way in defending people’s fundamental rights.

GDPR around the world

From May 2018 to March 2020, authorities levied 231 fines and sanctions while as many as 144,376 complaints were filed between May 2018 and May 2019.

Out of 30 DPAs from all 27 EU countries, the United Kingdom, Norway, and Iceland, only nine said they were happy with their level of resourcing. The inadequate budget provided to DPAs means that our rights may not be effectively protected. In fact, it may create a negative incentive for DPAs investigating large tech companies to agree on settlements that may be more favorable to the companies. This is reinforced by the huge disparity of resources between data protection authorities and companies they oversee.

In Poland, Romania, Hungary, and Slovakia, courts and authorities have been abusing the GDPR to curtail investigative journalism or target civic tech NGOs by trying to force outlets to reveal their sources.

The GDPR is a robust tool to guide officials and public health authorities in the response to the COVID-19 crisis. Access Now condemns Hungary’s disproportionate decision to limit the application of GDPR rights during the COVID-19 crisis as it gravely endangers people’s right to data protection at a time when our personal information, including our health data, is being collected perhaps more than ever.

Enforcement challenges and the UK’s insistence on lowering current standards through the Brexit talks have implications for any future negotiations of a so-called adequacy decision between the EU and the UK that would authorize the transfer of data between the two jurisdictions.

GDPR enforcement

Key recommendations

Governments across the EU must increase the financial and human resources allocated to Data Protection Authorities, including technical staff, so that they can function properly and be able to address the large number of complaints.

The European Commission should launch infringement procedures against EU states:

  • When they do not provide sufficient resources to Data Protection Authorities, or
  • When they do not guarantee the Data Protection Authority independence in status and in practices, or
  • Where Data Protection Authorities or courts misuse the GDPR to restrict freedom of the press or stifle civil society’s work.

Data Protection Authorities must not misuse the GDPR, as they hold much of the responsibility for the GDPR’s success or failure. It is absolutely unacceptable that DPAs misuse the GDPR to undermine civil society, restrict freedom of the press, or otherwise violate human rights.

Reality bites: Data privacy edition

May 25th is the second anniversary of the General Data Protection Regulation (GDPR) and data around compliance with the regulation shows a significant disconnect between perception and reality.

Only 28% of firms comply with GDPR; however, before GDPR kicked off, 78% of companies felt they would be ready to fulfill data requirements. While their confidence was high, when push comes to shove, complying with GDPR and GDPR-like laws – like CCPA and PDPA – are not as easy as initially thought.

Data privacy efforts

While crucial, facing this growing set of regulations is a massive, expensive undertaking. If a company is found out of compliance with GDPR, it’s looking at upwards of 4% of annual global turnover. To put that percentage in perspective, of the 28 major fines handed down since the GDPR took effect in May 2018, that equates to $464 million dollars spent on fines – a hefty sum for sure.

Additionally, there is also a cost to comply – something nearly every company faces today if they conduct business on a global scale. For CCPA alone, the initial estimates for getting California businesses into compliance is estimated at around $55 billion dollars, according to the State of California DoJ. That’s just to comply with one regulation.

Here’s the reality: compliance is incredibly expensive, but not quite as expensive as being caught being noncompliant. This double-edged sword is unfortunate, but it is the world we live in. So, how should companies navigate in today’s world to ensure the privacy rights of their customers and teams are protected without missing the mark on any one of these regulatory requirements?

Baby steps to compliance

A number of companies are approaching these various privacy regulations one-by-one. However, taking a separate approach for each one of these regulations is not only extremely laborious and taxing on a business, it’s unnecessary.

Try taking a step back and identifying the common denominator across all of the regulations. You’ll find that in the simplest form, it boils down to knowing what data you actually have and putting the right controls in place to ensure you can properly safeguard it. Implementing this common denominator approach can free up a lot of time, energy and resources dedicated to data privacy efforts across the board.

Consider walking through these steps when getting started: First, identify the sensitive data being housed within systems, databases and file stores (i.e. Box, Sharepoint, etc.). Next, identify who has access to what so that you can ensure that only the right people who ‘should’ have access do. This is crucial to protecting customer information. Lastly, implement controls to keep employee access updated. Using policies to keep access consistent is important, but it’s crucial that they are updated and stay current with any organizational changes.

Staying ahead of the game

The only way to stay ahead of the numerous privacy regulations is to take a general approach to privacy. We’ve already seen extensions on existing regulations, like The California Privacy Rights and Enforcement Act of 2020. ‘CCPA 2.0’ as some people call it, would be an amendment to the CCPA. So, if this legislation takes effect, it would create a whole new set of privacy rights that align well with GDPR, putting greater safeguards around protecting sensitive personal information. It’s my opinion that since the world has begun recognizing privacy rights are more invaluable than ever, that we’ll continue to see amendments piggybacking on existing regulations across the globe.

While many of us have essentially thrown in the towel, knowing that our own personal data is already out there on the dark web, it doesn’t mean that we can all sit back and let this continue to happen. Considering, this would be to the detriment of our customers’ privacy, cost-prohibitive and ineffective.

So, what are the key takeaways? Make your data privacy efforts just as central as the rest of your security strategy. Ensure it is holistic and takes into account all facts and overlaps in the various regulations we’re all required to comply with today. Only then do you stand a chance at protecting your customers and your employees’ data and dodge becoming another news headline and a tally on the GDPR fine count.

Despite spending more on compliance, businesses still have basic IT weaknesses

There is a misalignment between data privacy regulation spending and business outcomes, according to a Tanium research. Specifically, as businesses spend tens of millions on compliance, over 90 percent have fundamental IT weaknesses that leave them vulnerable and potentially non-compliant.

businesses compliance

The global study of 750 IT decision makers revealed that organizations have spent on average $70.3 million each to comply with the GDPR, the CCPA, and other data privacy regulations over the past year.

Most businesses have hired new talent (81 percent), invested in workforce training (85 percent) and introduced new software or services (82 percent) to ensure continued compliance.

In addition, 87 percent of organizations have set aside or increased their cyber liability insurance by an average of $185 million each, to deal with the potential consequences of a data breach.

However, despite this increased investment, businesses still feel unprepared to deal with the evolving regulatory landscape, with over a third (37 percent) claiming that a lack of visibility and control of endpoints is the biggest barrier to maintaining compliance with regulations such as GDPR.

Increased spending not solving visibility challenges

This lack of visibility into how organizations see and manage endpoints such as laptops, servers, virtual machines, containers and cloud infrastructure causes major challenges. In fact, the study revealed major visibility gaps in the IT environment of most organizations prior to the pandemic.

Ninety four percent of IT decision makers have discovered unknown endpoints within their IT environment, and 71 percent of CIOs said they find new endpoints on a weekly basis.

Mass home working and employee use of personal devices is likely to exacerbate these problems, expanding the corporate attack surface. When compliance relies on understanding what tools you use, what endpoints you have and what data you hold across the entire organization, these visibility gaps are dangerous.

Chris Hodson, CISO at Tanium said, “While it’s encouraging to see global businesses investing to stay on the right side of data privacy regulations, our research suggests that their good work could be undermined by inattention to basic IT principles.

“Many organizations seem to have fallen into the trap of thinking that spending a considerable amount of money on GDPR and CCPA is enough to ensure compliance. Yet without true visibility and control of their IT assets, they’re leaving a backdoor open to malicious actors.”

What is causing visibility gaps?

The majority (91 percent) of respondents acknowledged fundamental weak points within their organizations that are preventing a comprehensive view of their IT estate.

These visibility gaps are being caused by a lack of unity between IT, operations and security teams (39 percent), a lack of resources to effectively manage their IT estate (31 percent), legacy systems which don’t give them accurate information (31 percent), shadow IT (29 percent) and too many tools used across their business (29 percent).

The research found that firms have implemented an average of 43 separate security and operations tools to manage their IT environments. Tool sprawl like this further limits the effectiveness of siloed and distributed teams, adding unnecessary complexity.

Tech leaders are concerned about the consequences

In the study, IT leaders cited concerns that limited visibility of endpoints could leave their company more vulnerable to cyberattacks (53 percent), damage the brand reputation (39 percent), make risk assessments harder (33 percent), impact customer churn (31 percent) and lead to non-compliance fines (23 percent).

Respondents also revealed a false sense of confidence when it came to compliance readiness. Ninety percent of IT decision makers said they were confident of being able to report all required breach information to regulators within 72 hours. But with nearly half (48 percent) reporting they have challenges in getting visibility into devices on their network, this confidence appears to be misplaced — a single missed endpoint could be a compliance violation waiting to happen.

Chris Hodson, CISO at Tanium concluded: “GDPR and CCPA represent the beginning of a complex new era of rigorous data privacy regulations. Although some regulators have postponed large fines due to the current pandemic, it doesn’t defer the requirement for companies to ensure personal information is stored and processed using the strictest safeguards.

“Technology leaders need to focus on the fundamentals of unified endpoint management and security to drive rapid incident response and improved decision making. The first step must be gaining real-time visibility of these endpoints, which is a crucial prerequisite to improved IT hygiene, effective risk management, and regulatory compliance. With most teams working from home these days and many having to use their own devices, this has never been more important.”

GDPR, CCPA and beyond: How synthetic data can reduce the scope of stringent regulations

As many organizations are still discovering, compliance is complicated. Stringent regulations, like the GDPR and the CCPA, require multiple steps from numerous departments within an enterprise in order to achieve and maintain compliance. From understanding the regulations, implementing technologies that satisfy legal requirements, hiring qualified staff and training, to documentation updating and reporting – ongoing compliance can be costly and time intensive.

synthetic data

In fact, a report found that one-third of all enterprises (defined as businesses with 1000+ employees) spent more than $1 million on GDPR compliance alone.

As more states move to adopt GDPR-like regulations, such as California’s CCPA and Washington’s failed, but not forgotten Washington Privacy Act (WPA) legislation, organizations are having to look very closely at their data sets and make critical decisions to ensure compliance and data security.

But what can be done to minimize the scope of these stringent and wide-reaching regulations?

If an organization can identify all of its personal data, take it out of the data security and compliance equation completely – rending it useless to hackers, insider threats, and regulation scope – it can eliminate a huge amount of risk, and drastically the reduce the cost of compliance.

Enter synthetic data

Organizations like financial institutions and hospitals handle large quantities of extremely sensitive credit/debit card and personally identifiable information (PII). As such, they must navigate a very stringent set of compliance protocols – they can fall under the GDPR, CCPA, PCI DSS and additional laws and regulations depending on their location and the location of their customers.

Synthetic data is helping highly regulated companies safely use customer data to increase efficiencies or reduce operational costs, without falling under scope of stringent regulations.

Synthetic data makes this possible by removing identifiable characteristics of the institution, customer and transaction to create what is called a synthetic data set. Personally identifiable information is rendered unrecognizable by a one-way hash process that cannot be reversed. A cutting-edge data engine makes minor and random field changes to the original data, keeping the consumer identity and transaction associated with that consumer completely protected.

Once the data is synthetized, it’s impossible for a hacker or malicious insider to reverse-engineer the data. This makes the threat of a data breach a non-issue for even the largest enterprises. Importantly, this synthetic data set still keeps all the statistical value of the original data set, so that analysis and other data strategies may be safely conducted, such as AI algorithm feeding, target marketing and more.

What do the major data privacy regulations say about synthetic data

The CCPA does not expressly reference synthetic data, but it expressly excludes de-identified data from most of the CCPA’s requirements in cases where the requisite safeguards are in place. Synthesized data as defined is considered de-identified data. The CCPA also excludes from its coverage personal information subject to several federal privacy laws and comparable California state laws, including “personal information collected, processed, sold, or disclosed pursuant to Gramm-Leach-Bliley Act (GLBA) and the California Financial Information Privacy Act.”

Likewise, the GDPR does not expressly reference synthetic data, but it expressly says that it does not apply to anonymous information: according to UCL, “information which does not relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable.” Synthetic data is considered personal data which has been rendered anonymous and therefore falls outside the material scope of the GDPR.

Essentially, these important global regulatory mandates do not apply to collection, storage and use of synthesized data.

A big solution for big struggles

As businesses continue to grow in size and number of customers, the amount and frequency of data that flows in also increases dramatically. With these vast streams of data comes a struggle to collect, store and use customer data in a private and secure manner. This struggle is also becoming more publicly known, as headlines of data breaches or compliance violations flood news feeds seemingly every week.

To effectively and efficiently manage the influx of sensitive data while staying compliant and secure, companies can implement synthetic data in their environments with zero risks. Companies can use synthetic data to dig into customer action likelihood, analytics, customer segmentation for marketing, fraud detection trends, and more without jeopardizing compliance or data privacy.

And with data being the key to actualizing machine learning and artificial intelligence engines, companies can also utilize synthetic data to gain valuable insights into their algorithm data and design new products, reduce operational costs, and analyze new business endeavors while keeping customer privacy intact.

With the GDPR and the CCPA now in full effect and more industry and region-specific data regulations on the horizon, organizations of all sizes that want to reduce the burden of compliance will look to use synthetic data technology to manage their privacy and data security-related legal obligations.

Synthetic data helps organizations in highly regulated industries put customer data security and privacy first and keep their data operations frictionless and optimized while minimizing the scope of compliance. The more organizations that adopt synthetic data, the safer personal information transactions become, and the more organizations are free to conduct business without having to worry about regulation.

Organizations still struggle to manage foundational security

Regulatory measures such as GDPR put focus on data privacy at design, tightening requirements and guiding IT security controls like Public Key Infrastructure (PKI).

foundational security

Continued adoption of IoT, cloud and mobile technologies are increasing the number of digital certificates and keys that ensure secure connections and identity authentication through PKI, a Keyfactor and Ponemon Institute research reveals.

“This research demonstrates that despite heightened compliance focus, businesses struggle to manage foundational security like PKI and the tools and processes that maintain it. This is concerning, especially as the number of digital certificates and keys within enterprise continues to multiply,” said Chris Hickman, CSO at Keyfactor.

Regulatory compliance a strategic priority

Half of respondents indicate regulatory compliance as a strategic priority and two-thirds say their organization is adding additional layers of encryption to comply with regulations and IT policies.

However, undocumented or unenforced key management policies are problematic, with respondents averaging more than four failed audits or compliance experiences in the last 24 months.

“Less than half of respondents say they have sufficient staff dedicated to PKI,” said Hickman.

“A lack of program ownership, combined with the constant care and feeding that digital identities need, has introduced new risk, creating an exposure epidemic. Unless leaders invest in in-house processes and outsourced resources to manage PKI, enterprise will risk failed audits, fines and worse, a security breach.”

foundational security

Foundational security: Additional findings

  • A rise in security incidents: on average, organizations experienced a Certificate Authority (CA) or rogue man-in-the-middle (MITM) and/or phishing attack four times in the last 24 months, facing a 32% likelihood of a MITM or phishing attack over the next 24 months.
  • Staffing shortages: on average, 15% of IT security budget is spent on PKI deployment annually, yet just 43% of respondents say their organisation has enough IT security staff members dedicated to PKI deployment.
  • Lack of visibility: 70% of respondents say their organisation does not know how many digital certificates and keys it has within the business.
  • Cryptography related security incidents undermine trust: 68% of respondents say failure to secure keys and certificates undermines the trust their organisation relies upon to operate.
  • Cryptography lacks a center of excellence: despite the rising cost of PKI and growth of cryptography-related incidents, just 40% of companies have the ability to drive enterprise-wide best practice.
  • Spending trend: represented organizations are spending an average of £9.37M on IT security annually, with £1.37M dedicated to PKI.

How financial services firms are handling data privacy

One-third of financial services organizations lack a clear plan or the resources to address privacy risks related to customer data in the next 12 months, according to a report by Accenture.

financial services data privacy

The report is based on a survey of 100 privacy executives in the banking, insurance and capital markets sectors in North America and Europe. It focuses on how companies should rethink how they use, store and protect customer data as recently implemented regulations, including the General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA), give consumers explicit privacy rights.

An increased need for a clear privacy strategy

According to the report, seven in 10 respondents (70%) see privacy as a key risk for their firms, increasing the need for a clear privacy strategy. Noting that nearly three-quarters (72%) of respondents’ companies use consent to tailor customer-facing products and services, the report suggests that financial services firms incorporate privacy into the overall customer journey by giving customers more control over their data and deleting personal information upon request.

“Given the renewed regulatory focus and threat of significant financial fines, it’s not surprising that financial services firms are making privacy a top priority,” said Ben Shorten, a managing director in Accenture’s Strategy & Consulting group.

“But these institutions should think beyond the compliance risks and consider the broader opportunity to elevate the customer experience around privacy. Consumers are willing to share information if there’s value in it for them, whether personalized offers, better services or more competitive pricing. Firms that understand how customers perceive and value data privacy have a clear opportunity to differentiate themselves.”

Building consumer trust

When asked which privacy risks will require the most effort to remediate over the next year, respondents most often cited privacy risk monitoring (51%), the accuracy and maintenance of records processing/ information asset registers (44%), and records management and data retention/deletion (41%).

These risks are heightened by the “right to erasure” requests under GDPR and CCPA, which empower consumers to ask companies to delete their personal data upon request, making proper records management critical. One way that firms can achieve this, according to the report, is by using automated tools to aid with data discovery.

The report notes that while three-fourths (76%) of respondents plan to increase their privacy investments over the next year, companies without a clear privacy strategy could fail to reap the expected value from these investments — while those that create clear strategies and infuse a culture of privacy awareness across their organizations will differentiate themselves and build consumer trust.

In addition, as firms increasingly focus on demonstrating ethical and responsible use of data in their artificial intelligence and machine-learning algorithms, a new class of privacy risks related to data ethics could emerge.

This presents another opportunity for firms to build consumer trust by providing greater transparency around automated decisioning models and introducing ethical guide rails for the use of personal data.

More than 40% of privacy compliance technology will rely on AI by 2023

Over 40% of privacy compliance technology will rely on artificial intelligence (AI) by 2023, up from 5% today, according to Gartner.

privacy compliance technology

The research was conducted online among 698 respondents in Brazil, Germany, India, the U.S. and the U.K.

“Privacy laws, such as General Data Protection Regulation (GDPR), presented a compelling business case for privacy compliance and inspired many other jurisdictions worldwide to follow,” said Bart Willemsen, research vice president at Gartner.

“More than 60 jurisdictions around the world have proposed or are drafting postmodern privacy and data protection laws as a result. Canada, for example, is looking to modernize their Personal Information Protection and Electronic Documents Act (PIPEDA), in part to maintain the adequacy standing with the EU post-GDPR.”

Privacy leaders are under pressure to ensure that all personal data processed is brought in scope and under control, which is difficult and expensive to manage without technology aid. This is where the use of AI-powered applications that reduce administrative burdens and manual workloads come in.

AI-powered privacy technology lessens compliance headaches

At the forefront of a positive privacy user experience (UX) is the ability of an organization to promptly handle subject rights requests (SRRs). SRRs cover a defined set of rights, where individuals have the power to make requests regarding their data and organizations must respond to them in a defined time frame.

According to the survey, many organizations are not capable of delivering swift and precise answers to the SRRs they receive. Two-thirds of respondents indicated it takes them two or more weeks to respond to a single SRR. Often done manually as well, the average costs of these workflows are roughly $1,400 USD, which pile up over time.

“The speed and consistency by which AI-powered tools can help address large volumes of SRRs not only saves an organization excessive spend, but also repairs customer trust,” said Mr. Willemsen. “With the loss of customers serving as privacy leaders’ second highest concern, such tools will ensure that their privacy demands are met.”

Global privacy spending on compliance tooling will rise to $8 billion through 2022

Through 2022, privacy-driven spending on compliance tooling will rise to $8 billion worldwide. Privacy spending is expected to impact connected stakeholders’ purchasing strategies, including those of CIOs, CDOs and CMOs. “Today’s post-GDPR era demands a wide array of technological capabilities, well beyond the standard Excel sheets of the past,” said Mr. Willemsen.

“The privacy-driven technology market is still emerging,” said Mr. Willemsen.

“What is certain is that privacy, as a conscious and deliberate discipline, will play a considerable role in how and why vendors develop their products. As AI turbocharges privacy readiness by assisting organizations in areas like SRR management and data discovery, we’ll start to see more AI capabilities offered by service providers.”

Employees aware of privacy risks, but unsure of how they affect the workplace

62 percent of employees are unsure if their organization has to comply with the recently-enacted CCPA, which gives California residents enhanced consumer data privacy rights, according to a survey of more than 1,000 employees conducted by Osterman Research.

aware of privacy risks

Results reveal a similar lack of awareness regarding the GDPR, in effect since 2018.

Employee cybersecurity and privacy engagement

The findings reveal progress in cybersecurity awareness. However, many respondents continue to hold false impressions about malware, phishing, and cloud file-sharing, putting their personal and employers’ data at risk.

“The benefits and rewards of digital technology are many, but so are the risks. As states race to address cybersecurity and data privacy risks with new compliance measures, businesses are under more pressure than ever to educate their employees, or prepare to face increasingly negative outcomes,” MediaPRO Chief Strategist Lisa Plaggemier said.

“To adequately protect consumer data, companies must quickly transform employees from bystanders into security advocates, and that begins with awareness programs that engage employees and reinforce behaviors that align with security and compliance goals.”

The survey assessed employee engagement with and understanding of good cybersecurity and privacy practices (or lack thereof) across multiple risk areas. Overall results show more than 50 percent of respondents fall within the “vulnerable” side of the spectrum regarding their reported practices and attitudes.

“The survey revealed a number of key issues that decision makers should address right away,” said Michael Osterman, Principal Analyst of Osterman Research. “Among them is the need for more and better security awareness training, and improving employees’ perception of their role as a key line of defense for both security and privacy compliance.”

Confidence and security awareness remain lacking

Awareness of seemingly basic cybersecurity threats and best practices remains insufficient among many employees, putting them and their organizations at risk. More than a quarter admitted struggling to identify a phishing email, while just 17 percent felt “very confident” they could identify a social engineering attack.

Only 27 percent of employees can identify at least two warning signs that malware has infected their computing platform, and two in five employees are unable to describe to senior management the negative impacts posed by cybersecurity risks.

Misinformation and misconceptions abound

Cybersecurity awareness requires the ability to correctly distinguish cybersecurity fact from fiction, yet many employees have distorted ideas. For instance, one in seven employees believe that – much like the flu passes among people – malware can spread among devices in close physical proximity.

A full 39 percent of employees mistakenly believe that simply leaving their computer unlocked can also result in a malware infection.

Privacy regulations remain challenging

Many employees require a better understanding of the privacy regulations and guidelines impacting their organizations, and the requisite steps to protect data.

A majority of employees (more than 60 percent) don’t know if their organization needs to comply with most privacy rules and data protection guidelines such as the CCPA, PCI DSS, and GDPR.

In fact, nearly three in five employees (58 percent) don’t believe storing sensitive data in an unsecured location or on their desktop / laptop computers or mobile devices (69 percent) could pose a potential policy violation.

Social media and file-sharing security awareness is high

The majority of employees (more than 50 percent) understand that oversharing on social media is a bad idea, as it can give cybercriminals the information and opportunity to craft more targeted attacks.

More than half of employees understand using personal webmail for work purposes poses a risk to their organization, and 90 percent recognize the risk associated with using personally managed file-sharing or similar cloud solutions for work purposes.

Employees possess password savvy

The majority of employees are mindful of password best practices, using a unique password for every device and application (52 percent). When working from home 61 percent of employees agree it’s important to change their router’s default password before accessing corporate data or email.

Urgency of updates is understood

Software updates serve an important role in protecting devices from viruses and malware, and ensuring security holes are quickly patched before cyber thieves can exploit them.

The vast majority of employees (84 percent) understand that regularly installing software upgrades help protect against cybersecurity threats and prevent security breaches.

“Safely navigating the digital world remains confusing for many. Add to that an ever changing roster of seemingly byzantine rules and regulations and the effort can seem almost insurmountable,” said Tom Pendergast, Chief Learning Officer at MediaPRO.

“This survey shows we still have a long way to go toward resolving employee clarity and consistency on cybersecurity and data privacy obligations and best practices; however, we’re encouraged that many of our respondents appear to be on the right track in putting their cybersecurity knowledge into action day-to-day.”

What the government infosec landscape will look this year

The information security landscape seems to evolve at a faster clip each year. The deluge of ever-changing threats, attack techniques and new breaches making headlines can be challenging to track and assess. That’s why each year the WatchGuard Threat Lab takes a step back to assess the world of cyber security and develop a series of predictions for what emerging trends will have the biggest impact.

government infosec landscape

Following the worldwide controversy over hacking that influenced the 2016 presidential election and the many widely publicized privacy and security incidents that have taken place since, we believe the government information security sphere is the stage upon which we’ll see two major security developments play out in 2020.

The first is that bad actors will target voter registration systems with the intent to generate voting havoc and trigger voter fraud alerts. The second is that we’ll see multiple states enact privacy regulations inspired by GDPR and the CCPA. Let’s take a look at how these two issues will unfold in 2020 and what you need to know to be prepared.

Impending voter registration systems hacks

Security researchers have proven many times over that voting machines are hackable, but most of them don’t expect threat actors to expend the vast amount of time and resources needed to successfully hack the 2020 presidential election voting results directly. Instead, these online adversaries will use subtler tactics in the coming months to tamper with the voting process at the state and local level.

The culprits behind previous election-related attacks are state-sponsored actors that are happy to execute highly effective, politically motivated misinformation campaigns across social media platforms, but appear to draw the line at actually altering the voting results themselves. In 2020, they’ll seek to build on the success they achieved in 2016. We believe they will target US voter registration systems to make it more difficult for legitimate voters to cast their ballot and attempt to cause widespread mistrust in the validity of vote counts. Indirectly influencing the election by creating confusion, fear, uncertainty and doubt will be their MO.

What can we do about it? For state and local government departments managing voter registration systems it will be important to perform security audits and find and fix potential vulnerabilities before the bad guys have a chance to exploit them.

While there’s not a tremendous amount the average voter can do to ward off election hacking attempts by state-sponsored cyber criminals, there are some basic things you should keep in mind to make sure your voice is heard on election day. First, double-check the status of your voter registration at least a week before the election. Monitor the news for any updates about voter registration database hacks leading up to the election and be sure to contact your local state voter authority if you’re concerned. Lastly, bring a printed confirmation of your completed voter registration and multiple forms of ID on election day (just in case).

An upsurge in state-level privacy legislation

The European Union made a global splash when it implemented the GDPR. Designed to provide better privacy for its citizens’ data (regardless of the location of the organizations with access to it), the historic law was initially met with cynicism and uncertainty (and even panic in some cases) due to its stringent criteria and heavy penalties for noncompliance.

That said, since its inception, the level of privacy the law provides for individuals has been well-received. People welcome the comfort of knowing that organizations are finally being incentivized to protect their privacy and held accountable for mishandling their data. It goes a long way to inspire confidence in the public when organizations like Google and Marriott are fined millions of euros for GDPR violations.

Massive organizations like Facebook continue to neglect their obligation to safeguard user data and America’s appetite for privacy seems to be growing with each passing data breach and scandal involving the sale of user data. That’s why in 2020 you should expect to see 10 or more states to enact privacy laws similar to GDPR.

In fact, California has already passed its own CCPA and will begin rolling out fines for violations by mid-year. Given that most states passed mandatory data breach disclosure laws in the mid-2000s and lawmakers still haven’t been able to pass a federal version to date, it’s unlikely that the movement to enact a federal privacy law will gain enough steam to pass in the near term. That said, the rising public outcry for data privacy makes it highly likely that individual states will take it upon themselves to follow in California’s footsteps and pass privacy acts of their own.

This momentum will grow in 2020, so it will be critical for businesses across the country to carefully study the CCPA requirements and prepare to make adjustments. Other states will use the CCPA as a reference point for developing similar regulations of their own. If you’re concerned with your own personal data privacy, contact your local representatives to push for state-level legislation and federal action as well.

The road ahead

The changing conditions within the government information security landscape impact every American business and individual in one way or another. We simply can’t afford to be ignorant or apathetic when it comes to matters of public privacy and security.

Whether it be state-sponsored attempts to interfere with the next election, emerging security and privacy regulations, or some other development, we should all strive to become more informed about and engaged in these issues.

How CISOs can justify cybersecurity purchases

Sometimes a disaster strikes: ransomware encrypts critical files, adversaries steal sensitive data, a business application is compromised with a backdoor… This is the stuff that CISOs’ nightmares are made of. As devastating as such incidents can be, for the short time after they occur, the enterprise usually empowers the CISO to implement security measures that he or she didn’t get funding for earlier.

Of course, waiting for disastrous events is a reckless and unproductive way to fund cybersecurity purchases. How can you make a proactive business case for justifying expenses that advance your security program? I have a few suggestions based on my prior consulting experience and my recent work as a CISO at a cybersecurity firm.

Security practitioners used to point to the need for defense-in-depth when explaining why the organization should fund yet another cybersecurity measure. Unfortunately, this principle alone doesn’t clarify how many layers are sufficient. Without business-relevant details and the right context, the people reviewing your request won’t understand its necessity and significance to the organization.

The request itself: What details to include?

You might know why the organization needs a given security measure, but how do you relay its significance to others? At the very least, your funding request should cover:

  • Risk: How does the measure mitigate or otherwise address a meaningful risk? Explain the relationship between this risk and the organization’s business objectives. Clarify what might happen if you don’t address the risk and how likely this is to happen.
  • Cost: How much will the security measure cost? Include upfront and ongoing expenses. Account for the fees you’ll pay to third parties (software as well as infrastructure) and internal costs related to people’s time. Discuss the costs of alternative ways of addressing the risk.
  • Context: What role does your request play as part of the organization’s other initiatives and priorities? Also, discuss how other companies similar to yours handle such risks. Describe the way in which the risk fits into the current threat landscape that’s relevant to your organization.

The details above are essential, but they are not sufficient. The decision makers also need to understand that this is not merely a one-off request, but that it’s a part of a reasonable plan to strengthen the company’s security programs. This is where modern frameworks can help.

Your security program: A method to the madness

If you’re just starting a cybersecurity program, a good way to pick minimum security measures is CIS Critical Controls. This list and the accompanying guide provide practical consensus-based recommendations. If any of these controls are missing from your company, you can point to CIS Critical Controls to justify your request to fund the corresponding initiative. If you’re at a young tech company, consider as another reference the Security4Startups Controls Checklist, which was created by a group of experienced security professionals.

When requesting funding for security projects in organizations that require more sophistication than the lists above offer, take a close look at the NIST Cybersecurity Framework (CSF). It provides a comprehensive listing of security measures that enterprises should implement and has gained traction among government and commercial organizations in the US and world-wide.

Another reference to consider when deciding what security measures your enterprise needs is the Cybersecurity Defense Matrix, created by Sounil Yu. It offers a convenient way to understand the role that your various security tools play and helps identify portfolio gaps. This uses CSF categories to classify cybersecurity controls and reminds you to understand their capabilities with respect to your devices, applications, networks, data, and users. It’s handy for identifying areas that might have too many or too few security measures.

Additional justifications: Legal and privacy considerations

If you need additional ammunition to justify must-have cybersecurity measures, your company’s attorneys might help. Get their guidance regarding picking the baseline controls you must have to exercise due care and avoid negligence. Work with them to understand the relevant laws and regulations. Don’t forget to consider privacy obligations, such as CCPA and GDPR. Ask whether CIS Critical Controls or another framework provides a reasonable starting point.

Speaking of CCPA and GDPR… When explaining how your funding request is a part of a larger plan that benefits the organization, look at the NIST Privacy Framework. This methodology (and others like it) is especially relevant to organizations formalizing their privacy program. Though the scope of a privacy program goes beyond cybersecurity, there is a substantial overlap between the two worlds. You can strengthen the case for your security measure if it addresses cybersecurity as well as privacy risks.

The various frameworks above help you to explain how your security measure – and the associated funding request – fits into your broader plans for securing the organization. Discussing your request as part of the overarching plan explains how this request contributes toward the evolution of your cybersecurity program. It also prepares the organization for the subsequent requests that you will need to submit later.

Privacy ROI: Benefits from data privacy averaging 2.7 times the investment

Customer demands for increased data protection and privacy, the ongoing threat of data breaches and misuse by both unauthorized and authorized users, and preparation for the GDPR and similar laws around the globe spurred many organizations to make considerable privacy investments – which are now delivering strong returns, Cisco reveals.

privacy roi

The study is based on results from a double-blind survey of over 2,800 security professionals in organizations of various sizes across 13 countries.

Privacy ROI: Organizations experiencing positive returns

Organizations, on average, receive benefits 2.7 times their investment, and more than 40 percent are seeing benefits that are at least twice that of their privacy spend. Privacy ROI is real, it’s time for organizations to realize the benefits.

Operational and competitive advantages

Up from 40 percent last year, over 70 percent of organizations now say they receive significant business benefits from privacy efforts beyond compliance, including better agility, increased competitive advantage and improved attractiveness to investors, and greater customer trust.

Higher accountability translates to increased benefits

Companies with higher accountability scores (as assessed using the Centre for Information Policy Leadership’s Accountability Wheel, a framework for managing and assessing organizational maturity) experience lower breach costs, shorter sales delays, and higher financial returns.

82% of organizations see privacy certifications as a buying factor

Privacy certifications such as the ISO 27701, EU/Swiss-US Privacy Shield, and APEC Cross Border Privacy Rules system are becoming an important buying factor when selecting a third-party vendor. India and Brazil topped the list with 95 percent of respondents agreeing external certifications are now an important factor.

privacy roi

As markets continue to evolve, organizations should consider prioritizing their privacy investments on:

  • Improving transparency about processing activities – be up front and clear about what you are doing with data and why
  • Obtaining external privacy certifications – ISO, Shield, CBPRs and BCRs have all become important factors in the buying process by streamlining vendor due diligence
  • Going beyond the legal bare minimum – privacy is a business imperative and most organizations are seeing very positive returns on their spend
  • Building strong organizational governance and accountability to be able to demonstrate to internal and external stakeholders your privacy program maturity.