Cybersecurity is failing due to ineffective technology

A failing cybersecurity market is contributing to ineffective performance of cybersecurity technology, a Debate Security research reveals.

cybersecurity market failing

Based on over 100 comprehensive interviews with business and cybersecurity leaders from large enterprises, together with vendors, assessment organizations, government agencies, industry associations and regulators, the research shines a light on why technology vendors are not incentivized to deliver products that are more effective at reducing cyber risk.

The report supports the view that efficacy problems in the cybersecurity market are primarily due to economic issues, not technological ones. The research addresses three key themes and ultimately arrives at a consensus for how to approach a new model.

Cybersecurity technology is not as effective as it should be

90% of participants reported that cybersecurity technology is not as effective as it should be when it comes to protecting organizations from cyber risk. Trust in technology to deliver on its promises is low, and yet when asked how organizations evaluate cybersecurity technology efficacy and performance, there was not a single common definition.

Pressure has been placed on improving people and process related issues, but ineffective technology has become accepted as normal – and shamefully – inevitable.

The underlying problem is one of economics, not technology

92% of participants reported that there is a breakdown in the market relationship between buyers and vendors, with many seeing deep-seated information asymmetries.

Outside government, few buyers today use detailed, independent cybersecurity efficacy assessment as part of their cybersecurity procurement process, and not even the largest organizations reported having the resources to conduct all the assessments themselves.

As a result, vendors are incentivized to focus on other product features, and on marketing, deprioritizing cybersecurity technology efficacy – one of several classic signs of a “market for lemons”.

Coordinated action between stakeholders only achieved through regulation

Unless buyers demand greater efficacy, regulation may be the only way to address the issue. Overcoming first-mover disadvantages will be critical to fixing the broken cybersecurity technology market.

Many research participants believe that coordinated action between all stakeholders can only be achieved through regulation – though some hold out hope that coordination could be achieved through sectoral associations.

In either case, 70% of respondents feel that independent, transparent assessment of technology would help solve the market breakdown. Setting standards on technology assessment rather than on technology itself could prevent stifling innovation.

Defining cybersecurity technology efficacy

Participants in this research broadly agree that four characteristics are required to comprehensively define cybersecurity technology efficacy.

To be effective, cybersecurity solutions need to have the capability to deliver the stated security mission (be fit-for-purpose), have the practicality that enterprises need to implement, integrate, operate and maintain them (be fit-for-use), have the quality in design and build to avoid vulnerabilities and negative impact, and the provenance in the vendor company, its people and supply chain such that these do not introduce additional security risk.

“In cybersecurity right now, trust doesn’t always sell, and good security doesn’t always sell and isn’t always easy to buy. That’s a real problem,” said Ciaran Martin, advisory board member, Garrison Technology.

“Why we’re in this position is a bit of a mystery. This report helps us understand it. Fixing the problem is harder. But our species has fixed harder problems and we badly need the debate this report calls for, and industry-led action to follow it up.”

“Company boards are well aware that cybersecurity poses potentially existential risk, but are generally not well equipped to provide oversight on matters of technical detail,” said John Cryan, Chairman Man Group.

“Boards are much better equipped when it comes to the issues of incentives and market dynamics revealed by this research. Even if government regulation proves inevitable, I would encourage business leaders to consider these findings and to determine how, as buyers, corporates can best ensure that cybersecurity solutions offered by the market are fit for purpose.”

“As a technologist and developer of cybersecurity products, I really feel for cybersecurity professionals who are faced with significant challenges when trying to select effective technologies,” said Henry Harrison, CSO of Garrison Technology.

“We see two noticeable differences when selling to our two classes of prospects. For security-sensitive government customers, technology efficacy assessment is central to buying behavior – but we rarely see anything similar when dealing with even the most security-sensitive commercial customers. We take from this study that in many cases this has less to do with differing risk appetites and more to do with structural market issues.”

Data protection predictions for 2021

2020 presented us with many surprises, but the world of data privacy somewhat bucked the trend. Many industry verticals suffered losses, uncertainty and closures, but the protection of individuals and their information continued to truck on.

data protection 2021

After many websites simply blocked access unless you accepted their cookies (now deemed unlawful), we received clarity on cookies from the European Data Protection Board (EDPB). With the ending of Privacy Shield, we witnessed the cessation of a legal basis for cross border data transfers.

Severe fines levied for General Data Protection Regulation (GDPR) non-compliance showed organizations that the regulation is far from toothless and that data protection authorities are not easing up just because there is an ongoing global pandemic.

What can we expect in 2021? Undoubtedly, the number of data privacy cases brought before the courts will continue to rise. That’s not necessarily a bad thing: with each case comes additional clarity and precedent on many different areas of the regulation that, to date, is open to interpretation and conjecture.

Last time I spoke to the UK Information Commissioner’s Office regarding a technicality surrounding data subject access requests (DSARs) submitted by a representative, I was told that I was far from the only person enquiring about it, and this only illustrates some of the ambiguities faced by those responsible for implementing and maintaining compliance.

Of course, this is just the GDPR. There are many other data privacy legislative frameworks to consider. We fully expect 2021 to bring full and complete alignment of the ePrivacy Regulations with GDPR, and eradicate the conflict that exists today, particularly around consent, soft opt-in, etc., where the GDPR is very clear but the current Privacy and Electronic Communication Regulation (PECR) not quite so much.

These are just inside Europe but across the globe we’re seeing continued development of data localization laws, which organizations are mandated to adhere to. In the US, the California Consumer Privacy Act (CCPA) has kickstarted a swathe of data privacy reforms within many states, with many calls for something similar at the federal level.

The following year(s) will see that build and, much like with the GDPR, precedent-setting cases are needed to provide more clarity regarding the rules. Will Americans look to replace the shattered Privacy Shield framework, or will they adopt Standard Contractual Clauses (SCCs) more widely? SCCs are a very strong legal basis, providing the clauses are updated to align with the GDPR (something else we’d expect to see in 2021), and I suspect the US will take this road as the realization of the importance of trade with the EU grows.

Other noteworthy movements in data protection laws are happening in Russia with amendments to the Federal Law on Personal Data, which is taking a closer look at TLS as a protective measure, and in the Philippines, where the Personal Data Protection Act 2021 (PDPA) is being replaced by a new bill (currently a work in progress, but it’s coming).

One of the biggest events of 2021 will be the UK leaving the EU. The British implementation of the GDPR comes in the form of the UK Data Protection Bill 2018. Aside from a few deregulations, it’s the GDPR and that’s great… as far as it goes. Having strong local data privacy laws is good, but after enjoying 47 years (at the time of writing) of free movement within the Union, how will being outside of the EU impact British business?

It is thought and hoped that the UK will be granted an adequacy decision fairly swiftly, given that historically local UK laws aligned with those inside the Union, but there is no guarantee. The uncertainty around how data transfers will look in future might result in the British industry using more SCCs. The currently low priority plans to make Binding Corporate Rules (BCR) easier and more affordable will come sharply to the fore as the demand for them goes up.

One thing is certain, it’s going to be a fascinating year for data privacy and we are excited to see clearer definitions, increased certification, precedent-setting case law and whatever else unfolds as we continue to navigate a journey of governance, compliance and security.

How tech trends and risks shape organizations’ data protection strategy

Trustwave released a report which depicts how technology trends, compromise risks and regulations are shaping how organizations’ data is stored and protected.

data protection strategy

Data protection strategy

The report is based on a recent survey of 966 full-time IT professionals who are cybersecurity decision makers or security influencers within their organizations.

Over 75% of respondents work in organizations with over 500 employees in key geographic regions including the U.S., U.K., Australia and Singapore.

“Data drives the global economy yet protecting databases, where the most critical data resides, remains one of the least focused-on areas in cybersecurity,” said Arthur Wong, CEO at Trustwave.

“Our findings illustrate organizations are under enormous pressure to secure data as workloads migrate off-premises, attacks on cloud services increases and ransomware evolves. Gaining complete visibility of data either at rest or in motion and eliminating threats as they occur are top cybersecurity challenges all industries are facing.”

More sensitive data moving to the cloud

Types of data organizations are moving into the cloud have become increasingly sensitive, therefore a solid data protection strategy is crucial. Ninety-six percent of total respondents stated they plan to move sensitive data to the cloud over the next two years with 52% planning to include highly sensitive data with Australia at 57% leading the regions surveyed.

Not surprisingly, when asked to rate the importance of securing data regarding digital transformation initiatives, an average score of 4.6 out of a possible high of five was tallied.

Hybrid cloud model driving digital transformation and data storage

Of those surveyed, most at 55% use both on-premises and public cloud to store data with 17% using public cloud only. Singapore organizations use the hybrid cloud model most frequently at 73% or 18% higher than the average and U.S. organizations employ it the least at 45%.

Government respondents store data on-premises only the most at 39% or 11% higher than average. Additionally, 48% of respondents stored data using the hybrid cloud model during a recent digital transformation project with only 29% relying solely on their own databases.

Most organizations use multiple cloud services

Seventy percent of organizations surveyed were found to use between two and four public cloud services and 12% use five or more. At 14%, the U.S. had the most instances of using five or more public cloud services followed by the U.K. at 13%, Australia at 9% and Singapore at 9%. Only 18% of organizations queried use zero or just one public cloud service.

Perceived threats do not match actual incidents

Thirty-eight percent of organizations are most concerned with malware and ransomware followed by phishing and social engineering at 18%, application threats 14%, insider threats at 9%, privilege escalation at 7% and misconfiguration attack at 6%.

Interestingly, when asked about actual threats experienced, phishing and social engineering came in first at 27% followed by malware and ransomware at 25%. The U.K. and Singapore experienced the most phishing and social engineering incidents at 32% and 31% and the U.S. and Australia experienced the most malware and ransomware attacks at 30% and 25%.

Respondents in the government sector had the highest incidents of insider threats at 13% or 5% above the average.

Patching practices show room for improvement

A resounding 96% of respondents have patching policies in place, however, of those, 71% rely on automated patching and 29% employ manual patching. Overall, 61% of organizations patched within 24 hours and 28% patched between 24 and 48 hours.

The highest percentage patching within a 24-hour window came from Australia at 66% and the U.K. at 61%. Unfortunately, 4% of organizations took a week to over a month to patch.

Reliance on automation driving key security processes

In addition to a high percentage of organizations using automated patching processes, findings show 89% of respondents employ automation to check for overprivileged users or lock down access credentials once an individual has left their job or changed roles.

This finding correlates to low concern for insider threats and data compromise due to privilege escalation according to the survey. Organizations must exercise caution when assuming removal of user access to applications to also include databases, which is often not the case.

Data regulations having minor impact on database security strategies

When asked if data regulations such as GDPR and CCPA impacted database security strategies, a surprising 60% of respondents said no.

These findings may suggest a lack of alignment between information technology and other departments, such as legal, responsible for helping ensure stipulations like ‘the right to be forgotten’ are properly enforced to avoid severe penalties.

Small teams with big responsibilities

Of those surveyed, 47% had a security team size of only six to 15 members. Respondents from Singapore had the smallest teams with 47% reporting between one and ten members and the U.S. had the largest teams with 22% reporting team size of 21 or more, 2% higher than the average.

Thirty-two percent of government respondents surprisingly run security operations with teams between just six and ten members.

Global adoption of data and privacy programs still maturing

The importance of privacy and data protection is a critical issue for organizations as it transcends beyond legal departments to the forefront of an organization’s strategic priorities.

adoption privacy programs

A FairWarning research, based on survey results from more than 550 global privacy and data protection, IT, and compliance professionals outlines the characteristics and behaviors of advanced privacy and data protection teams.

By examining the trends of privacy adoption and maturity across industries, the research uncovers adjustments that security and privacy leaders need to make to better protect their organization’s data.

The prevalence of data and privacy attacks

Insights from the research reinforce the importance of privacy and data protection as 67% of responding organizations documented at least one privacy incident within the past three years, and over 24% of those experienced 30 or more.

Additionally, 50% of all respondents reported at least one data breach in the last three years, with 10% reporting 30 or more.

Overall immaturity of privacy programs

Despite increased regulations, breaches and privacy incidents, organizations have not rapidly accelerated the advancement of their privacy programs as 44% responded they are in the early stages of adoption and 28% are in middle stages.

Healthcare and software rise to the top

Despite an overall lack of maturity across industries, healthcare and software organizations reflect more maturity in their privacy programs, as compared to insurance, banking, government, consulting services, education institutions and academia.

Harnessing the power of data and privacy programs

Respondents understand the significant benefits of a mature privacy program as organizations experience greater gains across every area measured including: increased employee privacy awareness, mitigating data breaches, greater consumer trust, reduced privacy complaints, quality and innovation, competitive advantage, and operational efficiency.

Of note, more mature companies believe they experience the largest gain in reducing privacy complaints (30.3% higher than early stage respondents).

Attributes and habits of mature privacy and data protection programs

Companies with more mature privacy programs are more likely to have C-Suite privacy and security roles within their organization than those in the mid- to early-stages of privacy program development.

Additionally, 88.2% of advanced stage organizations know where most or all of their personally identifiable information/personal health information is located, compared to 69.5% of early stage respondents.

Importance of automated tools to monitor user activity

Insights reveal a clear distinction between the maturity levels of privacy programs and related benefits of automated tools as 54% of respondents with more mature programs have implemented this type of technology compared with only 28.1% in early stage development.

Automated tools enable organizations to monitor all user activity in applications and efficiently identify anomalous activity that signals a breach or privacy violation.

“This research revealed a major gap between mature and early stage privacy programs and the benefits they receive,” said Ed Holmes, CEO, FairWarning.

“It is exciting to see healthcare at the top when it comes to privacy maturity. However, as we dig deeper into the data, we find that 37% of respondents with 30 or more breaches are from healthcare, indicating that there is still more work to be done.

“This study highlights useful guidance on steps all organizations can take regardless of industry or size to advance their program and ensure they are at the forefront of privacy and data protection.”

“In today’s fast-paced and increasingly digitized world, organizations regardless of size or industry, need to prioritize data and privacy protection,” said IAPP President & CEO J. Trevor Hughes.

“As the research has demonstrated, it is imperative that security and privacy professionals recognize the importance of implementing privacy and data protection programs to not only reduce privacy complaints and data breaches, but increase operational efficiency.”

Banks risk losing customers with anti-fraud practices

Many banks across the U.S. and Canada are failing to meet their customers’ online identity fraud and digital banking needs, according to a survey from FICO.

banking fraud

Despite COVID-19 quickly turning online banking into an essential service, the survey found that financial institutions across North America are struggling to establish practices that combat online identity fraud and money laundering, without negatively impacting customer experience.

For example, 51 percent of North American banks are still asking customers to prove their identities by visiting branches or posting documents when opening digital accounts. This also applies to 25 percent of mortgages or home loans and 15 percent of credit cards opened digitally.

“The pandemic has forced industries to fully embrace digital. We now are seeing North American banks that relied on face-to-face interactions to prove customers’ identities rethinking how to adapt to the digital first economy,” said Liz Lasher, vice president of portfolio marketing for Fraud at FICO.

“Today’s consumers expect a seamless and secure online experience, and banks need to be equipped to meet those expectations. Engaging valuable new customers, then having them abandon applications when identity proofing becomes expensive and difficult.”

Identity verification process issues

The study found that only up to 16 percent of U.S. and Canadian banks employ the type of fully integrated, real-time digital capture and validation tools required for consumers to securely open a financial account online.

Even when digital methods are used to verify identity, the experience still raises barriers with customers expected to use email or visit an “identity portal” to verify their identities.

Creating a frictionless process is key to meeting consumers current expectation. For example, according to a recent Consumer Digital Banking study, while 75 percent of consumers said they would open a financial account online, 23 percent of prospective customers would abandon the process due to an inconsistent identity verification process.

Lack of automation is a problem for banks too

The lack of automation when verifying customers’ identity isn’t just a pain point for customers – 53 percent of banks reported it problematic for them too.

Regulation intended to prevent criminal activity such as money laundering typically requires banks to review customer identities in a consistent, robust manner and this is harder to achieve for institutions relying on inconsistent manual resources.

Fortunately, 75 percent of banks in the U.S. and Canada reported plans to invest in an identity management platform within the next three years.

By moving to a more integrated and strategic approach to identity proofing and identity authentication, banks will be able to meet customer expectations and deliver consistently positive digital banking experiences across online channels.

Compliance activities cost organizations $3.5 million annually

Organizations are struggling to keep up with IT security and privacy compliance regulations, according to a Telos survey.

compliance cost

Annual compliance cost

The survey, which polled 300 IT security professionals in July and August 2020, revealed that, on average, organizations must comply with 13 different IT security and/or privacy regulations and spend $3.5 million annually on compliance activities, with compliance audits consuming 58 working days each quarter.

As more regulations come into existence and more organizations migrate their critical systems, applications and infrastructure to the cloud, the risk of non-compliance and associated impact increases.

Key research findings

  • IT security professionals report receiving an average of over 17 audit evidence requests each quarter and spend an average of three working days responding to a single request
  • Over the last 24 months, organizations have been found non-compliant an average of six times by both internal and third party auditors resulting in an average of eight fines, costing an average of $460,000
  • 86 percent of organizations believe compliance would be an issue when moving systems, applications and infrastructure to the cloud
  • 94 percent of organizations report they would face challenges when it comes to IT security compliance and/or privacy regulations in the cloud

Compliance teams are overwhelmed

“Compliance teams spend 232 working days each year responding to audit evidence requests, in addition to the millions of dollars spent on compliance activities and fines,” said Dr. Ed Amoroso, CEO of TAG Cyber. “The bottom line is this level of financial and time commitment is unsustainable in the long run.”

“As hammer, chisel and stone gave way to clipboard, paper and pencil, it’s time for organizations to realize the days of spreadsheets for ‘checkbox compliance’ are woefully outdated,” said Steve Horvath, VP of strategy and cloud at Telos.

Automation can solve numerous compliance challenges, as the data shows. It’s the only real way to get in front of curve, rather than continuing to try and keep up.”

99 percent of survey respondents indicated their organization would benefit from automating IT security and/or privacy compliance activities, citing expected benefits such as increased accuracy of evidence (54 percent), reduced time spent being audited (51 percent) and the ability to respond to audit evidence requests more quickly (50 percent).

CPRA: More opportunity than threat for employers

Increasingly demanded by consumers, data privacy laws can create onerous burdens on even the most well-meaning businesses. California presents plenty of evidence to back up this statement, as more than half of organizations that do business in California still aren’t compliant with the California Consumer Privacy Act (CCPA), which went into effect earlier this year.

CPRA

As companies struggle with their existing compliance requirements, many fear that a new privacy ballot initiative – the California Privacy Rights Act (CPRA) – could complicate matters further. While it’s true that if passed this November, the CPRA would fundamentally change the way businesses in California handle both customer and employee data, companies shouldn’t panic. In fact, this law presents an opportunity for organizations to change their relationship with employee data to their benefit.

CPRA, the Californian GDPR?

Set to appear on the November 2020 ballot, the CPRA, also known as CCPA 2.0 or Prop 24 (its name on the ballot), builds on what is already the most comprehensive data protection law in the US. In essence, the CPRA will bring data protection in California nearer to the current European legal standard, the General Data Protection Regulation (GDPR).

In the process of “getting closer to GDPR,” the CCPA would gain substantial new components. Besides enhancing consumer rights, the CPRA also creates new provisions for employee data as it relates to their employers, as well as data that businesses collect from B2B business partners.

Although controversial, the CPRA is likely to pass. August polling shows that more than 80% of voters support the measure. However, many businesses do not. This is because, at first glance, the CPRA appears to create all kinds of legal complexities in how employers can and cannot collect information from workers.

Fearful of having to meet the same demanding requirements as their European counterparts, many organizations’ natural reaction towards the prospect of CPRA becoming law is fear. However, this is unfounded. In reality, if the CPRA passes, it might not be as scary as some businesses think.

CPRA and employment data

The CPRA is actually a lot more lenient than the GDPR in regard to how it polices the relationship between employers and employees’ data. Unlike for its EU equivalent, there are already lots of exceptions written into the proposed Californian law acknowledging that worker-employer relations are not like consumer-vendor relations.

Moreover, the CPRA extends the CCPA exemption for employers, set to end on January 1, 2021. This means that if the CPRA passes into law, employers would be released from both their existing and potential new employee data protection obligations for two more years, until January 1, 2023. This exemption would apply to most provisions under the CPRA, including the personal information collected from individuals acting as job applicants, staff members, employees, contractors, officers, directors, and owners.

However, employers would still need to provide notice of data collection and maintain safeguards for personal information. It’s highly likely that during this two-year window, additional reforms would be passed that might further ease employer-employee data privacy requirements.

Nonetheless, employers should act now

While the CPRA won’t change much overnight, impacted organizations shouldn’t wait to take action, but should take this time to consider what employee data they collect, why they do so, and how they store this information.

This is especially pertinent now that businesses are collecting more data than ever on their employees. With companies like the workplace monitoring company Prodoscore reporting that interest from prospective customers rose by 600% since the pandemic began, we are seeing rapid growth in companies looking to monitor how, where, and when their employees work.

This trend emphasizes the fact that the information flow between companies and their employees is mostly one-sided (i.e., from the worker to the employer). Currently, businesses have no legal requirement to be transparent about this information exchange. That will change for California-based companies if the CPRA comes into effect and they will have no choice but to disclose the type of data they’re collecting about their staff.

The only sustainable solution for impacted businesses is to be transparent about their data collection with employees and work towards creating a “culture of privacy” within their organization.

Creating a culture of privacy

Rather than viewing employee data privacy as some perfunctory obligation where the bare minimum is done for the sake of appeasing regulators, companies need to start thinking about worker privacy as a benefit. Presented as part of a benefits package, comprehensive privacy protection is a perk that companies can offer prospective and existing employees.

Privacy benefits can include access to privacy protection services that give employees privacy benefits beyond the workplace. Packaged alongside privacy awareness training and education, these can create privacy plus benefits that can be offered to employees alongside standard perks like health or retirement plans. Doing so will build a culture of privacy which can help companies ensure they’re in regulatory compliance, while also making it easier to attract qualified talent and retain workers.

It’s also worth bearing in mind that creating a culture of privacy doesn’t necessarily mean that companies have to stop monitoring employee activity. In fact, employees are less worried about being watched than they are by the possibility of their employers misusing their data. Their fears are well-founded. Although over 60% of businesses today use workforce data, only 3 in 10 business leaders are confident that this data is treated responsibly.

For this reason, companies that want to keep employee trust and avoid bad PR need to prioritize transparency. This could mean drawing up a “bill of rights” that lets employees know what data is being collected and how it will be used.

Research into employee satisfaction backs up the value of transparency. Studies show that while only 30% of workers are comfortable with their employer monitoring their email, the number of employees open to the use of workforce data goes up to 50% when the employer explains the reasons for doing so. This number further jumps to 92% if employees believe that data collection will improve their performance or well-being or come with other personal benefits, like fairer pay.

On the other hand, most employees would leave an organization if its leaders did not use workplace data responsibly. Moreover, 55% of candidates would not even apply for a job with such an organization in the first place.

Final thoughts

With many exceptions for workplace data management already built-in and more likely to come down the line, most employers should be able to easily navigate the stipulations CPRA entails.

That being said, if it becomes law this November, employers shouldn’t misuse the two-year window they have to prepare for new compliance requirements. Rather than seeing this time as breathing space before a regulatory crackdown, organizations should instead use it to be proactive in their approach to how they manage their employees’ data. As well as just ensuring they comply with the law, businesses should look at how they can turn employee privacy into an asset.

As data privacy stays at the forefront of employees’ minds, businesses that can show they have a genuine privacy culture will be able to gain an edge when it comes to attracting and retaining talent and, ultimately, coming out on top.

How do I select a data storage solution for my business?

We live in the age of data. We are constantly producing it, analyzing it, figuring out how to store and protect it, and, hopefully, using it to refine business practices. Unfortunately, 58% of organizations make decisions based on outdated data.

While enterprises are rapidly deploying technologies for real-time analytics, machine learning and IoT, they are still utilizing legacy storage solutions that are not designed for such data-intensive workloads.

To select a suitable data storage for your business, you need to think about a variety of factors. We’ve talked to several industry leaders to get their insight on the topic.

Phil Bullinger, SVP and General Manager, Data Center Business Unit, Western Digital

select data storage solutionSelecting the right data storage solution for your enterprise requires evaluating and balancing many factors. The most important is aligning the performance and capabilities of the storage system with your critical workloads and their specific bandwidth, application latency and data availability requirements. For example, if your business wants to gain greater insight and value from data through AI, your storage system should be designed to support the accelerated performance and scale requirements of analytics workloads.

Storage systems that maximize the performance potential of solid state drives (SSDs) and the efficiency and scalability of hard disk drives (HDDs) provide the flexibility and configurability to meet a wide range of application workloads.

Your applications should also drive the essential architecture of your storage system, whether directly connected or networked, whether required to store and deliver data as blocks, files, objects or all three, and whether the storage system must efficiently support a wide range of workloads while prioritizing the performance of the most demanding applications.

Consideration should be given to your overall IT data management architecture to support the scalability, data protection, and business continuity assurance required for your enterprise, spanning from core data centers to those distributed at or near the edge and endpoints of your enterprise operations, and integration with your cloud-resident applications, compute and data storage services and resources.

Ben Gitenstein, VP of Product Management, Qumulo

select data storage solutionWhen searching for the right data storage solution to support your organizational needs today and in the future, it’s important to select a solution that is trusted, scalable to secure demanding workloads of any size, and ensures optimal performance of applications and workloads both on premises and in complex, multi- cloud environments.

With the recent pandemic, organizations are digitally transforming faster than ever before, and leveraging the cloud to conduct business. This makes it more important than ever that your storage solution has built in tools for data management across this ecosystem.

When evaluating storage options, be sure to do your homework and ask the right questions. Is it a trusted provider? Would it integrate well within my existing technology infrastructure? Your storage solution should be easy to manage and meet the scale, performance and cloud requirements for any data environment and across multi-cloud environments.

Also, be sure the storage solution gives IT control in how they manage storage capacity needs and delivers real-time insight into analytics and usage patterns so they can make smart storage allocation decisions and maximize an organizations’ storage budget.

David Huskisson, Senior Solutions Manager, Pure Storage

select data storage solutionData backup and disaster recovery features are critically important when selecting a storage solution for your business, as now no organization is immune to ransomware attacks. When systems go down, they need to be recovered as quickly and safely as possibly.

Look for solutions that offer simplicity in management, can ensure backups are viable even when admin credentials are compromised, and can be restored quickly enough to greatly reduce major organizational or financial impact.

Storage solutions that are purpose-built to handle unstructured data are a strong place to start. By definition, unstructured data means unpredictable data that can take any form, size or shape, and can be accessed in any pattern. These capabilities can accelerate small, large, random or sequential data, and consolidate a wide range of workloads on a unified fast file and object storage platform. It should maintain its performance even as the amount of data grows.

If you have an existing backup product, you don’t need to rip and replace it. There are storage platforms with robust integrations that work seamlessly with existing solutions and offer a wide range of data-protection architectures so you can ensure business continuity amid changes.

Tunio Zafer, CEO, pCloud

select data storage solutionBear in mind: your security team needs to assist. Answer these questions to find the right solution: Do you need ‘cold’ storage or cloud storage? If you’re looking to only store files for backup, you need a cloud backup service. If you’re looking to store, edit and share, go for cloud storage. Where are their storage servers located? If your business is located in Europe, the safest choice is a storage service based in Europe.

Best case scenario – your company is going to grow. Look for a storage service that offers scalability. What is their data privacy policy? Research whether someone can legally access your data without your knowledge or consent. Switzerland has one of the strictest data privacy laws globally, so choosing a Swiss-based service is a safe bet. How is your data secured? Look for a service that offers robust encryption in-transit and at-rest.

Client-side encryption means that your data is secured on your device and is transferred already encrypted. What is their support package? At some point, you’re going to need help. A data storage service with a support package that’s included for free, answers in up to 24 hours is preferred.

Cybersecurity practices are becoming more formal, security teams are expanding

Organizations are building confidence that their cybersecurity practices are headed in the right direction, aided by advanced technologies, more detailed processes, comprehensive education and specialized skills, a research from CompTIA finds.

cybersecurity practices

Eight in 10 organizations surveyed said their cybersecurity practices are improving.

At the same time, many companies acknowledge that there is still more to do to make their security posture even more robust. Growing concerns about the number, scale and variety of cyberattacks, privacy considerations, a greater reliance on data and regulatory compliance are among the issues that have the attention of business and IT leaders.

Elevating cybersecurity

Two factors – one anticipated, the other unexpected – have contributed to the heightened awareness about the need for strong cybersecurity measures.

“The COVID-19 pandemic has been the primary trigger for revisiting security,” said Seth Robinson, senior director for technology analysis at CompTIA. “The massive shift to remote work exposed vulnerabilities in workforce knowledge and connectivity, while phishing emails preyed on new health concerns.”

Robinson noted that the pandemic accelerated changes that were underway in many organizations that were undergoing the digital transformation of their business operations.

“This transformation elevated cybersecurity from an element within IT operations to an overarching business concern that demands executive-level attention,” he said. “It has become a critical business function, on par with a company’s financial procedures.”

As a result, companies have a better understanding of what do about cybersecurity. Nine in 10 organizations said their cybersecurity processes have become more formal and more critical.

Two examples are risk management, where companies assess their data and their systems to determine the level of security that each requires; and monitoring and measurement, where security efforts are continually tracked and new metrics are established to tie security activity to business objectives.

IT teams foundational skills

The report also highlights how the “cybersecurity chain” has expanded to include upper management, boards of directors, business units and outside firms in addition to IT personnel in conversations and decisions.

Within IT teams, foundational skills such as network and endpoint security have been paired with new skills, including identity management and application security, that have become more important as cloud and mobility have taken hold.

On the horizon, expect to see skills related to security monitoring and other proactive tactics gain a bigger foothold. Examples include data analysis, threat knowledge and understanding the regulatory landscape.

Cybersecurity insurance is another emerging area. The report reveals that 45% of large companies, 41% of mid-sized firms and 37% of small businesses currently have a cyber insurance policy.

Common coverage areas include the cost of restoring data (56% of policy holders), the cost of finding the root cause of a breach (47%), coverage for third-party incidents (43%) and response to ransomware (42%).

Companies that facilitate ransomware payments risk violating US sanctions

Companies that ransomware-hit US organizations hire to facilitate the paying of the ransom are at risk of breaking US sanctions, falling afoul of the US Department of the Treasury’s Office of Foreign Assets Control (OFAC) regulations and may end up paying millions in fines.

Ransomware US sanctions

These include financial institutions, cyber insurance firms, and companies involved in digital forensics and incident response.

What is the OFAC?

The Office of Foreign Assets Control of the US Department of the Treasury administers and enforces economic and trade sanctions based on US foreign policy and national security goals.

Sanctions can be enforced against foreign countries/regimes, organized groups and individuals that “threaten the national security, foreign policy or economy of the United​ States”. Ransomware-wielding gangs fall in that category.

In a security advisory published on Thursday, the OFAC mentioned the developer of Cryptolocker, Iranian supporters of SamSam ransomware-wielding gangs, the Lazarus Group (a cybercriminal organization sponsored by North Korea that used the WannaCry ransomware) and Evil Corp, a Russia-based cybercriminal organization that wields the Dridex malware, as malicious cyber actors under its cyber-related sanctions program.

The advisory’s salient points

“Ransomware payments made to sanctioned persons or to comprehensively sanctioned jurisdictions could be used to fund activities adverse to the national security and foreign policy objectives of the United States. Ransomware payments may also embolden cyber actors to engage in future attacks. In addition, paying a ransom to cyber actors does not guarantee that the victim will regain access to its stolen data,” the OFAC explained.

“OFAC encourages victims and those involved with addressing ransomware attacks to contact OFAC immediately if they believe a request for a ransomware payment may involve a sanctions nexus. Victims should also contact the US Department of the Treasury’s Office of Cybersecurity and Critical Infrastructure Protection if an attack involves a US financial institution or may cause significant disruption to a firm’s ability to perform critical financial services.”

OFAC might issue a special license allowing them to perform the transaction (the paying of the ransom), but each application “will be reviewed by OFAC on a case-by-case basis with a presumption of denial.”

Also, it won’t matter if the ransomware gangs involved are from countries under US sanctions or under sanctions themselves.

“OFAC may impose civil penalties for sanctions violations based on strict liability, meaning that a person subject to US jurisdiction may be held civilly liable even if it did not know or have reason to know it was engaging in a transaction with a person that is prohibited under sanctions laws and regulations administered by OFAC,” the advisory pointed out.

To pay or not to pay?

If would be best, of course, if a ransomware-hit organization didn’t have to pay the ransom in order to quickly recover their IT capabilities and return to functioning as normal, but sometimes paying up is the only option if they want to stay afloat and/or keep providing vital services.

In and of itself, paying a ransom is not against the law, but if the payment is made to an entity or individual under US sanctions, the action is technically illegal.

But, according to Dissent Doe, FBI and Secret Service officials that attended a panel at the Privacy + Security Forum in Washington, D.C., a year ago confirmed that the US government has never prosecuted any victim for paying ransom.

The same panel, which also gathered private sector lawyers and a representative of a consulting firm, also unanimously confirmed that in an overwhelming majority of cases, victims end up getting the decryption key and their data back after paying up.

“So although the public isn’t told this clearly because the government wants to discourage it, I will repeat what I have been saying for quite a while: for some entities, paying ransom will just be a business decision based on how much money they will lose if they cannot function due to the ransomware attack,” Doe noted.

A (potential) fine levied by the US government then becomes just a factor in that equation.

Financial risk and regulatory compliance pros struggling with collaboration

After several months of working from home, with no clear end in sight, financial risk and regulatory compliance professionals are struggling when it comes to collaborating with their teams – particularly as they manage increasingly complex global risk and regulatory reporting requirements.

financial risk and regulatory compliance

According to a survey of major financial institutions conducted by AxiomSL, 41% of respondents said collaborating with teams remains a challenge while working remotely.

“During the pandemic, financial firms quickly adapted to major changes, although not without some operational and technology weaknesses emerging,” said Alex Tsigutkin, CEO AxiomSL.

“Indeed, businesses might never return to the ‘old normal’, and that has made building data- and technology-driven resilience much more pressing than before the crisis. Our clients have been experiencing heightened regulatory pressures,” he continued.

“Throughout the crisis, we enabled them to respond rapidly to changes in reporting criteria, the onset of daily liquidity reporting, and the Federal Reserve’s emerging risk data collection (ERDC) initiative – that required FR Y­–14 data on a weekly/monthly basis instead of quarterly.”

These data-intensive, high-frequency regulatory reporting requirements will continue in the ‘new normal.’ “To future-proof, organizations should continue to establish sustainable data architectures and analytics that enable connection and transparency between critical datasets,” Tsigutkin commented.

“And, as a priority, they should transition to our secure RegCloud to handle regulatory intensity efficiently, bolster business continuity, and strengthen their ability to collaborate remotely,” he concluded.

Key research findings

Remote collaboration is a top operational challenge for financial risk and regulatory pros: For all the talk of work-from-anywhere policies becoming the future of financial services, 41% of the risk and compliance professionals surveyed said collaborating with colleagues while working remotely has been their biggest challenge during the COVID-19 crisis.

This was the most frequently cited challenge, followed by accessing data from dispersed systems (18%), reliance on offshore resources (15%), and reliance on locally installed technology (15%).

Liquidity reporting expected to get harder: New capital and liquidity stress testing requirements are expected to present a much heavier burden on financial firms, with 18% of respondents citing increased capital and liquidity risk reporting as a major challenge they will face over the next two years.

Cloud adoption gets its catalyst: After years of resisting cloud adoption, many North American financial institutions are finally gearing up to make the move. When it comes to regulatory technology spending over the next two years, enhanced data analytics is the top area of focus among 29% of survey respondents. But cloud deployment rose to second place (23%) followed by data lakes (22%) and artificial intelligence and machine learning (20%).

Reduction of manual processes is an operational focus for the next two years: The top risk and regulatory compliance challenge firms see on the road ahead is continuing to eliminate manual processes (29%), followed by improving the transparency of data and processes (21%), and fully transitioning to a secure cloud (13%).

RegTech budgets largely intact heading into 2021: A total of 83% indicated their near-term projects as virtually unimpacted or mostly going forward. And similarly, 81% said their budgets for 2021 remain intact (70%) or will increase (11%).

GRC teams have a number of challenges meeting regulatory demands

Senior risk and compliance professionals within financial services company’s lack confidence in the security data they are providing to regulators, according to Panaseer.

GRC regulatory demands

Results from a global external survey of over 200+ GRC leaders reveal concerns on data accuracy, request overload, resource-heavy processes and lack of end-to-end automation.

The results indicate a wider issue with cyber risk management. If GRC leaders don’t have confidence in the accuracy and timeliness of security data provided to regulators, then the same holds true for the confidence in their own ability to understand and combat cyber risks.

41% of risk leaders feel ‘very confident’ that they can fulfill the security-related requests of a regulator in a timely manner. 27.5% are ‘very satisfied’ that their organization’s security reports align to regulatory compliance needs.

GRC leaders cited their top challenges in fulfilling regulator requests, as:

  • Getting access to accurate data (35%)
  • The number of report requests (29%)
  • The length of time it takes to get information from security team (26%)

The limitations of traditional GRC tools

The issue has been perpetuated by the limitations of traditional GRC tools, which rely on qualitative questionnaires to provide evidence of compliance. This does not reflect the current challenges from cyber.

92% of senior risk and compliance professionals believe it would be valuable to have quantitative security controls assurance reporting (vs qualitative) and 93.5% believe it’s important to automate security risk and compliance reporting. However, only 11% state that their risk and compliance reporting is currently automated end to end.

96% said it is important to prioritize security risk remediation based on its impact to the business, but most can’t isolate risk to critical business processes composed of people, applications, devices. Only 33.5% of respondents are ‘very confident’ in their ability to understand all the asset inventories.

GRC regulatory demands

Charaka Goonatilake, CTO, Panaseer: “Faced with increasing requests from regulators, GRC leaders have resorted to throwing a lot of people at time-sensitive requests. These manual processes combined with lack of GRC tool scalability necessitates data sampling, which means they cannot have complete visibility or full confidence in the data they are providing.

“The challenge is being exacerbated by new risks introduced by IoT sensors and endpoints, which rarely consider security a core requirement and therefore introduce greater risk and increase the importance of controls and mitigations to address them.”

Andreas Wuchner, Panaseer Advisory Board member: “To face the new reality of cyberthreats and regulatory pressures requires many organizations need to fundamentally rethink traditional tools and defences.

“GRC leaders can enhance their confidence to accurately and quickly meet stakeholder needs by implementing Continuous Controls Monitoring, an emerging category of security and risk, which has just been recognised in the 2020 Gartner Risk Management Hype Cycle.”

Internet Impact Assessment Toolkit: Protect the core that underpins the Internet

The Internet Society has launched the first-ever regulatory assessment toolkit that defines the critical properties needed to protect and enhance the future of the Internet.

Internet Impact Assessment Toolkit

The Internet Impact Assessment Toolkit is a guide to help ensure regulation, technology trends and decisions don’t harm the infrastructure of the Internet. It describes the Internet at its optimal state – a network of networks that is universally accessible, decentralized and open; facilitating the free and efficient flow of knowledge, ideas and information.

Critical properties of the Internet Impact Assessment Toolkit

The five critical properties identified by the IWN are:

  • An accessible infrastructure with a common protocol – A ‘common language’ enabling global connectivity and unrestricted access to the Internet.
  • An open architecture of interoperable and reusable building blocks – Open infrastructure with a set of standards enabling permission-free innovation.
  • Decentralized management and a single distributed routing system – Distributed routing enabling local networks to grow, while maintaining worldwide connectivity.
  • Common global identifiers – A single common identifier allowing computers and devices around the world to communicate with each other.
  • A technology neutral, general-purpose network – A simple and adaptable dynamic environment cultivating infinite opportunities for innovation.

When combined, these properties form the unique foundation that underpins the Internet’s success and are essential for its healthy evolution. The closer the Internet aligns with the IWN, the more open and agile it is for future innovation and the broader benefits of collaboration, resiliency, global reach and economic growth.

“The Internet’s ability to support the world through a global pandemic is an example of the Internet Way of Networking at its finest,” explains Joseph Lorenzo Hall, Senior VP for a Strong Internet, Internet Society. “Governments didn’t need to do anything to facilitate this massive global pivot in how humanity works, learns and socializes. The Internet just works – and it works thanks to the principles that underpin its success.”

A resource for policymakers and technologists

The Internet Impact Assessment Toolkit will serve as an important resource to help policymakers and technologists ensure trends in regulatory and technical proposals don’t harm the unique architecture of the Internet. The toolkit explains why each property of the IWN is crucial to the Internet and the social and economic consequences that can arise when any of these properties are damaged.

For instance, the Toolkit shows how China’s restrictive networking model severely impacts its global reach and hinders collaboration with networks beyond its borders. It also highlights how the US administration’s Clean Network proposal challenges the Internet’s architecture by dictating how networks interconnect according to political considerations rather than technical considerations.

“We’re seeing a trend of governments encroaching on parts of the Internet’s infrastructure to try and solve social and political problems through technical means. Ill-informed regulation can drastically alter the Internet’s fundamental architecture and harm the ecosystem that supports it,” continues Hall. “We’re giving both policymakers and Internet users the information and tools to make sure they don’t break this resource that brings connectivity, innovation, and empowerment to everyone.”

Most compliance requirements are completely absurd

Compliance is probably one of the dullest topics in cybersecurity. Let’s be honest, there’s nothing to get excited about because most people view it as a tick-box exercise. It doesn’t matter which compliance regulation you talk about – they all get a collective groan from companies whenever you start talking about it.

compliance requirements

The thing is, compliance requirements are often being poorly written, vague and confusing. In my opinion, the confusion around compliance comes from the writing, so it’s no surprise companies are struggling, especially when they have to comply with multiple requirements simultaneously.

Poor writing is smothering compliance regulations

Take ISO 27001 as an example. Its goal is to improve a business’ information security management and its process has six-parts, which include commands like “conduct a risk assessment”, “define a security policy” and “manage identified risks”. The requirements for each of these commands are extremely vague and needlessly subjective.

The Sarbanes-Oxley Act (SOX), which covers all businesses in the United States, is no better. Section 404 vaguely says that all publicly traded organizations have to demonstrate “due diligence” in the disclosure of financial information, but then it does not explain what “due diligence” means.

The Gramm-Leach-Bliley Act (GLBA) requires US financial institutions to explain information-sharing practices to their customers. It says financial organizations have to “develop a written information security plan”, but then doesn’t offer any advice on how to achieve that.

Even Lexcel (an accreditation indicating quality in relation to legal practice management standards) in the United Kingdom, which is written by lawyers for lawyers, is not clear: “Practices must have an information management policy with procedures for the protection and security of the information assets.”

For a profession that prides itself on being able to maintain absolute clarity, I’m surprised Lexcel allows this type of subjectivity in its compliance requirements.

It’s not easy to write for such a wide audience

Look, I understand. It’s a pretty tricky job to write compliance requirements. It needs to be applicable to all organizations within a particular field, each of which will have their differences in the way they conduct business and how they’ve set up their technological infrastructure.

Furthermore, writers are working against the clock with compliance requirements. IT regulations are changing at such a quick pace that the requirements they write today might be out of date tomorrow.

However, I think those who write requirements should take the Payment Card Industry Data Security Standard (PCI DSS) as an example. The PCI DSS applies to all organizations that store cardholder data and the requirements are clear, regularly updated, and you can find everything you need in one place.

The way PCI DSS compliance is structured (in terms of requirement, testing procedures and guidance) is a lot clearer than anything else I’ve seen. It contains very little room for subjectivity, and you know exactly where you stand with it.

The GDPR is also pretty well written and detailed. The many articles referring to data protection are specific, understandable and implementable.

For example, when it comes to data access, this sentence is perfectly clear: “Unauthorized access also includes accidental or unlawful destruction, loss, alteration, or unauthorized disclosure of personal data transmitted, stored or otherwise processed” (Articles 4, 5, 23 and 32).

It’s also very clear when it comes to auditing processes: “You need to maintain a record of data processing activities, including information on ‘recipients to whom the personal data have been or will be disclosed’, i.e. whom has access to data” (Articles 5, 28, 30, 39, 47).

So, while you’re faced with many compliance requirements, you need to have a good strategy in place. However, it can get complex when you’re trying to comply with multiple mandates. If I can give you one tip, it is to find the commonalities between all of them, before coming up with a solution.

You need to do the basics right

In my opinion, the confusing nature of compliance only spawns the relentless bombardment of marketing material from vendors on “how you can be compliant with X” or the “top five things you need to know about Y”.

You have to understand that at the core of any compliance mandate is the desire to keep protected data secure, only allowing access to those who need it for business reasons. This is why all you need to do with compliance is to start with the basics: data storage, file auditing and access management. Get those right, and you’re on your way to demonstrating your willingness to comply.

The state of GDPR compliance in the mobile app space

Among the rights bestowed upon EU citizens by the General Data Protection Regulation (GDPR) is the right to access their personal data stored by companies (i.e., data controllers) and information about how this personal data is being processed. A group of academics from three German universities has decided to investigate whether and how mobile app vendors respond to subject access requests, and the results of their four-year undercover field study are dispiriting.

The results of the study

“In three iterations between 2015 and 2019, we sent subject access requests to vendors of 225 mobile apps popular in Germany. Throughout the iterations, 19 to 26 % of the vendors were unreachable or did not reply at all. Our subject access requests were fulfilled in 15 to 53 % of the cases, with an unexpected decline between the GDPR enforcement date and the end of our study,” they shared.

“The remaining responses exhibit a long list of shortcomings, including severe violations of information security and data protection principles. Some responses even contained deceptive and misleading statements (7 to 13 %). Further, 9 % of the apps were discontinued and 27 % of the user accounts vanished during our study, mostly without proper notification about the consequences for our personal data.”

GDPR mobile

The researchers – Jacob Leon Kröger from TU Berlin (Weizenbaum Institute), Jens Lindemann from the University of Hamburg, and Prof. Dr. Dominik Herrmann from the University of Bamberg – made sure to test a representative sample of iOS and Android apps: popular and less popular, from a variety of app categories, and from vendors based in Germany, the EU, and outside of the EU.

They disguised themselves as an ordinary German user, created accounts needed for the apps to work, interacted with each app for about ten minutes, and asked app providers for information about their stored personal data (before and after GDPR enforcement).

They also used different a request text for each round of inquiries. The first one was more informal, while the last two were more elaborate and included references to relevant data protection laws and a warning that the responsible data protection authorities would be notified in the case of no response.

“While we cannot precisely determine their individual influence, it can be assumed that both the introduction of the GDPR as well as the more formal and threatening tone of our inquiry in [the latter two inquiries] had an impact on the vendors’ behavior,” they noted.

Solving the problem

Smartphones are ubiquitous and most users use a variety of mobile apps, which usually collect personal user data and share it with third parties.

In theory, the GDPR should force mobile app vendors to provide information about this data and how it’s used to users. In practice, though, many app vendors are obviously hoping that users won’t care enough about it and won’t make a stink when they don’t receive a satisfactory reply, and that GDPR regulators won’t have the resources to enforce the regulation.

“We (…) suspected that some vendors merely pretended to be poorly reachable when they received subject access requests – while others actually had insufficient resources to process incoming emails,” the researchers noted.

“To confirm this hypothesis, we tested how the vendors that failed to respond to our requests reacted to non-privacy related inquiries. Using another (different) fake identity, we emailed the vendors who had not replied [to the first inquiry] and [to the third inquiry], expressing interest in promoting their apps on a personal blog or YouTube channel. Out of the group of initial non-responders, 31 % [first inquiry] and 22 % [third inquiry] replied to these dummy requests, many of them within a few hours, proving that their email inbox was in fact being monitored.”

The researchers believe the situation for users can be improved by authorities doing random compliance checks and offering better support for data controllers through industry-specific guidelines and best practices.

“In particular, there should be mandatory standard interfaces for providing data exports and other privacy-related information to data subjects, obviating the need for the manual processing of GDPR requests,” they concluded.

How AI can alleviate data lifecycle risks and challenges

The volume of business data worldwide is growing at an astounding pace, with some estimates showing the figure doubling every year. Over time, every company generates and accumulates a massive trove of data, files and content – some inconsequential and some highly sensitive and confidential in nature.

Throughout the data lifecycle there are a variety of risks and considerations to manage. The more data you create, the more you must find a way to track, store and protect against theft, leaks, noncompliance and more.

Faced with massive data growth, most organizations can no longer rely on manual processes for managing these risks. Many have instead adopted a vast web of tracking, endpoint detection, encryption, access control and data policy tools to maintain security, privacy and compliance. But, deploying and managing so many disparate solutions creates a tremendous amount of complexity and friction for IT and security teams as well as end users. The problem with this approach is that it comes up short in terms of the level of integration and intelligence needed to manage enterprise files and content at scale.

Let’s explore several of the most common data lifecycle challenges and risks businesses are facing today and how to overcome them:

Maintaining security – As companies continue to build up an ocean of sensitive files and content, the risk of data breaches grows exponentially. Smart data governance means applying security across the points at which the risk is greatest. In just about every case, this includes both ensuring the integrity of company data and content, as well as any user with access to it. Every layer of enterprise file sharing, collaboration and storage must be protected by controls such as automated user behavior monitoring to deter insider threats and compromised accounts, multi-factor authentication, secure storage in certified data centers, and end-to-end encryption, as well as signature-based and zero-day malware detection.

Classification and compliance – Gone are the days when organizations could require users to label, categorize or tag company files and content, or task IT to manage and manually enforce data policies. Not only is manual data classification and management impractical, it’s far too risky. You might house millions of files that are accessible by thousands of users – there’s simply too much, spread out too broadly. Moreover, regulations like GDPR, CCPA and HIPAA add further complexity to the mix, with intricate (and sometimes conflicting) requirements. The definition of PII (personally identifiable information) under GDPR alone encompasses potentially hundreds of pieces of information, and one mistake could result in hefty financial penalties.

Incorrect categorization can lead to a variety of issues including data theft and regulatory penalties. Fortunately, machines can do in seconds–and often with better accuracy–what it might take years for a human to do. AI and ML technologies are helping companies quickly scan files across data repositories to identify sensitive information such as credit card numbers, addresses, dates of birth, social security numbers, and health-related data, to apply automatic classifications. They can also track files across popular data sources such as OneDrive, Windows File Server, SharePoint, Amazon S3, Google Cloud, GSuite, Box, Microsoft Azure Blob, and generic CIFS/SMB repositories to better visualize and control your data.

Retention – As data storage costs have plummeted over the past 10 years, many organizations have fallen into the trap of simply “keeping everything” because it’s (deceptively) cheap to do so. This approach carries many security and regulatory risks, as well as potential costs. Our research shows that exposure of just a single terabyte of data could cost you $129,324; now think about how many terabytes of data your organization stores today. The longer you retain sensitive files, the greater the opportunity for them to be compromised or stolen.

Certain types of data must be stored for a specific period of time in order to adhere to various customer contracts and regulatory criteria. For example, HIPAA regulations require organizations to retain documentation for six years from the date of its creation. GDPR is less specific, stating that data shall be kept for no longer than is necessary for the purposes for which it is being processed.

Keeping data any longer than absolutely necessary is not only risky, but those “affordable” costs can add up quickly. AI-enabled governance can track these set retention periods and minimize risk by automatically securing or eliminating any old or redundant files longer required (or allowed). With streamlined data retention processes, you can decrease storage costs, reduce security and noncompliance exposure and optimize data processing performance.

Ongoing monitoring and management – Strong governance gets easier with good data hygiene practices over the long term, but with so many files to manage across a variety of different repositories and storage platforms, it can be challenging to track risks and suspicious activities at all times. Defining dedicated policies for what data types can be stored in which locations, which users can access it, and all parties with which it be shared will help you focus your attention on further minimizing risk. AI can multiply these efforts by eliminating manual monitoring processes, providing better visibility into how data is being used and alerts when sensitive content might have been shared externally or with unapproved users. This makes it far easier to identify and respond to threats and risky behavior, enabling you to take immediate action on compromised accounts, move or delete sensitive content that is being shared too broadly or stored in unauthorized locations, etc.

The key to data lifecycle management

The sheer volume of data, files and content businesses are now generating and managing creates massive amounts of complexity and risk. You have to know what assets exist, where they’re stored, the specific users have access to them, when they’re being shared, what files can be deleted, which need to be stored in accordance with regulatory requirements, and so on. Falling short in any one of these areas can lead to major operational, financial and reputational consequences.

Fortunately, recent advances in AI and ML are enabling companies to streamline data governance to find and secure sensitive data at its source, sense and respond to potentially malicious behaviors, maintain compliance and adapt to changing regulatory criteria, and more. As manual processes and piecemeal point solutions fall short, AI-enabled data governance will continue to dramatically reduce complexity both for users and administrators, and deliver a level of visibility and control that business needs in today’s data-centric world.

Data crisis: Companies pivot from disruption to transformation

Only 10% of organizations are using data effectively for transformational purposes, according to NTT DATA Services.

data crisis

While 79% of organizations recognize the strategic value of data, the study concludes their efforts to use it are hindered by significant challenges including siloed islands of data across the organization and lack of data skills and talent.

The study analyzes the critical role of data and analytics in helping businesses and organizations pivot from disruption to transformation, an imperative as they respond to today’s global economic climate.

Organizations starting to prioritize a data-driven culture

The study shows only 37% are very effective at using data to adopt or invent a new business model, and only 31% are using data to enter new markets. These different use cases show that organizations have started prioritizing a data-driven culture, but many are still lagging in the most basic aspects of data management and governance.

“Our study reinforces that organizations who act quickly and decisively on their data strategies – or Data Leaders – will recover from the global crisis better and even accelerate their success,” said Greg Betz, Senior Vice President, Data Intelligence and Automation, NTT DATA Services.

“C-suite executives must be champions for the vital role strong data governance plays in resolving systemic process failures and transitioning to new business models in response to the crisis.

“To rebound effectively, corporations, organizations and government agencies must shift to next-generation technologies and create contactless experiences, increased security, and scalable hybrid infrastructures – all reinforced by quality, integrated data.”

Data crisis: Organizations struggle to use data for transformation

The financial services (FS) sector accounts for 25% of the data leaders, making this the sector with the most data leaders. The survey shows that 59% FS organizations report being aware of and fully prepared for new data regulations.

34% report data is shared seamlessly across the enterprise; however, they are the least likely to report they have clear data security processes in place.

The manufacturing sector boasts the second-highest number of data leaders in the study. More than eight out of 10 respondents say they can act swiftly if there is a data privacy breach; however, as with other sectors, when they attempt to derive value from their data, manufacturers struggle with data silos (24%), and they lack the necessary skills and talent to analyze their data (19%).

Among healthcare respondents, 60% say they’re aware and fully prepared for new and upcoming regulations, and approximately eight out of 10 say they’re confident they can comply with data privacy regulations.

However, this sector ranks first in its lack of data literacy skills — about a fifth of respondents report they don’t understand how to read, create and communicate data as information.

data crisis

Lack of data talent and skills in the public sector

The public sector has the highest number of data laggards at 37%. Like other sectors, lack of data talent and skills is one of the public sector’s biggest barriers when attempting to understand and derive value from data.

Insurance companies are among the most likely to report they’re aware and fully prepared for new data regulations (58%) and have clear processes in place for securely using their data (50%).

However, when it comes to deriving value from data, insurance companies – like manufacturing, struggle with data silos and the lack of the right technologies to analyze their data.

“This study validates that many of the top data challenges organizations face today are decades old,” said Theresa Kushner, Consultant, AI and Analytics, NTT DATA Services. “The 2020 pandemic is a wakeup call for businesses at any scale, and a reminder that in today’s global economic climate the time to address data challenges and chart a new path is now.”

340 GDPR fines for a total of €158,135,806 issued since May 2018

Since rolling out in May 2018, there have been 340 GDPR fines issued by European data protection authorities. Every one of the 28 EU nations, plus the United Kingdom, has issued at least one GDPR fine, Privacy Affairs finds.

GDPR fines

Whilst GDPR sets out the regulatory framework that all EU countries must follow, each member state legislates independently and is permitted to interpret the regulations differently and impose their own penalties to organizations that break the law.

Nations with the highest fines

  • France: €51,100,000
  • Italy: €39,452,000
  • Germany: €26,492,925
  • Austria: €18,070,100
  • Sweden: €7,085,430
  • Spain: €3,306,771
  • Bulgaria: €3,238,850
  • Netherlands: €3,490,000
  • Poland: €1,162,648
  • Norway: €985,400

Nations with the most fines

  • Spain: 99
  • Hungary: 32
  • Romania: 29
  • Germany: 28
  • Bulgaria: 21
  • Czech Republic: 13
  • Belgium: 12
  • Italy: 11
  • Norway: 9
  • Cyprus: 8

The second-highest number of fines comes from Hungary. The National Authority for Data Protection and Freedom of Information has issued 32 fines to date. The largest being €288,000 issued to an ISP for improper and non-secure storage of customers’ personal data.

UK organizations have been issued just seven fines, totalling over €640,000, by the Information Commissioner. The average penalty within the UK is €160,000. This does not include the potentially massive fines for Marriott International and British Airways that are still under review.

British Airways could face a fine of €204,600,000 for a data breach in 2019 that resulted in the loss of personal data of 500,000 customers.

Similarly, Marriott International suffered a breach that exposed 339 million people’s data. The hotel group faces a fine of €110,390,200.

The largest and highest GDPR fines

The largest GDPR fine to date was issued by French authorities to Google in January 2019. The €50 million was issued on the basis of “lack of transparency, inadequate information and lack of valid consent regarding ads personalization.”

Highest fines issued to private individuals:

  • €20,000 issued to an individual in Spain for unlawful video surveillance of employees.
  • €11,000 issued to a soccer coach in Austria who was found to be secretly filming female players while they were taking showers.
  • €9,000 issued to another individual in Spain for unlawful video surveillance of employees.
  • €2,500 issued to a person in Germany who sent emails to several recipients, where each could see the other recipients’ email addresses. Over 130 email addresses were visible.
  • €2,200 issued to a person in Austria for having unlawfully filmed public areas using a private CCTV system. The system filmed parking lots, sidewalks, a garden area of a nearby property, and it also filmed the neighbors going in and out of their homes.

CCPA enforcement to put pressure on financial organizations’ IT resources

Enforcement of the California Consumer Privacy Act (CCPA), which begins on July 1, 2020, is going to put additional pressure on already overstretched IT resources and budgets, Netwrix reveals.

CCPA enforcement pressure

Increase in DSARs

According to the survey, 32% of financial organizations have already seen an increase in data subject access rights requests (DSARs) since the CCPA came into force on January 1, 2020.

73% of respondents stated that manual processing of these requests puts significant or moderate pressure on their IT teams. Every fourth organization (27%) noted that rising interest in execution of privacy rights has increased their expenses.

Gartner warns that fulfilling a single request takes most organizations two or more weeks and costs an average of $1,400 if done manually. This means that many financial organizations, which are already facing tough times, will need to allocate additional workforce and budget to ensure compliance with the CCPA.

Other findings

  • 33% of financial organizations discovered sensitive or regulated customer data outside of designated secure locations.
  • 40% of respondents admitted their IT teams granted direct access to sensitive data based solely on a user’s request in the past 12 months.
  • 75% of financial organizations that classify data can detect data misuse in minutes, while those who don’t usually need days (43%) or months (29%).
  • 70% of incidents of unauthorized data sharing within this vertical led to data compromise.
  • 44% of CISOs and CIOs don’t have or don’t know whether they have KPIs for IT security and risk.

“While organizations are unlikely to be flooded with data subject access requests on July 2, they do need to be prepared to process requests accurately and promptly. One missed deadline or incompletely fulfilled request could result in a thorough audit from the authorities and sizable fines.

“To ensure compliance while controlling costs and relieving the burden on IT, financial organizations need to automate the DSAR process,” said Steve Dickson, CEO of Netwrix.

Does analyzing employee emails run afoul of the GDPR?

A desire to remain compliant with the European Union’s General Data Protection Regulation (GDPR) and other privacy laws has made HR leaders wary of any new technology that digs too deeply into employee emails. This is understandable, as GDPR non-compliance pay lead to stiff penalties.

analyzing employee emails

At the same time, new technologies are applying artificial intelligence (AI) and machine learning (ML) to solve HR problems like analyzing employee data to help with hiring, completing performance reviews or tracking employee engagement. This has great potential for helping businesses coach and empower employees (and thus help them retain top talent), but these tools often analyze employee emails as a data source. Does this create a privacy issue in regard to the GDPR?

In most cases, the answer is “no.” Let’s explore these misconceptions and explain how companies can stay compliant with global privacy laws while still using AI/ML workplace technologies to provide coaching and empowerment solutions to their employees.

Analyzing employee data with AI/ML isn’t unique to HR

First of all, many appliances already analyze digital messages with AI/ML. Many of these are likely already used by your organization and do not ask for consent from every sender for every message they analyze. Antivirus software uses AI/ML to scan incoming messages for viruses, chatbots use it to answer support emails, and email clients themselves use AI/ML to suggest responses to common questions as the user types them or create prompts to schedule meetings.

Applications like Gmail, Office 365 Scheduler, ZenDesk and Norton Antivirus do these tasks all the time. Office 365 Scheduler even analyzes emails using natural language processing to streamline the simple task of scheduling a meeting. Imagine if they had to ask for the user’s permission every time they did this! HR technologies that do something similar are not unique.

Employers also process employee’s personal data without their consent on a daily basis. Consider these tasks: automatically storing employee communications, creating paperwork for employee reviews or disciplinary action, or sending payroll information to government agencies. Employees don’t need to give consent for this. That’s because there’s a different legal basis at work that allows the company to share data in this way.

Companies do not need employee consent in this context

This isn’t an issue because the GDPR offers five alternative legal bases pursuant to which employee personal data can be processed, including the pursuit of the employer’s “legitimate interests.” This concept is intentionally broad and gives organizations flexibility to determine whether its interests are appropriate, regardless of whether these interests are commercial, individual, or broader societal benefits, or even whether the interests are a company’s own or those of a third party.

GDPR regulations single out preventing fraud and direct marketing as two specific purposes where personal data may be processed in pursuit of legitimate interest, but there are many more.

These “legitimate interest” bases give employers grounds to process personal data using AI/ML applications without requiring consent. In fact, employers should avoid relying on consent to process employee’s personal data whenever possible. Employees are almost never in a position to voluntarily or freely give consent due to the imbalance of power inherent in employer-employee relationships, and therefore the consents are often invalid. In all the cases listed above, the employer relies on legitimate interest to process employee data. HR tools fall into the same category and don’t require consent.

A right to control your inbox

We’ve established that employers can process email communication data internally with new HR tools that use AI/ML and be compliant with the GDPR. But should they?

Here is where we move from legal issues to ethical issues. Some companies that value privacy might believe that employees should control their own inbox, even though that’s not a GDPR requirement. That means letting employees grant and revoke permission to the applications that can read their workplace emails (and which have already been approved by the company). This lets the individual control their own data. Other organizations may value the benefits of new tools over employee privacy and may put them in place without employees’ consent.

I have seen some organizations create a middle ground by making these tools available to employees but requiring them to opt in to use them (rather than installing them and giving employees the option to opt out, which puts an extra burden on them to maintain privacy). This can both respect employee’s privacy and allow HR departments to use new technologies to empower individuals if they so choose. This is more important than ever in the new era of widespread work from home where we have an abundance of workplace communication and companies are charting new courses to help their employees thrive in the future of work.

Fully understanding compliance around new AI/ML tools is key to effectively rolling them out. While these solutions can be powerful and may help your employees become more self-aware and better leaders, organizations should fully understand compliance and privacy issues associated with their use in order to roll them out effectively.

EU Commission: The GDPR has been an overall success

The European Commission has published an evaluation report on the General Data Protection Regulation (GDPR), two years after the regulation became enforceable.

GDPR two years

Two years of the GDPR: What was achieved?

Citizens are more empowered and aware of their rights: The GDPR enhances transparency and gives individuals enforceable rights, such as the right of access, rectification, erasure, the right to object and the right to data portability. Today, 69% of the population above the age of 16 in the EU have heard about the GDPR and 71% of people heard about their national data protection authority, according to results published last week in a survey from the EU Fundamental Rights Agency. However, more can be done to help citizens exercise their rights, notably the right to data portability.

Data protection rules are fit for the digital age: The GDPR has empowered individuals to play a more active role in relation to what is happening with their data in the digital transition. It is also contributing to fostering trustworthy innovation, notably through a risk-based approach and principles such as data protection by design and by default.

Data protection authorities are making use of their stronger corrective powers: From warnings and reprimands to administrative fines, the GDPR provides national data protection authorities with the right tools to enforce the rules. However, they need to be adequately supported with the necessary human, technical and financial resources. Many Member States are doing this, with notable increases in budgetary and staff allocations. Overall, there has been a 42% increase in staff and 49% in budget for all national data protection authorities taken together in the EU between 2016 and 2019. However, there are still stark differences between Member States.

Data protection authorities are working together in the context of the European Data Protection Board (EDPB), but there is room for improvement: The GDPR established a governance system which is designed to ensure a consistent and effective application of the GDPR through the so called ‘one stop shop’, which provides that a company processing data cross-border has only one data protection authority as interlocutor, namely the authority of the Member State where its main establishment is located. Between 25 May 2018 and 31 December 2019, 141 draft decisions were submitted through the ‘one-stop-shop’, 79 of which resulted in final decisions. However, more can be done to develop a truly common data protection culture. In particular, the handling of cross-border cases calls for a more efficient and harmonised approach and an effective use of all tools provided in the GDPR for the data protection authorities to cooperate.

Advice and guidelines by data protection authorities: The EDPB is issuing guidelines covering key aspects of the Regulation and emerging topics. Several data protection authorities have created new tools, including helplines for individuals and businesses, and toolkits for small and micro-enterprises. It is essential to ensure that guidance provided at national level is fully consistent with guidelines adopted by the EDPB.

Harnessing the full potential of international data transfers: Over the past two years, the Commission’s international engagement on free and safe data transfers has yielded important results. This includes Japan, with which the EU now shares the world’s largest area of free and safe data flows. The Commission will continue its work on adequacy, with its partners around the world. In addition and in cooperation with the EDPB, the Commission is looking at modernising other mechanisms for data transfers, including Standard Contractual Clauses, the most widely used data transfer tool. The EDPB is working on specific guidance on the use of certification and codes of conduct for transferring data outside of the EU, which need to be finalised as soon as possible. Given the European Court of Justice may provide clarifications in a judgment to be delivered on 16 July that could be relevant for certain elements of the adequacy standard, the Commission will report separately on the existing adequacy decisions after the Court of Justice has handed down its judgment.

Promoting international cooperation: Over the last two years, the Commission has stepped up bilateral, regional and multilateral dialogue, fostering a global culture of respect for privacy and convergence between different privacy systems to the benefit of citizens and businesses alike. The Commission is committed to continuing this work as part of its broader external action, for example, in the context of the Africa-EU Partnership and in its support for international initiatives, such as ‘Data Free Flow with Trust’. At a time when violations of privacy rules may affect large numbers of individuals simultaneously in several parts of the world, it is time to step up international cooperation between data protection enforcers. This is why the Commission will seek authorisation from the Council to open negotiations for the conclusion of mutual assistance and enforcement cooperation agreements with relevant third countries.

GDPR: What’s next?

According to the report, in two years the GDPR has met most of its objectives, in particular by offering citizens a strong set of enforceable rights and by creating a new European system of governance and enforcement.

The GDPR proved to be flexible to support digital solutions in unforeseen circumstances such as the Covid-19 crisis. The report also concludes that harmonisation across the Member States is increasing, although there is a certain level of fragmentation that must be continually monitored. It also finds that businesses are developing a compliance culture and increasingly use strong data protection as a competitive advantage.

The GDPR has acted as a catalyst for many countries and states around the world – e.g., Chile, South Korea, Brazil, Japan, Kenya, India, Tunisia, Indonesia, Taiwan and the state of California – to consider how to modernise their privacy rules, the EC noted.

They also pointed out that it provided data protection authorities many corrective powers to enforce it (administrative fines, orders to comply with data subject’s requests, bans on processing or the suspension of data flows, etc.)

There is room for improvement, though.

“For example, we need more uniformity in the application of the rules across the Union: this is important for citizens and for businesses, especially SMEs. We need also to ensure that citizens can make full use of their rights,” noted Didier Reynders, Commissioner for Justice.

The EC also noted that stakeholders should also make sure to closely monitoring the application of the GDPR to new technologies such as AI, Internet of Things, abd blockchain.