COVID-19 has accelerated the push toward digital business transformation for most businesses, and legal and compliance leaders are under pressure to anticipate both the potential improvements and possible risks that come with new legal technology innovations, according to Gartner.
Legal technology innovations
To address this challenge, Gartner lists the 31 must watch legal technologies to allow legal and compliance leaders to identify innovations that will allow them to act faster. They can use this information for internal planning and prioritization of emerging innovations.
“Legal and compliance leaders must collaborate with other stakeholders to garner support for organization wide and function wide investments in technology,” said Zack Hutto, director in the Gartner Legal and Compliance practice.
“They must address complex business demand by investing in technologies and practices to better anticipate, identify and manage risks, while seeking out opportunities to contribute to growth.”
Analysts said enterprise legal management (ELM), subject rights requests, predictive analytics, and robotic process automation (RPA) are likely to be most beneficial for the majority of legal and compliance organizations within a few years. They are also likely to help with the increased need for cost optimization and unplanned legal work arising from the pandemic.
Enterprise legal management
This is a multifaceted market where several vendors are trying to consolidate many of the technologies on this year’s Hype Cycle into unified platforms and suites to streamline the many aspects of corporate governance.
“Just as enterprise resource planning (ERP) overhauled finance, there is promise for a foundational system of record to improve in-house legal operations and workflows,” said Mr. Hutto. “Legal leaders should take a lesson from ERP’s evolution: ‘monolithic’ IT systems tend to lack flexibility and can quickly become an anchor not a sail.”
Legal application leaders and general counsel must begin with their desired business outcomes, and only then find a technology that can help deliver those outcomes.
Subject rights requests
The demand for subject rights requests (SRRs) is growing along with the number of regulations that enshrine a data subject’s right to access their data and request amendment or deletion. Current regulations include the CCPA in the U.S., the EU’s GDPR and Brazil’s Lei Geral de Proteção de Dadosis.
Many organizations are funneling their subject access requests (SARs) through internal legal counsel to limit the potential exposure to liability. This is costing, on average, $1,406 per SAR.
“In the face of rising request volumes and significant costs, there is great potential for legal and compliance leaders to make substantial savings and free up time by using technology to automate part, if not most, of the SRR workflow,” said Mr. Hutto.
This is a well-established technology and the market is mature, so it can be relatively simple to use “out-of-the-box” or via a cloud service. Typically, the technology can examine data or content to answer the question, ”What is likely to happen if…?”
“Adoption of this technology in legal and compliance is typically less mature than other business functions,” said Mr. Hutto. “This likely means untapped use cases where existing solutions could be used in the legal and compliance context to offer some real benefits.
“While analytics platforms may make data analysis more ‘turnkey’ extracting real insights may be more elusive. Legal and compliance leaders still should consider and improve the usefulness of their data, the capabilities of their teams, and the attainability of data in various existing systems.”
Robotic process automation (RPA)
RPA’s potential to streamline workflows for repetitive, rule-based tasks is already well-established in other business functions. Typically, RPA is best suited to systems with a standardized — often legacy — user interfaces for which scripts can be written.
“Where legal departments already use these types of systems it is likely that RPA can drive higher efficiency,” said Mr. Hutto. “However, not all legal departments use such systems. If not, it could make sense to take a longer view and consider investing in systems that have automation functionality built in.”
Gartner advice is to consider these four technologies is not solely based on their position on the Hype Cycle. Legal and compliance leaders should focus on the technologies that have the most potential for driving the greatest transformation within their own organizations in the near to medium term; the position on the Hype Cycle is part of that but not the whole story.
For example, Mr. Hutto said blockchain is a technology that has the potential to make a successful journey to the Plateau of Productivity within five years. But for now, its application will likely be limited to quite a narrow set of use cases, and it is unlikely to be transformational for corporate legal and compliance leaders.
COVID-19 has put a spotlight on ethical issues emerging from the increased use of AI applications and the potential for bias and discrimination.
A report from the Capgemini Research Institute found that in 2020 45% of organizations have defined an ethical charter to provide guidelines on AI development, up from 5% in 2019, as businesses recognize the importance of having defined standards across industries.
However, a lack of leadership in terms of how these systems are developed and used is coming at a high cost for organizations.
The report notes that while organizations are more ethically aware, progress in implementing ethical AI has been inconsistent. For example, the progress on “fairness” (65%) and “auditability” (45%) dimensions of ethical AI has been non-existent, while transparency has dropped from 73% to 59%, despite the fact that 58% of businesses say they have been building awareness amongst employees about issues that can result from the use of AI.
The research also reveals that 70% of customers want a clear explanation of results and expect organizations to provide AI interactions that are transparent and fair.
Ethical governance has become a prerequisite
The need for organizations to implement an ethical charter is also driven by increased regulatory frameworks. For example, the European Commission has issued guidelines on the key ethical principles that should be used for designing AI applications.
Meanwhile, guidelines issued by the FTC in early 2020 call for transparent AI, stating that when an AI-enabled system makes an adverse decision (such as declining credit for a customer), then the organization should show the affected consumer the key data points used in arriving at the decision and give them the right to change any incorrect information.
However, while globally 73% of organizations informed users about the ways in which AI decisions might affect them in 2019, today, this has dropped to 59%.
According to the report, this is indicative of current circumstances brought about by COVID-19, growing complexity of AI models, and a change in consumer behavior, which has disrupted the functionalities of the AI algorithms.
New factors, including a preference of safety, bulk buying, and a lack of training data for similar situations from the past, has meant that organizations are redesigning their systems to suit a new normal; however, this has led to less transparency.
Discriminatory bias with AI systems come at a high cost for orgs
Many public and private institutions deployed a range of AI technologies during COVID-19 in an attempt to curtail the impacts wrought by the pandemic. As these continue, it is critical for organizations to uphold customer trust by furthering positive relationships between AI and consumers. However, reports show that datasets collected for healthcare and the public sector are subjected to social and cultural bias.
This is not limited to just the public sector. The research found that 65% of executives said they were aware of the issue of discriminatory bias with AI systems. Further, close to 60% of organizations have attracted legal scrutiny and 22% have faced a customer backlash in the last two to three years because of decisions reached by AI systems.
In fact, 45% of customers noted they will share their negative experiences with family and friends and urge them not to engage with an organization, 39% will raise their concerns with the organization and demand an explanation, and 39% will switch from the AI channel to a higher-cost human interaction. 27% of consumers say they would cease dealing with the organization altogether.
Establish ownership of ethical issues – leaders must be accountable
Only 53% of organizations have a leader who is responsible for the ethics of AI systems at their organization, such as a Chief Ethics Officer. It is crucial to establish leadership at the top to ensure these issues receive due priority from top management and to create ethically robust AI systems.
In addition, leaders in business and technology functions must be fully accountable for the ethical outcomes of AI applications. Our research shows that only half said they had a confidential hotline or ombudsman to enable customers and employees to raise ethical issues with AI systems.
The report highlights seven key actions for organizations to build an ethically robust AI system, which need to be underpinned by a strong foundation of leadership, governance, and internal practices:
- Clearly outline the intended purpose of AI systems and assess its overall potential impact
- Proactively deploy AI for the benefit of society and environment
- Embed diversity and inclusion principles throughout the lifecycle of AI systems
- Enhance transparency with the help of technology tools
- Humanize the AI experience and ensure human oversight of AI systems
- Ensure technological robustness of AI systems
- Protect people’s individual privacy by empowering them and putting them in charge of AI interactions
Anne-Laure Thieullent, Artificial Intelligence and Analytics Group Offer Leader at Capgemini, explains, “Given its potential, it would be a disservice if the ethical use of AI is only limited to ensure no harm to users and customers. It should be a proactive pursuit of environmental good and social welfare.
“AI is a transformational technology with the power to bring about far-reaching developments across the business, as well as society and the environment. This means governmental and non-governmental organizations that possess the AI capabilities, wealth of data, and a purpose to work for the welfare of society and environment must take greater responsibility in tackling these issues to benefit societies now and in the future.”
Californians regularly opt-out of companies selling their personal information, with “Do-not-sell” being the most common CCPA right exercised, happening nearly 50% of the time over access and deletion requests, DataGrail’s Mid-Year CCPA Trends Report shows.
Consumer rights under CCPA
The California Consumer Privacy Act gives California residents the right to:
- Know what personal data businesses have about them
- Know what businesses do with that information (to whom they sell it or disclose it)
- Access their personal data
- Refuse the sale of their personal data
- Request that a business deletes their personal data
Do-not-sell requests are almost 50% of all DSRs
When CCPA went into effect in January 2020, DataGrail saw people exercise their rights immediately, with a surge of data subject requests (DSRs) going across its platform in January 2020.
Since the initial surge, DSRs have stabilized around 13 DSRs per million records every month, which is a substantial rate and confirms that organizations need an established privacy program.
Consumers are accessing their data (21%), deleting their data (31%) and requiring that businesses do-not-sell their personal information (48%).
Gartner data shows that manually processing a single DSR costs on average $1,406. At this rate, organizations can expect to spend almost $240,000 per million records to fulfill DSRs – if they are done manually.
Additionally, organizations could be on the hook for more DSR requests from fines that will likely begin appearing in October, if CCPA follows the same timeline as GDPR.
According to the research, B2C companies should prepare to process approximately 170 total DSRs per one million consumer records each year.
DataGrail has also found that three of every ten DSRs will go unverified, confirming the need for a robust and scalable verification method to prevent fraud (i.e., detect fraudulent requests being made to steal personal data).
Access requests (DSARs) make up 70% of the unverified requests, validating the concern that nefarious characters could be submitting access requests to gain access to another person’s personal information.
Although consumers remain concerned about sharing personal data with companies, the results of a Privitar survey highlight an opportunity for businesses to take a leadership role and build brand loyalty by protecting their customers.
The report found that more than three-quarters of respondents are concerned or very concerned about protecting their personal data, with 42 percent of consumers saying they wouldn’t share sensitive data (e.g. name, address, email address, phone number, location information, health information, banking information, social security number) with a business for any reason.
As consumers grow increasingly apprehensive when it comes to their data, business success will depend on an organizations’ ability to prioritize and successfully execute on privacy initiatives.
Disconnect between consumer sentiment and actions surrounding data protection
When it comes to the management of their data, many consumers aren’t fully aware of how brands are securing their personal information. According to the survey, 43 percent of consumers don’t know if they’ve worked with a business that has been impacted by a data breach.
When it comes to privacy notices, 28 percent admit to not reading privacy notices at all and 42 percent admitted to only skimming the text. These findings point to a growing sentiment that data privacy should be the responsibility of the business – not the customer. With this, businesses have a tremendous opportunity to make data privacy a differentiator and way to build long-term loyalty.
Pandemic creating more data sharing opportunities, still consumers are wary
Despite the growing advancements on the data protection front, 51 percent of consumers surveyed said they are still not comfortable sharing their personal information. One-third of respondents said they are most concerned about it being stolen in a breach, with another 26 percent worried about it being shared with a third party.
In the midst of the growing pandemic, COVID-19 tracking, tracing, containment and research depends on citizens opting in to share their personal data. However, the research shows that consumers are not interested in sharing their information.
When specifically asked about sharing healthcare data, only 27 percent would share health data for healthcare advancements and research. Another 21 percent of consumers surveyed would share health data for contact tracing purposes.
As data becomes more valuable to combat the pandemic, companies must provide consumers with more background and reasoning as to why they’re collecting data – and how they plan to protect it.
Upcoming U.S. elections driving consumer awareness of data privacy
As the debate grows louder across the nation, 73 percent of consumers think that there should be more government oversight at the federal and/or state/local levels. While legislation can take years to pass, it’s important for businesses to overhaul their technology and processes now to quickly address consumers’ concerns and keep business running.
Businesses must drive data privacy action
Companies rely on brand loyalty to keep their operations up and running. While often referring to affordable costs and personalization as a means to keeping business moving, many overlook the importance of instilling a more personal sense of trust within their customer base.
When working with a business, 40 percent of consumers think the brand’s trustworthiness is most important when it comes to brand loyalty and 31 percent say it’s the brand’s commitment to protecting their data.
Evenly matched up with the 30 percent of consumers who believe customer service matters most, the results prove that data protection is just as critical to keeping customers coming back for more.
However, broken trust and lost responsibility for protecting that data have severe consequences, with 24 percent saying they have either stopped doing business or done less business with a company after it was breached.
As markets grow increasingly competitive in a fluctuating economy, it’s critical for businesses to keep customer loyalty high – and as such, be more open and transparent with how they’re using personal data.
“The global COVID-19 pandemic has underscored the importance of the trust relationship companies and governments need to build with consumers in an increasingly digital world,” said Jason du Preez, CEO, Privitar.
“The results of the survey affirm the growing need for brands to focus on building and maintaining this trust, starting first and foremost with protecting customer data. As more businesses utilize the cloud to enable data driven insights, a firm commitment to data privacy will help to ensure long-term loyalty, consumer satisfaction and shareholder value.”
Since rolling out in May 2018, there have been 340 GDPR fines issued by European data protection authorities. Every one of the 28 EU nations, plus the United Kingdom, has issued at least one GDPR fine, Privacy Affairs finds.
Whilst GDPR sets out the regulatory framework that all EU countries must follow, each member state legislates independently and is permitted to interpret the regulations differently and impose their own penalties to organizations that break the law.
Nations with the highest fines
- France: €51,100,000
- Italy: €39,452,000
- Germany: €26,492,925
- Austria: €18,070,100
- Sweden: €7,085,430
- Spain: €3,306,771
- Bulgaria: €3,238,850
- Netherlands: €3,490,000
- Poland: €1,162,648
- Norway: €985,400
Nations with the most fines
- Spain: 99
- Hungary: 32
- Romania: 29
- Germany: 28
- Bulgaria: 21
- Czech Republic: 13
- Belgium: 12
- Italy: 11
- Norway: 9
- Cyprus: 8
The second-highest number of fines comes from Hungary. The National Authority for Data Protection and Freedom of Information has issued 32 fines to date. The largest being €288,000 issued to an ISP for improper and non-secure storage of customers’ personal data.
UK organizations have been issued just seven fines, totalling over €640,000, by the Information Commissioner. The average penalty within the UK is €160,000. This does not include the potentially massive fines for Marriott International and British Airways that are still under review.
British Airways could face a fine of €204,600,000 for a data breach in 2019 that resulted in the loss of personal data of 500,000 customers.
Similarly, Marriott International suffered a breach that exposed 339 million people’s data. The hotel group faces a fine of €110,390,200.
The largest and highest GDPR fines
The largest GDPR fine to date was issued by French authorities to Google in January 2019. The €50 million was issued on the basis of “lack of transparency, inadequate information and lack of valid consent regarding ads personalization.”
Highest fines issued to private individuals:
- €20,000 issued to an individual in Spain for unlawful video surveillance of employees.
- €11,000 issued to a soccer coach in Austria who was found to be secretly filming female players while they were taking showers.
- €9,000 issued to another individual in Spain for unlawful video surveillance of employees.
- €2,500 issued to a person in Germany who sent emails to several recipients, where each could see the other recipients’ email addresses. Over 130 email addresses were visible.
- €2,200 issued to a person in Austria for having unlawfully filmed public areas using a private CCTV system. The system filmed parking lots, sidewalks, a garden area of a nearby property, and it also filmed the neighbors going in and out of their homes.
TrustArc announced the results of its survey on how organizations are protecting and leveraging data, their most valuable asset. The survey polled more than 1,500 respondents from around the world at all levels of the organization.
“The TrustArc survey highlights just how difficult it can be to comply with even a single new regulation, such as CCPA, let alone the entire list of existing laws. The results also show how the COVID-19 pandemic and its attendant technologies, such as video conferencing, have exacerbated an already difficult privacy challenge and forced respondents to rethink their approaches.”
CCPA compliance readiness mostly lacking, prior GDPR preparedness a boost
29% of respondents say they have just started planning for CCPA.
- More than 20% of respondents report they are either somewhat unlikely to be, very unlikely to be, or don’t know if they will be fully compliant with CCPA on July 1.
- Just 14% of respondents are done with CCPA compliance. Nine percent have not started with CCPA compliance, and 15% have a plan but have not started implementation.
- Of respondents who reported as being slightly or very knowledgeable about CCPA and GDPR regulations, 82% are leveraging at least some of the work they did for GDPR in implementing CCPA requirements.
Privacy professionals still use inefficient technologies for compliance programs
Though 90% of respondents agree or strongly agree that they are “mindful of privacy as a business,” many privacy professionals are left building privacy programs without automation.
- 19% of respondents report they are most deficient in automating privacy processes.
- Just 17% of all respondents have implemented privacy management software, which matches the 17% who are still using spreadsheets and word processors.
- In addition, 19% are using open source/free software and 9% are doing nothing.
- Even in the U.S., which boasts the highest rate of privacy management software adoption, just 22% of respondents use privacy management software as their primary compliance software.
Respondents understand the importance of data privacy and continue to invest in ongoing privacy programs. However, many are still attempting to implement these programs using manual processes and technologies that do not offer automation.
Moving forward, the companies that can leverage automation to simplify data privacy can protect their most valuable asset—data—and use it to drive business growth.
New technologies present additional challenges to compliance
With the move to all-remote workforces, companies are increasingly turning to technologies, such as video conferencing and collaboration tools. These tools present new avenues for data creation that privacy professionals must consider in their company-wide plans.
- Twenty-two percent of respondents said personal device security during the pandemic has added a great deal of risk to their businesses. “Personal device security” received the highest proportion of “a great deal of risk” responses, compared to the other four response options.
- A majority of respondents said that third-party data, supply chain, personal-device security, unintentional data sharing, and required or voluntary data sharing for public health purposes all added at least a moderate amount of risk to their businesses.
- Seventy percent of respondents say video conferencing tools have required a moderate or great change to their privacy approach, and 65% of respondents say collaboration tools have required a moderate or great change to privacy approaches.
Despite financial impact of pandemic, privacy compliance remains a high priority
Though many respondents expect a significant decrease in their company’s revenues as a result of the COVID-19 pandemic, they are still prioritizing privacy-related investments.
- Forty-four percent of companies expect a decrease or steep decrease in overall company revenues for the balance of 2020 as a result of COVID-19.
- Just 15% of respondents report they plan to spend less or a great deal less on privacy efforts in 2020 as a result of the pandemic.
- 42% of respondents plan to spend $500,000 or more in 2020 on CCPA efforts alone.
Boards of directors actively involved in privacy management
The mandate for increased privacy investments is coming from the very top of organizations.
- Eighty-three percent of respondents indicate their board of directors regularly reviews privacy approaches.
- An impressive 86% of respondents say that everyone from the board of directors to the front-line staff knows their role in protecting privacy.
- Four out of five respondents view privacy as a key differentiator for their company.
Two years after the GDPR went into effect, official data show that Data Protection Authorities (DPAs), crippled by a lack of resources, tight budgets, and administrative hurdles, have not yet been able to create adequate GDPR enforcement.
Worse, some public authorities have grossly misused the GDPR to undermine other fundamental rights such as the right to free expression and freedom of the press, Access Now reveals.
The GDPR’s first two years have been marked by crisis, whether internal, external, political, geopolitical, or administrative. Beyond enforcement challenges, the report explores how these crises have impacted the protection of personal data in the EU, taking a close look at both Brexit and the COVID-19 outbreak.
“Through this report, we raise the alarm to the EU institutions and Data Protection Authorities that it’s high time to act to enforce the GDPR and condemn its misuses,” said Estelle Massé, Senior Policy Analyst and Global Data Protection Lead at Access Now.
“The European Union may have the best law in the world for the protection of personal data, but if it is not enforced, it risks being as useful as a chocolate teapot.”
The GDPR remains a strong framework, and if authorities take urgent action, it can go a long way in defending people’s fundamental rights.
GDPR around the world
From May 2018 to March 2020, authorities levied 231 fines and sanctions while as many as 144,376 complaints were filed between May 2018 and May 2019.
Out of 30 DPAs from all 27 EU countries, the United Kingdom, Norway, and Iceland, only nine said they were happy with their level of resourcing. The inadequate budget provided to DPAs means that our rights may not be effectively protected. In fact, it may create a negative incentive for DPAs investigating large tech companies to agree on settlements that may be more favorable to the companies. This is reinforced by the huge disparity of resources between data protection authorities and companies they oversee.
In Poland, Romania, Hungary, and Slovakia, courts and authorities have been abusing the GDPR to curtail investigative journalism or target civic tech NGOs by trying to force outlets to reveal their sources.
The GDPR is a robust tool to guide officials and public health authorities in the response to the COVID-19 crisis. Access Now condemns Hungary’s disproportionate decision to limit the application of GDPR rights during the COVID-19 crisis as it gravely endangers people’s right to data protection at a time when our personal information, including our health data, is being collected perhaps more than ever.
Enforcement challenges and the UK’s insistence on lowering current standards through the Brexit talks have implications for any future negotiations of a so-called adequacy decision between the EU and the UK that would authorize the transfer of data between the two jurisdictions.
Governments across the EU must increase the financial and human resources allocated to Data Protection Authorities, including technical staff, so that they can function properly and be able to address the large number of complaints.
The European Commission should launch infringement procedures against EU states:
- When they do not provide sufficient resources to Data Protection Authorities, or
- When they do not guarantee the Data Protection Authority independence in status and in practices, or
- Where Data Protection Authorities or courts misuse the GDPR to restrict freedom of the press or stifle civil society’s work.
Data Protection Authorities must not misuse the GDPR, as they hold much of the responsibility for the GDPR’s success or failure. It is absolutely unacceptable that DPAs misuse the GDPR to undermine civil society, restrict freedom of the press, or otherwise violate human rights.
Organizations who plan on manually processing CCPA data subject requests (DSRs) or data subject access requests will spend between $140k – $275k per million consumer records they have in their systems, according to DataGrail.
The CCPA went into effect on January 1, 2020, giving consumers the right to know the data collected about them, to delete data about them, and ensure their data is not sold to third-parties. The report analyzed the number of requests in Q1 2020 to understand how CCPA will impact organizations in the long-run.
The early learnings from the first few months of CCPA should help businesses plan and predict the future of privacy regulation.
- Privacy headlines (and COVID-related emails) in March & April likely drove an increase of CCPA privacy requests.
- B2C companies should prepare to process approximately 100 to 194 requests per million consumer records each year.
- Processing CCPA privacy requests will likely cost B2C companies $140,000 to $275,000 per one million consumer records, if done manually.
- Deletion requests were the most popular requests (40%) in Q1 2020, followed by DNS (33%), and access requests (27%).
- Do Not Sell (DNS) requests will likely become the most dominant privacy request after analyzing early trending data.
CCPA privacy requests expected to stabilize
Looking forward to the remainder of 2020, the number of CCPA privacy requests is expected to stabilize around the February and March numbers (8 requests per million consumer records).
In July and August we may see a surge once again as CCPA enforcement begins on July 1, 2020.
DNS requests expected to dominate
DNS requests will likely dominate, with deletion requests not far behind, which means companies should prepare for the complex task of reaching out to their network of processors and sub processors to successfully perform a hard delete. New regulations cause a lot of uncertainty and anxiety – especially when they involve a lot of complexity and associated fines.
92% of companies are concerned about new consumer rights under the California Consumer Privacy Act (CCPA) with 51% believing this is the hardest part of CCPA compliance and 64% planning to spend more than $100K on compliance in 2020, according to Truyo.
Despite changing IT priorities and tightening of spend due to COVID-19 measures, 56% of data privacy professionals are expecting there will be an increase in rights requests as a result of COVID-19.
The research found that consumers are actively exercising their rights under CCPA with 51% of companies receiving more than 10 requests a week and 20% receiving more than 100 requests a week. The research surveyed 221 data privacy decision makers at companies with more than 1000 employees between 3/31/20-4/13/20.
“With changed behavior due to the covid control measures Americans are increasingly online and on zoom sharing more data than ever before. What was already a compliance headache for privacy professionals is now only likely to increase with the additional requirements for employee data and a spotlight on companies to protect consumer privacy ahead of enforcement starting in July,” said Dan Clarke, President of Truyo.
What have companies done to address CCPA?
Companies are taking the new legal requirements seriously with 59% investing in new tools to address CCPA privacy rights. Product features and automation capabilities were the top requirements for executives when choosing a third-party provider with a focus on long term scalability through automation while managers were more focused on costs.
The research also revealed a chasm in understanding between IT and Legal departments on what’s involved in managing data with 55% of legal professionals saying their solution was fully automated and only 13% of IT.
Privacy rights requests: What next?
With the exemption for employee rights under the CCPA due to end on December 31, 2020, 92% of privacy professionals said they planned to extend privacy rights to employees with 62% planning to offer these to all employees not just those in California. Only 15% say they intend to wait until this is a legal requirement under the CCPA.
74% are tracking progress in the introduction of new state privacy legislation outside of California. For 64% additional state legislation is the biggest driver to introduce a third-party tool to support compliance.
Over 40% of privacy compliance technology will rely on artificial intelligence (AI) by 2023, up from 5% today, according to Gartner.
The research was conducted online among 698 respondents in Brazil, Germany, India, the U.S. and the U.K.
“Privacy laws, such as General Data Protection Regulation (GDPR), presented a compelling business case for privacy compliance and inspired many other jurisdictions worldwide to follow,” said Bart Willemsen, research vice president at Gartner.
“More than 60 jurisdictions around the world have proposed or are drafting postmodern privacy and data protection laws as a result. Canada, for example, is looking to modernize their Personal Information Protection and Electronic Documents Act (PIPEDA), in part to maintain the adequacy standing with the EU post-GDPR.”
Privacy leaders are under pressure to ensure that all personal data processed is brought in scope and under control, which is difficult and expensive to manage without technology aid. This is where the use of AI-powered applications that reduce administrative burdens and manual workloads come in.
AI-powered privacy technology lessens compliance headaches
At the forefront of a positive privacy user experience (UX) is the ability of an organization to promptly handle subject rights requests (SRRs). SRRs cover a defined set of rights, where individuals have the power to make requests regarding their data and organizations must respond to them in a defined time frame.
According to the survey, many organizations are not capable of delivering swift and precise answers to the SRRs they receive. Two-thirds of respondents indicated it takes them two or more weeks to respond to a single SRR. Often done manually as well, the average costs of these workflows are roughly $1,400 USD, which pile up over time.
“The speed and consistency by which AI-powered tools can help address large volumes of SRRs not only saves an organization excessive spend, but also repairs customer trust,” said Mr. Willemsen. “With the loss of customers serving as privacy leaders’ second highest concern, such tools will ensure that their privacy demands are met.”
Global privacy spending on compliance tooling will rise to $8 billion through 2022
Through 2022, privacy-driven spending on compliance tooling will rise to $8 billion worldwide. Privacy spending is expected to impact connected stakeholders’ purchasing strategies, including those of CIOs, CDOs and CMOs. “Today’s post-GDPR era demands a wide array of technological capabilities, well beyond the standard Excel sheets of the past,” said Mr. Willemsen.
“The privacy-driven technology market is still emerging,” said Mr. Willemsen.
“What is certain is that privacy, as a conscious and deliberate discipline, will play a considerable role in how and why vendors develop their products. As AI turbocharges privacy readiness by assisting organizations in areas like SRR management and data discovery, we’ll start to see more AI capabilities offered by service providers.”
As state houses and Congress rush to consider new consumer privacy legislation in 2020, Americans expect more control over their personal information online, and are concerned with how businesses use the data collected about them, a DataGrail research reveals.
In a OnePoll online survey of 2,000 people aged 18 and above, 4 out of 5 Americans agreed there should be a law to protect their personal data, and 83 percent of people expect to have control over how their data is used at a business.
The request for more control over their personal data comes after many Americans experienced, first-hand, existing protections not working – 62 percent of people continue to receive emails from a company after unsubscribing.
In addition, more than 82 percent of people have concerns about businesses monitoring or collecting data from their phone’s microphone, laptop webcams, home devices (such as Google Home, Alexa, etc.), or mobile devices (phone, laptop, etc.) with location tracking.
Consumers do not feel safe from privacy infringements
Further, the research shows consumers do not feel safe from privacy infringements wherever they may be: 85% of those polled said they were concerned that businesses could be monetizing their laptops’ location.
In response to Americans’ demands, state regulators are listening. Several states have developed their own regulations, including California, Nevada and Maine, with Washington, New York and several other states following suit.
The California Consumer Privacy Act (CCPA) that went into effect Jan. 1, 2020, is one of the most consumer-forward, comprehensive and prominent data privacy laws. However, only 24 percent of Americans are familiar or have heard of it.
“As people put more of themselves online, they expect to have more control and transparency over their personal information,” said Daniel Barber, CEO of DataGrail.
“The good news is that businesses are responding. Brands are already making big moves to show their dedication to privacy, and it’s paying off. Those that proactively update preferences and consent will end up with a more loyal customer-base.
“However, we still have a lot of education to do. It’s clear people want the regulations. Our research shows that 50% of people would exercise at least one right under the CCPA.”
Control personal data: Data security over affordability
If all Americans were given the rights included in the CCPA:
- 65% of people would like to know and have access to what information businesses are collecting about them.
- 62% of people would like the right to opt-out and tell a business not to share or sell personal information.
- 58% of people would like the right to protections against businesses that do not uphold the value of their privacy.
- 49% of people would like the right to delete their personal data held by the business.
People are also more than willing to take their wallets elsewhere, even if it meant breaking their shopping preferences if they discovered their private data was not protected or that their data was being sold. The survey found that 77% would not shop at their favorite retailer if they found they did not keep their personal data safe.
Additionally, consumers said they would be willing to pay more for better privacy protections: 73% of people polled said they would pay more to online services companies (retailers, ecommerce, and social media) to ensure they didn’t sell their data, show them ads, or use their data for marketing or sales purposes.
The 10 top trends that will drive the most significant technological upheavals this year have been identified by Access Partnership.
“Shifts in tech policy will disrupt life for everyone. While some governments try to leverage the benefits of 5G, artificial intelligence, and IoT, others find reasons simply to confront Big Tech ranging from protectionism to climate urgency.
“Techlash trends highlighted in our report lay bare the risks of regulatory overreach: stymied innovation and economic growth for some and an unfair advantage for others,” said Greg Francis, Managing Director at Access Partnership.
Report highlights: Top policy trends for 2020
- AI regulation taking shape in the EU and the U.S.
- EU-based Digital Services Act (DSA) as the newest power grab since the GDPR
- New wave of tech protectionism in Europe
- China as a supply chain liability; other Asian nations filling in
- Spectrum sharing likely to become more mainstream with 5G
- 5G security to take an important position with shift to control functions
- U.S. privacy laws taking bipartisan note from California’s CCPA
- Data sharing regs to heat up, as balance with innovation becomes more critical
- IoTs, SIMs and eSIMs: who’s responsible for setting regulation?
- Rise of ‘green’ technology policy: another balancing act with industry emissions vs. the industry’s potential ability to solve climate change
Francis continued: “In just one year, we’ve seen dramatic changes in the regulatory and policy landscape for technology companies, originating in Europe but deeply affecting U.S. and other major global players.
“The report notes that while divisive impeachment proceedings in America create a blockage in new legislation pipelines, there is surprising bipartisan agreement on tech policy — Republicans are moving to protect companies from growth-killing regulation, and Democrats are seeking to pre-empt state-level measures.
“We expect to see new regulatory models emerging in the U.S. and other nations in reaction to the EU’s push for digital sovereignty.”
An analysis of current operational incident response (IR) set-up within the NIS Directive sectors has been released by ENISA. The NIS Directive and incident response The EU’s NIS Directive (Directive on security of network and information systems) was the first piece of EU-wide cybersecurity legislation. It aims to achieve a high common level of network and information system security across the EU’s critical infrastructure by bolstering capacities, cooperation and risk management practices across the Member … More
The post Insight into NIS Directive sectoral incident response capabilities appeared first on Help Net Security.
With only seven months left for nations to pass laws and virtual asset service providers (VASPs) to comply with the guidelines, the majority of cryptocurrency exchanges are not equipped to handle basic KYC, let alone comply with the stringent new funds Travel Rule included in the updated Financial Action Task Force (FATF) guidance, according to CipherTrace. Inadequate KYC The research results revealed that the lion’s share — more than two-thirds — of exchanges do not … More
The post 2019 experienced massive spate of crypto crimes, $4.4 billion to date appeared first on Help Net Security.
In the absence of a federal digital privacy law, Microsoft has decided to comply with the requirements of California’s Consumer Privacy Act (CCPA) throughout the U.S. The CCPA in short The CCPA goes into effect on January 1, 2020, and says that California residents (consumers) have the right to know what personal data is being collected about them and access it, to know whether their data is sold or disclosed (and to whom), to demand … More
The post Microsoft to honor California’s digital privacy law all through the U.S. appeared first on Help Net Security.