Consumer data privacy is no longer a necessary evil but a competitive differentiator for any company participating in the global economy. The EU’s GDPR represents the world’s most comprehensive regulation for privacy best practices, holding companies to stringent standards for data collection, storage and use. US national privacy law Many countries have followed suit in recent years by adopting similarly aggressive privacy laws that reflect a greater dedication to data protection. In stark contrast, the … More
Online users are more likely to reveal private information based on how website forms are structured to elicit data, Ben-Gurion University of the Negev (BGU) researchers have determined. “The objective was to demonstrate that we are able to cause smartphone and PC users of online services to disclose more information by measuring the likelihood that they sign-up for a service simply by manipulating the way information items (name, address, email) were presented,” says Prof. Lior … More
The post Users can be manipulated to share private information online appeared first on Help Net Security.
Data transparency allows people to know what personal data has been collected, what data an organization wants to collect and how it will be used. Data control provides the end-user with choice and authority over what is collected and even where it is shared. Together the two lead to a competitive edge, as 85% of consumers say they will take their business elsewhere if they do not trust how a company is handling their data. … More
The post How do I select a data control solution for my business? appeared first on Help Net Security.
68% of SMB and mid-market business executives believe working with a managed service provider (MSP) helps them stay ahead of their competition, according to Infrascale. MSP adoption The research also suggests that the top reason that businesses opt to work with MSPs, chosen by 51% of respondents, is to save costs. The second most common reason survey respondents said they use an MSP is for increased security (46%). 96% of the respondents said that it … More
The post Cost savings and security are key drivers of MSP adoption appeared first on Help Net Security.
You can’t swing a virtual bat without hitting someone touting the value of artificial intelligence (AI) and machine learning (ML) technologies to transform big data and human expertise.
A new generation of businesses is promising to accelerate and automate decision making. Most countries, including the United States, view AI technology as critical to retaining or establishing global business leadership. The promise and value of AI and ML rank equal or higher to other intellectual property or corporate secrets within an organization.
Despite this tremendous value, AI/ML assets can’t be protected – especially when in use. This creates intellectual property risks that can give pause to both entrepreneurs and investors. The result is a growing sense of urgency to create better controls to protect the raw data, training algorithms, run-time inference engines and results generated – both from competitors and from malicious actors.
The good news is that recent hardware advances built into the latest advanced microprocessors and incorporated into high-end servers can be utilized to protect AI/ML assets, data and other sensitive applications – even during runtime. Harnessing secure enclaves to close the loop on AI/ML vulnerabilities resolves these security concerns and enables AI/ML to be deployed even more widely, effectively, and safely.
What is a secure enclave?
A secure enclave is a private region of memory whose contents are protected by hardware-grade encryption and hardware isolation techniques. Data in an enclave cannot be read or modified by any entity outside the enclave itself, even if the host is physically compromised. From a business perspective, enclaves enable owners to tightly control how, when, and where data (including software in use) is created, used, and retired.
Secure enclaves leverage new hardware-level security capabilities present in modern CPUs and cloud computing platforms, such as Intel, AMD, AWS, Microsoft Azure, and others. Additional software can leverage these raw features to both create and enclave in which applications, which often require enclaved storage and communications, can operate unmodified.
What do secure enclaves protect?
More things than you might think. AI and ML both leverage and create a number of data sets, each of which have different security requirements.
First is the raw data that ML algorithms consume in order to learn. This often includes such highly sensitive data as personal medical or financial records with immense potential for industry. Ideally, this kind of data would be leveraged without the potential of any kind of exposure. In today’s computing environment, that’s practically impossible, because using, moving, or storing data (even when encrypted) implicitly exposes it.
Secure enclaves eliminate this exposure, while data is in use, and while it’s transported and stored as well. This facilitates the ability to use multiple data sets from multiple parties to train the AI engine with zero risk of exposure. Imagine the benefits this could bring to health care or insurance providers and even to government. It enables greater access to data for analysis while virtually guaranteeing data privacy. That means smarter AI.
Proprietary training engines used to process this raw data also need protection. In many cases, the mountain of data used to build experience can’t be moved; the learning engine has to be moved to the distant data mountain. Wherever that software is stored or used exposes it to theft, potentially indefinitely, when it runs on untrusted hardware.
But running and storing machine learning algorithms within the confines of a secure enclave assures that proprietary learning techniques are kept in the hands of their owners, even when those algorithms run in insecure environments. Simple policy and controls can dictate where, how, and when the software can be used down to specific, uniquely identified CPUs.
Similarly, the resulting proprietary interference/expert engine, which makes decisions based on new (often real-time) data must also be protected. The expertise and experience infused is core to the value of the business that created it. Enclaves can play a key role in not just protecting against software exposure and theft, but in controlling licensing and distribution as well. The same policy controls can potentially limit operations, such as to specific CPUs, clouds, and time periods, which protects the seller’s investment.
Interestingly, these same enclave protections secure customer data as well, because they assure that data processed by an enclaved application isn’t accessible by anyone anywhere.
Finally, there are the conclusions that the software generates. Data generated within an enclave is secured and tightly controlled by default. Policy controls must explicitly be implemented to allow exposure, if exposure is ever required.
Greater security means greater opportunity
Secure enclave protection doesn’t just obviate the data and IP risks associated with developing and protecting commercial AI/ML capabilities. It also creates opportunities to build new and more powerful capabilities from broader data sets. Secure enclaves offer a solid path for businesses to significantly reduce the risk associated with these potentially huge new opportunities.
77% of IT decision makers (ITDMs) don’t completely trust the data within their organization for timely and accurate decision making, according to SnapLogic.
With 98% of those surveyed reporting that data is reviewed and analyzed on a weekly basis by teams across the enterprise, this data distrust should be cause for concern, potentially leaving organizations open to poorly considered decisions and misguided actions.
Ineffective or flawed data analytics processes
The study found that the majority of this distrust in data comes down to ineffective or flawed data analytics processes. Despite the fact that data analytics is seen as very important for 82% of organizations, it’s almost become commonplace for data snags to impact results.
In 84% of organizations, data analytics projects are delayed due to the data not being in the right format, while for 82% the data used is of such poor quality that analytics projects need to be reworked.
The distrust caused by these data issues has a significant impact on organizational success, with 76% reporting missed revenue opportunities, 72% stating customer engagement and satisfaction is negatively impacted, and 68% believing they are slower than competitors to react to market changes.
Worryingly, those who have little or no trust in their organization’s data report that 54% of strategic decisions continue to use that same data, risking flawed decisions and perhaps hindering, rather than helping, the business achieve their goals.
Indeed, 64% of ITDMs believe a lack of trust in data is causing their organization to move forward cautiously, in turn missing opportunities that may otherwise put them ahead.
How to improve data quality for analysis
Rebuilding trust in data and data analytics overwhelmingly comes down to improving the ease and speed of access to quality, decision-ready data within the organization.
When asked what was needed to improve data quality for analysis, respondents noted some key areas: better data cleaning and standardization, modernization of infrastructure, and the integration of data silos. The latter was particularly important, as 53% called out growing data siloes and inaccessible data as the biggest drivers behind their lack of trust.
“It’s well known that effective use of data analytics can provide significant business advantages. But to know that so many organizations are making business decisions using data they do not trust is alarming,” said Craig Stewart, CTO at SnapLogic.
“To get data analytics projects right, it’s critical that organizations review what data they have, the applications and sources it comes from, and how they are bringing it all together. Modern integration tools can help with this, providing an automated way to democratize data throughout the organization so it’s accessible at the right time in the right format to all those who need it.”
Despite the data trust gap, analytics is an area that is seeing increased focus and investment during the COVID-19 pandemic, as 66% of organizations surveyed have either continued or even accelerated their warehousing and analytics projects during this period. This seems to indicate that organizations continue to see tremendous value in data-driven insights and are committed to getting analytics right, even or especially in tough times, in order to emerge stronger on the other side.
Most professionals say their organizations are concerned about cybersecurity risks related to 5G adoption (76.4% of professionals at organizations currently use 5G and 80.7% of professionals at organizations plan to adopt 5G in the year ahead), according to a Deloitte poll.
“U.S. 5G bandwidth availability has expanded and accelerated considerably in recent months, offering competitive advantages technologically, financially and otherwise to early adopters,” said Wendy Frank, Deloitte Risk & Financial Advisory Cyber 5G leader and principal, Deloitte.
“Of course, with all the technological advancement 5G enables, the cyber threat landscape and attack surface areas expand considerably. Working proactively to mitigate cybersecurity risks posed by 5G adoption is the hallmark of a well-designed program.”
Biggest cybersecurity concerns for 5G adoption
The biggest cybersecurity concerns for 5G adoption differed by group. For professionals at organizations currently using 5G, talent posed the biggest cyber challenge to 5G adoption (30.1%), as appropriately skilled security professionals will be needed for implementation, maintenance and operations.
For respondents from organizations planning to adopt 5G in the year ahead, top cyber challenges were data (26.8%) – due to an increase in the volume and diversity of data created from 5G-enabled segments (e.g., IoT, ERP and sensitive data) as well as data mismanagement risks – and third parties (24.3%).
“For organizations leveraging 5G, cyber risk will mount quickly if challenges – like a lack sophisticated encryption, decentralized operations or security monitoring functioning to the detriment of performance speeds – are not resolved,” Frank said.
“Securing the vastly expanded threat landscape resulting from 5G adoption will demand two equally important efforts: getting the right talent in place or upskilled; and, leveraging artificial intelligence and machine learning to automate areas like security policy configuration, compliance monitoring and threat and vulnerability detection.”
Pandemic impacts 5G adoption speeds
COVID-19 disruption had mixed impacts on respondents’ organizational plans to adopt 5G. For those at organizations currently using 5G, 32.2% increased adoption speed. Inversely, adoption speed decreased as a result of pandemic-driven disruption for 21.8% of those at organizations planning to adopt 5G in the year ahead.
Frank concluded, “The faster movement of data, the creation of new types of data and the ability to develop countless new IoT devices through 5G networks will disrupt most industries. But, just as with pandemic disruption, leading programs are working to keep security at the fore of 5G adoption.”
In The Social Dilemma, the Netflix documentary that has been in the news recently for its radical revelations, former executives at major technology companies like Facebook, Twitter, and Instagram, among others, share how their ex-employers have developed sophisticated algorithms that not only predict users’ actions but also know which content will keep them hooked on their platforms.
The knowledge that technology companies are preying on their users’ digital activities without their consent and awareness is well-known. But Associate Professor Jon Truby and Clinical Assistant Professor Rafael Brown at the Centre for Law and Development at Qatar University have pulled the curtain on another element that technology companies are pursuing to the detriment of people’s lives, and investigated what we can do about it.
“We had been working on the digital thought clone paper a year before the Netflix documentary aired. So, we were not surprised to see the story revealed by the documentary, which affirm what our research has found,” says Prof Brown, one of the co-authors.
Their paper identifies “digital thought clones,” which act as digital twins that constantly collect personal data in real-time, and then predict and analyze the data to manipulate people’s decisions.
Activity from apps, social media accounts, gadgets, GPS tracking, online and offline behavior and activities, and public records are all used to formulate what they call a “digital thought clone”.
Processing personalized data to test strategies in real-time
The paper defines digital thought clone as “a personalized digital twin consisting of a replica of all known data and behavior on a specific living person, recording in real-time their choices, preferences, behavioral trends, and decision making processes.”
“Currently existing or future artificial intelligence (AI) algorithms can then process this personalized data to test strategies in real-time to predict, influence, and manipulate a person’s consumer or online decisions using extremely precise behavioral patterns, and determine which factors are necessary for a different decision to emerge and run all kinds of simulations before testing it in the real world,” says Prof Truby, a co-author of the study.
An example is predicting whether a person will make the effort to compare online prices for a purchase, and if they do not, charging a premium for their chosen purchase. This digital manipulation reduces a person’s ability to make choices freely.
Outside of consumer marketing, imagine if financial institutions use digital thought clones to make financial decisions, such as whether a person would repay a loan.
What if insurance companies judged medical insurance applications by predicting the likelihood of future illnesses based on diet, gym membership, the distance applicants walk in a day–based on their phone’s location history–and their social circle, as generated by their phone contacts and social media groups, and other variables?
The authors suggest that the current views on privacy, where information is treated either as a public or private matter or viewed in contextual relationships of who the information concerns and impacts, are outmoded.
A human-centered framework is needed
A human-centered framework is needed, where a person can decide from the very beginning of their relationship with digital services if their data should be protected forever or until they freely waive it. This rests on two principles: the ownership principle that data belongs to the person, and that certain data is inherently protected; and the control principle, which requires that individuals be allowed to make changes to the type of data collected and if it should be stored. In this framework, people are asked beforehand if data can be shared with an unauthorized entity.
But the authors also raise critical moral and legal questions over the status of these digital thought clones. “Does privacy for humans mean their digital clones are protected as well? Are users giving informed consent to companies if their terms and conditions are couched in misleading language?” asks Prof Truby.
A legal distinction must be made between the digital clone and the biological source. Whether the digital clone can be said to have attained consciousness will be relevant to the inquiry but far more important would be to determine whether the digital clone’s consciousness is the same as that of the biological source.
The world is at a crossroads: should it continue to do nothing and allow for total manipulation by the technology industry or take control through much-needed legislation to ensure that people are in charge of their digital data? It’s not quite a social dilemma.
Only 37% of organizations definitely have the skills and technology to keep pace with digital projects during the COVID-19 pandemic, a MuleSoft survey reveals.
Access to data is paramount
82% of line of business (LoB) employees believe they need quick and easy access to data, IT systems, and applications to do their jobs effectively and remain productive.
Access to data is critical as 59% of LoB employees are involved in identifying, suggesting, or creating new ways to improve the delivery of digital services externally, such as building an online self-service portal or a customer-facing mobile application. Yet 29% think their organization is very effective in connecting and using data from multiple sources to drive business value.
“This research shows data is one of the most critical assets that businesses need to move fast and thrive into the future. Organizations need to empower every employee to unlock and integrate data — no matter where it resides — to deliver critical, time-sensitive projects and innovation at scale, while making products and services more connected than ever.”
Data silos increasingly slow down digital initiatives
According to McKinsey, businesses that once mapped digital strategy in one- to three-year phases must now scale their initiatives in a matter of days or weeks.
This report also sheds light on the pandemic leading to an increase in digital initiatives by an average 11-23%, highlighting what is hampering the pace of business and the ability to meet customer expectations:
- Data silos: 33% of LoB employees say the COVID-19 pandemic has revealed a lack of connectivity between existing IT systems, applications, and data as an inefficiency when it comes to digital delivery.
- A lack of digital skills: 29% LoB employees say a lack of digital skills across the business is also an inefficiency when delivering digital projects.
Already stretched IT teams can’t deliver projects quickly enough: 51% of LoB employees are currently frustrated by the speed at which their IT team can deliver digital projects.
Integration challenges directly impact revenue and customer experiences
In light of increasing operational inefficiencies, it is not surprising that 54% of LoB respondents say they are frustrated by the challenge of connecting different IT systems, applications, and data at their organization. Many view this weakness as a threat to their business and the ability to provide connected customer experiences.
- Siloed systems and data slow down business growth: LoB employees are well aware of the repercussions of failing to connect systems, applications, and data. 59% agree that failure in this area will hinder business growth and revenue.
- Behind disconnected experiences are disconnected systems, applications, and data: 59% of LoB employees agree that an inability to connect systems, applications, and data will negatively impact customer experience — a fundamental prerequisite for business success today.
- Automation initiatives require integration: 60% respondents admit that failure to connect systems, applications, and data will also hinder automation initiatives. This comes at a time when a growing number of organizations are looking to automate business processes via capabilities, such as robotic process automation (RPA).
Organizations need to move faster
As demands for digital initiatives grow, organizations across industries need to move faster than ever before. Business users are frustrated by data silos, slowing their ability to meet customer demand and innovate in today’s all-digital, work-from-anywhere world.
The report highlights the need to democratize these capabilities by giving business users the tools they need to easily and quickly unlock data, connect applications, and automate processes.
- Organizations need to scale innovation beyond the four walls of IT: 58% of LoB employees think IT leaders are spending more of their time “keeping the lights on” rather than supporting innovation. Furthermore, 44% go as far as to say they think their organization’s IT department is a blocker on innovation. By using a self-serve model that empowers everyone to unlock data, IT can enable innovation everywhere — in a way that’s governed but not gated by IT. IT can then be freed up from tactical integrations and maintenance to focus more on innovating and delivering high impact projects.
- Partnership with IT will be key to driving innovation: 68% of respondents think that IT and LoB employees should come together to jointly drive innovation in their organization.
- LoB employees need easy access to data to go faster: 80% of respondents think it would be beneficial to their organization if data and IT capabilities were discoverable and pre-packaged building blocks, which allow LoB employees to start creating digital solutions and deliver digital projects for themselves.
Scality announced its data storage predictions for 2021, focusing on the rapid growth rate of cloud-native apps and containerization. According to IDC, by 2023, over 500 million digital apps and services will be developed and deployed using cloud-native approaches. That is the same number of apps developed in total over the last 40 years. 2021 apps and containerization trends “The accelerated growth of next-generation cloud-native digital apps and services will define new competitive requirements in … More
The post Growth of cloud-native apps and containerization to define 2021 appeared first on Help Net Security.
The global network slicing market size is projected to grow from $161 million in 2020 to $1,284 million by 2025, at a Compound Annual Growth Rate (CAGR) of 51.5% during the forecast period, according to MarketsandMarkets.
The network slicing market is gaining traction due to the evolution of cellular network technology, which has offered higher data speeds and lower latency. The rapid rise in the volume of data being carried by cellular networks has been driven largely by consumer demand for video and the shift of business toward the use of cloud services.
Services segment to grow at the highest CAGR during the forecast period
Services play a vital role in the deployment and integration of next-generation networking solutions in an enterprise’s business environment. Services are considered an important component of the network slicing market, as they majorly focus on improving the business processes and optimizing the enterprise’s network.
Services are considered as the backbone of network slicing, as they are instrumental in fulfilling the clients’ requirements, such as network testing, planning and optimization, support and maintenance, and consulting
Automotive segment to grow at the highest CAGR during the forecast period
The automotive industry also makes use of the 5G technology to boost the productivity, enhance the efficiency, increase drive the brand loyalty, and offer autonomous and cooperative vehicles with significantly improved security standards and multimodal transportation solutions.
The introduction of next-generation technologies, such as 5G gave birth to numerous applications, such as AR, virtual realityVR, and tactile internet.
North America region to record the highest market share in 2020
North America is one of the most technologically advanced regions in the world. Consumers based in this region have readily adopted 4G-enabled smartphones that make the region as one of the established and most advanced mobile regions in the world.
According to the Ericsson Mobility Report published in 2017, North America records the largest use of smartphones, and traffic per smartphone is expected to increase from 7.1GB per month by the end of 2017 to 48GB by the end of 2023.
The increasing number of internet subscribers, expanding mobile data traffic, and growing government emphasis on enhancing telecommunications infrastructure to meet the users’ demand for seamless connectivity would drive the market to a great extent in the region.
Further, the region is expected to be the early adopter of 5G services in areas such as AR/VR, autonomous driving, and AI owing to the high customer digital engagement.
Generali Global Assistance released the findings of its survey which examines consumer sentiment on retail data breaches and the identity theft risks holiday shopping poses.
Grown comfort with online shopping
Among those who avoided it entirely, comfort with online shopping has grown substantially this year.
- 30% of Americans surveyed avoided online shopping due to the potential security risks prior to the COVID-19 pandemic
- 74% of those who avoided online shopping due to security risks say they are using their credit card online more often as a direct result of the pandemic-induced retail lockdowns
- 73% of those who avoided online shopping in the past agree they have become more comfortable shopping online since the start of the pandemic
Many plan to shop in brick-and-mortar stores
Online shopping dominates this year, but nearly half plan to shop in brick-and-mortar stores.
- 86% of consumers plan to do their holiday shopping online, up 21 percent from last year, likely due to the pandemic
- 48% indicated they will shop for the holidays in a brick-and-mortar store, down 15 percent from last year
- 70% of holiday shoppers plan to shop at two to five brick-and-mortar and/or online stores
- 18% indicated they will go to more than six stores this holiday season
Growing concern about data breaches
2 in 3 are concerned about data breaches during holiday shopping season; nearly 4 in 5 will think twice before doing business with a breached retailer.
- 66% of Americans surveyed expressed concern about their financial or personal information being compromised due to a data breach while shopping this holiday season
- 78% of customers indicated that they would be concerned about doing business with a retailer if they experienced a breach
- Down a point from last year, the number of customers who expressed concern over retailers who’ve been breached has decreased, continuing a potential trend of consumer apathy toward data breaches that GGA identified last year
Identity protection services are preferred
Most shoppers would feel more secure if a retailer offered them identity protection services.
- 64% of Americans indicated they would feel more secure doing business with a retailer if that retailer offered them identity protection services
- Compared to 61% of Americans in 2018 and 55% in the year prior who indicated they would feel more secure if a retailer offered them ID theft protection
- Revealing that more consumers understand the need for identity theft protection today
Identity theft is viewed as a threat this year by over 2 in 5 Americans
- 61% of shoppers indicate that data breaches of online merchants or credit card providers is still the biggest threat to their identity, up 14 percent from last year
- 43% of Americans indicated that identity theft is their greatest threat this year
- 28% perceive having their identity stolen due to a COVID-19 related scam as the greatest threat to their personal info, whether it be a COVID-related employment scam (20%) or a health scam (17%)
- Break-in or pickpocket (15%), a tax scam (13%), and to a lesser extent, a puppy scam (7%) were the other types of identity theft considered a danger by respondents
Big box stores are trusted most with personal data
- 40% of Americans trust big box stores the most with their personal data this holiday shopping season
- 36% consider e-retailers the most trustworthy
- Only 22% of the survey respondents trust their local small businesses with their personal data.
Paige Schaffer, CEO, Global Identity and Cyber Protection Services at Generali Global Assistance, commented on the findings, “Consumers’ shopping behavior has evolved rapidly as a result of the pandemic forcing even the 30 percent of Americans who used to avoid online shopping entirely to take their business online.
“While consumers growing apathy around breaches continued, our survey also showed that more of them understand the need for identity protection. Making sure the average consumer’s personal information is safe and offering them support in the wake of an incident will improve customer loyalty among all retailers from the big box super store to the local mom and pop shop.”
COVID-19 has reorganized the risk landscape for chief audit executives (CAEs), as CAEs have listed IT governance as the top risk for 2021, according to Gartner. Analysts said the pandemic is giving rise to new sets of risks while exacerbating long-standing vulnerabilities.
Gartner conducted interviews and surveys from across its global network of client organizations to identify the top 12 risks, or “Audit Plan Hot Spots,” facing boards, audit committees and executives entering 2021.
Existing risk trends
The report revealed that IT governance is displacing data governance, which was the top entry for 2020 and is in second position for 2021.
“While the pandemic has created new challenges for audit executives to grapple with, what’s most notable is how the current environment has accelerated existing risk trends,” said Leslee McKnight, research director for the Gartner Audit practice.
“The volatility and interconnectedness of the two most important risks, IT and data governance, also shines a light on the importance for firms to rethink their risk governance. Audit leaders should apply dynamic risk governance in order to rethink their approach to designing risk management roles and responsibilities.”
While the top three hot spots audit executives must focus on for 2021 all made appearances in last year’s list, they have all been altered by the nature of working in the pandemic.
Abrupt work-from-home mandates have accelerated digital roadmaps, causing many organizations to vault years forward in the space of a few weeks. This move has spurred the rapid adoption of new technologies both on the employee and customer side, presenting new challenges to productivity, consumer preferences and guarding against security vulnerabilities.
CAEs need to assess how new technology adoption may be hobbling their IT departments’ plans, with IT support incident requests doubling in early 2020 to support a huge increase in work-from-home employees.
Additionally, managing access rights for many more remote workers presents new risks such as “privileged user abuse,” which is expected to climb over the next 12 to 24 months.
The pandemic means that organizations are expected to collect more sensitive personal information from employees and customers than ever before. Yet, data governance practices are regressing, with fewer dedicated resources to data privacy than in previous years.
Organizations face increasingly complex data environments where their data is housed. Growth in software-as-a-service (SaaS) and delays to upgrading legacy systems have created work environments where data is distributed across disparate platforms, software and servers.
Such complexities continue to test audit executives, with only 45% expressing high confidence in their ability to manage data governance risk.
Cyber vulnerabilities are especially acute this year, due to the rapid organizational changes needed to protect employees and serve customers in the midst of a pandemic.
Despite increased cybersecurity spending, only 24% of organizations routinely follow cybersecurity best practices, this will result in cyberattacks that are expected to cost organizations $6 trillion annually by 2021. Drivers of this risk include lapses in security controls and increased employee vulnerability to social engineering.
More than half of employees are currently using personal devices to do work remotely, while 61% have indicated their employer has not provided tools to secure these devices. Additional security lapses include a lack of attention to employee’s home network security and status of antivirus software.
“The pandemic is forcing many audit and risk executives to address their organization’s deficiencies in the most critical areas,” said Ms. McKnight.
“Inadequate data governance and IT security practices will have even steeper consequences in the current environment than pre-pandemic, particularly when considering the types of data many organizations feel compelled to collect as a result of new health and safety measures.”
Cohesity announced the results of a survey of 500 IT decision makers in the United States that highlights critical IT and data management challenges midsize and enterprise organizations are facing as companies prepare for 2021.
The survey included 250 respondents from midsize companies ($100M-$1B in revenue) and 250 from enterprise organizations ($1B+ in revenue).
Some of these challenges came to light as companies answered questions about their appetite for Data Management as a Service (DMaaS). With a DMaaS solution, organizations do not have to manage data infrastructure – it is managed for them.
DMaaS provides organizations with easy access to backup and recovery, disaster recovery, archiving, file and object services, dev/test provisioning, data governance, and security – all through one vendor in a Software as a Service (SaaS) model.
IT budgets are being slashed: Seventy percent of respondents state their organization is being forced to cut the IT budget in the next 12 months. Around a third of respondents have to cut the IT budget by 10-25 percent, a tenth have to cut it by a whopping 25-50 percent.
Verticals facing the largest cuts on average: technology (20 percent), education (18 percent), government/public sector (16 percent).
Many midsize companies are struggling to compete against larger enterprises because of inefficient data management: 27 percent of respondents from midsize companies say they have lost 25-50 percent of deals to larger enterprises because larger enterprises have more resources to manage and derive value from their data.
Even worse, 18 percent of respondents from midsize companies claim to have lost 50-75 percent of deals to larger enterprises for the same reason.
Organizations are spending inordinate amounts of time managing data infrastructure: Respondents say IT teams, on average, spend 40 percent of their time each week installing, maintaining, and managing data infrastructure. Twenty-two percent claim their IT team spends 50-75 percent of time each week on these tasks.
Technology is needed that makes it easier to derive value from data while also reducing stress levels and employee turnover: When respondents were asked about the benefits of deploying a DMaaS solution versus spending so much time managing data infrastructure, 61 percent cited an ability to focus more on deriving value from data which could help their organization’s bottom line, 52 percent cited reduced stress levels for IT teams, and 47 percent are hopeful this type of solution could also reduce employee turnover within the IT team.
“Research shows IT leaders are anxious for comprehensive solutions that will enable them to do more with data in ways that will help boost revenues and provide a competitive advantage at a time when they are also facing budget cuts, burnout, and turnover.”
The growing appetite for technology that simplifies IT and data management
As businesses look to simplify IT operations, be more cost efficient, and do more with data, respondents are very optimistic about the benefits of DMaaS, which include:
- Cost predictability: Eighty-nine percent of respondents say their organization is likely to consider deploying a DMaaS solution, at least in part, due to budget cuts.
- Helping midsize companies win more business: Ninety-one percent of respondents from midsize companies believe deploying a DMaaS solution will enable their organizations to compete more effectively against larger enterprises that have more resources to manage data.
- Saving IT teams valuable time: Respondents who noted that their IT teams spend time each week managing IT infrastructure believe those teams will save, on average, 39 percent of their time each week if their company had a full DMaaS solution in place.
- Doing more with data: Ninety-seven percent of respondents believe DMaaS unlocks opportunities to derive more value from data using cloud-based services and applications. Sixty-four percent want to take advantage of cloud-based capabilities that enable them to access and improve their security posture, including improving anti-ransomware capabilities.
- Alleviating stress and reducing turnover: Ninety-three percent of respondents believe that deploying a DMaaS solution would enable them to focus less on infrastructure provisioning and data management tasks. 52 percent of these respondents say deploying a DMaaS solution could reduce their team’s stress levels by not having to spend so much time on infrastructure provisioning and management. Forty-seven percent believe deploying a DMaaS solution could reduce employee turnover within the IT team.
Choice is the name of the game for IT in 2021
“The data also pinpoints another important IT trend in 2021: choice is critical,” said Waxman. “IT leaders want to manage data as they see fit.” With respect to choice, respondents stated:
- It’s not one or the other, it’s both: 69 percent of respondents stated their organization prefers to partner with vendors that offer choice in how their company’s data is managed and will not consider vendors that just offer a DMaaS model — they also want the option to manage some data directly.
- Avoiding one-trick ponies is key: Ninety-four percent of survey respondents stated that it’s important to work with a DMaaS vendor that does more than Backup as a Service (BaaS). If the vendor only offers BaaS, 70 percent are concerned they will have to work with more vendors to manage their data and doing so is likely to increase their workload (77 percent), fail to help reduce costs (65 percent), and lead to mass data fragmentation where data is siloed and hard to manage and gain insights from (74 percent).
To stay connected with patients, healthcare providers are turning to telehealth services. In fact, 34.5 million telehealth services were delivered from March through June, according to the Centers for Medicare and Medicaid Services. The shift to remote healthcare has also impacted the roll out of new regulations that would give patients secure and free access to their health data.
The shift to online services shines a light on a major cybersecurity issue within all industries (but especially healthcare where people have zero control over their data): consent.
Hand over data control
Data transparency allows people to know what personal data has been collected, what data an organization wants to collect and how it will be used. Data control provides the end-user with choice and authority over what is collected and even where it is shared. Together the two lead to a competitive edge, as 85% of consumers say they will take their business elsewhere if they do not trust how a company is handling their data.
Regulations such as the GDPR and the CCPA have been enacted to hold companies accountable unlike ever before – providing greater protection, transparency and control to consumers over their personal data.
The U.S. Department of Health and Human Services’ (HHS) regulation, which is set to go into effect in early 2021, would provide interoperability, allowing patients to access, share and manage their healthcare data as they do their financial data. Healthcare organizations must provide people with control over their data and where it goes, which in turn strengthens trust.
How to earn patients’ trust
Organizations must improve their ability to earn patients’ confidence and trust by putting comprehensive identity and access management (IAM) systems in place. Such systems need to offer the ability to manage privacy settings, account for data download and deletion, and enable data sharing with not just third-party apps but also other people, such as additional care providers and family members.
The right digital identity solution should empower the orchestration of user identity journeys, such as registration and authentication, in a convenient way that unifies configuring security and user experience choices.
It should also enable the healthcare organization to protect patients’ personal data while offering their end-users a unified means of control of their data consents and permissions. Below are the four key steps companies should take to earn trust when users hand over data control:
- Identify where digital transformation opportunities and user trust risks intersect. Since users are becoming more skeptical, organizations must analyze “trust gaps” while they are discovering clever new ways to leverage personal data.
- Consider personal data as a joint asset. It’s easy for a company to say consumers own their own personal data, but business leaders have incentives to leverage that data for the value it brings to their business. This changes the equation. All the stakeholders within an organization need to come together and view data as a joint asset in which all parties, including end-users, have a stake.
- Lean into consent. Given the realities of regulations, a business often has a choice to offer consent to end-users rather than just collecting and using data. Seek to offer the option – it provides benefits when building trust with skeptical consumers, as well as when proving your right to use that data.
- Take advantage of consumer identity and access management (CIAM) for building trust. Identity management platforms automate and provide visibility into the entire customer journey across many different applications and channels. They also allow end-users to retain the controls to manage their own profiles, passwords, privacy settings and personal data.
Providing data transparency and data control to the end-user enhances the relationship between business and consumer. Organizations can achieve this trust with consumers in a comprehensive fashion by applying consumer identity and access management that scales across all of their applications. To see these benefits before regulations like the HHS regulations go into effect, organizations need to act now.
Businesses around the globe are facing challenges as they try to protect data stored in complex hybrid multi-cloud environments, from the growing threat of ransomware, according to a Veritas Technologies survey.
Only 36% of respondents said their security has kept pace with their IT complexity, underscoring the need for greater use of data protection solutions that can protect against ransomware across the entirety of increasingly heterogenous environments.
Need to pay ransoms
Typically, if businesses fall foul to ransomware and are not able to restore their data from a backup copy of their files, they may look to pay the hackers responsible for the attack to return their information.
The research showed companies with greater complexity in their multi-cloud infrastructure were more likely to make these payments. The mean number of clouds deployed by those organizations who paid a ransom in full was 14.06. This dropped to 12.61 for those who paid only part of the ransom and went as low as 7.22 for businesses who didn’t pay at all.
In fact, only 20% of businesses with fewer than five clouds paid a ransom in full, 44% for those with more than 20. This compares with 57% of the under-fives paying nothing to their hackers and just 17% of the over-20s.
Slow recovery times
Complexity in cloud architectures was also shown to have a significant impact on a business’s ability to recover following a ransomware attack. While 43% of those businesses with fewer than five cloud providers in their infrastructure saw their business operations disrupted by less than one day, only 18% of those with more than 20 were as fast to return to normal.
Moreover, 39% of the over-20s took 5-10 days to get back on track, with just 16% of the under-fives having to wait so long.
Inability to restore data
Furthermore, according to the findings of the research, greater complexity in an organization’s cloud infrastructure, also made it slightly less likely that they would ever be able to restore their data in the event of a ransomware attack.
While 44% of businesses with fewer than five cloud providers were able to restore 90% or more of their data, just 40% of enterprises building their infrastructure on more than 20 cloud services were able to say the same.
John Abel, SVP and CIO at Veritas said: “The benefits of hybrid multi-cloud are increasingly being recognised in businesses around the world. In order to drive the best experience, at the best price, organizations are choosing best-of-breed cloud solutions in their production environments, and the average company today is now using nearly 12 different cloud providers to drive their digital transformation.
“However, our research shows many businesses’ data protection strategies aren’t keeping pace with the levels of complexity they’re introducing and, as a result, they’re feeling the impact of ransomware more acutely.
“In order to insulate themselves from the financial and reputational damage of ransomware, organizations need to look to data protection solutions that can span their increasingly heterogenous infrastructures, no matter how complex they may be.”
Businesses recognize the challenge
The research revealed that many businesses are aware of the challenge they face, with just 36% of respondents believing their security had kept pace with the complexity in their infrastructure.
The top concern as a result of this complexity, as stated by businesses, was the increased risk of external attack, cited by 37% of all participants in the research.
Abel continued: “We’ve heard from our customers that, as part of their response to COVID, they rapidly accelerated their journey to the cloud. Many organizations needed to empower homeworking across a wider portfolio of applications than ever before and, with limited access to their on-premise IT infrastructure, turned to cloud deployments to meet their needs.
“We’re seeing a lag between the high-velocity expansion of the threat surface that comes with increased multi-cloud adoption, and the deployment of data protection solutions needed to secure them. Our research shows some businesses are investing to close that resiliency gap – but unless this is done at greater speed, companies will remain vulnerable.”
Need for investment
46% of businesses shared they had increased their budgets for security since the advent of the COVID-19 pandemic. There was a correlation between this elevated level of investment and the ability to restore data in the wake of an attack: 47% of those spending more since the Coronavirus outbreak were able to restore 90% or more of their data, compared with just 36% of those spending less.
The results suggest there is more to be done though, with the average business being able to restore only 80% of its data.
Back to basics
While the research indicates organizations need to more comprehensively protect data in their complex cloud infrastructures, the survey also highlighted the need to get the basics of data protection right too.
Only 55% of respondents could claim they have offline backups in place, even though those who do are more likely to be able to restore more than 90% of their data. Those with multiple copies of data were also better able to restore the lion’s share of their data.
Forty-nine percent of those with three or more copies of their files were able to restore 90% or more of their information, compared with just 37% of those with only two.
The three most common data protection tools to have been deployed amongst respondents who had avoided paying ransoms were: anti-virus, backup and security monitoring, in that order.
The safest countries to be in to avoid ransomware attacks, the research revealed, were Poland and Hungary. Just 24% of businesses in Poland had been on the receiving end of a ransomware attack, and the average company in Hungary had only experienced 0.52 attacks ever.
The highest incident of attack was in India, where 77% of businesses had succumbed to ransomware, and the average organization had been hit by 5.27 attacks.
It was an accomplishment for the ages: within just a couple of days, IT departments hurriedly provided millions of newly homebound employees online access to the data and apps they needed to remain productive.
Some employees were handed laptops as they left the building, while others made do with their own machines. Most connected to their corporate services via VPNs. Other companies harnessed the cloud and software and infrastructure services (SaaS, IaaS).
Bravo, IT! Not only did it all work, businesses and employees both saw the very real benefits of remote life, and that egg is not going back into the shell. Many won’t return to those offices and will continue work from home.
But while immediate access challenges were answered, this was not a long-term solution.
Let’s face it, because of the pandemic a lot of companies were caught off guard with insufficient plans for data protection and disaster recovery (DR). That isn’t easy in the best of times, never mind during a pandemic. Even those with effective strategies now must revisit and update them. Employees have insufficient home security. VPNs are difficult to manage and provision, perform poorly and are hard to scale. And, IT’s domain is now stretched across the corporate data center, cloud (often more than one), user endpoints and multiple SaaS providers.
There’s a lot to do. A plan that fully covers DR, data protection and availability is a must.
There are several strategies for protecting endpoints. First off, if employees are using company-issued machines, there are many good mobile machine management products on the market. Sure, setting up clients for a volume of these will be a laborious task, but you’ll have peace of mind knowing data won’t go unprotected.
Another strategy is to create group policies that map the Desktop and My Documents folders directly to the cloud file storage of your choice, no matter if it’s Google Drive, OneDrive, Dropbox or some other solution. That can simplify file data protection but its success hinges on the employee storing documents in the right place. And if they keep them on their desktop, for example, they’re not going to be protected.
And right there is the rub with protecting employee machines – employees are going to store data on these devices. Often, insecure home Internet connections make these devices and data vulnerable. Further, if you add backup clients and/or software to employee-owned machines, you could encounter some privacy resistance.
Remote desktops can provide an elegant solution. We’ve heard “this is the year of virtual desktop infrastructure (VDI)” for over a decade. It’s something of a running joke in IT circles, but you know what? The current scenario could very well make this the year of remote desktops after all.
VDI performance in more sophisticated remote desktop solutions has greatly improved. With a robust platform configured properly, end-users can’t store data on their local machines – it’ll be safely kept behind a firewall with on-premises backup systems to protect and secure it.
Further, IT can set up virtual desktops to prevent cut and paste to the device. And because many solutions don’t require a client, it doesn’t matter what machine an employee uses – just make sure proper credentials are needed for access and include multi-factor authentication.
Pain in the SaaS
As if IT doesn’t have enough to worry about, there’s a potential SaaS issue that can cause a lot of pain. Most providers operate under the shared responsibility model. They secure infrastructure, ensure apps are available and data is safe in case of a large-scale disaster. But long-term, responsibility for granular protection of data rests on the shoulders of the customer.
Unfortunately, many organizations are unprepared. A January 2020 survey from OwnBackup of 2,000 Salesforce users found that 52% are not backing up their Salesforce data.
What happens if someone mistakenly deletes a Microsoft Office 365 document vital for a quarterly sales report and it’s not noticed for a while? Microsoft automatically empties recycle bins data after 30 days, so unless there’s backup in place, it’s gone for good.
Backup vendors provide products to protect data in most of the more common SaaS services, but if there’s not a data protection solution for one your organization is using, make data protection part of the service provider’s contract and insist they regularly send along copies of your data.
When it comes to a significant disaster, highly distributed environments can make recovery difficult. The cloud seems like a clear choice for storing DR and backup data, but while the commodity cloud providers make it easy and cheap to upload data, costs for retrieval are much higher. Also, remember that cloud recovery is different from on-prem, requiring expertise in areas like virtual machines and user access. And, if IT is handling cloud directly and has issues, keep in mind that it could be very difficult getting support.
During a disaster, you want to recover fast; you don’t want to be creating a backup and DR strategy as the leadership grits their teeth due to downtime. So, set your data protection strategy now, be sure each app is included, follow all dependencies and test over and over again. Employees and data may be in varied locations, so be sure you’re completely covered so your company can get back in the game faster.
While IT pulled off an amazing feat handling a rapid remote migration, to ensure your company’s future, you need to be certain it can protect data, even outside of the corporate firewall. With a backup and DR strategy for dispersed data in place, you’ll continue to be in a position to make history, instead of fading away.
ESET researchers have discovered ModPipe, a modular backdoor that gives its operators access to sensitive information stored in devices running ORACLE MICROS Restaurant Enterprise Series (RES) 3700 POS (point-of-sale) – a management software suite used by hundreds of thousands of bars, restaurants, hotels and other hospitality establishments worldwide.
The majority of the identified targets were from the United States.
Containing a custom algorithm
What makes the backdoor distinctive are its downloadable modules and their capabilities, as it contains a custom algorithm designed to gather RES 3700 POS database passwords by decrypting them from Windows registry values.
This shows that the backdoor’s authors have deep knowledge of the targeted software and opted for this sophisticated method instead of collecting the data via a simpler yet “louder” approach, such as keylogging.
Exfiltrated credentials allow ModPipe’s operators access to database contents, including various definitions and configuration, status tables and information about POS transactions.
“However, based on the documentation of RES 3700 POS, the attackers should not be able to access some of the most sensitive information – such as credit card numbers and expiration dates – which is protected by encryption. The only customer data stored in the clear and thus available to the attackers should be cardholder names,” cautions ESET researcher Martin Smolár, who discovered ModPipe.
“Probably the most intriguing parts of ModPipe are its downloadable modules. We’ve been aware of their existence since the end of 2019, when we first found and analyzed its basic components,” explains Smolár.
- GetMicInfo targets data related to the MICROS POS, including passwords tied to two database usernames predefined by the manufacturer. This module can intercept and decrypt these database passwords, using a specifically designed algorithm.
- ModScan 2.20 collects additional information about the installed MICROS POS environment on the machines by scanning selected IP addresses.
- ProcList with main purpose is to collect information about currently running processes on the machine.
“ModPipe’s architecture, modules and their capabilities also indicate that its writers have extensive knowledge of the targeted RES 3700 POS software. The proficiency of the operators could stem from multiple scenarios, including stealing and reverse engineering the proprietary software product, misusing its leaked parts or buying code from an underground market,” adds Smolár.
What can you do?
To keep the operators behind ModPipe at bay, potential victims in the hospitality sector as well as any other businesses using the RES 3700 POS are advised to:
- Use the latest version of the software.
- Use it on devices that run updated operating system and software.
- Use reliable multilayered security software that can detect ModPipe and similar threats.
It is a mathematical certainty that data is more protected by communication products that provide end-to-end encryption (E2EE).
Yet, many CISOs are required to prioritize regulatory requirements before data protection when considering the corporate use of E2EE communications. Most Fortune 1000 compliance and security teams have the ability to access employee accounts on their enterprise communications platform to monitor activity and investigate bad actors. This access is often required in highly regulated industries and E2EE is perceived as blocking that critical corporate access.
Unfortunately for enterprise security and compliance teams in most companies, unsanctioned communications platforms like WhatsApp are being used outside to conduct sensitive business in contravention of corporate policies. Just recently Morgan Stanley executives were removed from the firm for using WhatsApp.
Employees have come to understand that their IT, compliance and security teams are not the only ones who have special access to their communications. They know that Slack, Microsoft, Google, etc., can also access their data and communications. As such, many have turned to consumer E2EE products because they are not comfortable conducting sensitive business on systems where the service provider is both listening and responsible for security.
Why consumer apps running rampant is bad for business
Taking sensitive business to consumer products is risky. These consumer-grade platforms are not purpose-built for secure and compliant communications. They prioritize engagement and entertainment resulting in an ongoing pattern of security flaws, like person-in-the-middle attacks and remote code execution vulnerabilities. WhatsApp users have borne the brunt of these security vulnerabilities for years.
CISOs have been left to choose between turning a blind eye to employees using consumer E2EE products like WhatsApp or, worse yet, relenting and creating policy exceptions that they hope will placate regulators. Yet this approach is an endorsement of long-term use of non-compliant and insecure consumer products.
End-to-end encryption is more flexible than you think
Corporate security teams have operated under the misconception that E2EE is rigid. That not having a backdoor implies that there is only a one-size-fits-all implementation of the world’s most reliable cryptography. In reality, E2EE is flexible and can be deployed in concert with corporate policies and industry regulations.
CISOs don’t need to choose between compliance and strong encryption. Organizations, regardless of industry, can use E2EE that adheres to regulations, internal policies and integrates with IT workflows. This means that the corporate decision to use E2EE can be focused on protecting data from adversaries, competitors and service providers, instead of a fear of breaking the rules.
Choosing an E2EE-enabled communications platform
When it comes to choosing an E2EE-enabled communications platform, security professionals need to assess vendors’ claims, capabilities and motivations. While some mainstream platforms advertise E2EE, they only encrypt the traffic from endpoint to server. This is called Client-to-Server encryption (C2S). This happened most notably with Zoom earlier this year when they sold their product as E2EE.
Most reasonable security professionals agree this was not a malicious attempt to trick end users, rather a genuine lack of cryptographic understanding and sophistication. The company decided that a green lock symbol would make end users feel good – despite a C2S architecture that was prone to person-in-the-middle attacks.
Providers who are not in the business of securing critical user information will almost certainly make claims they do not understand and ship solutions that “don’t suck” rather than serious security technology.
CISOs who embrace E2EE will benefit from the certainty of math. It’s important to ensure that the service provider is capable of, and committed to, providing true E2EE.
There are three important pillars to a strong E2EE solution:
- Both the cryptographic protocols and results from third-party security reviews are public
- Their servers do not store data; and
- The service provider’s business model isn’t reliant upon access to customer data
This is to say that the CISO’s zero trust security policy should be extended to the service provider. If your Unified Communications service provider can access, mine and analyze your data, then they are an attack surface. We know that this access can lead to unauthorized access. Strong E2EE eliminates the service provider risk with mathematical certainty.
Compliance-ready E2EE is a relatively new phenomenon. But it is more important than ever for CISOs to weigh the risk of giving service providers access to all of their company’s data and the unparalleled benefits of taking control of their data while adhering to corporate compliance requirements.
When it comes to providing no-compromise security for enterprise communications, E2EE is a must-have for organizations, and now implementing it can be done without breaking the rules. Further, when organizations deploy enterprise E2EE with forethought they can pull end users off dangerous products like WhatsApp, We Chat and Telegram by giving their employees the security and privacy they need and deserve.
The race is on to build the world’s first reliable and truly useful quantum computer, and the finish line is closer than you might think – we might even reach it this decade. It’s an exciting prospect, particularly as these super-powerful machines offer huge potential to almost every industry, from drug development to electric-vehicle battery design.
But quantum computers also pose a big security problem. With exponentially higher processing power, they will be able to smash through the public-key encryption standards widely relied on today, threatening the security of all digital information and communication.
While it’s tempting to brush it under the carpet as “tomorrow’s problem”, the reality of the situation is much more urgent. That’s because quantum computers don’t just pose a threat to tomorrow’s sensitive information: they’ll be able to decrypt data that has been encrypted in the past, that’s being encrypted in the present, and that will be encrypted in the future (if quantum-resistant algorithms are not used).
It’s why the NSA warned, as early as 2015, that we “must act now” to defuse the threat, and why the US National Institute of Standards and Technology (NIST) is racing to standardize new post-quantum cryptographic solutions, so businesses can get a trusted safety net in place before the threat materializes.
From aviation to pharma: The industries at risk
The harsh reality is that no one is immune to the quantum threat. Whether it’s a security service, pharmaceutical company or nuclear power station, any organization holding sensitive information or intellectual property that needs to be protected in the long term has to take the issue seriously.
The stakes are high. For governments, a quantum attack could mean a hostile state gains access to sensitive information, compromising state security or revealing secrets that undermine political stability. For pharmaceuticals, on the other hand, a quantum computer could allow competitors to gain access to valuable intellectual property, hijacking a drug that has been in costly development for years. (As we’re seeing in the race for a COVID-19 vaccine, this IP can sometimes have significant geopolitical importance.)
Hardware and software are also vulnerable to attack. Within an industry like aviation, a quantum-empowered hacker would have the ability to forge the signature of a software update, push that update to a specific engine part, and then use that to alter the operations of the aircraft. Medical devices like pacemakers would be vulnerable to the same kind of attack, as would connected cars whose software is regularly updated from the cloud.
Though the list of scenarios goes on, the good news is that companies can ready themselves for the quantum threat using technologies available today. Here’s how:
1. Start the conversation early
Begin by promoting quantum literacy within your business to ensure that executive teams understand the severity and immediacy of the security threat. Faced with competing priorities, they may otherwise struggle to understand why this issue deserves immediate attention and investment.
It’s your job to make sure they understand what they’re up against. Identify specific risks that could materialize for your business and industry – what would a quantum attack look like, and what consequences would you be facing if sensitive information were to be decrypted?
Paint a vivid picture of the possible scenarios and calculate the cost that each one would have for your business, so everyone knows what’s at stake. By doing so, you’ll start to build a compelling business case for upgrading your organization’s information security, rather than assuming that this will be immediately obvious.
2. Work out what you’ve got and what you still need
Do a full audit of every place within your business where you are using cryptography, and make sure you understand why that is. Surprisingly, many companies have no idea of all the encryption they currently have in place or why, because the layers of protection have been built up in a siloed fashion over many years.
What cryptographic standards are you relying on today? What data are you protecting, and where? Try to pinpoint where you might be vulnerable. If you’re storing sensitive information in cloud-based collaboration software, for example, that may rely on public key cryptography, so won’t be quantum-secure.
As part of this audit, don’t forget to identify the places where data is in transit. However well your data is protected, it’s vulnerable when moving from one place to another. Make sure you understand how data is moving within your business – where from and to – so you can create a plan that addresses these weak points.
It’s also vital that you think about what industry regulations or standards you need to comply with, and where these come into play across the areas of your business. For industries like healthcare or finance, for example, there’s an added layer of regulation when it comes to information security, while privacy laws like the GDPR and CCPA will apply if you hold personal information relating to European or Californian citizens.
3. Build a long-term strategy for enhanced security
Once you’ve got a full view of what sensitive data you hold, you can start planning your migration to a quantum-ready architecture. How flexible is your current security infrastructure? How crypto-agile are your cryptography solutions? In order to migrate to new technology, do you need to rewrite everything, or could you make some straightforward switches?
Post-quantum encryption standards will be finalized by NIST in the next year and a half, but the process is already underway, and the direction of travel is becoming clearer. Now that finalist algorithms have been announced, businesses don’t need to wait to get quantum-secure – they must simply ensure that they design their security infrastructure to work with any of the shortlisted approaches that NIST is currently considering for standardization.
Deploying a hybrid solution – pairing existing solutions with one of the post-quantum schemes named as a NIST finalist – can be a good way to build resilience and flexibility into your security architecture. By doing this, you’ll be able to comply with whichever new industry standards are announced and remain fully protected against present and future threats in the meantime.
Whatever you decide, remember that migration can take time – especially if your business is already built on a complex infrastructure that will be hard to unpick and rebuild. Put a solid plan in place before you begin and consider partnering with an expert in the field to speed up the process.
A risk we can’t see
Just because a risk hasn’t yet materialized, doesn’t mean it isn’t worth preparing for (a mindset that could have come in handy for the coronavirus pandemic, all things considered…).
The quantum threat is serious, and it’s urgent. The good thing is that we already have all the ingredients to get a safety net in place, and thanks to strong mathematical foundations, we can be confident in the knowledge that the algorithms being standardized by NIST will protect businesses from even the most powerful computers.
The next step? Making sure this cutting-edge technology gets out of the lab and into the hands of the organizations who need it most.
The number of records exposed has increased to a staggering 36 billion. There were 2,935 publicly reported breaches in the first three quarters of 2020, with the three months of Q3 adding an additional 8.3 billion records to what was already the “worst year on record,” Risk Based Security reveals.
“Breach disclosures continue to be well below the high water mark established just last year despite other research indicating the number of attacks are on the rise. How do we square these two competing views into the digital threat landscape?”
Factors contributing to the decline in publicly reported breaches
The report explores numerous factors such as how media coverage may be a factor contributing to the decline in publicly reported breaches. In addition, the increase of ransomware attacks may also have a part to play.
“We believe that the pivot by malicious actors to more lucrative ransomware attacks is another factor,” Goddijn commented.
“While many of these attacks are now clearly breach events, the nature of the data compromised can give some victim organizations a reprieve from reporting the incident to regulators and the public.
“After all, while the compromised data may be sensitive to the target organization, unless it contains a sufficient amount of personal data to trigger a notification obligation the event can go unreported.”
The Risk Based Security report covers the data breaches reported between January 1, 2020 and September 30, 2020. In addition to the latest breach data research, the report also dissects alarming trends involving the coming November election, where several US voter databases have been shared and discussed on both Russian and English speaking hacking forums.