Growth of cloud-native apps and containerization to define 2021

Scality announced its data storage predictions for 2021, focusing on the rapid growth rate of cloud-native apps and containerization. According to IDC, by 2023, over 500 million digital apps and services will be developed and deployed using cloud-native approaches. That is the same number of apps developed in total over the last 40 years. 2021 apps and containerization trends “The accelerated growth of next-generation cloud-native digital apps and services will define new competitive requirements in … More

The post Growth of cloud-native apps and containerization to define 2021 appeared first on Help Net Security.

IT and data management challenges for 2021

Cohesity announced the results of a survey of 500 IT decision makers in the United States that highlights critical IT and data management challenges midsize and enterprise organizations are facing as companies prepare for 2021.

2021 data management challenges

The survey included 250 respondents from midsize companies ($100M-$1B in revenue) and 250 from enterprise organizations ($1B+ in revenue).

Some of these challenges came to light as companies answered questions about their appetite for Data Management as a Service (DMaaS). With a DMaaS solution, organizations do not have to manage data infrastructure – it is managed for them.

DMaaS provides organizations with easy access to backup and recovery, disaster recovery, archiving, file and object services, dev/test provisioning, data governance, and security – all through one vendor in a Software as a Service (SaaS) model.

Key findings

IT budgets are being slashed: Seventy percent of respondents state their organization is being forced to cut the IT budget in the next 12 months. Around a third of respondents have to cut the IT budget by 10-25 percent, a tenth have to cut it by a whopping 25-50 percent.

Verticals facing the largest cuts on average: technology (20 percent), education (18 percent), government/public sector (16 percent).

Many midsize companies are struggling to compete against larger enterprises because of inefficient data management: 27 percent of respondents from midsize companies say they have lost 25-50 percent of deals to larger enterprises because larger enterprises have more resources to manage and derive value from their data.

Even worse, 18 percent of respondents from midsize companies claim to have lost 50-75 percent of deals to larger enterprises for the same reason.

Organizations are spending inordinate amounts of time managing data infrastructure: Respondents say IT teams, on average, spend 40 percent of their time each week installing, maintaining, and managing data infrastructure. Twenty-two percent claim their IT team spends 50-75 percent of time each week on these tasks.

Technology is needed that makes it easier to derive value from data while also reducing stress levels and employee turnover: When respondents were asked about the benefits of deploying a DMaaS solution versus spending so much time managing data infrastructure, 61 percent cited an ability to focus more on deriving value from data which could help their organization’s bottom line, 52 percent cited reduced stress levels for IT teams, and 47 percent are hopeful this type of solution could also reduce employee turnover within the IT team.

“As 2020 comes to a close, it’s clear that IT teams are expected to face significant challenges in 2021,” said Matt Waxman, VP, product management, Cohesity.

“Research shows IT leaders are anxious for comprehensive solutions that will enable them to do more with data in ways that will help boost revenues and provide a competitive advantage at a time when they are also facing budget cuts, burnout, and turnover.”

The growing appetite for technology that simplifies IT and data management

As businesses look to simplify IT operations, be more cost efficient, and do more with data, respondents are very optimistic about the benefits of DMaaS, which include:

  • Cost predictability: Eighty-nine percent of respondents say their organization is likely to consider deploying a DMaaS solution, at least in part, due to budget cuts.
  • Helping midsize companies win more business: Ninety-one percent of respondents from midsize companies believe deploying a DMaaS solution will enable their organizations to compete more effectively against larger enterprises that have more resources to manage data.
  • Saving IT teams valuable time: Respondents who noted that their IT teams spend time each week managing IT infrastructure believe those teams will save, on average, 39 percent of their time each week if their company had a full DMaaS solution in place.
  • Doing more with data: Ninety-seven percent of respondents believe DMaaS unlocks opportunities to derive more value from data using cloud-based services and applications. Sixty-four percent want to take advantage of cloud-based capabilities that enable them to access and improve their security posture, including improving anti-ransomware capabilities.
  • Alleviating stress and reducing turnover: Ninety-three percent of respondents believe that deploying a DMaaS solution would enable them to focus less on infrastructure provisioning and data management tasks. 52 percent of these respondents say deploying a DMaaS solution could reduce their team’s stress levels by not having to spend so much time on infrastructure provisioning and management. Forty-seven percent believe deploying a DMaaS solution could reduce employee turnover within the IT team.

Choice is the name of the game for IT in 2021

“The data also pinpoints another important IT trend in 2021: choice is critical,” said Waxman. “IT leaders want to manage data as they see fit.” With respect to choice, respondents stated:

  • It’s not one or the other, it’s both: 69 percent of respondents stated their organization prefers to partner with vendors that offer choice in how their company’s data is managed and will not consider vendors that just offer a DMaaS model — they also want the option to manage some data directly.
  • Avoiding one-trick ponies is key: Ninety-four percent of survey respondents stated that it’s important to work with a DMaaS vendor that does more than Backup as a Service (BaaS). If the vendor only offers BaaS, 70 percent are concerned they will have to work with more vendors to manage their data and doing so is likely to increase their workload (77 percent), fail to help reduce costs (65 percent), and lead to mass data fragmentation where data is siloed and hard to manage and gain insights from (74 percent).

How do I select a data storage solution for my business?

We live in the age of data. We are constantly producing it, analyzing it, figuring out how to store and protect it, and, hopefully, using it to refine business practices. Unfortunately, 58% of organizations make decisions based on outdated data.

While enterprises are rapidly deploying technologies for real-time analytics, machine learning and IoT, they are still utilizing legacy storage solutions that are not designed for such data-intensive workloads.

To select a suitable data storage for your business, you need to think about a variety of factors. We’ve talked to several industry leaders to get their insight on the topic.

Phil Bullinger, SVP and General Manager, Data Center Business Unit, Western Digital

select data storage solutionSelecting the right data storage solution for your enterprise requires evaluating and balancing many factors. The most important is aligning the performance and capabilities of the storage system with your critical workloads and their specific bandwidth, application latency and data availability requirements. For example, if your business wants to gain greater insight and value from data through AI, your storage system should be designed to support the accelerated performance and scale requirements of analytics workloads.

Storage systems that maximize the performance potential of solid state drives (SSDs) and the efficiency and scalability of hard disk drives (HDDs) provide the flexibility and configurability to meet a wide range of application workloads.

Your applications should also drive the essential architecture of your storage system, whether directly connected or networked, whether required to store and deliver data as blocks, files, objects or all three, and whether the storage system must efficiently support a wide range of workloads while prioritizing the performance of the most demanding applications.

Consideration should be given to your overall IT data management architecture to support the scalability, data protection, and business continuity assurance required for your enterprise, spanning from core data centers to those distributed at or near the edge and endpoints of your enterprise operations, and integration with your cloud-resident applications, compute and data storage services and resources.

Ben Gitenstein, VP of Product Management, Qumulo

select data storage solutionWhen searching for the right data storage solution to support your organizational needs today and in the future, it’s important to select a solution that is trusted, scalable to secure demanding workloads of any size, and ensures optimal performance of applications and workloads both on premises and in complex, multi- cloud environments.

With the recent pandemic, organizations are digitally transforming faster than ever before, and leveraging the cloud to conduct business. This makes it more important than ever that your storage solution has built in tools for data management across this ecosystem.

When evaluating storage options, be sure to do your homework and ask the right questions. Is it a trusted provider? Would it integrate well within my existing technology infrastructure? Your storage solution should be easy to manage and meet the scale, performance and cloud requirements for any data environment and across multi-cloud environments.

Also, be sure the storage solution gives IT control in how they manage storage capacity needs and delivers real-time insight into analytics and usage patterns so they can make smart storage allocation decisions and maximize an organizations’ storage budget.

David Huskisson, Senior Solutions Manager, Pure Storage

select data storage solutionData backup and disaster recovery features are critically important when selecting a storage solution for your business, as now no organization is immune to ransomware attacks. When systems go down, they need to be recovered as quickly and safely as possibly.

Look for solutions that offer simplicity in management, can ensure backups are viable even when admin credentials are compromised, and can be restored quickly enough to greatly reduce major organizational or financial impact.

Storage solutions that are purpose-built to handle unstructured data are a strong place to start. By definition, unstructured data means unpredictable data that can take any form, size or shape, and can be accessed in any pattern. These capabilities can accelerate small, large, random or sequential data, and consolidate a wide range of workloads on a unified fast file and object storage platform. It should maintain its performance even as the amount of data grows.

If you have an existing backup product, you don’t need to rip and replace it. There are storage platforms with robust integrations that work seamlessly with existing solutions and offer a wide range of data-protection architectures so you can ensure business continuity amid changes.

Tunio Zafer, CEO, pCloud

select data storage solutionBear in mind: your security team needs to assist. Answer these questions to find the right solution: Do you need ‘cold’ storage or cloud storage? If you’re looking to only store files for backup, you need a cloud backup service. If you’re looking to store, edit and share, go for cloud storage. Where are their storage servers located? If your business is located in Europe, the safest choice is a storage service based in Europe.

Best case scenario – your company is going to grow. Look for a storage service that offers scalability. What is their data privacy policy? Research whether someone can legally access your data without your knowledge or consent. Switzerland has one of the strictest data privacy laws globally, so choosing a Swiss-based service is a safe bet. How is your data secured? Look for a service that offers robust encryption in-transit and at-rest.

Client-side encryption means that your data is secured on your device and is transferred already encrypted. What is their support package? At some point, you’re going to need help. A data storage service with a support package that’s included for free, answers in up to 24 hours is preferred.

EU Commission: The GDPR has been an overall success

The European Commission has published an evaluation report on the General Data Protection Regulation (GDPR), two years after the regulation became enforceable.

GDPR two years

Two years of the GDPR: What was achieved?

Citizens are more empowered and aware of their rights: The GDPR enhances transparency and gives individuals enforceable rights, such as the right of access, rectification, erasure, the right to object and the right to data portability. Today, 69% of the population above the age of 16 in the EU have heard about the GDPR and 71% of people heard about their national data protection authority, according to results published last week in a survey from the EU Fundamental Rights Agency. However, more can be done to help citizens exercise their rights, notably the right to data portability.

Data protection rules are fit for the digital age: The GDPR has empowered individuals to play a more active role in relation to what is happening with their data in the digital transition. It is also contributing to fostering trustworthy innovation, notably through a risk-based approach and principles such as data protection by design and by default.

Data protection authorities are making use of their stronger corrective powers: From warnings and reprimands to administrative fines, the GDPR provides national data protection authorities with the right tools to enforce the rules. However, they need to be adequately supported with the necessary human, technical and financial resources. Many Member States are doing this, with notable increases in budgetary and staff allocations. Overall, there has been a 42% increase in staff and 49% in budget for all national data protection authorities taken together in the EU between 2016 and 2019. However, there are still stark differences between Member States.

Data protection authorities are working together in the context of the European Data Protection Board (EDPB), but there is room for improvement: The GDPR established a governance system which is designed to ensure a consistent and effective application of the GDPR through the so called ‘one stop shop’, which provides that a company processing data cross-border has only one data protection authority as interlocutor, namely the authority of the Member State where its main establishment is located. Between 25 May 2018 and 31 December 2019, 141 draft decisions were submitted through the ‘one-stop-shop’, 79 of which resulted in final decisions. However, more can be done to develop a truly common data protection culture. In particular, the handling of cross-border cases calls for a more efficient and harmonised approach and an effective use of all tools provided in the GDPR for the data protection authorities to cooperate.

Advice and guidelines by data protection authorities: The EDPB is issuing guidelines covering key aspects of the Regulation and emerging topics. Several data protection authorities have created new tools, including helplines for individuals and businesses, and toolkits for small and micro-enterprises. It is essential to ensure that guidance provided at national level is fully consistent with guidelines adopted by the EDPB.

Harnessing the full potential of international data transfers: Over the past two years, the Commission’s international engagement on free and safe data transfers has yielded important results. This includes Japan, with which the EU now shares the world’s largest area of free and safe data flows. The Commission will continue its work on adequacy, with its partners around the world. In addition and in cooperation with the EDPB, the Commission is looking at modernising other mechanisms for data transfers, including Standard Contractual Clauses, the most widely used data transfer tool. The EDPB is working on specific guidance on the use of certification and codes of conduct for transferring data outside of the EU, which need to be finalised as soon as possible. Given the European Court of Justice may provide clarifications in a judgment to be delivered on 16 July that could be relevant for certain elements of the adequacy standard, the Commission will report separately on the existing adequacy decisions after the Court of Justice has handed down its judgment.

Promoting international cooperation: Over the last two years, the Commission has stepped up bilateral, regional and multilateral dialogue, fostering a global culture of respect for privacy and convergence between different privacy systems to the benefit of citizens and businesses alike. The Commission is committed to continuing this work as part of its broader external action, for example, in the context of the Africa-EU Partnership and in its support for international initiatives, such as ‘Data Free Flow with Trust’. At a time when violations of privacy rules may affect large numbers of individuals simultaneously in several parts of the world, it is time to step up international cooperation between data protection enforcers. This is why the Commission will seek authorisation from the Council to open negotiations for the conclusion of mutual assistance and enforcement cooperation agreements with relevant third countries.

GDPR: What’s next?

According to the report, in two years the GDPR has met most of its objectives, in particular by offering citizens a strong set of enforceable rights and by creating a new European system of governance and enforcement.

The GDPR proved to be flexible to support digital solutions in unforeseen circumstances such as the Covid-19 crisis. The report also concludes that harmonisation across the Member States is increasing, although there is a certain level of fragmentation that must be continually monitored. It also finds that businesses are developing a compliance culture and increasingly use strong data protection as a competitive advantage.

The GDPR has acted as a catalyst for many countries and states around the world – e.g., Chile, South Korea, Brazil, Japan, Kenya, India, Tunisia, Indonesia, Taiwan and the state of California – to consider how to modernise their privacy rules, the EC noted.

They also pointed out that it provided data protection authorities many corrective powers to enforce it (administrative fines, orders to comply with data subject’s requests, bans on processing or the suspension of data flows, etc.)

There is room for improvement, though.

“For example, we need more uniformity in the application of the rules across the Union: this is important for citizens and for businesses, especially SMEs. We need also to ensure that citizens can make full use of their rights,” noted Didier Reynders, Commissioner for Justice.

The EC also noted that stakeholders should also make sure to closely monitoring the application of the GDPR to new technologies such as AI, Internet of Things, abd blockchain.

How data intelligent organizations mitigate risk

Organizations that put data at the center of their vision and strategy realize a differentiated competitive advantage by mitigating cost and risk, growing revenue and improving the customer experience, a Collibra survey of more than 900 global business analysts reveals.

data strategy advantage

Orgs rarely use data to guide business decisions

Despite a majority of companies saying they valued using data to drive decisions, many organizations are not consistently executing. While 84% of respondents said that it is very important or critical to put data at the center of their crucial business decisions and strategy, 43% admitted that their organizations fail to always or even routinely use data to guide business decisions.

Without a data management strategy, analysts often spend time on tasks that take away from their ability to perform analysis and provide value. This is a resounding issue for less data-mature organizations, which are 55% less likely to say their data management strategies positively contribute to optimal business decisions.

Data management strategy improves customer trust

Those insights-driven decisions are also yielding more successful outcomes, giving data intelligent organizations a competitive edge in achieving their key business objectives.

These organizations, which have the ability to connect the right data, insights and algorithms so people can drive business value, realized an 8% advantage in improving customer trust, an 81% advantage in growing revenue, and a 173% advantage in better complying with regulations and requirements.

Those organizations adopting data intelligence were also 58% more likely to exceed their revenue goals than non-data intelligent organizations.

data strategy advantage

“To lead with data, companies need to advance how they discover, organize, collaborate with, and execute on the data they have,” said Felix Van de Maele, co-founder and CEO of Collibra.

“Companies also must optimize how data analysts spend their time and automate rote tasks with data management technology. By freeing analysts up to spend more time on value-added tasks, organizations can decrease time to insight and accelerate trusted business outcomes.”

Organizations look to build resiliency with hybrid and multi-cloud architectures

Hybrid and multi-cloud architectures have become the de-facto standard among organizations, with 53 percent embracing them as the most popular form of deployment.

multi-cloud architectures

Advantages of hybrid and multi-cloud architectures

Surveying over 250 worldwide business executives and IT professionals from a diverse group of technical backgrounds, Denodo’s cloud usage survey revealed that hybrid cloud configurations are the centre of all cloud deployments at 42 percent, followed by public (18 percent) and private clouds (17 percent).

The advantages of hybrid cloud and multi-cloud configurations according to respondents include the ability to diversify spend and skills, build resiliency, and cherry-pick features and capabilities depending on each cloud service provider’s particular strengths, all while avoiding the dreaded vendor lock-in.

The use of container technologies increased by 50 percent year-over-year indicating a growing trend to use it for scalability and portability to the cloud. DevOps professionals continue to look to containerization for production, because it enables reproducibility and the ability to automate deployments.

About 80 percent of the respondents are leveraging some type of container deployment, with Docker being the most popular (46 percent) followed by Kubernetes (40 percent) which is gaining steam, as is evident from the consistent support of all the key cloud providers.

Most popular cloud service providers

A foundational metric for demonstrating cloud adoption maturity, 78 percent of all respondents are running some kind of a workload in the cloud. Over the past year, there has been a positive reinforcement of cloud adoption with at least a 10 percent increase across beginners, intermediate, and advanced adopters.

About 90 percent of those embracing cloud are selecting AWS and Microsoft Azure as their service providers, demonstrating the continued dominance of these front-runners.

But users are not just lifting their on-premises applications and shifting them to either of or both of these clouds; 35 percent said they would re-architect their applications for the best-fit cloud architecture.

For the most popular cloud initiative, analytics and BI came out at the top with two out of three (66 percent) participants claiming to use it for big data analytics projects. AWS, Azure, and Google Cloud each has its own specific strengths, but analytics surfaced as the top use case across all three of them. This use case was followed closely by both logical data warehouse (43 percent) and data science (41 percent) in the cloud.

multi-cloud architectures

Data formats

When it comes to data formats, two thirds of the data being used is still in structured format (68 percent), while there is a vast pool of unstructured data that is growing in importance. Cloud object storage (47 percent) along with SaaS data (44 percent) are frequently used to maximize ease of computation and performance optimization.

Further, cloud marketplaces are growing at a phenomenal speed and are becoming more popular. Half (50 percent) of those surveyed are leveraging cloud marketplaces with utility/pay-as-you-go pricing being the most popular incentive (19 percent) followed by its self-service capability/ability to minimize IT dependency (13 percent). Avoiding a long-term commitment also played a role (6 percent).

“As data’s center of gravity shifts to the cloud, hybrid cloud and multi-cloud architectures are becoming the basis of data management, but the challenge of integrating data in the cloud has almost doubled (43 percent),” said Ravi Shankar, SVP and CMO of Denodo.

“Today, users are looking to simplify cloud data integration in a hybrid/multi-cloud environment without having to depend on heavy duty data migration or replication which may be why almost 50 percent of respondents said they are considering data virtualization as a key part of their cloud integration and migration strategy.”

The state of data quality: Too much, too wild and too skewed

We live in the age of data. We are constantly producing it, analyzing it, figuring out how to store and protect it, and, hopefully, using it to refine business practices and better understand the markets and customer we work with. However, this is all easier said than done and one of the biggest concerns that businesses have around their data is the quality – a fact confirmed by 1,900 people surveyed at the end of last year on the state of data quality. Despite being aware of data quality issues, many are uncertain about how to best address those concerns.

data quality issues

Data quality issues

At the top of the concern chart is the sheer number of data sources that are available to businesses today. Over 60% of respondents indicated that too many data sources and inconsistent data was their top data quality worry. This was followed closely by disorganised data stores and lack of meta data (50%), and poor data quality controls at data entry (47%).

Despite this being a top concern, it will be hard for any organization to reduce the number of data sources it has – if anything, this is only likely to increase over time. The problem was first tackled when we were still maintaining data in spreadsheet, with data management practitioners coining the term “spreadmart hell” as they tried to maintain data governance over multiple spreadsheets maintained by individuals or groups spread across an organization.

This problem was, unfortunately, not solved with the adoption of self-service data analysis tools as they failed to include much needed features like metadata creation and management and data synchronization.

So, instead of looking at the number of data sources as a problem, we should look at it as a feature and be thankful that technology has greatly progressed match organizations’ needs. Front-end tools generate metadata and capture provenance and lineage, data cataloguing software then manages this – so technology has our back. We do however have to continue to push a cultural change around data, encouraging people throughout the organization to ensure data quality, governance and general data literacy.

Data governance 101

Some of the common data quality issues that the survey revealed point to larger, institutional problems. Disorganised data stores and lack of metadata is fundamentally a governance issue and with only 20% of respondents saying their organizations publish information on data provenance and lineage, we can make the conclusion that very few organizations are up to snuff with governance.

Like the sheer amounts of data being ingested by organizations, data governance isn’t necessarily an easy problem to solve and it is likely to only grow. The poor data quality controls at data entry is fundamentally where this problem originates, as any good data scientist knows, entry issues are persistent and widespread, if not stubborn. Adding to this, practitioners may have little or no control over providers of third-party data, so missing data will always be with us.

However, there is reason for hope – machine learning and artificial intelligence tools could provide a reprieve from the worries. And almost half (48%) are already using machine learning or AI tools to address data quality issues, automating some of the tasks involved in discovering, profiling and indexing data.

Data governance, like data quality, is fundamentally a socio-technical problem and as much as ML and AI can help, the right people and processes need to be in place to truly make it happen. Ultimately, people and processes are almost always implicated in both the creation and the perpetuation of data quality issues, so we need to start there.

People and data: Bridging the biases

People have long been noted as transferring their biases onto data when analyzing it, but only 20% of respondents cited this as a primary data quality issue. Despite this being a much-discussed problem, respondents see it as less of an issue than many other quality issues.

Based solely on these responses, we shouldn’t rule data biases out as an issue, what it should do is underscore the importance of acknowledging that data contains bias. We should assume, not rule out, the existence of unknown biases. This should lead us to developing formal diversity standards within data and create processes to detect, acknowledge and address those biases.

Missing data also plays a part in this, it isn’t just that we lack the data we believe we need, sometimes we don’t know or can’t imagine what data we need. This is why it is important to have a diverse group of people working with data to bring different ideas and insights to the process.

Creating high quality data practices

Most respondents indicated many data quality issues – they don’t seem to travel alone. At the same time, over 70% of respondents don’t have dedicated data quality teams. What is to be done?

Organizations should take formal steps to condition and improve their data. This will be an ongoing process and C-suite buy-in – although difficult to obtain – will be key to creating this long-term strategy. The C-suite, like many others in the organization, will need education on and understanding of the importance of this project and the business benefits it will enable.

C-suite buy-in is also vital because data conditioning is not easy or cheap. Committing to formal processes, implementing technology and creating a dedicated team takes time and money. A ROI-based approach should help to determine what data conditioning is a priority and what is not worth addressing.

AI can unearth quality issues hiding in plain sight and investing in AI can be a catalyst for data quality remediation. But while AI-enriched tools can help, they are not the complete answer. You need a team to foster the use of the tools and fully master them to garner the benefits.

Data quality can feel like an overwhelming problem. Start with the basics and encourage good data hygiene practices throughout your organization. Remember: technology and tools are great, but it all starts with people and culture.

While nearly 90% of companies are backing up data, only 41% do it daily

42% of companies experienced a data loss event that resulted in downtime last year, according to Acronis. That high number is likely caused by the fact that while nearly 90% are backing up the IT components they’re responsible for protecting, only 41% back up daily – leaving many businesses with gaps in the valuable data available for recovery.

back up data

The figures revealed illustrate the new reality that traditional strategies and solutions to data protection are no longer able to keep up with the modern IT needs of individuals and organizations.

The importance of implementing a cyber protection strategy

The annual survey, completed this year by nearly 3,000 people, gauges the protection habits of users around the globe. The findings revealed that while 91% of individuals back up data and devices, 68% still lose data as a result of accidental deletion, hardware or software failure, or an out-of-date backup.

Meanwhile, 85% of organizations aren’t backing up multiple times per day, only 15% report they are. 26% back up daily, 28% back up weekly, 20% back up monthly, and 10% aren’t backing up at all, which can mean days, weeks, or months of data lost with no possibility of complete recovery.

Of those professional users who don’t back up, nearly 50% believe backups aren’t necessary. A belief the survey contradicts: 42% of organizations reported data loss resulting in downtime this year and 41% report losing productivity or money due to data inaccessibility.

Furthermore, only 17% of personal users and 20% of IT professionals follow best practices, employing hybrid backups on local media and in the cloud.

These findings stress the importance of implementing a cyber protection strategy that includes backing up your data multiple times a day and practicing the 3-2-1 backup rule: create three copies of your data (one primary copy and two backups), store your copies in at least two types of storage media, and store one of these copies remotely or in the cloud.

“Individuals and organizations keep suffering from data loss and cyberattacks. Everything around us is rapidly becoming dependent on digital, and it is time for everyone to take cyber protection seriously,” said Acronis Chief Cyber Officer, Gaidar Magdanurov.

“Cyber protection in the digital world becomes the fifth basic human need, especially during this unprecedented time when many people must work remotely and use less secure home networks.

“It is critical to proactively implement a cyber protection strategy that ensures the safety, accessibility, privacy, authenticity, and security of all data, applications, and systems – whether you’re a home user, an IT professional, or an IT service provider.”

Cyber protection changes the game

With increasing cyberattacks, traditional backup is no longer sufficient to protect data, applications, and systems, relying on backup alone for true business continuity is too dangerous. Cybercriminals target backup software with ransomware and try to modify backup files, which magnifies the need for authenticity verification when restoring workloads.

It makes sense, then, that the survey indicated a universally high level of concern about cyberthreats like ransomware. 88% of IT professionals reported concern over ransomware, 86% are concerned about cryptojacking, 87% are concerned about social engineering attacks like phishing, and 91% are concerned about data breaches.

Among personal users, awareness and concern regarding all four of these threat types were nearly as high. In fact, compared to the 2019 survey their concern about cyberthreats rose by 33%.

The survey also revealed a lack of insight into data management, exposing a great need for cyber protection solutions with greater visibility and analytics. The surprising findings indicate that 30% of personal users and 12% of IT professionals wouldn’t know if their data was modified unexpectedly.

30% of personal users and 13% of IT professionals aren’t sure if their anti-malware solution stops zero-day threats. Additionally, 9% of organizations reported that they didn’t know if they experienced downtime as a result of data loss this year.

To ensure complete protection, secure backups must be part of an organization’s comprehensive cyber protection approach, which includes ransomware protection, disaster recovery, cybersecurity, and management tools.

Cyber protection recommendations

Whether you are concerned about personal files or your company’s business continuity, there are five simple recommendations to ensure fast, efficient, and secure protection of your workloads:

  • Always create backups of important data. Keep multiple copies of the backup both locally (so it’s available for fast, frequent recoveries) and in the cloud (to guarantee you have everything if a fire, flood, or disaster hits your facilities).
  • Ensure your operating systems and applications are current. Relying on outdated OSes or apps means they lack the bug fixes and security patches that help block cybercriminals from gaining access to your systems.
  • Beware suspicious email, links, and attachments. Most virus and ransomware infections are the result of social engineering techniques that trick unsuspecting individuals into opening infected email attachments or clicking on links to websites that host malware.
  • Install anti-virus, anti-malware, and anti-ransomware software while enabling automatic updates so your system is protected against malware, with the best software also able to protect against zero-day threats.
  • Consider deploying an integrated cyber protection solution that combines backup, anti-ransomware, anti-virus, vulnerability assessment and patch management in a single solution. An integrated solution increases ease of use, efficiency and reliability of protection.

Researchers develop data exchange approach with blockchain-based security features

An IT startup has developed a novel blockchain-based approach for secure linking of databases, called ChainifyDB.

ChainifyDB

“Our software resembles keyhole surgery. With a barely noticeable procedure we enhance existing database infrastructures with blockchain-based security features. Our software is seamlessly compatible with the most common database management systems, which drastically reduces the barrier to entry for secure digital transactions,” explains Jens Dittrich, Professor of Computer Science at Saarland University at Saarbrücken, Germany.

How does ChainifyDB work?

The system offers various mechanisms for a trustworthy data exchange between several parties. The following example shows one of its use cases.

Assume some doctors are treating the same patient and want to maintain his or her patient file together. To do this, the doctors would have to install the Saarbrücken researchers’ software on their existing database management systems. Then, they could jointly create a data network.

In this network, the doctors set up a shared table in which they enter the patient file for the shared patient. “If a doctor changes something in his table, it affects all other tables in the network. Subsequent changes to older table states are only possible if all doctors in the network agree,” explains Jens Dittrich.

Another special feature: If something about the table is changed, the focus is not on the change itself, but on its result. If the result is identical in all tables in the network, the changes can be accepted. If not, the consensus process starts again.

“This makes the system tamper-proof and guarantees that all network participants’ tables always have the same status. Furthermore, only the shared data in the connected tables is visible to other network participants; all other contents of the home database remain private”, emphasizes Dr. Felix Martin Schuhknecht, Principal Investigator of the project.

Advantages for security-critical situations

The new software offers advantages especially for security-critical situations, such as hacker attacks or when business partners cannot completely trust each other. Malicious participants can be excluded from a network without impairing its functionality.

If a former participant is to be reinstated, the remaining network participants only have to agree on a “correct” table state. The previously suspended partner can then be set to this state. “As far as we know, this function is not yet offered by any comparable software,” adds Dittrich.

In order to bring ChainifyDB to market, the German Federal Ministry of Education and Research is supporting the Saarbrücken researchers’ start-up, which is currently being founded, with 840,000 euros.

NIST Privacy Framework 1.0: Manage privacy risk, demonstrate compliance

Our data-driven society has a tricky balancing act to perform: building innovative products and services that use personal data while still protecting people’s privacy. To help organizations keep this balance, the National Institute of Standards and Technology (NIST) is offering a new tool for managing privacy risk.

NIST Privacy Framework

Version 1.0 of the NIST Privacy Framework

The agency has just released Version 1.0 of the NIST Privacy Framework: A Tool for Improving Privacy through Enterprise Risk Management. Developed from a draft version in collaboration with a range of stakeholders, the framework provides a useful set of privacy protection strategies for organizations that wish to improve their approach to using and protecting personal data.

The publication also provides clarification about privacy risk management concepts and the relationship between the Privacy Framework and NIST’s Cybersecurity Framework.

“Privacy is more important than ever in today’s digital age,” said Under Secretary of Commerce for Standards and Technology and NIST Director Walter G. Copan.

“The strong support the Privacy Framework’s development has already received demonstrates the critical need for tools to help organizations build products and services providing real value, while protecting people’s privacy.”

Personal data includes information about specific individuals, such as their addresses or Social Security numbers, that a company might gather and use in the normal course of business. Because this data can be used to identify the people who provide it, an organization must frequently take action to ensure it is not misused in a way that could embarrass, endanger or compromise the customers.

Helping organizations manage privacy risk

The NIST Privacy Framework is not a law or regulation, but rather a voluntary tool that can help organizations manage privacy risk arising from their products and services, as well as demonstrate compliance with laws that may affect them, such as the California Consumer Privacy Act and the European Union’s General Data Protection Regulation. It helps organizations identify the privacy outcomes they want to achieve and then prioritize the actions needed to do so.

“What you’ll find in the framework are building blocks that can help you achieve your privacy goals, which may include laws your organization needs to follow,” said Naomi Lefkovitz, a senior privacy policy adviser at NIST and leader of the framework effort.

“If you want to consider how to increase customer trust through more privacy-protective products or services, the framework can help you do that. But we designed it to be agnostic to any law, so it can assist you no matter what your goals are.”

Privacy application still evolving

Privacy as a basic right in the USA has roots in the U.S. Constitution, but its application in the digital age is still evolving, in part because technology itself is changing at a rapidly accelerating pace.

New uses for data pop up regularly, especially in the context of the internet of things and artificial intelligence, which together promise to gather and analyze patterns in the real world that previously have gone unrecognized. With these opportunities come new risks.

“A class of personal data that we consider to be of low value today may have a whole new use in a couple of years,” Lefkovitz said, “or you might have two classes of data that are not sensitive on their own, but if you put them together they suddenly may become sensitive as a unit. That’s why you need a framework for privacy risk management, not just a checklist of tasks: You need an approach that allows you to continually reevaluate and adjust to new risks.”

The Privacy Framework 1.0 has an overarching structure modeled on that of the widely used NIST Cybersecurity Framework, and the two frameworks are designed to be complementary and also updated over time.

Merely adopting a good security posture is not enough

Privacy and security are related but distinct concepts, Lefkovitz said, and merely adopting a good security posture does not necessarily mean that an organization is addressing all its privacy needs.

As with its draft version, the Privacy Framework centers on three sections: the Core, which offers a set of privacy protection activities; the Profiles, which help determine which of the activities in the Core an organization should pursue to reach its goals most effectively, and the Implementation Tiers, which help optimize the resources dedicated to managing privacy risk.

The NIST authors plan to continue building on their work to benefit the framework’s users. Digital privacy risk management is a comparatively new concept, and Lefkovitz said they received many requests for clarification about the nature of privacy risk, as well as for additional supporting resources.

“People continue to yearn for more guidance on how to do privacy risk management,” she said. “We have released a companion roadmap for the framework to point the way toward more research to address current privacy challenges, and we are building a repository of guidance resources to support implementation of the framework. We hope the community of users will contribute to it to advance privacy for the good of all.”

Six trends attracting the attention of enterprise technology leaders

Organizations around the world will accelerate enterprise technology investment in 2020, leveraging digital improvements to make them more competitive, improve connections with consumers, and keep up with the increasing demands of privacy regulation and security needs.

enterprise technology investment 2020

Hyland has identified six technology trends that will drive these improvements and demand the attention of CIOs CTOs in the coming year and beyond.

Prioritize cloud control

Organizations will opt for managed cloud services to increase security and efficiency. Because hosting solutions in a public cloud requires extensive internal oversight, CIOs and CTOs see the value in outsourcing the management and hosting of their cloud infrastructure to experts who handle:

  • Backing up data and implementing the latest security measures
  • Maintaining and updating solutions to ensure compliance with national and international regulations
  • Disaster recovery
  • Scaling solutions up or down as data needs fluctuate

Enterprise technology investment should focus on data security in 2020

The proliferation of national, international and even statewide data and privacy regulations — such as GDPR and the CCPA — is forcing organizations to rethink the way they manage and protect information.

As the stakes for companies to comply continue to rise, so will the challenge to keep up with the ever-changing regulations. Keeping data in perpetuity is no longer an option. As a result, organizations are investing in enterprise technology like content services solutions to automate document retention and records management policies.

Push blockchain beyond Bitcoin

As business processes generate more and more data, and digital transactions increase, the need for transparency and authentication will grow. Blockchain is an increasingly viable way to provide those assurances across industries, from higher education to mortgage lending.

Using tech to answer economic questions

The strong yet unpredictable economy will drive organizations to seek efficiencies today so they can be more nimble and competitive in the future.

In the same way manufacturers have used technology to improve efficiency on the factory floor, organizations in every industry will focus on reducing the cost and complexity of business processes by improving the efficiency of knowledge workers in the back office.

Areas such as accounts payable and other transactional departments are now looking to second- or third-generation solutions that intelligently automate processes.

enterprise technology investment 2020

Enterprise technology investment 2020: Accelerate automation

Fifty-two percent of the Fortune 500 companies that existed in the year 2000 no longer exist due to bankruptcy, mergers and acquisitions, or other reasons. This pace of change will accelerate, as the rise of intelligent automation technology will create new sources of revenue, leading to the rise of new companies and the demise of others.

Robotic process automation will allow “digital workers” to toil around the clock at blinding speeds — complementing human workers and eliminating the most tedious, repetitive manual tasks.

Machine learning and AI will augment the productivity of knowledge workers by driving more processes and making more contextual decisions, freeing up employees to focus on the highest value tasks.

Embrace tech as a customer loyalty program

With the rise of consumerization and the expectation for rapid response in every interaction, organizations are looking to speed up processes in order to improve employee and customer experiences — and thereby gain a competitive edge. Content services technology will play a critical role in organizations’ quest to deliver better experiences to the people they serve.

Budgetary, policy, workforce issues influencing DOD and intelligence community IT priorities

Information Technology spending by Department of Defense (DOD) and Intelligence Community (IC) agencies will continue to grow as they work to keep pace with the evolution of both the threat landscape and technology development, according to Deltek.

DOD and intelligence community IT priorities

Intelligence community

The increasing sophistication of adversaries, expanding threat landscape, rapid pace of technology advancement and data proliferation continue to fuel the IC’s demand for tools and resources to meet mission objectives.

IT solutions such as cloud computing, modern data management, big data, cybersecurity and artificial intelligence are in high demand by intelligence agencies with increasingly complex national security missions.

Deltek forecasts growth in IC IT investments from $9.9 billion in FY 2019 to $11.0 billion in FY 2024 at a Compound Annual Growth Rate (CAGR) of 2.2%.

Technology as a strategic enabler

Technology innovation is a major tenant of the 2019 National Intelligence Strategy (NIS), which is critical for the IC’s ability to provide strategic intelligence, anticipatory intelligence, cyber threat intelligence, counterterrorism, counter-proliferation, and counterintelligence and security.

Budgets and IT demands are increasing

IC overall budgets and IT budgets continue to rise to provide resources to combat the increasing threat landscape. IT solutions such as cloud computing, big data analytics, artificial intelligence, robotic process automation, machine learning, cybersecurity and new cutting-edge technologies will remain in high demand by IC organizations to fulfill their missions.

Push for increased collaboration

In order to keep pace and deliver actionable intelligence information in a timely manner, IC agencies will continue efforts for broader cross-agency collaboration, coordination, information sharing, IT integration and elimination of silos.

“Artificial intelligence, cloud computing and analytics solutions are empowering the Intelligence Community to share and analyze growing amounts of information more efficiently, effectively and at a faster rate,” said Angie Petty, Senior Principal Research Analyst at Deltek.

“The volume and diversity of data and intelligence – such as imagery, geospatial and open-source – drive the Intelligence Community’s steady demand for advanced, and increasingly automated, solutions,” said Christine Fritsch, Principal Research Analyst at Deltek.

Department of Defense

The Department of Defense’s IT ecosystem is undergoing a profound transformation. The DOD continues its shift to a new cloud-based infrastructure that will eventually enable the enterprise deployment of artificial intelligence.

Military departments are working to provide capabilities to the warfighting edge using cloud-based platforms that also employ enterprise-wide big data analytics delivered via mobile devices.

Leveraging new acquisition authority and tools such as Other Transaction Agreements (OTAs), process transformation and enhanced cybersecurity capabilities, the DOD is introducing commercial technologies at an unprecedented rate.

Deltek forecasts the Department of Defense contractor addressable IT market to show modest grow from $53.0 billion in FY 2019 to $55.7 billion in FY 2024 at a Compound Annual Growth Rate (CAGR) of 1.0%.

CIO authority driving advancement

The DOD CIO now has greater authority to determine appropriate IT modernization budget levels for the military departments, a development that should help drive common standards and interoperability.

Enterprise-IT-as-a-Service (EITaaS) is important to modernization

EITaaS initiatives are key modernization efforts enabling the DOD to make greater use of cloud computing, advanced analytics and enterprise-level artificial intelligence/machine learning.

Cybersecurity Maturity Model Certification (CMMC) required for entry

The new Pentagon CMMC requiring contractors to obtain 3rd-party cybersecurity certifications in order to be considered for new DOD contracts will be a “gate keeper” for defense contracts going forward.

“The Department of Defense’s transformation to a multi-domain operations construct will naturally drive unprecedented joint force integration across the DOD, which requires common standards, interoperability, and command and control,” said Alex Rossino, Principal Research Analyst at Deltek.

“Cybersecurity and supply chain security are top of mind for the Department of Defense leadership and Congress. Contractors should be prepared for increased scrutiny of their IT environments, as well as the potential financial impact of complying with new security requirements,” said John Slye, Advisory Research Analyst at Deltek.

What students think about university data security

Only 32% of students agree they are aware of how their institution handles their personal data, compared to 45% who disagree and 22% who neither agree nor disagree, according to a Higher Education Policy Institute (HEPI) survey of over 1,000 full-time undergraduate students.

university data security

Perceptions about university data security

Just 31% of students feel their institution has clearly explained how their personal data are used and stored, compared to 46% who disagree and 24% who neither agree nor disagree.

When students were asked whether they are concerned about rumors of university data security issues, 69% of students stated they are concerned. Around one-fifth of students (19%) are unconcerned and 12% are unsure.

65% of students say a higher education institution having a poor security reputation would have made them less likely to apply, compared to around a third (31%) who say it would have made no difference and 4% who said it would have made them more likely to apply.

Only 45% of students feel confident that their institution will keep their personal data secure and private, while 22% are not confident. A third (33%) are unsure.

93% of students agree they should have the right to view any personal information their higher education institution stores about them, 5% neither agree nor disagree and only 2% disagree.

Keeping private data private

When it comes to sharing health or wellbeing information with a student’s parents or guardians, almost half (48%) of respondents say it would be fine for institutions to do so. A further 19% said they neither agree nor disagree and a third (33%) disagree.

Comparatively, only a third (35%) of students were supportive of parents or guardians being contacted about academic performance issues at university, compared to almost half of students (48%) who are opposed and 17% do not take a stance on this issue.

Rachel Hewitt, HEPI’s Director of Policy and Advocacy, said: “Students are required to provide large amounts of data to their universities, including personal and sensitive information. It is critical that universities are open with students about how this information will be used.

“Under a third of students feel their university has clearly explained how their data will be used and shared and under half feel confident that their data will be kept secure and private. Universities should take action to ensure students can have confidence in the security of their data.”

university data security

Michael Natzler, HEPI’s Policy Officer, said: “Students are generally willing for their data to be used anonymously to improve the experience of other students, for example on learning and mental wellbeing. Around half are even happy for information about their health or mental wellbeing to be shared with parents or guardians.

“However, when it comes to identifiable information about them as individuals, students are clear they want this data to be kept confidential between them and their institutions. It is important that universities keep students’ data private where possible and are clear with students when information must be shared more widely.”

Growing complexity is driving operational changes to privacy programs

A majority of companies are adopting a single global data protection strategy to manage evolving privacy programs, and that managing the expanding ecosystem of third parties handling data has become a top priority, a TrustArc report reveals. Evolving ecosystem of partners, customers, and vendors driving risk assessment processes Vendor and third-party risk assessments ranked first among privacy assessments globally, with 78 percent of U.S. respondents reporting that they now conduct them. That figure indicates the … More

The post Growing complexity is driving operational changes to privacy programs appeared first on Help Net Security.

Inadequate data sanitization puts enterprises at risk of breaches and compliance failures

Global enterprises’ overconfidence and inadequate data sanitization are exposing organizations to the risk of data breach, at a time when proper data management should be at the forefront of everything they do, according to Blancco. Three quarters (73 percent) agreed that the large volume of different devices at end-of-life leaves their company vulnerable to a data security breach, while 68 percent said they were very concerned about the risk of data breach related to end-of-life … More

The post Inadequate data sanitization puts enterprises at risk of breaches and compliance failures appeared first on Help Net Security.