Operator‑billed 5G connections revenue to reach $357 billion by 2025

Operator‑billed revenue from 5G connections will reach $357 billion by 2025, rising from $5 billion in 2020, its first full year of commercial service, according to Juniper Research.

5G connections revenue

By 2025, 5G revenue is anticipated to represent 44% of global operator‑billed revenue owing to rapid migration of 4G mobile subscribers to 5G networks and new business use cases enabled by 5G technology.

However, the study identified 5G networks roll-outs as highly resilient to the COVID-19 pandemic. It found that supply chain disruptions caused by the initial pandemic period have been mitigated through modified physical roll-out procedures, in order to maintain the momentum of hardware deployments.

5G connections to generate 250% more revenue than average cellular connection

The study found that 5G uptake had surpassed initial expectations, predicting total 5G connections will surpass 1.5 billion by 2025. It also forecast that the average 5G connection will generate 250% more revenue than an average cellular connection by 2025.

To secure a return on investment into new services, such as uRLLC (Ultra-Reliable Low-Latency Communication) and network slicing, enabled by 5G, operators will apply this premium pricing for 5G connections.

However, these services alongside the high-bandwidth capabilities of 5G will create data-intensive use cases that lead to a 270% growth in data traffic generated by all cellular connections over the next five years.

Networks must increase virtualisation to handle 5G data traffic

Operators must use future launches of standalone 5G network as an opportunity to further increase virtualisation in core networks. Failure to develop 5G network architectures that handle increasing traffic will lead to reduced network functionality, inevitably leading to a diminished value proposition of its 5G network amongst end users.

Research author Sam Barker remarked: “Operators will compete on 5G capabilities, in terms of bandwidth and latency. A lesser 5G offering will lead to user churn to competing networks and missed opportunities in operators’ fastest-growing revenue stream.”

How tech trends and risks shape organizations’ data protection strategy

Trustwave released a report which depicts how technology trends, compromise risks and regulations are shaping how organizations’ data is stored and protected.

data protection strategy

Data protection strategy

The report is based on a recent survey of 966 full-time IT professionals who are cybersecurity decision makers or security influencers within their organizations.

Over 75% of respondents work in organizations with over 500 employees in key geographic regions including the U.S., U.K., Australia and Singapore.

“Data drives the global economy yet protecting databases, where the most critical data resides, remains one of the least focused-on areas in cybersecurity,” said Arthur Wong, CEO at Trustwave.

“Our findings illustrate organizations are under enormous pressure to secure data as workloads migrate off-premises, attacks on cloud services increases and ransomware evolves. Gaining complete visibility of data either at rest or in motion and eliminating threats as they occur are top cybersecurity challenges all industries are facing.”

More sensitive data moving to the cloud

Types of data organizations are moving into the cloud have become increasingly sensitive, therefore a solid data protection strategy is crucial. Ninety-six percent of total respondents stated they plan to move sensitive data to the cloud over the next two years with 52% planning to include highly sensitive data with Australia at 57% leading the regions surveyed.

Not surprisingly, when asked to rate the importance of securing data regarding digital transformation initiatives, an average score of 4.6 out of a possible high of five was tallied.

Hybrid cloud model driving digital transformation and data storage

Of those surveyed, most at 55% use both on-premises and public cloud to store data with 17% using public cloud only. Singapore organizations use the hybrid cloud model most frequently at 73% or 18% higher than the average and U.S. organizations employ it the least at 45%.

Government respondents store data on-premises only the most at 39% or 11% higher than average. Additionally, 48% of respondents stored data using the hybrid cloud model during a recent digital transformation project with only 29% relying solely on their own databases.

Most organizations use multiple cloud services

Seventy percent of organizations surveyed were found to use between two and four public cloud services and 12% use five or more. At 14%, the U.S. had the most instances of using five or more public cloud services followed by the U.K. at 13%, Australia at 9% and Singapore at 9%. Only 18% of organizations queried use zero or just one public cloud service.

Perceived threats do not match actual incidents

Thirty-eight percent of organizations are most concerned with malware and ransomware followed by phishing and social engineering at 18%, application threats 14%, insider threats at 9%, privilege escalation at 7% and misconfiguration attack at 6%.

Interestingly, when asked about actual threats experienced, phishing and social engineering came in first at 27% followed by malware and ransomware at 25%. The U.K. and Singapore experienced the most phishing and social engineering incidents at 32% and 31% and the U.S. and Australia experienced the most malware and ransomware attacks at 30% and 25%.

Respondents in the government sector had the highest incidents of insider threats at 13% or 5% above the average.

Patching practices show room for improvement

A resounding 96% of respondents have patching policies in place, however, of those, 71% rely on automated patching and 29% employ manual patching. Overall, 61% of organizations patched within 24 hours and 28% patched between 24 and 48 hours.

The highest percentage patching within a 24-hour window came from Australia at 66% and the U.K. at 61%. Unfortunately, 4% of organizations took a week to over a month to patch.

Reliance on automation driving key security processes

In addition to a high percentage of organizations using automated patching processes, findings show 89% of respondents employ automation to check for overprivileged users or lock down access credentials once an individual has left their job or changed roles.

This finding correlates to low concern for insider threats and data compromise due to privilege escalation according to the survey. Organizations must exercise caution when assuming removal of user access to applications to also include databases, which is often not the case.

Data regulations having minor impact on database security strategies

When asked if data regulations such as GDPR and CCPA impacted database security strategies, a surprising 60% of respondents said no.

These findings may suggest a lack of alignment between information technology and other departments, such as legal, responsible for helping ensure stipulations like ‘the right to be forgotten’ are properly enforced to avoid severe penalties.

Small teams with big responsibilities

Of those surveyed, 47% had a security team size of only six to 15 members. Respondents from Singapore had the smallest teams with 47% reporting between one and ten members and the U.S. had the largest teams with 22% reporting team size of 21 or more, 2% higher than the average.

Thirty-two percent of government respondents surprisingly run security operations with teams between just six and ten members.

Most US states show signs of a vulnerable election-related infrastructure

75% of all 56 U.S. states and territories leading up to the presidential election, showed signs of a vulnerable IT infrastructure, a SecurityScorecard report reveals.

election infrastructure

Since most state websites offer access to voter and election information, these findings may indicate unforeseen issues leading up to, and following, the US election.

Election infrastructure: High-level findings

Seventy-five percent of U.S. states and territories’ overall cyberhealth are rated a ‘C’ or below; 35% have a ‘D’ and below. States with a grade of ‘C’ are 3x more likely to experience a breach (or incident, such as ransomware) compared to an ‘A’ based on a three-year SecurityScorecard study of historical data. Those with a ‘D’ are nearly 5x more likely to experience a breach.

  • States with the highest scores: Kentucky (95) Kansas (92) Michigan (92)
  • States with the lowest scores: North Dakota (59) Illinois (60) Oklahoma (60)
  • Among states and territories, there are as many ‘F’ scores as there are ‘A’s
  • The Pandemic Effect: Many states’ scores have dropped significantly since January. For example, North Dakota scored a 72 in January and now has a 59. Why? Remote work mandates gave state networks a larger attack surface (e.g., thousands of state workers on home Wi-Fi), making it more difficult to ensure employees are using up-to-date software.

Significant security concerns were observed with two critically important “battleground” states, Iowa and Ohio, both of which scored a 68, or a ‘D’ rating.

The battleground states

According to political experts, the following states are considered “battleground” and will help determine the result of the election. But over half have a lacking overall IT infrastructure:

  • Michigan: 92 (A)
  • North Carolina: 81 (B)
  • Wisconsin: 88 (B)
  • Arizona: 81 (B)
  • Texas: 85 (B)
  • New Hampshire: 77 (C)
  • Pennsylvania: 85 (B)
  • Georgia: 77 (C)
  • Nevada: 74 (C)
  • Iowa: 68 (D)
  • Florida: 73 (C)
  • Ohio: 68 (D)

“The IT infrastructure of state governments should be of critical importance to securing election integrity,” said Alex Heid, Chief Research & Development Officer at SecurityScorecard.

“This is especially true in ‘battleground states’ where the Department of Homeland Security, political parties, campaigns, and state government officials should enforce vigilance through continuously monitoring state voter registration networks and web applications for the purpose of mitigating incoming attacks from malicious actors.

“The digital storage and transmission of voter registration and voter tally data needs to remain flawlessly intact. Some states have been doing well regarding their overall cybersecurity posture, but the vast majority have major improvements to make.”

Potential consequences of lower scores

  • Targeted phishing/malware delivery via e-mail and other mediums, potentially as a means to both infect networks and spread misinformation. Malicious actors often sell access to organizations they have successfully infected.
  • Attacks via third-party vendors – many states use the same vendors, so access into one could mean access to all. This is the top cybersecurity concern for political campaigns.
  • Voter registration databases could be impacted. In the worst-case scenario, attackers could remove voter registrations or change voter precinct information or make crucial systems entirely unavailable on Election Day through ransomware.

“These poor scores have consequences that go beyond elections; the findings show chronic underinvestment in IT by state governments,” said Rob Knake, the former director for cybersecurity policy at the White House in the Obama Administration.

“For instance, combatting COVID-19 requires the federal government to rely on the apparatus of the states. It suggests the need for a massive influx of funds as part of any future stimulus to refresh state IT systems to not only ensure safe and secure elections, but save more lives.”

A set of best practices for states

  • Create dedicated voter and election-specific websites under the domains of the official state domain, rather than using alternative domain names which can be subjected to typosquatting
  • Have an IT team specifically tasked and accountable for bolstering voter and election website cybersecurity: defined as confidentiality, integrity, and availability of all processed information
  • States should establish clear lines of authority for updating the information on these sites that includes the ‘two-person’ rule — no single individual should be able to update information without a second person authorizing it
  • States and counties should continuously monitor the cybersecurity exposure of all assets associated with election systems, and ensure that vendors supplying equipment and services to the election process undergo stringent processes

SaaS adoption prompting concerns over operational complexity and risk

A rise in SaaS adoption is prompting concerns over operational complexity and risk, a BetterCloud report reveals.

SaaS adoption risk

Since 2015, the number of IT-sanctioned SaaS apps has increased tenfold, and it’s expected that by 2025, 85 percent of business apps will be SaaS-based. With SaaS on the rise, 49 percent of respondents are confident in their ability to identify and monitor unsanctioned SaaS usage on company networks—yet 76 percent see unsanctioned apps as a security risk.

And when asked what SaaS applications are likely to hold the most sensitive data across an organization, respondents believe it’s all apps including cloud storage, email, devices, chat apps, password managers, etc.

Concerns when managing SaaS environments

Respondents also highlighted slow, manual management tasks as a prime concern when managing SaaS environments. IT organizations spend over 7 hours offboarding a single employee from a company’s SaaS apps, which takes time and energy from more strategic projects.

“In the earlier part of the year, organizations around the world were faced with powering their entire workforces from home and turned to SaaS to make the shift with as little disruption to productivity as possible,” said David Politis, CEO, BetterCloud.

“Up until this point, most companies were adopting a cloud-first approach for their IT infrastructure — that strategy has now shifted to cloud only. But SaaS growth at this scale has also brought about challenges as our 2020 State of SaaSOps report clearly outlines.

“The findings also show increased confidence and reliance on SaaSOps as the path forward to reigning in SaaS management and security.”

SaaS adoption risk: Key findings

  • On average, organizations use 80 SaaS apps today. This is a 5x increase in just three years and a 10x increase since 2015.
  • The top two motivators for using more SaaS apps are increasing productivity and reducing costs.
  • Only 49 percent of IT professionals inspire confidence in their ability to identify and monitor unsanctioned SaaS usage on company networks—yet more than three-quarters (76 percent) see unsanctioned apps as a security risk.
  • The top five places where sensitive data lives are: 1. files stored in cloud storage, 2. email, 3. devices, 4. chat apps, and 5. password managers. But because SaaS apps have become the system of record, sensitive data inevitably lives everywhere in your SaaS environment.
  • The top two security concerns are sensitive files shared publicly and former employees retaining data access.
  • IT teams spend an average of 7.12 hours offboarding a single employee from a company’s SaaS apps.
  • Thirty percent of respondents already use the term SaaSOps in their job title or plan to include it soon.

For the report surveyed nearly 700 IT leaders and security professionals from the world’s leading enterprise organizations. These individuals ranged in seniority from C-level executives to front-line practitioners and included both IT and security department roles.

How do I select a data storage solution for my business?

We live in the age of data. We are constantly producing it, analyzing it, figuring out how to store and protect it, and, hopefully, using it to refine business practices. Unfortunately, 58% of organizations make decisions based on outdated data.

While enterprises are rapidly deploying technologies for real-time analytics, machine learning and IoT, they are still utilizing legacy storage solutions that are not designed for such data-intensive workloads.

To select a suitable data storage for your business, you need to think about a variety of factors. We’ve talked to several industry leaders to get their insight on the topic.

Phil Bullinger, SVP and General Manager, Data Center Business Unit, Western Digital

select data storage solutionSelecting the right data storage solution for your enterprise requires evaluating and balancing many factors. The most important is aligning the performance and capabilities of the storage system with your critical workloads and their specific bandwidth, application latency and data availability requirements. For example, if your business wants to gain greater insight and value from data through AI, your storage system should be designed to support the accelerated performance and scale requirements of analytics workloads.

Storage systems that maximize the performance potential of solid state drives (SSDs) and the efficiency and scalability of hard disk drives (HDDs) provide the flexibility and configurability to meet a wide range of application workloads.

Your applications should also drive the essential architecture of your storage system, whether directly connected or networked, whether required to store and deliver data as blocks, files, objects or all three, and whether the storage system must efficiently support a wide range of workloads while prioritizing the performance of the most demanding applications.

Consideration should be given to your overall IT data management architecture to support the scalability, data protection, and business continuity assurance required for your enterprise, spanning from core data centers to those distributed at or near the edge and endpoints of your enterprise operations, and integration with your cloud-resident applications, compute and data storage services and resources.

Ben Gitenstein, VP of Product Management, Qumulo

select data storage solutionWhen searching for the right data storage solution to support your organizational needs today and in the future, it’s important to select a solution that is trusted, scalable to secure demanding workloads of any size, and ensures optimal performance of applications and workloads both on premises and in complex, multi- cloud environments.

With the recent pandemic, organizations are digitally transforming faster than ever before, and leveraging the cloud to conduct business. This makes it more important than ever that your storage solution has built in tools for data management across this ecosystem.

When evaluating storage options, be sure to do your homework and ask the right questions. Is it a trusted provider? Would it integrate well within my existing technology infrastructure? Your storage solution should be easy to manage and meet the scale, performance and cloud requirements for any data environment and across multi-cloud environments.

Also, be sure the storage solution gives IT control in how they manage storage capacity needs and delivers real-time insight into analytics and usage patterns so they can make smart storage allocation decisions and maximize an organizations’ storage budget.

David Huskisson, Senior Solutions Manager, Pure Storage

select data storage solutionData backup and disaster recovery features are critically important when selecting a storage solution for your business, as now no organization is immune to ransomware attacks. When systems go down, they need to be recovered as quickly and safely as possibly.

Look for solutions that offer simplicity in management, can ensure backups are viable even when admin credentials are compromised, and can be restored quickly enough to greatly reduce major organizational or financial impact.

Storage solutions that are purpose-built to handle unstructured data are a strong place to start. By definition, unstructured data means unpredictable data that can take any form, size or shape, and can be accessed in any pattern. These capabilities can accelerate small, large, random or sequential data, and consolidate a wide range of workloads on a unified fast file and object storage platform. It should maintain its performance even as the amount of data grows.

If you have an existing backup product, you don’t need to rip and replace it. There are storage platforms with robust integrations that work seamlessly with existing solutions and offer a wide range of data-protection architectures so you can ensure business continuity amid changes.

Tunio Zafer, CEO, pCloud

select data storage solutionBear in mind: your security team needs to assist. Answer these questions to find the right solution: Do you need ‘cold’ storage or cloud storage? If you’re looking to only store files for backup, you need a cloud backup service. If you’re looking to store, edit and share, go for cloud storage. Where are their storage servers located? If your business is located in Europe, the safest choice is a storage service based in Europe.

Best case scenario – your company is going to grow. Look for a storage service that offers scalability. What is their data privacy policy? Research whether someone can legally access your data without your knowledge or consent. Switzerland has one of the strictest data privacy laws globally, so choosing a Swiss-based service is a safe bet. How is your data secured? Look for a service that offers robust encryption in-transit and at-rest.

Client-side encryption means that your data is secured on your device and is transferred already encrypted. What is their support package? At some point, you’re going to need help. A data storage service with a support package that’s included for free, answers in up to 24 hours is preferred.

Priorities and technologies defining the mainframe in the digital enterprise

There’s an overwhelming support for mainstreaming the mainframe, new strategic priorities, and a resurgence of next generation mainframe talent, according to a BMC survey.

mainframe priorities

The study queried over 1000 executives and practitioners on their priorities, challenges, and growth opportunities for the platform. High-level insights include:

  • 90% of respondents see the mainframe as a platform for growth and long-term applications.
  • 68% expect MIPS, the mainframe’s measure of computing performance, to grow.
  • 63% of respondents say security and compliance were their top mainframe priorities.
  • More than half of survey respondents increased mainframe platform data and transaction volume by 25% or more, signaling its ongoing importance in the digital business environment.

“The Mainframe Survey validates that businesses see the mainframe as a critical component of the modern digital enterprise and an emerging hub for innovation,” says Stephen Elliot, Program VP, Management Software and DevOps, IDC.

“They’re putting it to work more and more to support digital business demands as they strive to achieve greater agility and success across the enterprise.”

Top mainframe priorities

With mainframe enterprises competing to bring new, digital experiences to market to delight customers, the survey’s themes are resoundingly strong: adapt, automate, and secure.

Adapt – responses indicated that enterprises’ need to adapt spanned several areas:

  • New processes to keep up with digital demand.
  • Technology demands such as application development/DevOps across the mainframe; 78% of respondents want to be able to update mainframe applications more frequently than currently possible.
  • Changing workforce, as the number of next generation mainframe talent increases along with the number of women working on the platform.

Automate – mainframe modernization continues to play a key role in priorities among respondents with the need to implement AI and machine learning strategies jumping by 8% year over year.

Secure – while the mainframe has a reputation of being a naturally secure platform, respondents are seeing the growing need to fortify its “walls.” Security trumped cost optimization as the leading mainframe priority among respondents for the first time in the 15-year history of the survey.

“Early results were shared with leading industry analysts and key customers from our Mainframe Executive Council in order to validate findings with market sentiment,” states John McKenny, SVP of Mainframe Innovation and Strategy at BMC.

“These conversations further solidified the study’s findings that the platform’s positive outlook and growth is largely due to the need to create intuitive, customer-centric digital experiences. The mainframe continues to shine as innovative, agile, and secure and is a vital component to digital success.”

Workforce demographic shifts

The survey revealed the demographic shifts in mainframe operations, as younger, less experienced staff replaces departing senior staff, and a higher proportion of women respondents than last year.

AWS adds new S3 security and access control features

Amazon Web Services (AWS) has made available three new S3 (Simple Storage Service) security and access control features:

  • Object Ownership
  • Bucket Owner Condition
  • Copy API via Access Points

Object Ownership

Object Ownership is a permission that can be set when creating a new object within an S3 bucket, to enforce the transfer of new object ownership onto the bucket owner.

AWS S3 security

“With the proper permissions in place, S3 already allows multiple AWS accounts to upload objects to the same bucket, with each account retaining ownership and control over the objects. This many-to-one upload model can be handy when using a bucket as a data lake or another type of data repository. Internal teams or external partners can all contribute to the creation of large-scale centralized resources,” explained Jeff Barr, Chief Evangelist for AWS.

But with this set up, the bucket owner doesn’t have full control over the objects in the bucket and therefore cannot use bucket policies to share and manage objects. If the object uploader needs retain access to it, bucket owners will need to grant additional permissions to the uploading account.

“Keep in mind that this feature does not change the ownership of existing objects. Also, note that you will now own more S3 objects than before, which may cause changes to the numbers you see in your reports and other metrics,” Barr added.

Bucket Owner Condition

Bucket Owner Condition allows bucket owners to confirm the ownership when they create a new object or perform other S3 operations.

AWS recommends using Bucket Owner Condition whenever users perform a supported S3 operation and know the account ID of the expected bucket owner.

The feature eliminates the risk of users accidentally interacting with buckets in the wrong AWS account. For example, it prevents situations like applications writing production data into a bucket in a test account.

Copy API via Access Points

S3 Access Points are “unique hostnames that customers create to enforce distinct permissions and network controls for any request made through the access point. Customers with shared data sets […] can easily scale access for hundreds of applications by creating individualized access points with names and permissions customized for each application.”

The feature can now be used together with the S3 CopyObject API, allowing customers to copy data to and from access points within an AWS Region.

How important is monitoring in DevOps?

The importance of monitoring is often left out of discussions about DevOps, but a Gartner report shows how it can lead to superior customer experiences.

DevOps monitoring

The report provides the following key recommendations:

  • Work with DevOps teams during the design phase to add the instrumentation necessary to track business key performance indicators and monitor business metrics in production.
  • Automate the transmission of embedded monitoring results between monitoring and deployment tools to improve application deployments.
  • Use identified business requirements to develop a pipeline for delivering new functionality, and develop monitoring to a practice of continuous learning and feedback across stakeholders and product managers.

While the report focuses on application monitoring, the benefits of early DevOps integration apply equally to database monitoring, according to Grant Fritchey, Redgate DevOps Advocate and Microsoft Data Platform MVP: “In any DevOps pipeline, the database is often the pain point because you need to update it alongside the application while keeping data safe. Monitoring helps database developers identify and fix issues earlier, and minimizes errors when changes are deployed.”

Optimizing performance before releases hit production

Giving development teams access to live monitoring data during database development and testing, for example, can help them optimize performance before releases hit production. They can see immediately if their changes influence operational or performance issues, and drill down to the cause.

Similarly, database monitoring tools can be configured to read and report on deployments made to any server and automatically deliver an alert back to the development team if a problem arises, telling them what happened and how to fix the issue.

This continuous feedback loop not only reduces time spent manually checking for problems, but speeds up communication between database development and operational teams. Most importantly, this activity all takes place on non-production environments, meaning fewer bad customer experiences when accessing production data.

This increased focus on monitoring is prompting many high performing DevOps teams to introduce third-party tools which offer more advanced features like the ability to integrate with the most popular deployment, alerting and ticketing tools.

The advantages

A good example is the financial services sector. Redgate’s report revealed that 66% of businesses in the sector now use a third-party monitoring tool, outpacing all other sectors. And while 61% of businesses deploy database changes once a week or more, compared to 43% across other sectors, issues with deployments are detected faster and recovered from sooner.

The Gartner report states: “By enabling faster recognition and response to issues, monitoring improves system reliability and overall agility, which is a primary objective for new DevOps initiatives.”

Many organizations are discovering there are big advantages in including the database in the monitoring conversation as well.

NIST crowdsourcing challenge aims to de-identify public data sets to protect individual privacy

NIST has launched a crowdsourcing challenge to spur new methods to ensure that important public safety data sets can be de-identified to protect individual privacy.

NIST crowdsourcing challenge

The Differential Privacy Temporal Map Challenge includes a series of contests that will award a total of up to $276,000 for differential privacy solutions for complex data sets that include information on both time and location.

Critical applications vulnerability

For critical applications such as emergency planning and epidemiology, public safety responders may need access to sensitive data, but sharing that data with external analysts can compromise individual privacy.

Even if data is anonymized, malicious parties may be able to link the anonymized records with third-party data and re-identify individuals. And, when data has both geographical and time information, the risk of re-identification increases significantly.

“Temporal map data, with its ability to track a person’s location over a period of time, is particularly helpful to public safety agencies when preparing for disaster response, firefighting and law enforcement tactics,” said Gary Howarth, NIST prize challenge manager.

“The goal of this challenge is to develop solutions that can protect the privacy of individual citizens and first responders when agencies need to share data.”

Protecting PII

Differential privacy provides much stronger data protection than anonymity; it’s a provable mathematical guarantee that protects personally identifiable information (PII).

By fully de-identifying data sets containing PII, researchers can ensure data remains useful while limiting what can be learned about any individual in the data regardless of what third-party information is available.

The individual contests that make up the challenge will include a series of three “sprints” in which participants develop privacy algorithms and compete for prizes, as well as a scoring metrics development contest (A Better Meter Stick for Differential Privacy Contest) and a contest designed to improve the usability of the solvers’ source code (The Open Source and Development Contest).

The Better Meter Stick for Differential Privacy Contest will award a total prize purse of $29,000 for winning submissions that propose novel scoring metrics by which to assess the quality of differentially private algorithms on temporal map data.

The three Temporal Map Algorithms sprints will award a total prize purse of $147,000 over a series of three sprints to develop algorithms that preserve data utility of temporal and spatial map data sets while guaranteeing privacy.

The Open Source and Development Contest will award a total prize purse of $100,000 to teams leading in the sprints to increase their algorithm’s utility and usability for open source audiences.

Average data queries take too long, yet organizations need daily data insights to make decisions

58% of organizations make decisions based on outdated data, according to an Exasol research.

decisions data

The report reveals that 84% of organizations are under increasing pressure to make faster decisions as a result of the COVID-19 pandemic, yet 58% of organizations lack access to real-time insights.

The report further reveals that 63% of respondents confirm that daily insights are needed to make informed business decisions, but these are hampered by long query run times.

A query taking to long to come back

75% of respondents have to wait between 2 hours and a full day for a query to come back, and only 15% of respondents’ query run times are between 15 and 60 minutes. 56% believe they can’t make informed decisions based on their organization’s data.

“As a healthcare, retail, or financial services business you cannot afford to make decisions based on yesterday’s data,” said Rishi Diwan, CPO of Exasol.

“If the pandemic has made one thing clear it’s that business conditions can turn on a dime, yet 6 in 10 businesses find themselves saddled with decision-making infrastructure that is just not responsive enough.”

The report is based on a global survey of 2,500 data decision makers and reveals ample pessimism among data and IT professionals regarding the extent to which current infrastructure set-ups can power a crisis recovery. According to the research:

  • 51% believe their organization’s data infrastructure will need improvements in order to help them recover from macro or micro economic challenges.
  • The top areas highlighted for performance improvement include data literacy (84%), data infrastructure (55%) and data quality (33%). However, 85% report action being taken to improve literacy across the business, which is an encouraging sign.
  • Of the 36% of organizations that have increased the size of their decision-making teams during the COVID-19 pandemic to compensate the long time-to-insights, 86% have experienced an increase in decision-making speed.
  • 69% of respondents reported receiving a higher number of data analytics requests from both multiple business departments and their end-users in recent months.

Demand for data analytics will continue to rise

Going forward, 45% of respondents agreed that demand for data analytics will continue to rise. While the bulk of these requests is expected to come from marketing, operations and sales, demand from all areas is expected to increase, adding to the urgency for organizations to review their data-driven decision-making capabilities.

“One way that organizations compensate for the long time-to-insights during the COVID-19 pandemic is by expanding the number of people with decision-making authority,” said Mathias Golombek, CTO at Exasol.

“Our research clearly shows that organizations want to increase their speed and agility regarding data-driven decisions. Data-democratization and self-service analytics across the organization are the ultimate goal, but existing legacy systems are struggling with these workloads. That’s where a reduction of query response times from hours to seconds is a game changer.”

“If you want to evolve towards a data-driven agile enterprise, you need to start with your existing data infrastructure. Not only must it be set up to support your future growth, but it should also enable data democratization,” said Philip Howard, Bloor Research.

“You should also look at whether your infrastructure can deliver the time to insight – the performance – that you need. Can it scale across all your knowledge workers? Because if it doesn’t do all of these things, then it’s not supporting your business goals and you need to think about changing it.”

70% of consumers would cut ties with doctors over unprotected health data

There are growing privacy concerns among Americans due to COVID-19 with nearly 70 percent citing they would likely sever healthcare provider ties if they found that their personal health data was unprotected, a CynergisTek survey reveals.

unprotected health data

Privacy concerns

And as many employers seek to welcome staff back into physical workplaces, nearly half (45 percent) of Americans expressed concerns about keeping personal health information private from their employer.

“With the enactment of key regulations including CCPA and GDPR, we are seeing the convergence of security and privacy come to the forefront at national, state and corporate levels.

“As healthcare systems and corporations continue to grapple with data challenges associated with COVID-19 – whether that’s more sophisticated, targeted cyber-attacks or the new requirements around interoperability and data sharing, concerns around personal data and consumer awareness of privacy rights will only continue to grow,” said Caleb Barlow, president and CEO of CynergisTek.

Patients contemplate cutting ties over unprotected health data

While many still assume personal data is under lock and key, 18 percent of Americans are beginning to question whether personal health data is being adequately protected by healthcare providers. In fact, 47.5 percent stated they were unlikely to use telehealth services again should a breach occur, sounding the alarm for a burgeoning telehealth industry predicted to be worth over $260B by 2026.

While 3 out of 4 Americans still largely trust their data is properly protected by their healthcare provider, tolerance is beginning to wane with 67 percent stating they would change providers if it was found that their data was not properly protected. When drilling deeper into certain age groups and health conditions, the survey also found that:

  • Gen X (73 percent) and Millennials (70 percent) proved even less tolerant compared to other demographics when parting ways with their providers due to unprotected health data.
  • 66 percent of Americans living with chronic health conditions stated they would be willing to change up care providers should their data be compromised.

Data shows that health systems who have not invested the time, money and resources to keep pace with the ever-changing threat landscape are falling behind. Of the nearly 300 healthcare facilities assessed, less than one half met NIST Cybersecurity Framework guidelines.

Concern about sharing COVID-19 health data upon returning to work

As pressures mount for returning employees to disclose COVID-19 health status and personal interactions, an increasing conflict between ensuring public health safety and upholding employee privacy is emerging.

This is increasingly evident with 45 percent stating a preference to keep personal health information private from their employer, shining a light on increased scrutiny among employees with over 1 in 3 expressing concerns about sharing COVID-19 specific health data, e.g. temperature checks. This highlights that office openings may prove more complicated than anticipated.

“The challenges faced by both healthcare providers and employers during this pandemic have seemed insurmountable at times, but the battle surrounding personal health data and privacy is a challenge we must rise to,” said Russell P. Branzell, president and CEO of the College of Healthcare Information Management Executives.

“With safety and security top of mind for all, it is imperative that these organizations continue to take the necessary steps to fully protect this sensitive data from end to end, mitigating any looming cyberthreats while creating peace of mind for the individual.”

Beyond unwanted employer access to personal data, the survey found that nearly 60 percent of respondents expressed anxieties around their employer sharing personal health data externally to third parties such as insurance companies and employee benefit providers without consent.

A stark contrast to Accenture’s recent survey which found 62 percent of C-suite executives confirmed they were exploring new tools to collect employee data. A reminder to employers to tread lightly when mandating employee health protocols and questionnaires.

“COVID-19 has thrown many curveballs at both healthcare providers and employers, and the privacy and protection of critical patient and employee data must not be ignored,” said David Finn, executive VP of strategic innovation of CynergisTek.

“By getting ahead of the curve and implementing system-wide risk posture assessments and ensuring employee opt-in/opt-out functions when it comes to sharing personal data, these organizations can help limit these privacy and security risks.”

GRC teams have a number of challenges meeting regulatory demands

Senior risk and compliance professionals within financial services company’s lack confidence in the security data they are providing to regulators, according to Panaseer.

GRC regulatory demands

Results from a global external survey of over 200+ GRC leaders reveal concerns on data accuracy, request overload, resource-heavy processes and lack of end-to-end automation.

The results indicate a wider issue with cyber risk management. If GRC leaders don’t have confidence in the accuracy and timeliness of security data provided to regulators, then the same holds true for the confidence in their own ability to understand and combat cyber risks.

41% of risk leaders feel ‘very confident’ that they can fulfill the security-related requests of a regulator in a timely manner. 27.5% are ‘very satisfied’ that their organization’s security reports align to regulatory compliance needs.

GRC leaders cited their top challenges in fulfilling regulator requests, as:

  • Getting access to accurate data (35%)
  • The number of report requests (29%)
  • The length of time it takes to get information from security team (26%)

The limitations of traditional GRC tools

The issue has been perpetuated by the limitations of traditional GRC tools, which rely on qualitative questionnaires to provide evidence of compliance. This does not reflect the current challenges from cyber.

92% of senior risk and compliance professionals believe it would be valuable to have quantitative security controls assurance reporting (vs qualitative) and 93.5% believe it’s important to automate security risk and compliance reporting. However, only 11% state that their risk and compliance reporting is currently automated end to end.

96% said it is important to prioritize security risk remediation based on its impact to the business, but most can’t isolate risk to critical business processes composed of people, applications, devices. Only 33.5% of respondents are ‘very confident’ in their ability to understand all the asset inventories.

GRC regulatory demands

Charaka Goonatilake, CTO, Panaseer: “Faced with increasing requests from regulators, GRC leaders have resorted to throwing a lot of people at time-sensitive requests. These manual processes combined with lack of GRC tool scalability necessitates data sampling, which means they cannot have complete visibility or full confidence in the data they are providing.

“The challenge is being exacerbated by new risks introduced by IoT sensors and endpoints, which rarely consider security a core requirement and therefore introduce greater risk and increase the importance of controls and mitigations to address them.”

Andreas Wuchner, Panaseer Advisory Board member: “To face the new reality of cyberthreats and regulatory pressures requires many organizations need to fundamentally rethink traditional tools and defences.

“GRC leaders can enhance their confidence to accurately and quickly meet stakeholder needs by implementing Continuous Controls Monitoring, an emerging category of security and risk, which has just been recognised in the 2020 Gartner Risk Management Hype Cycle.”

Cyberwarfare predicted to damage the economy in the coming year

71% of CISOs believe cyberwarfare is a threat to their organization, and yet 22% admit to not having a strategy in place to mitigate this risk. This is especially alarming during a period of unprecedented global disruption, as 50% of infosec professionals agree that the increase of cyberwarfare will be detrimental to the economy in the next 12 months.

cyberwarfare

CISOs and infosec professionals however are shoring up their defenses — with 51% and 48% respectively stating that they believe they will need a strategy against cyberwarfare in the next 12-18 months.

These findings, and more, are revealed in Bitdefender’s global 10 in 10 Study, which highlights how, in the next 10 years, cybersecurity success lies in the adaptability of security decision makers, while simultaneously looking back into the last decade to see if valuable lessons have already been learnt about the need to make tangible changes in areas such as diversity.

It explores, in detail, the gap between how security decision makers and infosec professionals view the current security landscape and reveals the changes they know they will need to make in the upcoming months and years of the 2020s.

The study takes into account the views and opinions of more than 6,724 infosec professionals representing a broad cross-section of organizations from small 101+ employee businesses to publicly listed 10,000+ person enterprises in a wide variety of industries, including technology, finance, healthcare and government.

The rise and fall (and rise again) of ransomware

Outside of the rise of cyberwarfare threats, an old threat is rearing its head — ransomware. During the disruption of 2020, ransomware has surged with as much as 43% of infosec professionals reporting that they are seeing a rise in ransomware attacks.

What’s more concerning is that 70% of CISOs/CIOs and 63% of infosec professionals expect to see an increase in ransomware attacks in the next 12-18 months. This is of particular interest as 49% of CISOs/CIOs and 42% of infosec professionals are worried that a ransomware attack could wipe out the business in the next 12-18 months if they don’t increase investment in security.

But what is driving the rise in ransomware attacks? Some suggest it’s because more people are working from home — which makes them an easier target outside of the corporate firewall. The truth might however be tied to money.

59% of CISOs/CIOs and 50% of infosec professionals believe that the business they work for would pay the ransom in order to prevent its data/information from being published — making ransomware a potential cash cow.

A step change in communication is in high demand

Cyberwarfare and ransomware are complex topics to unpack, amongst many others in infosec. The inherent complexity of infosec topics does however make it hard to gain internal investment and support for projects. This is why infosec professionals believe a change is needed.

In fact, 51% of infosec professionals agree that in order to increase investment in cybersecurity, the way that they communicate about security has to change dramatically. This number jumps up to 55% amongst CISOs and CIOs — many of whom have a seat at the most senior decision-making table in their organizations.

The question is, what changes need to be made? 41% of infosec professionals believe that in the future more communication with the wider public and customers is needed so everyone, both in and organization and outside, better understands the risks.

In addition, 38% point out that there is a need for the facilitation of better communication with the C-suite, especially when it comes to understanding the wider business risks.

And last, but not least, as much as 31% of infosec professionals believe using less technical language would help the industry communicate better, so that the whole organization could understand the risks and how to stay protected.

“The reason that 63% of infosec professionals believe that cyberwarfare is a threat to their organization is easy,” said Neeraj Suri, Distinguished Professorship and Chair in Cybersecurity at Lancaster University.

“Dependency on technology is at an all-time high and if someone was to take out the WiFi in a home or office, no one would be able to do anything. This dependency wasn’t there a few years back–it wasn’t even as high a few months back.

“This high dependency on technology doesn’t just open the door for ransomware or IoT threats on an individual level, but also to cyberwarfare which can be so catastrophic it can ruin economies.

“The reason that nearly a quarter of infosec pros don’t currently have a strategy to protect against cyberwarfare is likely because of complacency. Since they haven’t suffered an attack or haven’t seen on a wide scale–the damage that can be done–they haven’t invested the time in protecting against it.”

Diversity, and specifically neurodiversity, is key to future success

Outside of the drastic changes that are needed in the way cybersecurity professionals communicate, there’s also a need to make a change within the very makeup of the workforce. The infosec industry as a whole has long suffered from a skills shortage, and this looks to remain an ongoing and increasingly obvious issue.

15% of infosec professionals believe that the biggest development in cybersecurity over the next 12-18 months will be the skills gap increasing. If the skills deficit continues for another five years, 28% of CISOs and CIOs say they believe that it will destroy businesses.

And another 50% of infosec professionals believe that the skills gap will be seriously disruptive if it continues for the next 5 years.

Today, however, it will take more than just recruiting skilled workers to make a positive change and protect organizations. In 2015, 52% of infosec workers would have agreed that there is a lack of diversity in cybersecurity and that it’s a concern.

Five years later, in 2020, this remains exactly the same — and that is a significant problem as 40% of CISOs/CIOs and infosec professionals say that the cybersecurity industry should reflect the society around it to be effective.

What’s more, 76% of CISOs/CIOs, and 72% of infosec professionals, believe that there is a need for a more diverse skill set among those tackling cybersecurity tasks. This is because 38% of infosec professionals say that neurodiversity will make cybersecurity defenses stronger, and 33% revealed a more neurodiverse workforce will level the playing field against bad actors.

While it’s clear that the cybersecurity skills gap is here to stay, it’s also clear why changes need to be made to the makeup of the industry.

cyberwarfare

Liviu Arsene, Global Cybersecurity Researcher at Bitdefender concludes, “2020 has been a year of change, not only for the world at large, but for the security industry. The security landscape is rapidly evolving as it tries to adapt to the new normal, from distributed workforces to new threats. Amongst the new threats is cyberwarfare.

“It’s of great concern to businesses and the economy — and yet not everyone is prepared for it. At the same time, infosec professionals have had to keep up with new threats from an old source, ransomware, that can affect companies’ bottom lines if not handled carefully.

“The one thing we know is that the security landscape will continue to evolve. Changes will happen, but we can now make sure they happen for better and not for worse. To succeed in the new security landscape, the way we as an industry talk about security has to become more accessible to a wider audience to gain support and investment from within the business.

“In addition, we have to start thinking about plugging the skills gap in a different way — we have to focus on diversity, and specifically neurodiversity, if we are to stand our ground and ultimately defeat bad actors.”

Improving privacy of a global genomic data sharing network

A Case Western Reserve University computer and data sciences researcher is working to shore up privacy protections for people whose genomic information is stored in a vast global collection of vital, personal data.

genomic data sharing network

Erman Ayday pursued novel methods for identifying and analyzing privacy vulnerabilities in the genomic data sharing network known commonly as “the Beacons.”

Personal genomic data refers to each person’s unique genome, his or her genetic makeup, information that can be gleaned from DNA analysis of a blood test, or saliva sample.

Ayday plans to identify weaknesses in the beacons’ infrastructure, developing more complex algorithms to protect against people or organizations who share his ability to figure out one person’s identity or sensitive genomic information using publicly available information.

Doing that will also protect the public – people who voluntarily shared their genomic information to hospitals where they were treated – with the understanding that their identity or sensitive information would not be revealed.

“While the shared use of genomics data is valuable to research, it is also potentially dangerous to the individual if their identity is revealed,” Ayday said. “Someone else knowing your genome is power – power over you. And, generally, people aren’t really aware of this, but we’re starting to see how genomic data can be shared, abused.”

Other research has shown that “if someone had access to your genome sequence–either directly from your saliva or other tissues, or from a popular genomic information service – they could check to see if you appear in a database of people with certain medical conditions, such as heart disease, lung cancer, or autism.”

Human genomic research

Genomics may sometimes be confused or conflated with genetics, but the terms refer to related, but different fields of study:

  • Genetics refers to the study of genes and the way that certain traits or conditions are passed down from one generation to another. It involves scientific studies of genes and their effects. Genes (units of heredity) carry the instructions for making proteins, which direct the activities of cells and functions of the body.
  • Genomics is a more recent term that describes the study of all of a person’s genes (the genome), including interactions of those genes with each other and with the person’s environment. Genomics includes the scientific study of complex diseases such as heart disease, asthma, diabetes, and cancer because these diseases are typically caused more by a combination of genetic and environmental factors than by individual genes.

There has been an ever-growing cache of genomic information since the conclusion of the Human Genome Project in 2003, the 13-year-long endeavor to “discover all the estimated 20,000-25,000 human genes and make them accessible for further biological study” as well as complete the DNA sequencing of 3 billion DNA subunits for research.

Popular genealogy sites such as Ancestry.com and 23andMe rely on this information–compared against their own accumulation of genetic information and analyzed by proprietary algorithms–to discern a person’s ancestry, for example.

Ayday said companies, government organizations and others are also tapping into genomic data. “The military can check the genome of recruits, insurance companies can check whether someone has a predisposition to a certain disease,” he said. “There are plenty of real life examples already.”

Scientists researching genomics are accessing shared DNA data considered critical to advance biomedical research. To access the shared data, researchers send digital requests (“queries”) to certain beacons, each specializing in different genetic mutations.

What are ‘the Beacons?’

The Beacon Network is an array of about 100 data repositories of human genome coding, coordinated by the Global Alliance for Genomics & Health (in collaboration with a Europe-based system called ELIXIR).

And while “queries do not return information about single individuals,” according to the site, a scientific study in 2015 revealed that someone could infer the membership of a particular beacon by sending that site an excessive number of queries.

“And then we used a more sophisticated algorithm and showed that you don’t need thousands of queries,” Ayday said. “We did it by sending less than 10.”

Then, in a follow-up study, Ayday and team showed that someone could also reconstruct the entire genome for an individual with the information from only a handful of queries.

“That’s a big problem,” Ayday said. “Information about one, single individual should not be that easily found out.”

Ongoing and initial costs top list of barriers to 5G implementation

5G is set to deliver higher data transfer rates for mission-critical communications and will allow massive broadband capacities, enabling high-speed communication across various applications such as the Internet of Things (IoT), robotics, advanced analytics and artificial intelligence.

barriers 5G implementation

According to a study from CommScope, only 46% of respondents feel their current network infrastructure is capable of supporting 5G, but 68% think 5G will have a significant impact on their agency operations within one to four years.

Of the respondents who do not feel their current infrastructure is capable of supporting 5G, none have deployed 5G, 19% are piloting, 43% are planning to pilot, and 52% are not planning or evaluating whether to pilot 5G.

Costs reported as top barriers to 5G implementation

According to the report, ongoing and initial costs are reported as top barriers for federal agencies wishing to implement 5G – 44% believe initial/up-front costs will be the biggest barrier and 49% are concerned about ongoing costs.

“There is no single approach to 5G and no one-size-fits-all 5G solution,” said Chris Collura, vice president, Federal business for CommScope.

“This study indicates that federal agencies are at the beginning stages of 5G evaluation and deployment. As they are looking to finalize their strategy for connectivity, agencies should also consider private networks, whether those are private LTE networks, private 5G networks, or a migration from one to the other to ensure flexibility and scalability.”

Desired outcomes for federal agencies

Remote employee productivity (40%) is one of the top desired outcomes for federal agencies looking to implement 5G, along with introducing high bandwidth (39%), higher throughput (39%) and better connectivity (38%).

Additional findings from the study include:

  • 32% hope that 5G will make it easier to share information securely and 32% would like to see easier access to data
  • 82% plan to or have already adopted 5G with 6% having already deployed 5G, 14% piloting 5G and 62% evaluating/planning to pilot 5G
  • 71% are looking at hardware, software or endpoint upgrades to support 5G
  • 83% believe it is very/somewhat important for mission-critical traffic on the agency network to remain onsite while 64% feel it is very/somewhat important

Mobile messengers expose billions of users to privacy attacks

Popular mobile messengers expose personal data via discovery services that allow users to find contacts based on phone numbers from their address book, according to researchers.

mobile messengers privacy

When installing a mobile messenger like WhatsApp, new users can instantly start texting existing contacts based on the phone numbers stored on their device. For this to happen, users must grant the app permission to access and regularly upload their address book to company servers in a process called mobile contact discovery.

A recent study by a team of researchers from the Secure Software Systems Group at the University of Würzburg and the Cryptography and Privacy Engineering Group at TU Darmstadt shows that currently deployed contact discovery services severely threaten the privacy of billions of users.

Utilizing very few resources, the researchers were able to perform practical crawling attacks on the popular messengers WhatsApp, Signal, and Telegram. The results of the experiments demonstrate that malicious users or hackers can collect sensitive data at a large scale and without noteworthy restrictions by querying contact discovery services for random phone numbers.

Attackers are enabled to build accurate behavior models

For the extensive study, the researchers queried 10% of all US mobile phone numbers for WhatsApp and 100% for Signal. Thereby, they were able to gather personal (meta) data commonly stored in the messengers’ user profiles, including profile pictures, nicknames, status texts and the “last online” time.

The analyzed data also reveals interesting statistics about user behavior. For example, very few users change the default privacy settings, which for most messengers are not privacy-friendly at all.

The researchers found that about 50% of WhatsApp users in the US have a public profile picture and 90% a public “About” text. Interestingly, 40% of Signal users, which can be assumed to be more privacy concerned in general, are also using WhatsApp, and every other of those Signal users has a public profile picture on WhatsApp.

Tracking such data over time enables attackers to build accurate behavior models. When the data is matched across social networks and public data sources, third parties can also build detailed profiles, for example to scam users.

For Telegram, the researchers found that its contact discovery service exposes sensitive information even about owners of phone numbers who are not registered with the service.

Which information is revealed during contact discovery and can be collected via crawling attacks depends on the service provider and the privacy settings of the user. WhatsApp and Telegram, for example, transmit the user’s entire address book to their servers.

More privacy-concerned messengers like Signal transfer only short cryptographic hash values of phone numbers or rely on trusted hardware. However, the research team shows that with new and optimized attack strategies, the low entropy of phone numbers enables attackers to deduce corresponding phone numbers from cryptographic hashes within milliseconds.

Moreover, since there are no noteworthy restrictions for signing up with messaging services, any third party can create a large number of accounts to crawl the user database of a messenger for information by requesting data for random phone numbers.

“We strongly advise all users of messenger apps to revisit their privacy settings. This is currently the most effective protection against our investigated crawling attacks,” agree Prof. Alexandra Dmitrienko (University of Würzburg) and Prof. Thomas Schneider (TU Darmstadt).

Impact of research results: Service providers improve their security measures

The research team reported their findings to the respective service providers. As a result, WhatsApp has improved their protection mechanisms such that large-scale attacks can be detected, and Signal has reduced the number of possible queries to complicate crawling.

The researchers also proposed many other mitigation techniques, including a new contact discovery method that could be adopted to further reduce the efficiency of attacks without negatively impacting usability.

GAIA-X to strenghten European digital infrastructure sovereignity

The GAIA-X Initiative announced that it is one step closer to its goal of a trustworthy, sovereign digital infrastructure for Europe, with the official signing of incorporation papers for GAIA-X AISBL, a non-profit association that will take the project to the next level.

GAIA-X Europe

GAIA-X: A vision for Europe

The initiative’s twenty-two founding members signed the documents in Brussels to create an association for securing funding and commitment from members to fulfill the initiative’s vision for Europe.

“We are deeply motivated to meet the challenges of the European digital economy,” said Servane Augier, COO at 3DS OUTSCALE.

“Through GAIA-X, we are building, all together, a sovereign and reliable digital infrastructure and an ecosystem for innovation in Europe. In this way, we will strengthen the digital sovereignty of businesses, research and education, governments and society as a whole.”

Seeking active participation and membership

While final incorporation is pending, the founding members of GAIA-X AISBL are seeking active participation and membership from national and multi-national, European and non-European companies, as well as partners in the worlds of science and politics, who share European standards and values.

The association views its members as the primary drivers of progress and innovation, working closely together to define standards and prototype implementations from both provider and user perspectives.

“The BMW Group sees the future of automotive software in the cloud, whether it is about pioneering IT solutions for the development and production of premium vehicles, new digital services for our customers or innovative features in the car,” said Marco Görgmaier, Head of DevOps Platform and Cloud Technologies, The BMW Group.

”Participation in the GAIA-X project is a logical step in our intention to further expand our innovative strength. The goals of the GAIA-X project—striving for data sovereignty, reducing dependencies, establishing cloud services on a broad scale and creating an open ecosystem for innovation—are fully in line with our own efforts. “

Setting-up head office in Brussels

As the incorporation process is moving forward, the association will continue to set-up its head office in Brussels and establish key organizational structures.

Overall, the GAIA-X founders aim to establish a culture of trust, knowledge exchange and transparency. They anticipate that as the membership of GAIA-X grows, it will be able to have an increasing impact on innovation and collaboration in the development of technical solutions and standards for business, science and society across Europe.

What are the most vulnerable departments and sectors to phishing attacks?

While cyber attackers chase down system vulnerabilities and valuable data each passing day, the business world has taken the measures against them. The latest trends and cybersecurity statistics reveal that data from various sources, especially mobile and IoT devices, is targeted and attacked. Organizations face the risk of data loss due to unprotected data and weak cyber security practices.

vulnerable sectors phishing

In the first half of last year, 4.1 billion of data records were exposed, while the average time needed to detect a leak was 206 days. While the average loss caused by a data breach is estimated at $3.86 million for businesses, cyberattacks will create over $5 trillion in losses globally in the next year.

Keepnet Labs has revealed the most vulnerable departments and sectors against phishing attacks, based on a data set of 410 thousand phishing emails, covering a period of one year.

Accordingly, 90% of successful cyber attacks occur through email-based attacks. These cyberattacks use deceptive, deceptive and fraudulent social engineering techniques, especially to bypass various security mechanisms / controls.

1 out of 8 people share the information requested by attackers

According to the report, which identifies the sectors and departments that are most vulnerable to phishing attacks:

  • 1 out of 2 employees opens and reads phishing emails.
  • 1 out of 3 employees clicks links or opens file attachments in phishing emails (which may cause silent installation of malware / ransomware).
  • 1 out of every 8 employees shares the information requested in phishing emails.

Sectors most vulnerable to cyber attacks

Top 5 sectors with the highest click rates on malicious links in phishing emails:

  • Consulting (63%)
  • Clothing and accessories (48%)
  • Education (47%)
  • Technology (40%)
  • Holdings / conglomerates (32.37%)

Sectors with the highest rates of data sharing:

  • Clothing and accessories (43%)
  • Consulting (30%)
  • Securities and stock exchange (23%)
  • Education (22%)

Corporate departments most affected by cyber attacks

The top three departments with the highest rates of clicking fake links in phishing emails:

  • Law / audit / internal control (59%)
  • Procurement / administrative affairs (58%)
  • Quality management / Health (56%)

While the findings reveal that these departments have not changed according to last year’s statistics, the report concludes that most of the sensitive information needed by cybercriminals is accessible via users working in these vulnerable units.

This in turn poses a serious threat to their respective organizations, because employees with such privileged access to this prized information are the key people in those organizations who motivate the hackers to infiltrate organizations and execute their intended, malicious campaigns.

The top three departments with the highest rates of sharing data:

  • Quality management / health (27%)
  • Procurement / administrative affairs (26%)
  • Legal / audit / internal control (25%)

These statistics reveal that certain departments are more inclined to share sensitive information compared to others, and considering their position, they should be much more careful against cyber attacks.

Are today’s organizations ready for the data age?

67% of business and IT managers expect the sheer quantity of data to grow nearly five times by 2025, a Splunk survey reveals.

data age

The research shows that leaders see the significant opportunity in this explosion of data and believe data is extremely or very valuable to their organization in terms of: overall success (81%), innovation (75%) and cybersecurity (78%).

81% of survey respondents believe data to be very or highly valuable yet 57% fear that the volume of data is growing faster than their organizations’ ability to keep up.

“The aata age is here. We can now quantify how data is taking center stage in industries around the world. As this new research demonstrates, organizations understand the value of data, but are overwhelmed by the task of adjusting to the many opportunities and threats this new reality presents,” said Doug Merritt, President and CEO, Splunk.

“There are boundless opportunities for organizations willing to quickly learn and adapt, embrace new technologies and harness the power of data.”

The data age has been accelerated by emerging technologies powered by, and contributing to, exponential data growth. Chief among these emerging technologies are Edge Computing, 5G networking, IoT, AI/ML, AR/VR and Blockchain.

It’s these very same technologies 49% of those surveyed expect to use to harness the power of data, but across technologies, on average, just 42% feel they have high levels of understanding of all six.

Data is valuable, and data anxiety is real

To thrive in this new age, every organization needs a complete view of its data — real-time insight, with the ability to take real-time action. But many organizations feel overwhelmed and unprepared. The study quantifies the emergence of a data age as well as the recognition that organizations have some work to do in order to use data effectively and be successful.

  • Data is extremely or very valuable to organizations in terms of: overall success (81%), innovation (75%) and cybersecurity (78%).
  • And yet, 66% of IT and business managers report that half or more of their organizations’ data is dark (untapped, unknown, unused) — a 10% increase over the previous year.
  • 57% say the volume of data is growing faster than their organizations’ ability to keep up.
  • 47% acknowledge their organizations will fall behind when faced with rapid data volume growth.

Some industries are more prepared than others

The study quantifies the emergence of a data age and the adoption of emerging technologies across industries, including:

  • Across industries, IoT has the most current users (but only 28%). 5G has the fewest and has the shortest implementation timeline at 2.6 years.
  • Confidence in understanding of 5G’s potential varies: 59% in France, 62% in China and only 24% in Japan.
  • For five of the six technologies, financial services leads in terms of current development of use cases. Retail comes second in most cases, though retailers lag notably in adoption of AI.
  • 62% of healthcare organizations say that half or more of their data is dark and that they struggle to manage and leverage data.
  • The public sector lags commercial organizations in adoption of emerging technologies.
  • Manufacturing leaders predict growth in data volume (78%) than in any other industry; 76% expect the value of data to continue to rise.

Some countries are more prepared than others

The study also found that countries seen as technology leaders, like the U.S. and China, are more likely to be optimistic about their ability to harness the opportunities of the data age.

  • 90% of business leaders from China expect the value of data to grow. They are by far the most optimistic about the impact of emerging technologies, and they are getting ready. 83% of Chinese organizations are prepared, or are preparing, for rapid data growth compared to just 47% across all regions.
  • U.S. leaders are the second most confident in their ability to prepare for rapid data growth, with 59% indicating that they are at least somewhat confident.
  • In France, 59% of respondents say that no one in their organization is having conversations about the impact of the data age. Meanwhile, in Japan 67% say their organization is struggling to stay up to date, compared to the global average of 58%.
  • U.K. managers report relatively low current usage of emerging technologies but are optimistic about plans to use them in the future. For example, just 19% of U.K. respondents say they are currently using AI/ML technologies, but 58% say they will use them in the near future.