According to IBM’s Cost of a Data Breach Report 2020, the average time it took a company in 2019 to identify and contain a breach was 279 days. It was 266 days in 2018 and the average over the past five years was a combined 280 days. In other words, things haven’t gotten much better. It’s clear that time is not on CISOs’ side and they need to act fast.
What’s holding organizations back when it comes to detecting and remediating data breaches?
Let’s consider the top challenges facing security operations centers (SOCs). First, there are too many alerts, which makes it difficult to prioritize those that deserve immediate attention and investigation.
Also, there’s no unified view of the security information generated by the layers of tools deployed by most large enterprises. Finally, these problems are compounded by the fact that organizations are using hybrid on-premises and cloud architectures, as well as purely cloud environments.
Another major obstacle facing SOCs is that threat hunting and investigations are still manually intensive activities. They are complicated by the fact that the data sources SOCs use are decentralized and must be accessed from different consoles.
SOCs also lack visibility into a very significant component of threat hunting: identity. It has taken an even more prominent role now that so many people are working remotely due to COVID-19.
The analysis, control and response planes in current security architectures are not integrated. In other words, analytics are separated from the administration and investigation stack, which is also separated from the tools used to intercept adversaries and shut down an attack.
A new architecture has emerged called XDR, which stands for “extended detection and response.” Research firm Gartner listed XDR as one of its top 9 security and risk trends for 2020. XDR flips the current security model on its head by replacing the traditional top-down approach with a bottom-up approach to deliver more precise and higher fidelity results.
The primary driver behind XDR is its fusing of analytics with detection and response. The premise is that these functions are not and should not be separate. By bringing them together, XDR promises to deliver many benefits.
The first is a precise response to threats. Instead of keeping logs in a separate silo, with XDR they can be used to immediately drive response actions with higher fidelity and greater depth knowledge into the details surrounding an incident. For example, the traditional SIEM approach is based on monitoring network log data for threats and responding on the network.
Unless a threat is simple, like commodity malware that can be easily cleaned up, remediation is typically delayed until a manual investigation is performed. XDR, on the other hand, provides SOCs both the visibility and ability to not just respond but also remediate. SOC operators can take precise rather than broad actions, and not just across the network, but also the endpoint and other areas.
Because XDR seeks to fuse the analysis, control and response planes, it provides a unified view of threats. Instead of forcing SOCs to use multiple interfaces to threat hunt and investigate, event data and analytics are brought together in XDR to provide the full context needed to precisely respond to an incident.
Unlike the SIEM model, which centralizes logs for SOCs to figure out what’s important, XDR begins with a view of what’s important and then uses logs to inform response and remediations actions. This is fundamental to how XDR inverts traditional SIEM and SOC workflows.
Another important benefit of XDR is that it provides SOCs the ability to investigate and respond to incidents from the same security technology platform. For example, an alert or analytics indicator might be generated from the endpoint which initiates an investigative workflow that is then augmented with network logs or other system logs that are part of the XDR platform for greater context.
Instead of moving between different consoles, all the data sources are in one place. XDR enables SOC operators to resolve and close out a workflow on the same technology platform where it was initiated.
Currently, most organizations have tools that can initiate a workflow and others that can augment a workflow, but very few that can actually resolve a workflow. The goal of XDR is to provide a single environment where incidents can be initiated, investigated and remediated.
Finally, by fusing analytics, the network and the endpoint, SOCs can respond to incidents across a variety of control planes, and customize actions based on the event, the system criticality, the adversary activity, etc.
What XDR makes possible
With XDR, SOCs can force a re-log on, or a log off through the integration with IAM tools. They can contain a host because they are directly connected to the end point. Using network analysis and visibility XDR can provide deeper insight and context into threats, including whether they are moving laterally, have exfiltrated data, and more.
Ultimately, XDR makes it possible for SOCs to respond to incidents in ways that were not possible in the past, such as taking more surgical network-based remediation actions.
Making XDR a reality requires implementing a horizontal plane that connects all existing security silos to unify analysis, control, and response – which won’t happen overnight. The benefits of XDR, however, are well worth the effort.
58% of organizations make decisions based on outdated data, according to an Exasol research.
The report reveals that 84% of organizations are under increasing pressure to make faster decisions as a result of the COVID-19 pandemic, yet 58% of organizations lack access to real-time insights.
The report further reveals that 63% of respondents confirm that daily insights are needed to make informed business decisions, but these are hampered by long query run times.
A query taking to long to come back
75% of respondents have to wait between 2 hours and a full day for a query to come back, and only 15% of respondents’ query run times are between 15 and 60 minutes. 56% believe they can’t make informed decisions based on their organization’s data.
“As a healthcare, retail, or financial services business you cannot afford to make decisions based on yesterday’s data,” said Rishi Diwan, CPO of Exasol.
“If the pandemic has made one thing clear it’s that business conditions can turn on a dime, yet 6 in 10 businesses find themselves saddled with decision-making infrastructure that is just not responsive enough.”
The report is based on a global survey of 2,500 data decision makers and reveals ample pessimism among data and IT professionals regarding the extent to which current infrastructure set-ups can power a crisis recovery. According to the research:
- 51% believe their organization’s data infrastructure will need improvements in order to help them recover from macro or micro economic challenges.
- The top areas highlighted for performance improvement include data literacy (84%), data infrastructure (55%) and data quality (33%). However, 85% report action being taken to improve literacy across the business, which is an encouraging sign.
- Of the 36% of organizations that have increased the size of their decision-making teams during the COVID-19 pandemic to compensate the long time-to-insights, 86% have experienced an increase in decision-making speed.
- 69% of respondents reported receiving a higher number of data analytics requests from both multiple business departments and their end-users in recent months.
Demand for data analytics will continue to rise
Going forward, 45% of respondents agreed that demand for data analytics will continue to rise. While the bulk of these requests is expected to come from marketing, operations and sales, demand from all areas is expected to increase, adding to the urgency for organizations to review their data-driven decision-making capabilities.
“One way that organizations compensate for the long time-to-insights during the COVID-19 pandemic is by expanding the number of people with decision-making authority,” said Mathias Golombek, CTO at Exasol.
“Our research clearly shows that organizations want to increase their speed and agility regarding data-driven decisions. Data-democratization and self-service analytics across the organization are the ultimate goal, but existing legacy systems are struggling with these workloads. That’s where a reduction of query response times from hours to seconds is a game changer.”
“If you want to evolve towards a data-driven agile enterprise, you need to start with your existing data infrastructure. Not only must it be set up to support your future growth, but it should also enable data democratization,” said Philip Howard, Bloor Research.
“You should also look at whether your infrastructure can deliver the time to insight – the performance – that you need. Can it scale across all your knowledge workers? Because if it doesn’t do all of these things, then it’s not supporting your business goals and you need to think about changing it.”
Cybersecurity: Main focus for planned projects
IT leaders also revealed that adapting culture quickly to new ways of working is the number one challenge they need to overcome in the next 12 months. The findings are unveiled following a survey of 600+ attendees for the upcoming DTX: NOW event.
26 percent of respondents cited cybersecurity as the main focus for planned projects, followed by cloud (21 percent), data analytics (15 percent) and network infrastructure (14 percent). According to separate research there were more hands-on-keyboard intrusions in the first half of 2020 that in the entirety of 2019.
IT leaders revealed that adapting digital culture for a new world of work was the main challenge they need to overcome in the next year (18 percent), followed by automation of business tasks and processes (14 percent), and choosing the right cloud strategy (12 percent).
Most significant barriers to digital transformation projects
The biggest barriers to delivering digital transformation projects on time and on budget reflect changing organizational dynamics that are being intensified by COVID-19. The most significant barrier to projects was revealed to be changing scope (29 percent of respondents), reduced budgets (24 percent) and changing team structure (17 percent).
The data also indicates that digital transformation has become a priority for businesses of every size. 58 percent of projects are anticipated to come in at less than £250,000, and just 22 percent have a budget of over £500,000 and 10 percent over £1 million.
“COVID-19 is a catalyst for digital transformation, but it’s a leveller too. We’re hearing from IT leaders that there is a shift in which technologies businesses are investing in.
“Ensuring the vast majority of employees could work from home practically overnight has exposed issues with IT strategy, and modernising the core tech stack has become an immediate priority for just about every organization”, said James McGough, managing director of Imago Techmedia.
“Many businesses have found that areas like cybersecurity measures, network infrastructure and cloud strategy need urgent adaptation for a distributed workforce.
“Some companies might be in a position to consider the likes of AI, blockchain and quantum computing, but the reality for most is that the future-looking, big ticket tech projects are on the back burner for now. Companies of every size are finding themselves restarting their digital transformation journeys,” McGough concluded.
The Network Computing, Communications and Storage research group at Aarhus University has developed a completely new way to compress data. The new technique provides possibility to analyze data directly on compressed files, and it may have a major impact on the so-called “data tsunami” from massive amounts of IoT devices.
The method will now be further developed, and it will form the framework for an end-to-end solution to help scale-down the exponentially increasing volumes of data from IoT devices.
“Today, if you need just 1 Byte of data from a 100 MB compressed file, you usually have to decompress a significant part of the whole file to access to the data. Our technology enables random access to the compressed data. It means that you can access 1 Byte data at the cost of only decompressing less than 100 Bytes, which is several orders of magnitude lower compared to the state-of-the-art technologies. This could have a huge impact on data accessibility, data processing speed and the cloud storage infrastructure,” says Associate Professor Qi Zhang from Aarhus University.
Compressed IoT data
The compression technique makes it feasible to compress IoT data (typically data in time series) in real time before the data is sent to the cloud. After this, the typical data analytics could be carried out directly on the compressed data. There is no need to decompress all the data or large amounts of it in order to carry out an analysis.
This could potentially alleviate the ever-increasing pressure on the communication and data storage infrastructure. The research group believes that the project’s results will serve as a foundation for the development of sustainable IoT solutions, and that it could have a profound impact on digitalization:
“Today, IoT data is constantly being streamed to the cloud, and as consequence of the massive amounts of IoT devices deployed globally an exponential data growth is expected. Conventionally, to allow fast frequent data retrieval and analysis, it is preferable to store the data in an uncompressed form.
“The drawback here is the use of more storage space. If you keep the data in compressed form; however, it takes time to decompress the data first before you can access and analyze it. Our project outcome has the potential not only to reduce data storage space but also to accelerate data analysis,” says Qi Zhang.
Integrated cloud-native security platforms can overcome limitations of traditional security products
To close security gaps caused by rapidly changing digital ecosystems, organizations must adopt an integrated cloud-native security platform that incorporates artificial intelligence, automation, intelligence, threat detection and data analytics capabilities, according to 451 Research.
Cloud-native security platforms are essential
The report clearly defines how to create a scalable, adaptable, and agile security posture built for today’s diverse and disparate IT ecosystems. And it warns that legacy approaches and MSSPs cannot keep up with the speed of digital transformation.
- Massive change is occurring. Over 97 percent of organizations reported they are underway with, or expecting, digital transformation progress in the next 24 months, and over 41 percent are allocating more than 50 percent of their IT budgets to projects that grow and transform the business.
- Security platforms enable automation and orchestration capabilities across the entire IT stack, streamlining and optimizing security operations, improving productivity, enabling higher utilization of assets, increasing the ROI of security investments and helping address interoperability challenges created by isolated, multi-vendor point products.
- Threat-driven and outcome-based security platforms address the full attack continuum, compared with legacy approaches that generally focus on defensive blocking of a single vector.
- Modern security platforms leverage AI and ML to solve some of the most prevalent challenges for security teams, including expertise shortages, alert fatigue, fraud detection, behavioral analysis, risk scoring, correlating threat intelligence, detecting advanced persistent threats, and finding patterns in increasing volumes of data.
- Modern security platforms are positioned to deliver real-time, high-definition visibility with an unobstructed view of the entire IT ecosystem, providing insights into the company’s assets, attack surface, risks and potential threats and enabling rapid response and threat containment.
451 Senior Analyst Aaron Sherrill noted, “The impact of an ever-evolving IT ecosystem combined with an ever-evolving threat landscape can be overwhelming to even the largest, most well-funded security teams, including those at traditional MSSPs.
“Unfortunately, a web of disparate and siloed security tools, a growing expertise gap and an overwhelming volume of security events and alerts continue to plague internal and service provider security teams of every size.
“The consequences of these challenges are vast, preventing security teams from gaining visibility, scaling effectively, responding rapidly and adapting quickly. Today’s threat and business landscape demands new approaches and new technologies.”
How to deliver effective cybersecurity today
“Delivering effective cybersecurity today requires being able to consume a growing stream of telemetry and events from a wide range of signal sources,” said Dustin Hillard, CTO, eSentire.
“It requires being able to process that data to identify attacks while avoiding false positives and negatives. It requires equipping a team of expert analysts and threat hunters with the tools they need to investigate incidents and research advanced, evasive attacks.
“Most importantly, it requires the ability to continuously upgrade detection and defenses. These requirements demand changing the technology foundations upon which cybersecurity solutions are built—moving from traditional security products and legacy MSSP services to modern cloud-native platforms.”
Sherrill further noted, “Cloud-native security platforms optimize the efficiency and effectiveness of security operations by hiding complexity and bringing together disparate data, tools, processes, workflows and policies into a unified experience.
“Infused with automation and orchestration, artificial intelligence and machine learning, big data analytics, multi-vector threat detection, threat intelligence, and machine and human collaboration, cloud-native security platforms can provide the vehicle for scalable, adaptable and agile threat detection, hunting, and response. And when combined with managed detection and response services, organizations are able to quickly bridge expertise and resource gaps and attain a more comprehensive and impactful approach to cybersecurity.”
LexisNexis Risk Solutions announced the results of its annual focus group, comprised of over 20 healthcare IT executives that are members of the College of Healthcare Information Management Executives (CHIME).
The focus group participants accepted more accountability than in previous years to provide the safe and reliable technology tools necessary to deliver high-quality, connected, and cost-effective care.
The survey results also highlighted the importance of a team approach with support across the organization in helping CIOs achieve the vision of connected healthcare.
While the focus group came together before the COVID-19 pandemic struck, the technology priorities for 2020 – from data sharing and security to using data analytics to help vulnerable populations – have become more urgent in light of the pandemic challenges. For example, recent months have illustrated the need for data access to inform decisions about population health, wellness and care capacity.
The surveyed executives identified three main priority areas for 2020.
Members acknowledged challenges amid the surge of digital touchpoints, such as mobile phones, smart devices and remote services.
Goals include a common patient identifier to combine and verify disparate patient records for a true health information exchange.
Members are confronting new cybersecurity risks, confusion over who bears the ultimate responsibility for patient data, and the competing goals of seamless user experience and data safety.
To address that final challenge and strike an appropriate balance, executives are moving to multifactor authentication strategies for optimal user workflow and security.
Integrating Social Determinants of Health (SDOH)
As the pandemic has highlighted, incorporating SDOH data is a vital, immediate requirement for improving the delivery of patient support and value-based care, and ultimately, outcomes.
Executives shared SDOH implementation challenges, including data aggregation and operationalization within IT and EHR systems, especially when not utilizing third-party data to support their efforts. While CIOs previously had not perceived specific accountability for SDOH data, that changed as its value was demonstrated.
“CHIME’s executive health IT members are approaching evolving patient and industry needs with careful consideration, ingenuity and focus,” said Josh Schoeller, CEO of LexisNexis Risk Solutions Health Care.
“Our annual focus group presents valuable insights about how healthcare decision-makers are strategically using technology solutions to overcome hurdles regarding cybersecurity, data governance, and interoperability, all of which have become more urgent during the COVID-19 pandemic.
“It’s a big challenge but with the right data integration and analytics they continue to make great progress even in the face of the COVID-19 pandemic.”
Also encouraging, focus group participants reported solid results when rallying support from stakeholders across the enterprise to participate in tough conversations about information security, privacy, operations, compliance, and clinical and accountable care.
Despite the huge drops in employment and the immense worries around job security over the past three months, 81% of data professionals felt as, or more, secure in their role than they did this time a year ago, a Harnham survey reveals. This suggests businesses are more reliant on data professionals than ever before.
The report includes input from more than 3,000 global professionals, an analysis of the 1,000+ job placements made by Harnham over the last year, and a review of worldwide job views to convey a clear understanding of the market.
Other key findings
- Active job market – The data and analytics job market continues to move quickly, with respondents only remaining in their roles for an average of 2¼ years. Additionally, 80 percent are either actively looking for a new role, or open to the right opportunity.
- Gender diversity – The number of female data professionals continues to increase, now making up 27 percent of the industry.
- Flexible working – Businesses offering flexible working conditions grew from 64 percent to 80 percent even prior to COVID-19, suggesting “the new normal” of remote and flexible working was already a way of life.
- Career priorities – Data professionals have seen a shifting in priorities in the wake of COVID-19. When seeking a new role, respondents prioritize career progression over salary increases, as well as an emphasis on job security and working for a stable and growing business.
The impact of COVID-19 on the data and analytics space
In the wake of COVID-19, the report includes a special dedicated section on the impact the pandemic has had on the data and analytics space, as the industry has had to adjust to “the new normal.”
“As COVID-19 has impacted all aspects of our daily lives, its effects on the data and analytics industry have been widespread,” said Dave Farmer, founding partner at Harnham.
“That being said, however, we have found that professionals in the data and analytics space have adjusted to ‘the new normal’ reasonably well and have become even more vital to businesses looking to streamline existing processes and establish new ones in the wake of the pandemic. This has resulted in increased job security and shifting career priorities that suggest that data professionals will continue to thrive in the coming year.”
While companies continue to invest in teams of data experts, a Fivetran survey suggests that adding more data analytics wizards might not be the solution. In fact, the survey found that during the course of a workday, data analysts spend less than half their time actually analyzing data.
Conducted by Dimensional Research, the online survey of approximately 500 data professionals across five continents also shows 68 percent of the respondents have ideas that would drive more profit for their organizations but lack time to implement them.
The struggles of data professionals
More than 60 percent of respondents reported wasting time waiting for engineering resources several times each month and often spending one-third of every workday just trying to access data. 90 percent said their work was slowed by numerous unreliable data sources over the last 12 months.
“The struggles data professionals face in simply doing their work and the time they waste is astounding,” said George Fraser, CEO of Fivetran.
“To keep critical analytics projects moving, these unsung heroes contend with numerous workarounds to compensate for unavailable engineering resources and unreliable data sources. Fivetran ready-to-use connectors help remove some of these bottlenecks and allow analysts to instead focus on uncovering insights.”
As enterprises strive to optimize decision-making in a rapidly evolving global economic landscape, the study indicates that enabling analysts to spend more time analyzing and less time finding, fixing and stabilizing data will drive better decisions and increased profits.
Much of the problem lies in data integrity, quality and access — the top three challenges almost unanimously pointed to by the survey respondents.
- 71 percent of companies plan to hire more data analysts within the next year
- 74 percent of companies will grow business intelligence users in the same time period
- At the same time, 86 percent struggle with working with out-of-date data
- 41 percent report they had used data that was two months old or older
- 60 percent deal with frequently changing data schemas
- 92 percent state they often need to perform tasks outside their role
At a high level—and contrary to conventional wisdom – not all IT budgets are being cut. Even with the economic challenges that COVID-19 has posed for businesses, almost 38 percent of enterprises are keeping their IT budgets unchanged (flat) or actually increasing them.
Yellowbrick Data received responses from more than 1,000 enterprise IT managers and executives, uncovering their infrastructure priorities during this era of economic uncertainty and disruption.
“The survey brought to light some trends that we have been noticing recently related to the speed at which companies are moving to the cloud and investing in analytics. In fact, more than half of enterprises are accelerating their move to the cloud in light of COVID-19 challenges to their businesses,” said Jeff Spicer, CMO for Yellowbrick Data.
“But what really stands out is that nearly 55 percent of enterprises are looking at a hybrid cloud strategy with a combination of cloud and on-premises solutions. That clearly shows that a cloud-alone strategy is not what most enterprises are looking for—and validates what our customers are telling us about their own best practices combining cloud and on-prem approaches to their biggest data infrastructure challenges.”
For huge margins of enterprise IT leaders, investments in data infrastructure and analytics are a top priority:
- Data warehouse modernization is important for almost 90 percent of enterprises this year. For 55 percent it is very important, and for an additional 35 percent it is somewhat important.
- Getting more business value from their data lake is important for more than 95 percent of enterprises, with 61 percent saying it is very important and an additional 35 percent saying it is somewhat important.
- For almost two-thirds of respondents, investments in analytical infrastructure are important, with 27 percent investing a lot more and an additional 37 percent investing somewhat more.
Answering the “why” behind IT investments
These are the top four reasons IT decision-makers cite for investing in a new data warehouse or data analytics tool:
- 73 percent of respondents want better performance
- 54 percent want a solution that is easier to use
- 52 percent want a solution that is less expensive
- 48 percent say new enterprise applications require new solutions
Many firms are modernizing by adding cloud services, with 55 percent of enterprises looking at a hybrid cloud strategy as their best approach.
Enterprises see a variety of benefits with hybrid cloud. Answers that gained a more than 50 percent response included:
- 56 percent want more control over what is where—for example, the ability to customize the private end of their hybrid cloud model to their specific needs and adjust them accordingly as they see fit
- 54 percent say their IT staff can better optimize the network
- 52 percent say their companies can get the security of a private cloud with the power and services of the public cloud
- 51 percent say they can scale faster without compromising sensitive data
Top IT spending priorities diverse among businesses
When asked to identify their #1 business priority (single choice only) from their cloud investment, decision-makers gave numerous responses, with two consensus points emerging:
- Cost savings took up two of the top three spots and accounted for 39 percent of the total: 23 percent cited cost savings in infrastructure (hardware or software) and 16 percent cited cost saving to IT staff
- Business flexibility was the second biggest priority overall, coming in at 18 percent
- With the exception of “greater compute speed” (10 percent), no other choice received higher than a single-digit percentage
Public clouds: Mostly trusted, definitely diversify
Despite enterprises embracing the cloud, some skepticism remains. 27 percent of enterprise leaders say they do not trust public cloud providers to prioritize their business needs.
With the above statistic in mind, it is not surprising that risk mitigation remains a critical consideration, with 82 percent of respondents saying they want hybrid or multi-cloud options to spread any risk from their cloud investments, along with an additional 67 percent saying there are some parts of their business they will not trust to any single cloud vendor.
Almost 65% of the nearly 300 international cybersecurity professionals canvased by Gurucul at RSA Conference 2020 said they access documents that have nothing to do with their jobs.
Meanwhile, nearly 40% of respondents who experienced bad performance reviews also admitted to abusing their privileged access, which is double the overall rate (19%).
“We knew insider privilege abuse was rampant in most enterprises, but these survey results demonstrate that the infosecurity department is not immune to this practice,” said Saryu Nayyar, CEO of Gurucul. “Detecting impermissible access to resources by authorized users, whether it is malicious or not, is virtually impossible with traditional monitoring tools. That’s why many organizations are turning to security and risk analytics that look at both employee and entity behaviors to identify anomalies indicative of insider threats.”
- In finance, 58% said they have emailed company documents to their personal accounts.
- In healthcare, 33% have abused their privileged access.
- In manufacturing, 78% accessed documents unrelated to their jobs.
- In retail, 86% have clicked on a link in an email from someone they didn’t know.
- In midsize companies, 62% did not alert IT when their job role had changed.
This showcases the problems organizations have with employees behaving outside of the bounds of practical and published security policies. The human element is often the deciding factor in how data breaches occur. Monitoring and deterring risky employee behavior with machine learning based security analytics is the most effective measure in keeping mayhem to a minimum.
People may not realize their behavior in opening the door to cybercriminals, which is why security analytics technology is so critical to maintaining a secure corporate environment.
By making better use of data, leading organizations had materially increased revenue and reduced operational costs, boosting profitability by an average of 12.5% of their total gross profit, according to a Splunk survey.
The research study surveyed 1,350 senior business and IT decision-makers across eight industries in Australia, China, France, Germany, Japan, the UK and the US.
Results found that a more advanced data strategy was tied to improved outcomes including revenue growth, operational cost reduction, increased innovation, faster time to market, higher customer satisfaction and retention, and better, faster decision making across all industries and in all countries.
The data-to-everything journey
In addition to analyzing how organizations are saving money and growing revenue with data, the study assessed respondents’ different stages of data maturity based on criteria such as the prevalence of modern analytics tools and skill sets and the effectiveness of the organization at operationalizing its data.
Respondents were grouped into three data maturity categories:
- Stage 1: Data Deliberator – Organizations that are in the early phase of their data strategy implementation.
- Stage 2: Data Adopter – Organizations that are making good use of their data, but still have room for improvement.
- Stage 3: Data Innovator – Organizations which place the strongest strategic emphasis on data and have an advanced strategy in place to extract business value.
The study found that an organization’s stage of data use impacts its ability to not only glean insights from its data, but to convert these insights into concrete, data-driven decision-making and real-time action. All organizations reported benefits from better data use, but Data Innovators achieved considerably higher key business and economic benefits.
Relative to the Data Deliberators surveyed, Data Innovators have added 83% more revenue to their topline and 66% more profit to their bottom line in the past 12 months. The study found Data Innovators to be more likely to have a data-obsessed company culture and employ AI technologies for data analysis by acting on their data more frequently.
In addition, the study found that 97% of Data Innovators meet or exceed their customer retention targets, with the majority (60%) having actually outstripped their goals.
Meanwhile, 93% feel they tend to make better, faster decisions than competitors; while 91% believe that their organization is in a strong position to compete and succeed in its markets over the next few years.
However, across industries and countries, less than 11% of organizations have reached the stage of Data Innovator, demonstrating that nearly 90% still have room for improvement.
Organizations embracing data are primed to lead the next generation of business
The study quantifies the economic impact resulting from an organization’s better data use, with significant findings across eight major industries, including:
- 89% of financial firms agree that the intelligent use of data and analytics is increasingly becoming the only source of differentiation in the financial industry.
- 65% of technology firms have increased revenue through better use of their data assets.
- 60% of retail firms have increased revenue through better use of their data assets.
- 88% of healthcare and life sciences firms agree that advancements in data analysis and correlation of different data sets will have as big an impact on health outcomes as other medical advancements.
- 55% of manufacturing and resources firms have increased revenue through better use of their data assets.
- 93% of traditional communications and media companies agree that they must use data to reinvent their services or be disrupted by alternative entertainment offerings.
- 52% of public sector agencies have reduced their cost of operations through better use of their data assets.
- 51% of higher education institutions used data to provide better and more proactive protection from cyber threats.
Organizations who utilize data outperform their global peers
The study also surfaced trends related to data utilization maturity by country:
- 53% of UK organizations have progressed beyond Stage 1 data utilization maturity status; tied for 3rd out of the seven countries included in this survey.
- UK organizations were most likely to say they have improved product/service quality as a result of better data utilization. 73% reported improved product and/or service quality compared with a global average of 67%.
- Relative to organizations in other countries, they are also most likely to report improved employee efficiency and/or productivity as an additional benefit of better data discovery and use. 65% reported increased employee efficiency and/or productivity, compared with a global average of 60%.
- Among respondents attributing an increase in top-line revenues to improved data utilization, UK organizations report an average 12-month revenue gain of 3.97%; the second-highest increase among all countries surveyed. The highest increase was in China, with 4.05%.
In 2020, there will be greater adoption of Continuous Intelligence (CI) technologies, which will elevate IoT data analytics way beyond traditional operational levels and have a greater impact on strategic planning and organizational change, states ABI Research.
Looking at the 2020 technology market
Analysts have identified 35 trends that will shape the technology market and 19 others that, although attracting huge amounts of speculation and commentary, look less likely to move the needle over the next twelve months.
“After a tumultuous 2019 that was beset by many challenges, both integral to technology markets and derived from global market dynamics, 2020 looks set to be equally challenging,” says Stuart Carlaw, Chief Research Officer at ABI Research.
“CI will be consolidating in the IoT analytics market, enabling more advanced analytics in near-real time,” says Kateryna Dubrova, M2M, IoT & IoE Analyst at ABI Research.
Since the emergence and expansion of streaming analytics and streaming technologies, the ability to continuously analyze and extract value from the IoT data is growing. The CI application will be possible because the cloud vendors and vendors are offering E2E platforms, expanding their capabilities through digital twinning, big data technologies, and ML algorithms.
“Hence, in 2020, ABI Research predicts greater adoption of CI technologies, which will elevate IoT data analytics beyond traditional operational level (maintenance and control), but we will also observe a greater impact on strategic planning and organizational change.”
mMTC will sustain only a handful of chipset manufacturers
mMTC begun under 4G, with LTE-M and Narrowband-IoT (NB-IoT) being “forward-compatible” with the forthcoming 5G New Radio (NR) standard. Chipset vendors saw a greenfield opportunity to go from zero to hero with massive IoT, with some being established from scratch for the sake of developing a single NB-IoT baseband chip.
“This resulting race saw 17 baseband vendors emerge, but only four different ones currently supply most of the hundreds of LTE-M and NB-IoT products now available. HiSilicon, MediaTek, Qualcomm, and RDA (UNISOC) dominate. And this situation will only compound as we move toward Release 16 and the full coexistence of LTE-M and NB-IoT with 5G NR, i.e., the “official” start of the mMTC market,” says Jamie Moss, M2M, IoT & IoE Research Director at ABI Research.
“Nothing succeeds like success and only those with strong early adoption, regardless of slow initial sales, will be there to enjoy the boom years to come.”
China will drive the sharing economy 2.0
Uber and Airbnb could be considered the Sharing Economy 1.0. “But China is showing the world what the next phase of the sharing economy will look like: shared powerbanks,” says Dan Shey, Vice President of Enabling Platforms at ABI Research. Shared powerbanks have been a major driver of cellular connections in China today.
“The newer applications in the more “connected” version of the sharing economy will continue to grow across the world, albeit at a more measured pace than seen in China.”
2020 technology market: The IoT platform market will not consolidate
“For many years, there have been predictions that the IoT platform supplier market will begin to consolidate, and it just won’t happen,” says Dan Shey, Vice President of Enabling Platforms at ABI Research.
“The simple reason is that there are more than 100 companies that offer device-to-cloud IoT platform services and for every one that is acquired, there are always new ones that come to market.”
Unlicensed proprietary LPWA will not merge with licensed open standards
“The two cannot be reconciled at the standards level, for the premium that cellular commands stems from the cost of its license, and the control that its owners have over their blocks of spectrum, providing a secure, managed, quality of service-based guarantee to IoT customers,” says Adarsh Krishnan, M2M, IoT & IoE Principal Analyst at ABI Research.
Edge will not overtake cloud
“The accelerated growth of the edge technology and intelligent device paradigm created one of the largest industry misconceptions: edge technology will cannibalize cloud technology,” says M2M, IoT & IoE Analyst Kateryna Dubrova.
“In fact, in the future we will see a rapid development of edge-cloud-fog continuum, where technology will complement each other, rather than cross-cannibalize.”
The top 10 enterprise analytics trends to watch in 2020 have been announced by MicroStrategy in collaboration with analysts and influencers from Forrester, IDC, Constellation Research, Ventana Research and others.
Deep learning delivers a competitive advantage
“In 2020, the spotlight on deep learning will be the nexus between knowing and doing. No longer just a buzzword, the pragmatic advent of deep learning to predict and understand human behavior is a tempest disruptor in how companies will perform with intelligence against their competitors.” – Frank J. Bernhard, Chief Data Officer and Author, “SHAPE—Digital Strategy by Data and Analytics”.
AutoML improves the ROI of data science initiatives
“Machine learning is one of the fastest-evolving technologies in recent years, and the demand for development in machine learning has increased exponentially. This rapid growth of machine learning solutions has created a demand for ready-to-use machine learning models that can be used easily and without expert knowledge.” – Marcus Borba, Founder and Principal Consultant, Borba Consulting.
The semantic graph becomes paramount to delivering business value
“The semantic graph will become the backbone supporting data and analytics over a constantly changing data landscape. Organizations not using a semantic graph are at risk of seeing the ROI for analytics plummet due to growing complexity and resulting organizational costs.” – Roxane Edjlali, Senior Director, Product Management, MicroStrategy and former Gartner analyst.
Human insight becomes even more important as data volumes increase
“As more and more knowledge workers become comfortable working with data, they should also become conversant with data ethnography, or the study of what the data relates to, the context in which it was collected, and the understanding that data alone might not give them a complete picture.” – Chandana Gopal, Research Director, IDC.
Next-gen embedded analytics speeds time to insights
“Concise analytics delivered in the context of specific applications and interfaces speed decision making. This style of embedding and the curation of concise, in-context analytics can take more time, but with advances including no-code and low-code development methods, we’re seeing rising adoption of next-generation embedding.” – Doug Henschen, VP and Principal Analyst, Constellation Research.
The need to combine data sources continues to grow
“We expect to see a continued focus on data diversity. Organizations rarely have a single standard platform for their data and analytics and multiple tools are used to access the data. The need to combine these data sources will only continue to grow.” – David Menninger, SVP and Research Director, Ventana Research.
Data-driven upskilling becomes an enterprise requirement
“Enterprise organizations will need to focus their attention not just on recruiting efforts for top analytics talent, but also on education, reskilling, and upskilling for current employees as the need for data-driven decision making increases—and the shortage of talent grows.” – Hugh Owen, Executive Vice President, Worldwide Education, MicroStrategy.
AI is real and ready
“Next year, more of these confident CDAOs and CIOs will see to it that data science teams have what they need in terms of data so that they can spend 70%, 80%, or 90% of their time actually modeling for AI use cases.” – Srividya Sridharan, Mike Gualteri, J.P. Gownder, Craig Le Clair, Ian Jacobs, Andrew Hogan, Predictions 2020: Artificial Intelligence—It’s Time to Turn the Artificial Into Reality (Checks), Forrester, October 30, 2019.
Mobile intelligence evolves for 2020 and beyond
“Half of organizations will re-examine their use of mobile devices and conclude that their technology does not adequately address the needs of their workers, leading them to examine a new generation of mobile applications that enable a better work experience and far more effective connectivity to the rest of the organization and to customers.” – Mark Smith, CEO and Chief Research Officer, Ventana Research.
The future of experience management is powered by AI
“As apps get decomposed by business process to headless microservices, automation and intelligence will play a big role in creating mass personalization and mass efficiencies at scale. The Intelligent Enterprise will take context and data to power next best actions.” – R “Ray” Wang, Founder and Principal Analyst, Constellation Research.