Growth of cloud-native apps and containerization to define 2021

Scality announced its data storage predictions for 2021, focusing on the rapid growth rate of cloud-native apps and containerization. According to IDC, by 2023, over 500 million digital apps and services will be developed and deployed using cloud-native approaches. That is the same number of apps developed in total over the last 40 years. 2021 apps and containerization trends “The accelerated growth of next-generation cloud-native digital apps and services will define new competitive requirements in … More

The post Growth of cloud-native apps and containerization to define 2021 appeared first on Help Net Security.

Most US states show signs of a vulnerable election-related infrastructure

75% of all 56 U.S. states and territories leading up to the presidential election, showed signs of a vulnerable IT infrastructure, a SecurityScorecard report reveals.

election infrastructure

Since most state websites offer access to voter and election information, these findings may indicate unforeseen issues leading up to, and following, the US election.

Election infrastructure: High-level findings

Seventy-five percent of U.S. states and territories’ overall cyberhealth are rated a ‘C’ or below; 35% have a ‘D’ and below. States with a grade of ‘C’ are 3x more likely to experience a breach (or incident, such as ransomware) compared to an ‘A’ based on a three-year SecurityScorecard study of historical data. Those with a ‘D’ are nearly 5x more likely to experience a breach.

  • States with the highest scores: Kentucky (95) Kansas (92) Michigan (92)
  • States with the lowest scores: North Dakota (59) Illinois (60) Oklahoma (60)
  • Among states and territories, there are as many ‘F’ scores as there are ‘A’s
  • The Pandemic Effect: Many states’ scores have dropped significantly since January. For example, North Dakota scored a 72 in January and now has a 59. Why? Remote work mandates gave state networks a larger attack surface (e.g., thousands of state workers on home Wi-Fi), making it more difficult to ensure employees are using up-to-date software.

Significant security concerns were observed with two critically important “battleground” states, Iowa and Ohio, both of which scored a 68, or a ‘D’ rating.

The battleground states

According to political experts, the following states are considered “battleground” and will help determine the result of the election. But over half have a lacking overall IT infrastructure:

  • Michigan: 92 (A)
  • North Carolina: 81 (B)
  • Wisconsin: 88 (B)
  • Arizona: 81 (B)
  • Texas: 85 (B)
  • New Hampshire: 77 (C)
  • Pennsylvania: 85 (B)
  • Georgia: 77 (C)
  • Nevada: 74 (C)
  • Iowa: 68 (D)
  • Florida: 73 (C)
  • Ohio: 68 (D)

“The IT infrastructure of state governments should be of critical importance to securing election integrity,” said Alex Heid, Chief Research & Development Officer at SecurityScorecard.

“This is especially true in ‘battleground states’ where the Department of Homeland Security, political parties, campaigns, and state government officials should enforce vigilance through continuously monitoring state voter registration networks and web applications for the purpose of mitigating incoming attacks from malicious actors.

“The digital storage and transmission of voter registration and voter tally data needs to remain flawlessly intact. Some states have been doing well regarding their overall cybersecurity posture, but the vast majority have major improvements to make.”

Potential consequences of lower scores

  • Targeted phishing/malware delivery via e-mail and other mediums, potentially as a means to both infect networks and spread misinformation. Malicious actors often sell access to organizations they have successfully infected.
  • Attacks via third-party vendors – many states use the same vendors, so access into one could mean access to all. This is the top cybersecurity concern for political campaigns.
  • Voter registration databases could be impacted. In the worst-case scenario, attackers could remove voter registrations or change voter precinct information or make crucial systems entirely unavailable on Election Day through ransomware.

“These poor scores have consequences that go beyond elections; the findings show chronic underinvestment in IT by state governments,” said Rob Knake, the former director for cybersecurity policy at the White House in the Obama Administration.

“For instance, combatting COVID-19 requires the federal government to rely on the apparatus of the states. It suggests the need for a massive influx of funds as part of any future stimulus to refresh state IT systems to not only ensure safe and secure elections, but save more lives.”

A set of best practices for states

  • Create dedicated voter and election-specific websites under the domains of the official state domain, rather than using alternative domain names which can be subjected to typosquatting
  • Have an IT team specifically tasked and accountable for bolstering voter and election website cybersecurity: defined as confidentiality, integrity, and availability of all processed information
  • States should establish clear lines of authority for updating the information on these sites that includes the ‘two-person’ rule — no single individual should be able to update information without a second person authorizing it
  • States and counties should continuously monitor the cybersecurity exposure of all assets associated with election systems, and ensure that vendors supplying equipment and services to the election process undergo stringent processes

How do I select a data storage solution for my business?

We live in the age of data. We are constantly producing it, analyzing it, figuring out how to store and protect it, and, hopefully, using it to refine business practices. Unfortunately, 58% of organizations make decisions based on outdated data.

While enterprises are rapidly deploying technologies for real-time analytics, machine learning and IoT, they are still utilizing legacy storage solutions that are not designed for such data-intensive workloads.

To select a suitable data storage for your business, you need to think about a variety of factors. We’ve talked to several industry leaders to get their insight on the topic.

Phil Bullinger, SVP and General Manager, Data Center Business Unit, Western Digital

select data storage solutionSelecting the right data storage solution for your enterprise requires evaluating and balancing many factors. The most important is aligning the performance and capabilities of the storage system with your critical workloads and their specific bandwidth, application latency and data availability requirements. For example, if your business wants to gain greater insight and value from data through AI, your storage system should be designed to support the accelerated performance and scale requirements of analytics workloads.

Storage systems that maximize the performance potential of solid state drives (SSDs) and the efficiency and scalability of hard disk drives (HDDs) provide the flexibility and configurability to meet a wide range of application workloads.

Your applications should also drive the essential architecture of your storage system, whether directly connected or networked, whether required to store and deliver data as blocks, files, objects or all three, and whether the storage system must efficiently support a wide range of workloads while prioritizing the performance of the most demanding applications.

Consideration should be given to your overall IT data management architecture to support the scalability, data protection, and business continuity assurance required for your enterprise, spanning from core data centers to those distributed at or near the edge and endpoints of your enterprise operations, and integration with your cloud-resident applications, compute and data storage services and resources.

Ben Gitenstein, VP of Product Management, Qumulo

select data storage solutionWhen searching for the right data storage solution to support your organizational needs today and in the future, it’s important to select a solution that is trusted, scalable to secure demanding workloads of any size, and ensures optimal performance of applications and workloads both on premises and in complex, multi- cloud environments.

With the recent pandemic, organizations are digitally transforming faster than ever before, and leveraging the cloud to conduct business. This makes it more important than ever that your storage solution has built in tools for data management across this ecosystem.

When evaluating storage options, be sure to do your homework and ask the right questions. Is it a trusted provider? Would it integrate well within my existing technology infrastructure? Your storage solution should be easy to manage and meet the scale, performance and cloud requirements for any data environment and across multi-cloud environments.

Also, be sure the storage solution gives IT control in how they manage storage capacity needs and delivers real-time insight into analytics and usage patterns so they can make smart storage allocation decisions and maximize an organizations’ storage budget.

David Huskisson, Senior Solutions Manager, Pure Storage

select data storage solutionData backup and disaster recovery features are critically important when selecting a storage solution for your business, as now no organization is immune to ransomware attacks. When systems go down, they need to be recovered as quickly and safely as possibly.

Look for solutions that offer simplicity in management, can ensure backups are viable even when admin credentials are compromised, and can be restored quickly enough to greatly reduce major organizational or financial impact.

Storage solutions that are purpose-built to handle unstructured data are a strong place to start. By definition, unstructured data means unpredictable data that can take any form, size or shape, and can be accessed in any pattern. These capabilities can accelerate small, large, random or sequential data, and consolidate a wide range of workloads on a unified fast file and object storage platform. It should maintain its performance even as the amount of data grows.

If you have an existing backup product, you don’t need to rip and replace it. There are storage platforms with robust integrations that work seamlessly with existing solutions and offer a wide range of data-protection architectures so you can ensure business continuity amid changes.

Tunio Zafer, CEO, pCloud

select data storage solutionBear in mind: your security team needs to assist. Answer these questions to find the right solution: Do you need ‘cold’ storage or cloud storage? If you’re looking to only store files for backup, you need a cloud backup service. If you’re looking to store, edit and share, go for cloud storage. Where are their storage servers located? If your business is located in Europe, the safest choice is a storage service based in Europe.

Best case scenario – your company is going to grow. Look for a storage service that offers scalability. What is their data privacy policy? Research whether someone can legally access your data without your knowledge or consent. Switzerland has one of the strictest data privacy laws globally, so choosing a Swiss-based service is a safe bet. How is your data secured? Look for a service that offers robust encryption in-transit and at-rest.

Client-side encryption means that your data is secured on your device and is transferred already encrypted. What is their support package? At some point, you’re going to need help. A data storage service with a support package that’s included for free, answers in up to 24 hours is preferred.

PinK: A new way of implementing a key-value store in SSDs

As web services, cloud storage, and big-data services continue expanding and finding their way into our lives, the gigantic hardware infrastructures they rely on–known as data centers – need to be improved to keep up with the current demand.

key-value store

One promising solution for improving the performance and reducing the energy load associated with reading and writing large amounts of data is to confer storage devices with some computational capabilities and offload part of the data read/write process from CPUs.

A new way of implementing a key-value store

In a recent study, researchers from Daegu Gyeongbuk Institute of Science and Technology (DGIST), Korea, describe a new way of implementing a key-value store in solid state drives (SSDs), which offers many advantages over a more widely used method.

A key-value store (also known as key-value database) is a way of storing, managing, and retrieving data in the form of key-value pairs. The most common way to implement one is through the use of a hash function, an algorithm that can quickly match a given key with its associated stored data to achieve fast read/write access.

One of the main problems of implementing a hash-based key-value store is that the random nature of the hash function occasionally leads to long delays (latency) in read/write operations. To solve this problem, the researchers from DGIST implemented a different paradigm, called “log-structured merge-tree (LSM).” This approach relies on ordering the data hierarchically, therefore putting an upper bound on the maximum latency.

Letting storage devices compute some operations by themselves

In their implementation, nicknamed “PinK,” they addressed the most serious limitations of LSM-based key-value stores for SSDs. With its optimized memory use, guaranteed maximum delays, and hardware accelerators for offloading certain sorting tasks from the CPU, PinK represents a novel and effective take on data storage for SSDs in data centers.

Professor Sungjin Lee, who led the study, remarks: “Key-value store is a widely used fundamental infrastructure for various applications, including Web services, artificial intelligence applications, and cloud systems. We believe that PinK could greatly improve the user-perceived performance of such services.”

So far, experimental results confirm the performance gains offered by this new implementation and highlight the potential of letting storage devices compute some operations by themselves.

“We believe that our study gives a good direction of how computational storage devices should be designed and built and what technical issues we should address for efficient in-storage computing,” Prof Lee concludes.

Things to consider when selecting enterprise SSDs for critical workloads

The process of evaluating solid state drives (SSDs) for enterprise applications can present a number of challenges. You want maximum performance for the most demanding servers running mission-critical workloads.

We sat down with Scott Hamilton, Senior Director, Product Management, Data Center Systems at Western Digital, to learn more about SSDs and how they fit into current business environments and data centers.

enterprise SSDs

What features do SSDs need to have in order to offer uncompromised performance for the most demanding servers running mission-critical workloads in enterprise environments? What are some of the misconceptions IT leaders are facing when choosing SSDs?

First, IT leaders must understand environmental considerations, including the application, use case and its intended workload, before committing to specific SSDs. It’s well understood that uncompromised performance is paramount to support mission critical workloads in the enterprise environment. However, performance has different meanings to different customers for their respective use cases and available infrastructure.

Uncompromised performance may focus more on latency (and associated consistency), IOPs (and queue depth) or throughput (and block size) depending on the use case and application.

Additionally, the scale of the application and solution dictate the level of emphasis, whether it be interface-, device-, or system-level performance. Similarly, mission-critical workloads may have different expectations or requirements e.g. high availability support, disaster recovery, or performance and performance consistency. This is where IT leaders need to rationalize and test the best fit for their use case.

Today there are many different SSD segments that fit certain types of infrastructure choices and use cases. For example, PCIe SSD options are available from boot drives to performance NVMe SSDs and they come in different form factors such as M.2 (ultra- light and thin) and U.2 (standard 2.5-inch) to name a few. It’s also important to consider power/performance. Some applications do not require interface saturation, and can leverage low-power, single-port mainstream SSDs instead of dual-port, high-power, higher-endurance and higher-performance drives.

IT managers have choices today, which they should consider carefully to rationalize, optimize, infrastructure elasticity and scaling, test and ultimately align their future system architecture strategies when it comes to choosing the best fit SSD. My final word of advice: Sometimes it is not wise to pick the highest performing SSD available on the market as you do not want to pay for a rocket engine for a bike. Understanding the use case and success metrics – e.g., price-capacity, latency, price performance (either $/IOPs or $/GB/sec) – will help eliminate some of the misconceptions IT leaders face when choosing SSDs.

How has the pandemic accelerated cloud adoption and how has that translated to digital transformation efforts and the creation of agile data infrastructures?

The rapid increase in our global online footprint is stressing IT infrastructure from virtual office, live video calls, online classes, healthcare services and content streaming to social media, instant messaging services, gaming and e-commerce. This is the new normal of our personal and professional lives. There is no doubt that the pandemic has increased dependence on cloud data centers and services. Private, public and hybrid cloud use cases will continue to co-exist due to costs, data governance and strategies, security and legacy application support.

Digital transformation continues all around us, and the pandemic accelerated these initiatives. Before the pandemic, digital transformation projects generally spanned over several years with lengthy and exhaustive cycles to go online and scale up their web foot print. However, 2020 has really surprised all of us. Tectonic shifts have happened (and are still happening) with projects now taking only weeks or months even for businesses that are learning to scale up for the first time.

This infrastructure stress will further accelerate technological shifts at as well, whether it be from SAS to NVMe at the endpoints or from DAS- or SAN-based solutions to NVMe over Fabrics (NVMe-oF) based solutions to deliver greater agility to meet both dynamic and unforeseen demands of the future.

OPIS

Organizations are scrambling to update their infrastructure, and many are battling inefficient data silos and large operational expenses. How can data centers take full advantage of modern NVMe SSDs?

NVMe SSDs are playing a pivotal role in making the new reality possible for the people and businesses around the world. As users transition from SAS and SATA, NVMe is not only increasing overall system performance and utilization, it’s creating next-generation flexible and agile IT infrastructure as well. Capitalizing on the power of NVMe, SSDs now enable data centers to run more services on their hardware i.e., improved utilization. This is an important consideration for IT leaders and organizations who are looking to improve efficiencies.

NVMe SSDs are helping both public and private cloud infrastructures in various areas such as the highest performance storage, the lowest latency interface and the flexibility to support needs from boot to high-performance compute as well as infrastructure productivity. NVMe supports enterprise specifications for server and storage systems such as namespaces, virtualization support, scatter gather list, reservations, fused operations, and emerging technologies such as Zoned Namespaces (ZNS).

Additionally, NVMe-oF extends the benefits of NVMe technology and enables sharing data between hosts and NVMe-based platforms over a fabric. The ratification of the NVMe 1.4 and NVMe-oF 1.1 specifications, with the addition of ZNS, have further strengthened NVMe’s position in enterprise data centers. Therefore, by introducing NVMe SSDs into their infrastructure, organizations will have the tools to get more from their data assets.

OPIS

What kind of demand for faster hardware do you expect in the next five years?

Now and into the future, data centers of all shapes and sizes are constantly striving to achieve greater scalability, efficiencies and increased productivity and responsiveness with the best TCO. Business leaders and IT decision-makers must understand and navigate through the complexities of cloud, edge and hybrid on-prem data center technologies and architectures, which are increasingly being relied upon to support a growing and complex ecosystem of workloads, applications and AI/IoT datasets.

More than a decade ago, IT systems used to rely on software running on dedicated general purpose systems for any applications. This created many inefficiencies and scaling challenges, especially with large scale system designs. Today, data dependence has been consistently and exponentially growing, which has forced data center architects to decouple the applications from the systems. This was the birth of the HCI market and now the composable disaggregated infrastructure market.

Next-generation infrastructures are moving to disaggregated, pooled resources (e.g., compute, accelerators and storage) that can be dynamically composed to meet the ever increasing and somewhat unpredictable demands of the future. All of this allows us to make efficient use of hardware to increase infrastructure agility, scalability and software control, remove various system bottlenecks and improve overall TCO.

USB storage devices: Convenient security nightmares

There’s no denying the convenience of USB media. From hard drives and flash drives to a wide range of other devices, they offer a fast, simple way to transport, share and store data. However, from a business security perspective, their highly accessible and portable nature makes them a complete nightmare, with data leakage, theft, and loss all common occurrences.

Widespread remote working appears to have compounded these issues. According to new research, there’s been a 123% increase in the volume of data downloaded to USB media by employees since the onset of COVID-19, suggesting many have used such devices to take large volumes of data home with them. As a result, there’s hundreds of terabytes of potentially sensitive, unencrypted corporate data floating around at any given time, greatly increasing the risk of serious data loss.

Fortunately, effective implementation of USB control and encryption can significantly minimize that risk.

What is USB control and encryption?

USB control and encryption refers to the set of techniques and practices used to secure the access of devices to USB ports. Such techniques and practices form a key part of endpoint security and help protect both computer systems and sensitive data assets from loss, as well as security threats (e.g., malware) that can be deployed via physical plug-in USB devices.

There are numerous ways that USB control and encryption can be implemented. The most authoritarian approach is to block the use of USB devices altogether, either by physically covering endpoint USB ports or by disabling USB adapters throughout the operating system. While this is certainly effective, for the vast majority of businesses it simply isn’t a workable approach given the huge number of peripheral devices that rely on USB ports to function, such as keyboards, chargers, printers and so on.

Instead, a more practical approach is to combine less draconian physical measures with the use of encryption that protects sensitive data itself, meaning even if a flash drive containing such data is lost or stolen, its contents remain safe. The easiest (and usually most expensive) way to do this is by purchasing devices that already have robust encryption algorithms built into them.

A cheaper (but harder to manage) alternative is to implement and enforce specific IT policies governing the use of USB devices. This could either be one that only permits employees to use certain “authenticated” USB devices – whose file systems have been manually encrypted – or stipulating that individual files must be encrypted before they can be transferred to a USB storage device.

Greater control means better security

The default USB port controls offered as part of most operating systems tend to be quite limited in terms of functionality. Security teams can choose to leave them completely open, designate them as read-only, or fully disable them. However, for those wanting a more nuanced approach, a much greater level of granular control can be achieved with the help of third-party security applications and/or solutions. For instance, each plugged-in USB device is required to tell the OS exactly what kind of device it is as part of the connection protocol.

With the help of USB control applications, admins can use this information to limit or block certain types of USB devices on specific endpoint ports. A good example would be permitting the use of USB-connected mice via the port, but banning storage devices, such as USB sticks, that pose a much greater threat to security.

Some control applications go further still, allowing security teams to put rules in place that govern USB ports down to an individual level. This includes specifying exactly what kinds of files can be copied or transferred via a particular USB port or stipulating that a particular port can only be used by devices from a pre-approved whitelist (based on their serial number). Such controls can be extremely effective at preventing unauthorized data egress, as well as malicious actions like trying to upload malware via an unauthorized USB stick.

A centrally controlled solution saves significant logistical headaches

It’s worth noting that a normal business network can contain hundreds, or even thousands of endpoints, each with one or more USB ports. As such, control and encryption solutions that can be managed centrally, rather than on an individual basis, are significantly easier to implement and manage. This is particularly true at this current point in time, where remote working protocols make it almost impossible to effectively manage devices any other way.

While portable USB drives and devices are seen as a quick, convenient way to transport or store data by employees, they often present a major headache for security professionals.

Fortunately, implementing USB control and encryption solutions can greatly improve the tools at a security team’s disposal to deal with such challenges and ensure both the network and sensitive company data remains protected at all times.

Cloud IT infrastructure spending grows, non-cloud investments plunge

Vendor revenue from sales of IT infrastructure products (server, enterprise storage, and Ethernet switch) for cloud environments, including public and private cloud, increased 2.2% in the first quarter of 2020 (1Q20) while investments in traditional, non-cloud, infrastructure plunged 16.3% year over year, according to IDC.

non-cloud investments plunge

Pandemic as the major factor driving infrastructure spending

The broadening impact of the COVID-19 pandemic was the major factor driving infrastructure spending in the first quarter. Widespread lockdowns across the world and staged reopening of economies triggered increased demand for cloud-based consumer and business services driving additional demand for server, storage, and networking infrastructure utilized by cloud service provider datacenters.

As a result, public cloud was the only deployment segment escaping year-over-year declines in 1Q20 reaching $10.1 billion in spend on IT infrastructure at 6.4% year-over-year growth. Spending on private cloud infrastructure declined 6.3% year over year in 1Q to $4.4 billion.

The pace set in the first quarter is expected to continue through rest of the year as cloud adoption continues to get an additional boost driven by demand for more efficient and resilient infrastructure deployment.

For the full year, investments in cloud IT infrastructure will surpass spending on non-cloud infrastructure and reach $69.5 billion or 54.2% of the overall IT infrastructure spend.

Spending on private cloud infrastructure expected to recover

Spending on private cloud infrastructure is expected to recover during the year and will compensate for the first quarter declines leading to 1.1% growth for the full year. Spending on public cloud infrastructure will grow 5.7% and will reach $47.7 billion representing 68.6% of the total cloud infrastructure spend.

Disparity in 2020 infrastructure spending dynamics for cloud and non-cloud environments will ripple through all three IT infrastructure domains – Ethernet switches, compute, and storage platforms.

Within cloud deployment environments, compute platforms will remain the largest category of spending on cloud IT infrastructure at $36.2 billion while storage platforms will be fastest growing segment with spending increasing 8.1% to $24.9 billion. The Ethernet switch segment will grow at 3.7% year over year.

Vendor revenues by region

At the regional level, year-over-year changes in vendor revenues in the cloud IT Infrastructure segment varied significantly during 1Q20, ranging from 21% growth in China to a decline of 12.1% in Western Europe.

Long term, spending on cloud IT infrastructure is expected to grow at a five-year CAGR of 9.6%, reaching $105.6 billion in 2024 and accounting for 62.8% of total IT infrastructure spend.

Public cloud datacenters will account for 67.4% of this amount, growing at a 9.5% CAGR. Spending on private cloud infrastructure will grow at a CAGR of 9.8%. Spending on non-cloud IT infrastructure will rebound somewhat in 2020 but will continue declining with a five-year CAGR of -1.6%.

OPTIMUSCLOUD: Cost and performance efficiency for cloud-hosted databases

A Purdue University data science and machine learning innovator wants to help organizations and users get the most for their money when it comes to cloud-based databases. Her same technology may help self-driving vehicles operate more safely on the road when latency is the primary concern.

optimuscloud

Somali Chaterji, a Purdue assistant professor of agricultural and biological engineering who directs the Innovatory for Cells and Neural Machines [ICAN], and her team created a technology called OPTIMUSCLOUD.

A benefit for both cloud vendors and customers

The system is designed to help achieve cost and performance efficiency for cloud-hosted databases, rightsizing resources to benefit both the cloud vendors who do not have to aggressively over-provision their cloud-hosted servers for fail-safe operations and to the clients because the data center savings can be passed on them.

“It also may help researchers who are crunching their research data on remote data centers, compounded by the remote working conditions during the pandemic, where throughput is the priority,” Chaterji said. “This technology originated from a desire to increase the throughput of data pipelines to crunch microbiome or metagenomics data.”

This technology works with the three major cloud database providers: Amazon’s AWS, Google Cloud, and Microsoft Azure. Chaterji said it would work with other more specialized cloud providers such as Digital Ocean and FloydHub, with some engineering effort.

It is benchmarked on Amazon’s AWS cloud computing services with the NoSQL technologies Apache Cassandra and Redis.

“Let’s help you get the most bang for your buck by optimizing how you use databases, whether on-premise or cloud-hosted,” Chaterji said. “It is no longer just about computational heavy lifting, but about efficient computation where you use what you need and pay for what you use.”

Handling long-running, dynamic workloads

Chaterji said current cloud technologies using automated decision making often only work for short and repeat tasks and workloads. She said her team created an optimal configuration to handle long-running, dynamic workloads, whether it be workloads from the ubiquitous sensor networks in connected farms or high-performance computing workloads from scientific applications or the current COVID-19 simulations from different parts of the world in a rush to find the cure against the virus.

“Our right-sizing approach is increasingly important with the myriad applications running on the cloud with the diversity of the data and the algorithms required to draw insights from the data and the consequent need to have heterogeneous servers that drastically vary in costs to analyze the data flows,” Chaterji said.

“The prices for on-demand instances on Amazon EC2 vary by more than a factor of five-thousand, depending on the virtual memory instance type you use.”

Chaterji said OPTIMUSCLOUD has numerous applications for databases used in self-driving vehicles (where latency is a priority), healthcare repositories (where throughput is a priority), and IoT infrastructures in farms or factories.

OPTIMUSCLOUD: Using machine learning and data science principles

OPTIMUSCLOUD is a software that is run with the database server. It uses machine learning and data science principles to develop algorithms that help jointly optimize the virtual machine selection and the database management system options.

“Also, in these strange times when both traditionally compute-intensive laboratories such as ours and wet labs are relying on compute storage, such as to run simulations on the spread of COVID-19, throughput of these cloud-hosted VMs is critical and even a slight improvement in utilization can result in huge gains,” Chaterji said.

“Consider that currently, even the best data centers run at lower than 50% utilization and so the costs that are passed down to end-users are hugely inflated.”

“Our system takes a look at the hundreds of options available and determines the best one normalized by the dollar cost,” Chaterji said. “When it comes to cloud databases and computations, you don’t want to buy the whole car when you only need a tire, especially now when every lab needs a tire to cruise.”

Global revenue from OCP infrastructure market to reach $33.8 billion in 2024

Worldwide revenue from the Open Compute Project (OCP) infrastructure market will reach $33.8 billion in 2024, according to IDC.

OCP infrastructure market revenue

While year-over-year growth will slow slightly in 2020 due to capital preservation strategies during the COVID-19 situation, the market for OCP compute and storage infrastructure is forecast to see a compound annual growth rate (CAGR) of 16.6% over the 2020-2024 forecast period.

The forecast assumes a rapid recovery for this market in 2021-22, fueled by a robust economic recovery worldwide. However, a prolonged crisis and economic uncertainty could delay the market’s recovery well past 2021, although investments in and by cloud service providers may dominate infrastructure investments when they occur during this period.

“By opening and sharing the innovations and designs within the community, IDC believes that OCP will be one of the most important indicators of datacenter infrastructure innovation and development, especially among hyperscalers and cloud service providers,” said Sebastian Lagana, research manager, Infrastructure Systems, Platforms and Technologies.

“IDC projects massive growth in the amount of data generated, transmitted, and stored worldwide. Much of this data will flow in and out of the cloud and get stored in hyperscale cloud data centers, thereby driving demand for infrastructure,” said Kuba Stolarski, research director, Infrastructure Systems, Platforms and Technologies at IDC.

OCP technology by segment

The compute segment will remain the primary driver of overall OCP infrastructure revenue for the coming five years, accounting for roughly 83% of the total market. Despite being a much larger portion of the market, compute will achieve a CAGR comparable to storage through 2024. The compute and storage segments are defined below:

  • Compute: Spend on computing platforms (i.e., servers including accelerators and interconnects) is estimated to grow at a five-year CAGR of 16.2% and reach $28.07 billion. This segment includes externally attached accelerator trays also known as JBOGs (GPUs) and JBOFs (FPGAs).
  • Storage: Spend on storage (i.e., server-based platforms and externally attached platforms and systems) is estimated to grow at a five-year CAGR of 18.5% and reach $5.73 billion. Externally attached platforms are also known as JBOFs (Flash) and JBODs (HDDs) and do not contain a controller. Externally attached systems are built using storage controllers.

Buyer type highlights

OCP Board Member purchases make up the bulk of the OCP infrastructure market and are poised to grow at a 14.8% CAGR through 2024, when they will account for just under 75% of the total market.

Conversely, non-member spending is projected to increase at a five-year CAGR of 23.2% and will expand its share of the OCP infrastructure market by just over 600 basis points during that period.

In terms of end user type, hyperscalers account for the largest portion of the market at just over 78% in 2019 and are projected to expand spending at a 14.2% CAGR through 2024, although this will result in erosion of total share.

Conversely non-hyperscaler purchases will expand 23.8% over the same period, increasing this group’s market share by approximately 650 basis points from 2019 to 2024.

Total end-user spending on IT infrastructure products recovers

Total end-user spending on IT infrastructure products (server, enterprise storage, and Ethernet switch) for cloud environments, including public and private cloud, recovered in the fourth quarter of 2019 (4Q19) after two consecutive quarters of decline, according to IDC.

spending IT infrastructure products

The 12.4% year-over-year growth in 4Q19 yielded $19.4 billion in spending. The fourth quarter results also brought the full year into positive territory with annual growth of 2.1% and total spending of $66.8 billion for 2019.

Meanwhile, the overall IT infrastructure market continued to struggle after its strong performance in 2018, up 3.3% to $38.1 billion in 4Q19 but declining 1.1% to $134.4 billion for the full year. Non-cloud IT infrastructure fell 4.6% to $18.7 billion for the quarter and declined 4.1% to $67.7 billion for the year.

4Q19 cloud IT infrastructure market results

In 4Q19, growth in spending on cloud IT infrastructure was driven by the public cloud segment, which grew 14.5% year over year to $13.3 billion; private cloud grew 8.2% to $6.1 billion.

As the overall segment is generally trending up, it tends to be more volatile at the quarterly level as a significant part of the public cloud IT segment is represented by a few hyperscale service providers. After a weaker middle part of the year, public cloud ended 2019 barely up 0.1% to $45.2 billion. Private cloud grew in 2019 by 6.6% to $21.6 billion.

As investments in cloud IT infrastructure continue to increase, with some swings up and down in the quarterly intervals, the IT infrastructure industry is approaching the point where spending on cloud IT infrastructure consistently surpasses spending on non-cloud IT infrastructure.

The fourth quarter of 2019 marked the third consecutive quarter of cloud IT leadership with the annual share just slightly below the midpoint (49.7%). From here on out, cloud IT infrastructure is expected to stay above 50% of the IT Infrastructure market at both the quarterly and annual levels, reaching 60.5% annually in 2024.

Across the three IT infrastructure technology domains, storage platforms saw the fastest year-over-year growth in 4Q19 at 15.1% with spending reaching $6.6 billion. Compute platforms grew 14.5% year over year with $10.8 billion in spending while Ethernet switches declined 3.9% to $2.0 billion.

For the full year 2019, Ethernet switches led with year-over-year growth of 5.0% and $8.2 billion in spending, followed by storage platforms with 1.9% growth and spending of $23.1 billion, and compute platforms with growth of 1.5% and spending of $35.5 billion.

4Q19 cloud IT infrastructure market forecast

The forecast for 2020, after taking into consideration the repercussions of the COVID-19 pandemic and its ensuing economic crisis, is for $69.2 billion in cloud IT infrastructure spending, a 3.6% predicted annual increase over 2019.

Non-cloud IT infrastructure spending is expected to decline 9.2% to $61.4 billion in 2020. Together, overall IT infrastructure spending is expected to decline 2.9% to 130.6 billion.

The COVID-19 pandemic represents a severe threat to global growth. Prior to the outbreak the expected global real GDP growth was to be lackluster 2.3% (at market exchange rates) in 2020.

The emergence of the epidemic in China is a game changer and the expected growth for 2020 is now -0.2%, the slowest rate since the global financial crisis. The negative effect on growth will come via both demand and supply channels.

On one hand, quarantine measures, illness, and negative consumer and business sentiment will suppress demand in specific areas, while certain pockets of demand will surface, such as cloud platforms for communication and collaboration workloads.

At the same time, closure of some factories and disruption to supply chains will create supply bottlenecks. These effects are expected to be distributed unevenly across the market landscape.

“While the beginning of 2020 was marked by supply chain issues that should be resolved before the end of the second quarter, the negative economic impact will hit enterprise customers’ CAPEX spending,” said Kuba Stolarski, research director, Infrastructure Systems, Platforms and Technologies at IDC.

“As enterprise IT budgets tighten through the year, public cloud will see an increase in demand for services. This increase will come in part from the surge of work-from-home employees using online collaboration tools, but also from workload migration to public cloud as enterprises seek ways to save money for the current year. Once the coast is clear of coronavirus, IDC expects some of this new cloud service demand to remain sticky going forward.”

A new five-year forecast predicts cloud IT infrastructure spending will reach $100.1 billion in 2024 with a compound annual growth rate (CAGR) of 8.4%. Non-cloud IT infrastructure spending will decline slightly to $65.3 billion with a -0.7% CAGR. Total IT infrastructure is forecast to grow at a 4.2% CAGR and produce $165.4 billion in spending in 2024.

21% of SMBs do not have a data backup or disaster recovery solution in place

58% of C-level executives at small and medium businesses (SMBs) said their biggest data storage challenge is security vulnerability, according to Infrascale.

data backup

The research, conducted in March 2020, is based on a survey of more than 500 C-level executives. CEOs represented 87% of the group. Almost all of the remainder was split between CIOs and CTOs.

“Our research indicates that 21% of SMBs do not have data protection solutions in place,” said Russell P. Reeder, CEO of Infrascale. “That’s a problem, because every modern company depends on data and operational uptime for its very survival. And this has never been more important than during the unprecedented times we are currently facing.”

Data protection means different things to different people

Certain aspects of data protection are more important than others depending upon an individual’s unique experiences and position. But data protection clearly delivers significant value from many vantage points.

When asked what data protection means to them, 61% of the survey group named data security and encryption. The same share said data backup. Nearly as many (59%) defined data protection as data recovery, while 54% cited anti-malware services.

Forty-six percent said data protection addresses email protection. Data archiving and the ability to become operational quickly after a disaster each captured 45% of the survey group’s vote.

Meanwhile, 44% of the group said data protection means ransomware protection/mitigation. The same share named physical device protection for endpoints such as laptops and mobile phones. And 32% said that for them data protection involves processes that prevent user error.

“Data protection can come into play in a wide array of important ways – including data security and encryption, data recovery, email protection and data archiving. It also provides the ability to recover quickly from a disaster, protection from and mitigation of ransomware, and physical device protection. Plus, it can prevent user error,” said Reeder.

“All of the above are valuable for businesses. These benefits contribute to the success of many businesses today, and implementing data protection to these ends will better position organizations for the future.”

Opinions about data protection vary by industry as well

The research suggests there is significant variation in what top executives from different sectors consider the most important aspects of data protection.

On the legal space, 89% of executives said data protection provides data security and encryption. Seventy-one percent of the top leaders in the healthcare sector agreed. Data security and encryption was the top answer among retail/ecommerce and telecommunications leaders as well, although with lower shares – 67% and 52%, respectively.

Top executives in education see data backup and data recovery as the most important aspects of data protection. Sixty-one percent of this group said they hold this belief. For 57% of the top leaders in accounting, banking or finance, data backup is the key concern in data protection.

Cyberattacks are SMB leaders’ top overall data protection concern

The overall survey group said cyberattacks are the biggest data protection issue their companies are facing. Nearly half (49%) of the group voiced their concern about hacking.

Micro disasters such as corrupted hard drives and malware infections were the second most commonly indicated concern, garnering a 46% share from the group. System crashes (41%), data leaks (39%), ransomware attacks (38%), and human errors (38%) were next on the list.

There was some variation in sector response here as well. Top leaders in education (64%), telecommunications (63%) and healthcare (54%) said that micro disasters are their biggest data protection issues.

But more than half of the survey respondents in both the retail (54%) and financial sectors (53%) said cyberattacks such as ransomware are their leading data protection challenges.

“Cyberattacks like ransomware are a major challenge for businesses today,” said Reeder. “But organizations can put defensive measures in place to lower their susceptibility to attack.”

Most SMBs have data protection in place, but those that don’t remain unprotected

Views about data protection definitions – and what is most important to the protection of SMB data – may vary. But most SMBs clearly believe it is important to have a data protection and/or backup and disaster recovery solution in place, as 79% of the survey group said they already do.

However, while the majority has taken steps to protect data, the remainder – which represents a significant share at 21% – clearly has not. And 13% of SMB C-level executives said they do not have any data protection strategy in place. That leaves these businesses vulnerable.

“Each organization is different,” said Reeder. “But one thing all businesses have in common is a desire to eradicate downtime and data loss. Organizations can and should protect their data, and their businesses as a whole, by enabling comprehensive data protection with modern backup and disaster recovery solutions and strategies.”

Interconnectivity and networking predictions for 2020 and beyond

Traditional networking and interconnectivity approaches are not handling the pressures being placed on traditional computer networking, according to Stateless.

networking predictions

The challenges of on-demand compute and storage, the migration of enterprise workloads across multiple cloud services, the imminence of 5G and more, all require changes in the way networks are built, managed and how they grow.

Encryption everywhere

Security experts are discovering that there’s no such thing as a trusted network. Forty-one percent of organizations have seen an increase in internally-sourced threats. Data encryption can help reduce these threats significantly.

While encryption has typically been the job of firewalls, new lightweight microservice-based solutions are emerging that will fortify security by allowing the encryption of every network connection.

Multi-cloud automation

Driven partly by demand and partly by ongoing technical innovation, hyperscale clouds will reach a state of interoperability that will unlock the ability of end-users to achieve automation across multiple clouds.

IoT becomes the norm

Instead of being a mildly interesting statistic to track, the Internet of Things will become the standard operating model for gathering information. IoT was the final frontier of the Information Age, and all that remains is its colonization with near-edge data centers and 5G networks.

Colocation providers become next-gen network operators

The world has known two primary types of network operators: legacy wireline and wireless. This model worked well when data needed to be sent from point A to point Z. Now data lives at point B, point C and every point in-between. Those points in-between are data centers.

It makes no sense for businesses to connect to a carrier so they can take the data to the carrier’s gateway and then take it to the data center. Data center operators are recognizing this and are building their networks to bypass the legacy carriers – and the Internet, too.

DevOps teams in every enterprise

Over the last few years, industry leaders have been telling IT that they need to step up and help their companies take advantage of the latest technology. Now a new type of skill set is emerging: DevOps.

DevOps engineers focus specifically on how to leverage not only new IT technologies but also new advances in cloud, software and automation. DevOps teams will become the new creators of innovation in the corporate world.

Named data networking

Today, networking technologies are focused on the transfer of ones and zeros from one point to another. The network has limited awareness of the content that it carries.

Humans care about the content, not the bits and bytes. Technologies like NDN, blockchain and the separation of state from processing will enable networks to become queryable information stores.

5G displaces last mile access

Businesses have been held hostage by the limitation and requirement of selecting locations based on the reach of wireline connectivity. If a building is not connected, wireline operators charge high fees for network build-outs.

Also, public utility commissions have stifled network builds by charging exorbitant fees for access to the right of ways to lay cables. This situation is all about to change.

5G will provide users with much more than the ability to stream full 4K-quality movies to their cell phones. Offering speeds that will rival fiber connectivity, 5G networks will allow businesses and wireless operators to connect to each other in ways previously unfathomable.

The network is storage

The network stores data today, but the data is only stored while it is in transit, and the network is not aware of the contents of the information it is storing. NDN will allow networks to be content-aware. Network operators will be able to cache information.

When multiple users request data, the network will be able to fulfill those requests without having to pull the data from the source. Data will live on the network until it is no longer needed.

Seagate Lyve Drive Mobile System: Manage enterprise, cloud, and edge data growth

Today at CES 2020, Seagate introduced a modular storage solution to manage the surge of enterprise, cloud, and edge data. The Seagate Lyve Drive Mobile System is a portfolio of simple, secure, and efficient data management solutions built to activate the datasphere.

Seagate Lyve Drive Mobile System

Powered by IT 4.0—the fourth wave of industrial revolution within IT—where connected homes, connected cities, AI-powered factories, autonomous vehicles, and media and entertainment content drive explosive data growth, the global datasphere is forecasted to grow from 41 zettabytes in 2019 to 175 zettabytes by 2025, according to IDC. Nearly 30% of that data will need real-time processing. Seagate’s Lyve Drive system enables efficient and cost effective movement of this data between enterprise, cloud, and the edge.

“Data empowers those who can harness and activate it,” said Jeff Fochtman, vice president of marketing for Seagate. “However, today’s data management tools are too costly and inefficient for businesses to tap into data’s full value. Lyve Drive is Seagate’s first step toward a unified data experience, which will turn data’s possibilities into tangible growth for the world’s most critical industries.”

Seagate Lyve Drive Mobile System

Developed to address the growing need to move massive amounts of data from endpoints to edge to core, Seagate’s Lyve Drive Mobile System is a collection of modular storage solutions built to help businesses be more efficient and grow. During CES, Seagate will showcase the system and several key product concepts in the line including:

Lyve Drive Cards and Lyve Drive Card Reader – High capacity, high-performance 1TB CFexpress cards and a portable card reader for ingesting endpoint data sources.

Lyve Drive Shuttle – An autonomous data storage and transport solution for easy ingestion from direct-attached, network-attached, and other external storage devices. It offers up to 16TB of capacity depending on HDD or SSD configuration, and an e-ink touchscreen display to copy files directly without a PC.

Lyve Drive Mobile Array – A sealed, high-performance, 6-bay array that is ruggedized and easy to transport. The mobile array displayed at CES will feature 6 of Seagate’s 18TB Exos HAMR (heat-assisted magnetic recording) hard drives for a total capacity of 108TB.

Lyve Drive Modular Array – A high-performance 4-bay array with flexible configuration so businesses can build what they need for a particular workflow. The high-capacity modular array displayed at CES will include Seagate’s Exos 2X14 enterprise hard drive, which is the first to integrate Seagate’s groundbreaking MACH.2 multi-actuator technology.

Lyve Drive Rackmount Receiver – A high-performance datacenter 4U rackmount ingestion hub that accepts two Lyve Drive arrays for high-speed data transfer directly into a data center fabric without the need of cables.