Cloud adoption was already strong heading into 2020. According to a study by O’Reilly, 88% of businesses were using the cloud in some form in January 2020. The global pandemic just accelerated the move to SaaS tools. This seismic shift where businesses live day-to-day means a massive amount of business data is making its way into the cloud.
All this data is absolutely critical for core business functions. However, it is all too often mistakenly considered “safe” thanks to blind trust in the SaaS platform. But human error, cyberattacks, platform updates and software integrations can all easily compromise or erase that data … and totally destroy a business.
According to Microsoft, 94% of businesses report security benefits since moving to the cloud. Although there are definitely benefits, data is by no means fully protected – and the threat to cloud data continues to rise, especially as it ends up spread across multiple applications.
Organizations continue to overlook the simple steps they can take to better protect cloud data and their business. In fact, our 2020 Ecommerce Data Protection Survey found that one in four businesses has already experienced data loss that immediately impacted sales and operations.
Cloud data security illusions
Many companies confuse cloud storage with cloud backup. Cloud storage is just that – you’ve stored your data in the cloud. But what if, three years later, you need a record of that data and how it was moved or changed for an audit? What if you are the target of a cyberattack and suddenly your most important data is no longer accessible? What if you or an employee accidentally delete all the files tied to your new product line?
Simply storing data in the cloud does not mean it is fully protected. The ubiquity of cloud services like Box, Dropbox, Microsoft 365, Google G Suite/Drive, etc., has created the illusion that cloud data is protected and easily accessible in the event of a data loss event. Yet even the most trusted providers manage data by following the Shared Responsibility Model.
The same goes for increasingly popular business apps like BigCommerce, GitHub, Shopify, Slack, Trello, QuickBooks Online, Xero, Zendesk and thousands of other SaaS applications. Cloud service providers only fully protect system-level infrastructure and data. So while they ensure reliability and recovery for system-wide failures, the cloud app data of individual businesses is still at risk.
In the current business climate, human errors are even more likely. With the pandemic increasing the amount of remote work, employees are navigating constant distractions tied to health concerns, increasing family needs and an inordinate amount of stress.
Complicating things further, many online tools do not play nicely with each other. APIs and integrations can be a challenge when trying to move or share data between apps. Without a secure backup, one cyberattack, failed integration, faulty update or click of the mouse could wipe out the data a business needs to survive.
While top SaaS platforms continue to expand their security measures, data backup and recovery is missing from the roadmap. Businesses need to take matters into their own hands.
Current cloud backup best practices
In its most rudimentary form, a traditional cloud backup essentially makes a copy of cloud data to support business continuity and disaster recovery initiatives. Proactively protecting cloud data ensures that if that business-critical data is compromised, corrupted, deleted or inaccessible, they still have immediate access to a comprehensive, usable copy of the data needed to avoid business disruption.
From multi-level user access restrictions, password managers and regularly timed manual downloads, there are many basic (even if tedious) ways for businesses to better protect their cloud data. Some companies have invested in building more robust backup solutions to keep their cloud business data safe. However, homegrown backup solutions are costly and time intensive as they require constant updates to keep pace with ever-changing APIs.
In contrast, third-party backup solutions can provide an easier to manage, cost/time-efficient way to protect cloud data. There is a wide range of offerings though – some more reputable and secure than others. Any time business data is entrusted to a third party, reputability and security of that vendor must take center stage. If they have your data, they need to protect it.
Cloud backup providers need to meet stringent security and regulatory requirements so look for explicit details about how they secure your data. As business data continues to move to the cloud, storage limits, increasingly complex integrations and new security concerns will heighten the need for comprehensive cloud data protection.
The trend of business operations moving to the cloud started long before the quarantine. Nevertheless, the cloud storage and security protocols most businesses currently rely on to protect cloud data are woefully insufficient.
Critical business data used to be stored (and secured) in a central location. Companies invested significant resources to manage walls of servers. With SaaS, everything is in the cloud and distributed – apps running your store, your account team, your mailing list, your website, etc. Business data in the backend of each SaaS tool looks very different and isn’t easily transferable.
All the data has become decentralized, and most backups can’t keep pace. It isn’t a matter of “if” a business will one day have a data loss event, it’s “when”. We need to evolve cloud backups into a comprehensive, distributed cloud data protection platform that secures as much business-critical data as possible across various SaaS platforms.
As businesses begin to rethink their approach to data protection in the cloud era, business backups will need to alleviate the worry tied to losing data – even in the cloud. True business data protection means not worrying about whether an online store will be taken out, a third-party app will cause problems, an export is fully up to date, where your data is stored, if it is compliant or if you have all of the information needed to fully (and easily) get apps back up and running in case of an issue.
Delivering cohesive cloud data protection, regardless of which application it lives in, will help businesses break free from backup worry. The next era of cloud data protection needs to let business owners and data security teams sleep easier.
One Identity released a global survey that reveals attitudes of IT and security teams regarding their responses to COVID-19-driven work environment changes. The results shed insight into IT best practices that have emerged in recent months, and how organizations rushed to adopt them to maintain a secure and efficient virtual workplace.
Cloud computing has been a lifesaver
99% of IT security professionals said their organizations transitioned to remote work because of COVID-19, and only a third described that transition as “smooth.” 62% of respondents indicated that cloud infrastructure is more important now than 12 months ago.
Thirty-one percent attributed this shift directly to COVID-19. The cloud has become front and center to the new working reality, creating flexibility for employees.
These results demonstrate that the previous level of attention to cloud deployments, while notable, does not appear to have been nearly enough to accommodate the dramatic computing shift across organizations.
“This research makes it clearly evident that cloud computing has been a lifesaver for many enterprises as IT teams pivoted and supported the massive shift to working away from offices,” said Darrell Long, president and general manager at One Identity.
“While we knew the pandemic-driven changes were sudden, what was particularly notable was how strongly the results proved that organizations had to turn their focus on the immediate challenges presented by the aggressive move to cloud computing, chiefly finding solutions that streamlined administering and securing who has access to what and how.”
Higher priority on access request technologies
Shifts in priorities indicate organizations are turning their focus on tackling the security basics. When compared to 12 months ago, 50% of respondents are placing a higher priority on access request technologies, and 31% said this change in prioritization is because of COVID.
Identity/access lifecycle management, identity process and workflow, and role management all saw increased priority among at least half of those surveyed.
Perhaps shell shocked, only 45% of IT security professionals indicated they are prepared for the IT changes necessary when their employees move back to organizations’ offices, according to survey results. Yet, 66% expressed increased confidence in the effectiveness of their identity management programs post COVID-based changes.
“We now know the truth: the COVID pandemic did not change the need to be productive, nor did it change the regulatory compliance requirements companies face, but clearly IT and security teams scrambled to shift their systems to accommodate work from home in a secure and controlled way,” said Long.
“Companies and organizations were helped to an extent by cloud investments that prepared them pre-COVID. However, most of them are still dealing with new challenges as employees adapt, IT and security teams effectively respond to the challenge of providing effective processes for gaining access to the resources needed for the workforce to do their jobs and security challenges associated with this new working environment.”
Many companies tend to jump into the cloud before thinking about security. They may think they’ve thought about security, but when moving to the cloud, the whole concept of security changes. The security model must transform as well.
Moving to the cloud and staying secure
Most companies maintain a “castle, moat, and drawbridge” attitude to security. They put everything inside the “castle” (datacenter); establish a moat around it, with sharks and alligators, guns on turrets; and control access by raising the drawbridge. The access protocol involves a request for access, vetting through firewall rules where the access is granted or denied. That’s perimeter security.
When moving to the cloud, perimeter security is still important, but identity-based security is available to strengthen the security posture. That’s where a cloud partner skilled at explaining and operating a different security model is needed.
Anybody can grab a virtual machine, build the machine in the cloud, and be done, but establishing a VM and transforming the machine to a service with identity-based security is a different prospect. When identity is added to security, the model looks very different, resulting in cost savings and an increased security posture.
Advanced technology, cost of security, and lack of cybersecurity professionals place a strain on organizations. Cloud providers invest heavily in infrastructure, best-in-class tools, and a workforce uniquely focused on security. As a result, organizations win operationally, financially, and from a security perspective, when moving to the cloud. To be clear, moving applications and servers, as is, to the cloud does not make them secure.
Movement to the cloud should be a standardized process and should use a Cloud Center of Excellence (CCoE) or Cloud Business Office (CBO); however, implemented within a process focused on security first, organizations can reap the security benefits.
Although security is marketed as a shared responsibility in the cloud, ultimately, the owner of the data (customer) is responsible and the responsibility is non-transferrable. In short, the customer must understand the responsibility matrix (RACI) involved to accomplish their end goals. Every cloud provider has a shared responsibility matrix, but organizations often misunderstand the responsibilities or the lines fall into a grey area. Regardless of responsibility models, the data owner has a responsibility to protect the information and systems. As a result, the enterprise must own an understanding of all stakeholders, their responsibilities, and their status.
When choosing a partner, it’s vital for companies to identify their exact needs, their weaknesses, and even their culture. No cloud vendor will cover it all from the beginning, so it’s essential that organizations take control and ask the right questions (see Cloud Security Alliance’s CAIQ), in order to place trust in any cloud provider. If it’s to be a managed service, for example, it’s crucial to ask detailed questions about how the cloud provider intends to execute the offering.
It’s important to develop a standard security questionnaire and probe multiple layers deep into the service model until the provider is unable to meet the need. Looking through a multilayer deep lens allows the customer and service provider to understand the exact lines of responsibility and the details around task accomplishment.
It might sound obvious, but it’s worth stressing: trust is a shared responsibility between the customer and cloud provider. Trust is also earned over time and is critical to the success of the customer-cloud provider relationship. That said, zero trust is a technical term that means, from a technology viewpoint, assume danger and breach. Organizations must trust their cloud provider but should avoid blind trust and validate. Trust as a Service (TaaS) is a newer acronym that refers to third-party endorsement of a provider’s security practices.
Key influencers of a customer’s trust in their cloud provider include:
- Data location
- Investigation status and location of data
- Data segregation (keeping cloud customers’ data separated from others)
- Privileged access
- Backup and recovery
- Regulatory compliance
- Long-term viability
A TaaS example: Google Cloud
Google has taken great strides to earn customer trust, designing the Google Cloud Platform with a key eye on zero trust and its implementation of the model BeyondCorp. For example, Google has implemented two core concepts including:
- Delivery of services and data: ensuring that people with the correct identity and the right purpose can access the required data every time
- Prioritization and focus: access and innovation are placed ahead of threats and risks, meaning that as products are innovated, security is built into the environment
Transparency is very important to the trust relationship. Google has enabled transparency through strong visibility and control of data. When evaluating cloud providers, understanding their transparency related to access and service status is crucial. Google ensures transparency by using specific controls including:
- Limited data center access from a physical standpoint, adhering to strict access controls
- Disclosing how and why customer data is accessed
- Incorporating a process of access approvals
Multi-layered security for a trusted infrastructure
Finally, cloud services must provide customers with an understanding of how each layer of infrastructure works and build rules into each. This includes operational and device security, encrypting data at rest, multiple layers of identity, and finally storage services: multi-layered, and supported by security by default.
Cloud native companies have a security-first approach and naturally have a higher security understanding and posture. That said, when choosing a cloud provider, enterprises should always understand, identify, and ensure that their cloud solution addresses each one of their security needs, and who’s responsible for what.
Essentially, every business must find a cloud partner that can answer all the key questions, provide transparency, and establish a trusted relationship in the zero trust world where we operate.
IT leaders are increasingly concerned accelerated digital transformation, combined with the complexity of modern multicloud environments, is putting already stretched digital teams under too much pressure, a Dynatrace survey of 700 CIOs reveals.
This leaves little time for innovation, and limits teams’ ability to prioritize tasks that drive greater value and better outcomes for the business and its customers.
- 89% of CIOs say digital transformation has accelerated in the last 12 months, and 58% predict it will continue to speed up.
- 86% of organizations are using cloud-native technologies, including microservices, containers, and Kubernetes, to accelerate innovation and achieve more successful business outcomes.
- 63% of CIOs say the complexity of their cloud environment has surpassed human ability to manage.
- 44% of IT and cloud operations teams’ time is spent on manual, routine work just ‘keeping the lights on’, costing organizations an average of $4.8 million per year.
- 56% of CIOs say they are almost never able to complete everything the business needs from IT.
- 70% of CIOs say their team is forced to spend too much time doing manual tasks that could be automated if only they had the means.
“The benefits of IT and business automation extend far beyond cost savings. Organizations need this capability – to drive revenue, stay connected with customers, and keep employees productive – or they face extinction,” said Bernd Greifeneder, CTO at Dynatrace.
“Increased automation enables digital teams to take full advantage of the ever-growing volume and variety of observability data from their increasingly complex, multicloud, containerized environments. With the right observability platform, teams can turn this data into actionable answers, driving a cultural change across the organization and freeing up their scarce engineering resources to focus on what matters most – customers and the business.”
Cloud environment complexity
- Organizations are using cloud-native technologies including microservices (70%), containers (70%) and Kubernetes (54%) to advance innovation and achieve more successful business outcomes.
- However, 74% of CIOs say the growing use of cloud-native technologies will lead to more manual effort and time spent ‘keeping the lights on’.
Traditional tools and manual effort cannot keep up
- On average, organizations are using 10 monitoring solutions across their technology stacks. However, digital teams only have full observability into 11% of their application and infrastructure environments.
- 90% of CIOs say there are barriers preventing them from monitoring a greater proportion of their applications.
- The dynamic nature of today’s hybrid, multicloud ecosystems amplifies complexity. 61% of CIOs say their IT environment changes every minute or less, while 32% say their environment changes at least once every second.
CIOs call for radical change
- 74% of CIOs say their organization will lose its competitive edge if IT is unable to spend less time ‘keeping the lights on’.
- 84% said the only effective way forward is to reduce the number of tools and amount of manual effort IT teams invest in monitoring and managing the cloud and user-experience.
- 72% said they cannot keep plugging monitoring tools together to maintain observability. Instead, they need a single platform covering all use cases and offering a consistent source of truth.
Observability, automation, and AI are key
- 93% of CIOs said AI-assistance will be critical to IT’s ability to cope with increasing workloads and deliver maximum value to the business.
- CIOs expect automation in cloud and IT operations will reduce the amount of time spent ‘keeping the lights on’ by 38%, saving organizations $2 million per year, on average.
- Despite this advantage, just 19% of all repeatable operations processes for digital experience management and observability have been automated.
“History has shown successful organizations use disruptive moments to their advantage,” added Greifeneder. “Now is the time to break silos, establish a true BizDevOps approach, and deliver agile processes across a consistent, continuous delivery stack.
“This is essential for effective and intelligent automation and, more importantly, to enable engineers to take more end-to-end responsibility for the outcomes and value they create for the business.”
Today’s organizations desire the accessibility and flexibility of the cloud, yet these benefits ultimately mean little if you’re not operating securely. One misconfigured server and your company may be looking at financial or reputational damage that takes years to overcome.
Fortunately, there’s no reason why cloud computing can’t be done securely. You need to recognize the most critical cloud security challenges and develop a strategy for minimizing these risks. By doing so, you can get ahead of problems before they start, and help ensure that your security posture is strong enough to keep your core assets safe in any environment.
With that in mind, let’s dive into the five most pressing cloud security challenges faced by modern organizations.
1. The perils of cloud migration
According to Gartner, the shift to cloud computing will generate roughly $1.3 trillion in IT spending by 2022. The vast majority of enterprise workloads are now run on public, private or hybrid cloud environments.
Yet if organizations heedlessly race to migrate without making security a primary consideration, critical assets can be left unprotected and exposed to potential compromise. To ensure that migration does not create unnecessary risks, it’s important to:
- Migrate in stages, beginning with non-critical or redundant data. Mistakes are often more likely to occur earlier in the process. So, begin moving data that won’t lead to damaging consequences to the enterprise in case it gets corrupted or erased.
- Fully understand your cloud provider’s security practices. Go beyond “trust by reputation” and really dig into how your data is stored and protected.
- Maintain operational continuity and data integrity. Once migration occurs, it’s important to ensure that controls are still functioning and there is no disruption to business operations.
- Manage risk associated with the lack of visibility and control during migration. One effective way to manage risk during transition is to use breach and attack simulation software. These automated solutions launch continuous, simulated attacks to view your environment through the eyes of an adversary by identifying hidden vulnerabilities, misconfigurations and user activity that can be leveraged for malicious gain. This continuous monitoring provides a significant advantage during migration – a time when IT staff are often stretched thin, learning new concepts and operating with less visibility into key assets.
2. The need to master identity and access management (IAM)
Effectively managing and defining the roles, privileges and responsibilities of various network users is a critical objective for maintaining robust security. This means giving the right users the right access to the right assets in the appropriate context.
As workers come and go and roles change, this mandate can be quite a challenge, especially in the context of the cloud, where data can be accessed from anywhere. Fortunately, technology has improved our ability to track activities, adjust roles and enforce policies in a way that minimizes risk.
Today’s organizations have no shortage of end-to-end solutions for identity governance and management. Yet it’s important to understand that these tools alone are not the answer. No governance or management product can provide perfect protection as organizations are eternally at the mercy of human error. To help support smart identity and access management, it’s critical to have a layered and active approach to managing and mitigating security vulnerabilities that will inevitably arise.
Taking steps like practicing the principle of least privilege by permitting only the minimal amount of access necessary to perform tasks will greatly enhance your security posture.
3. The risks posed by vendor relationships
The explosive growth of cloud computing has highlighted new and deeper relationships between businesses and vendors, as organizations seek to maximize efficiencies through outsourcing and vendors assume more important roles in business operations. Effectively managing vendor relations within the context of the cloud is a core challenge for businesses moving forward.
Why? Because integrating third-party vendors often substantially raises cybersecurity risk. A Ponemon institute study in 2018 noted that nearly 60% of companies surveyed had encountered a breach due to a third-party. APT groups have adopted a strategy of targeting large enterprises via such smaller partners, where security is often weaker. Adversaries know you’re only as strong as your weakest link and take the least path of resistance to compromise assets. Due to this, it is incumbent upon today’s organizations to vigorously and securely manage third-party vendor relations in the cloud. This means developing appropriate guidance for SaaS operations (including sourcing and procurement solutions) and undertaking periodic vendor security evaluations.
4. The problem of insecure APIs
APIs are the key to successful cloud integration and interoperability. Yet insecure APIs are also one of the most significant threats to cloud security. Adversaries can exploit an open line of communication and steal valuable private data by compromising APIs. How often does this really occur? Consider this: By 2022, Gartner predicts insecure APIs will be the vector most commonly used to target enterprise application data.
With APIs growing ever more critical, attackers will continue to use tactics such as exploiting inadequate authentications or planting vulnerabilities within open source code, creating the possibility of devastating supply chain attacks. To minimize the odds of this occurring, developers should design APIs with proper authentication and access control in mind and seek to maintain as much visibility as possible into the enterprise security environment. This will allow for the quick identification and remediation of such API risks.
5. Dealing with limited user visibility
We’ve mentioned visibility on multiple occasions in this article – and for good reason. It is one of the keys to operating securely in the cloud. The ability to tell friend from foe (or authorized user from unauthorized user) is a prerequisite for protecting the cloud. Unfortunately, that’s a challenging task as cloud environments grow larger, busier and more complex.
Controlling shadow IT and maintaining better user visibility via behavior analytics and other tools should be a top priority for organizations. Given the lack of visibility across many contexts within cloud environments, it’s a smart play to develop a security posture that is dedicated to continuous improvement and supported by continuous testing and monitoring.
Critical cloud security challenges: The takeaway
Cloud security is achievable as long as you understand, anticipate and address the most significant challenges posed by migration and operation. By following the ideas outlined above, your organization will be in a much stronger position to prevent and defeat even the most determined adversaries.
As companies shift to remote work and move business operations online because of the spread of COVID-19, they are increasingly relying on cloud services.
Unexpected expenses and cloud migration
In fact, cloud spending hit a record $34.6 billion in the second quarter, representing a 30% bump year-over-year and 11% increase from the previous quarter. Further, nearly a third of IT budgets will be dedicated to cloud services by next year.
Tangoe advised companies about the risk of unexpected cloud migration expenses, while at the same time employees are also buying more self-service infrastructure to support working from home.
“Given the cost pressures many companies find themselves under because of the current economic environment, it is critical they employ a strategy for cloud investment that provides the best service to their organizations, while optimizing both their cloud infrastructure and corresponding spend,” said Brandon Henning, Chief Product Officer at Tangoe.
To maximize cloud investment and improve overall efficiency, companies are advised to take a few important steps now.
Achieve clear visibility into usage
An understanding of how the workforce is leveraging cloud technology plays a critical role in assessing the true ROI of these initiatives. This includes analyzing how usage has changed over time to better predict where increased or, in some cases, decreased, investment is needed.
Visibility goes beyond the IT department and extends into other parts of the business, such as finance, to ensure teams are aligned on how cloud spending benefits the business overall.
Reevaluate cloud infrastructure to optimize spend
Understanding the infrastructure purchased and how it aligns to what is required to support the business is critical for optimizing spend and cloud contracts.
Organizations may be able to shift from one vendor to another, or turn-up/turn-down reserve instances to better optimize infrastructure, spend and contracts. This requires having the right tools in place to provide the necessary visibility for making these assessments.
Establish proper tools for cloud environment maintenance for future investments
The modern enterprise will continue to shift to the cloud, so infrastructure requirements will only grow – and so will the associated costs of both infrastructure IT and unauthorized shadow IT purchases.
It is critical to ensure proper monitoring tools and processes are in place for keeping cloud costs under control. By proactively identifying areas in which spending can be better controlled, organizations are able to improve efficiency and adjust budget allocations to support future investments.
“There’s no arguing that cloud is driving the way businesses operate today. The ability to expand and manage these environments will be the key differentiator in successfully future-proofing business models and avoiding potential disruptions,” Henning said.
A malicious cryptocurrency miner and DDoS worm that has been targeting Docker systems for months now also steals Amazon Web Services (AWS) credentials.
The original threat
TeamTNT’s “calling card” appears when the worm first runs on the target installation:
- Scan for open Docker daemon ports (i.e., misconfigured Docker containers)
- Create an Alpine Linux container to host the coinminer and DDoS bot
- Search for and delete other coin miners and malware
- Configure the firewall to allow ports that will be used by the other components, sinkhole other domain names, exfiltrate sensitive information from the host machine
- Download additional utilities, a log cleaner, and a tool that attackers may use to pivot to other devices in the network (via SSH)
- Download and install the coinminer
- Collect system information and send it to the C&C server
The latest iteration has been equipped with new capabilities, Cado Security researchers found.
The worm still scans for open Docker APIs, then spins up Docker images and install itself in a new container, but it now also searches for exploitable Kubernetes systems and files containing AWS credentials and configuration details – just in case the compromised systems run on the AWS infrastructure.
The code to steal these files is relatively straightforward, the researchers note, and they expect other worms to copy this new ability soon.
But are the attackers using the stolen credentials or are they selling them? The researchers tried to find out by sending “canary” AWS keys to TeamTNT’s servers, but they haven’t been used yet.
“This indicates that TeamTNT either manually assess and use the credentials, or any automation they may have created isn’t currently functioning,” they concluded.
Nevertheless, they urge businesses to:
- Identify systems that are storing AWS credential files and delete them if they aren’t needed
- Use firewall rules to limit any access to Docker APIs
- Review network traffic for connections to mining pools or using the Stratum mining protocol
- Review any connections sending the AWS Credentials file over HTTP
While most enterprises are committed to modernizing their application software portfolios, there are still myriad challenges to overcome and improvements to be made, according to a survey conducted by Hanover Research.
According to the report, application development functions have a full agenda for the next 12 months, with the majority seeking improvements in speed, quality and security to ensure continued competitiveness.
- Speed: Only 37% of respondents are very satisfied with how fast they are currently delivering new software or new features. As such, 82% of survey respondents selected wanting to meet the needs of the business faster as a top priority. To speed up application releases, 54% of respondents are investing in automation.
- Quality: Delivering better quality software, faster is a top priority for 79% of respondents as only 40% of respondents are fully satisfied with current software quality.
- Security: Less than half (42%) are fully satisfied with the security of their application software, and as such, 48% identified DevSecOps – the integration of security early and often in software development – as an active initiative.
Application software challenges
The road to modern applications can be arduous as complex legacy infrastructure, inefficient manual processes and organizational silos persist as barriers to progress.
- Complex infrastructure: Inflexible and complex legacy environments (35%) were cited as the top barrier to modern apps for respondents. A quarter of respondents (25%) identified too much custom software and 23% listed infrastructure that can’t keep up with the demand for new features, as additional challenges.
- Inefficient processes: When asked which software challenges does your company face, 31% of survey respondents said current IT processes take too much time, followed by test and release processes taking too long (29%). Additionally, the survey found that there is a higher focus on migrating applications to the cloud (59%) than on efforts such as rolling out continuous integration and deployment (43%) and containers and container orchestration (33%).
- Silos: Nearly half (43%) of companies report that their infrastructure can’t keep up with app development needs, in part because of silos between applications and infrastructure teams. Related, 72% of responding organizations listed running IT with more controls and visibility as a top priority.
Drivers of success
There are four pillars to a successful modern applications strategy: cloud native applications (built with microservices), cloud native platforms, continuous integration / continuous delivery pipelines and adoption of DevOps culture and practices.
To help enterprises achieve the benefits of modern applications – such as security, quality and speed – the report found six common themes correlated with success, including:
Overarching strategy: 25% of enterprises don’t have an overarching strategy for modernizing applications. This number increases to 31% when looking at companies with $5 billion or more in revenue. Investing in a strategy before embarking on any IT project will ensure cross-functional alignment and reduce risks.
Executive sponsorship: When asked which benefits of application modernization your company experienced over the past 12 months, organizations with executive sponsorship report a 15% to 16% increase in improved software quality and faster delivery of new features.
Strategic vendor partnerships: 39% of enterprises feel that their systems integrators/service providers are not meeting expectations in the area of modern applications. The desire to ditch bad habits, update architecture and minimize technical debt must be coupled with a willingness to explore and adopt new service providers who understand both brownfield (modernization of legacy apps) and greenfield (cloud native development) models.
Training: When it comes to modernizing application software and cloud-native development, finding talent with the application software skills needed is a priority for 45% of respondents, yet 23% of survey respondents said they can’t find enough talent. As enterprises compete for top talent, enabling/training its existing workforce is critical, as well.
DevOps practices and tooling: According to the report, 77% of enterprises have some level of DevOps activities underway. However, the promise of DevOps can be elusive if the right strategy and structure isn’t put into place to facilitate application delivery. Additionally, the report found that executive sponsorship drives up DevOps success and satisfaction, with 76% of the most successful DevOps programs having executive sponsorship.
Cloud center of excellence: Of organizations that say infrastructure satisfied or exceeds the demands of the applications function, 78% have Cloud Centers of Excellence (CCoE) and 84% have an overarching strategy guiding public cloud adoption, often facilitated by the CCoE.
“Today’s enterprises are bogged down with complex legacy software, with software developers spending nights and weekends wrestling outdated IT infrastructure,” said Tom Pohlmann, executive vice president of customer success at AHEAD. “A move to cloud native apps and platforms, as well as adoption of DevOps practices, present the best opportunity to keep pace with agile, cloud-born competitors.”
Integrated cloud-native security platforms can overcome limitations of traditional security products
To close security gaps caused by rapidly changing digital ecosystems, organizations must adopt an integrated cloud-native security platform that incorporates artificial intelligence, automation, intelligence, threat detection and data analytics capabilities, according to 451 Research.
Cloud-native security platforms are essential
The report clearly defines how to create a scalable, adaptable, and agile security posture built for today’s diverse and disparate IT ecosystems. And it warns that legacy approaches and MSSPs cannot keep up with the speed of digital transformation.
- Massive change is occurring. Over 97 percent of organizations reported they are underway with, or expecting, digital transformation progress in the next 24 months, and over 41 percent are allocating more than 50 percent of their IT budgets to projects that grow and transform the business.
- Security platforms enable automation and orchestration capabilities across the entire IT stack, streamlining and optimizing security operations, improving productivity, enabling higher utilization of assets, increasing the ROI of security investments and helping address interoperability challenges created by isolated, multi-vendor point products.
- Threat-driven and outcome-based security platforms address the full attack continuum, compared with legacy approaches that generally focus on defensive blocking of a single vector.
- Modern security platforms leverage AI and ML to solve some of the most prevalent challenges for security teams, including expertise shortages, alert fatigue, fraud detection, behavioral analysis, risk scoring, correlating threat intelligence, detecting advanced persistent threats, and finding patterns in increasing volumes of data.
- Modern security platforms are positioned to deliver real-time, high-definition visibility with an unobstructed view of the entire IT ecosystem, providing insights into the company’s assets, attack surface, risks and potential threats and enabling rapid response and threat containment.
451 Senior Analyst Aaron Sherrill noted, “The impact of an ever-evolving IT ecosystem combined with an ever-evolving threat landscape can be overwhelming to even the largest, most well-funded security teams, including those at traditional MSSPs.
“Unfortunately, a web of disparate and siloed security tools, a growing expertise gap and an overwhelming volume of security events and alerts continue to plague internal and service provider security teams of every size.
“The consequences of these challenges are vast, preventing security teams from gaining visibility, scaling effectively, responding rapidly and adapting quickly. Today’s threat and business landscape demands new approaches and new technologies.”
How to deliver effective cybersecurity today
“Delivering effective cybersecurity today requires being able to consume a growing stream of telemetry and events from a wide range of signal sources,” said Dustin Hillard, CTO, eSentire.
“It requires being able to process that data to identify attacks while avoiding false positives and negatives. It requires equipping a team of expert analysts and threat hunters with the tools they need to investigate incidents and research advanced, evasive attacks.
“Most importantly, it requires the ability to continuously upgrade detection and defenses. These requirements demand changing the technology foundations upon which cybersecurity solutions are built—moving from traditional security products and legacy MSSP services to modern cloud-native platforms.”
Sherrill further noted, “Cloud-native security platforms optimize the efficiency and effectiveness of security operations by hiding complexity and bringing together disparate data, tools, processes, workflows and policies into a unified experience.
“Infused with automation and orchestration, artificial intelligence and machine learning, big data analytics, multi-vector threat detection, threat intelligence, and machine and human collaboration, cloud-native security platforms can provide the vehicle for scalable, adaptable and agile threat detection, hunting, and response. And when combined with managed detection and response services, organizations are able to quickly bridge expertise and resource gaps and attain a more comprehensive and impactful approach to cybersecurity.”
Public cloud adoption continues to surge, with roughly 83% of all enterprise workloads expected to be in the cloud by the end of the year. The added flexibility and lower costs of cloud computing make it a no-brainer for most organizations.
Yet while cloud adoption has transformed the way applications are built and managed, it has also precipitated a radical rethink of how to approach security. What has historically worked on-premises is no longer relevant when dealing with public cloud or hybrid environments.
So, how does one modernize and develop an effective cloud security posture management (CSPM) strategy? Let’s take a closer look at some best practices you can adopt to efficiently manage this transition.
Don’t use static tools and practices in dynamic environments
On-premises security and compliance auditing procedures simply won’t work effectively in a dynamic cloud environment. Instead, you need procedures designed to accommodate the dynamic nature of cloud objects and the rules put in place by the cloud provider. Things simply change much too quickly in the public cloud for routine scanning or other point-in-time snapshot solutions to be a useful standalone security and compliance measure.
Instead, implement CSPM tools that offer the power of continuous, automated monitoring and test your security posture against cloud-specific benchmarks. One example of this approach is a breach and attack simulation (BAS) platform. These advanced tools launch non-stop simulated attacks against security environments and provide prioritized remediation guidance.
Unlike point-in-time scanning or manual pen testing, a BAS platform works continuously to uncover security gaps along with a variety of other key CSPM uses. By harnessing the power of automated continuous protection, these tools are ideally suited for the task of maintaining security in highly dynamic environments.
Rank and remediate
Alert fatigue is a dangerous phenomenon in many fields, and cybersecurity is no exception. Studies have shown that – particularly in information security or healthcare settings – alert fatigue can overload staff, increasing the odds that they miss truly significant events because they are overwhelmed by the sheer amount of information coming at them.
Ideally, organizations need to minimize false positives and quickly identify critical risks and violations, i.e., those that jeopardize “crown jewel” assets by exposing data or allowing unauthorized access.
This raises an important question: How do IT staff slice through the fog and effectively prioritize the most urgent risks?
One option is to work with an outside expert to design a plan (as part of a cloud security posture assessment) for creating and enabling mission-critical security checks and policies. A second option is the incorporation of new technology (such as the aforementioned BAS platforms) to make the process of identifying, ranking and remediating threats simpler through continuous automation. By implementing both, it becomes possible to minimize the risk of critical threats being missed or mis-ranked.
More emphasis on security checks in development pipelines
We mentioned above how the dynamic nature of public clouds can render a security scan almost instantly irrelevant. Trying to stay current with outdated tools and approaches is more than a guaranteed losing battle – it’s also a massive waste of time and resources.
So how does one enforce security in such an ephemeral environment? It’s no small challenge, but it can be done without extreme commitments of time and money and never-ending games of “catch up.”
One simple fix is to define misconfiguration checks as a pipeline, allowing for violations to be rooted out once deployment pipelines are in force. Misconfigurations can therefore be quickly and easily rectified by embedding remediation into the pipeline. Feedback can be collected and analyzed to spot violation trends and adapt policies as needed.
Effective cloud security posture: The takeaway
The adoption of public cloud computing has been inexorable, and in a post-COVID-19 world, it will accelerate exponentially. Organizations are eager for a competitive edge by reaping the benefits of cloud computing at scale.
The mandate to migrate quickly needs to be balanced with an equal effort to maintain a strong security posture. In many cases, the ability to operate safely in the cloud has not kept pace with the speed by which adoption has occurred. One need only look at the countless examples of simple (and highly preventable) server misconfigurations causing massive amounts of financial and reputational harm. The fact that this often happens to the most deeply resourced enterprises with access to top drawer security talent should give organizations even greater pause.
To maintain a more robust cloud security posture, it’s necessary to update existing, on premises-centric policies and frameworks and align them with the new and fast-evolving circumstances of cloud and hybrid environments.
In that same vein, it also makes sense to deploy newer cloud security posture management tools, such as BAS, that are especially well-suited to this particular task. The dynamism of cloud environments is one of the core challenges defenders must face; tools that offer automated and continuous protection are part of the answer to surmounting this challenge. Without continuous monitoring, it is simply impossible to manage risk in an ephemeral landscape.
By combining a new approach with a better selection of tools to help implement that approach, today’s enterprises can manage risk more effectively – and develop the kind of resilient cloud security posture management that helps prevent the nightmare of critical asset exposure.
Even before lockdowns, there was a steady migration toward more flexible workforce arrangements. Given the new normal of so many more people working from home—on top of a pile of evidence showing that productivity and quality of life typically go up with remote work—it is inevitable that many more companies will continue to offer those arrangements even as stay-at-home orders are lifted.
Unfortunately, a boom in remote access goes hand-in-hand with an increased risk to sensitive information. Verizon reports that 30 percent of recent data breaches were a direct result of the move to web applications and services.
Data is much harder to track, govern, and protect when it lives inside a cloud. In large part, these threats are associated with internet-exposed storage.
Emerging threat matrix
Traditionally, system administrators rely on perimeter security to stop outside intruders, yet even the most conscientious are exposed after a single missed or delayed update. Beyond that, insiders are widely considered the biggest threat to data security.
Misconfiguration accounts for the vast majority of insider errors. It is usually the result of failure to properly secure cloud storage or firewall settings, and largely relates to unsecured databases or file storage that are directly exposed on a cloud service.
In many cases, employees mislabel private documents by setting storage privileges to public. According to the Verizon report, among financial services and insurance firms, this is now the second most common type of misconfiguration error.
Addressing this usually means getting open sharing under control, figuring out where sensitive data resides and who owns it, and running a certificate program to align data access with organizational needs.
Optimistically, companies hope that a combination of technological safeguards and diligence on the part of users—whether employees, partners, or customers—will eliminate, or at least minimize, costly mistakes.
Other internal threats come as a part of a cloud migration or backup process, where a system admin or DBA will often stand up an instance of data on a cloud platform but fail to put inconvenient but necessary access controls in place.
Consider the example of cloud data warehouses. Providers such as Amazon, Google, and Snowflake now make it simple to store vast quantities of data cheaply, to migrate data easily, and to scale up or down at will. Little wonder that these services are growing so quickly.
Yet even the best services need some help when it comes to tracking data access. Some tools makes it easy to authenticate remote users before letting them inside the gate of the cloud data warehouse. After that, though, things often get murky. Who is accessing which data, how much of it, when, and from where?
These are issues that every company must confront. That data is ripe for exploitation by dishonest insiders, or by careless employees, with serious consequences. In more fortunate circumstances, it is discovered by security teams, or by management who make an irate call to the CISO.
Born in the cloud
More approaches to data security that are born in the cloud are now appearing, and the new normal means the enterprise is motivated to adapt. As most organizations turn to the cloud for what used to be on-premises IT deployments, the responsibility and techniques to secure the infrastructure and applications that hold data are also being moved to the cloud.
For instance, infrastructure-as-a-service (IaaS) provides virtualized computing resources like virtual firewalls and network security hardware, and virtual intrusion detection and prevention, but these are an intermediate step at best.
The idea is that IaaS can offer a set of defenses at scale for all of a cloud provider’s customers, built into the platform itself, which will relieve an individual cloud customer from having to do many of the things that used to be on-premises data-protection requirements.
But what has really changed? A top certification may be enough to be called “above average” data security, but in reality that security still remains totally contingent on perimeter defenses, hardware appliances, and proper configurations by system administrators and DBMs. And it’s still only as good as the data hygiene of end users. There are a lot of “ifs” and “buts,” which is nothing new.
Data Security-as-a-Service (DSaaS) complements IaaS as it integrates data protection at the application layer. This places data access services in the path between users who want data and the data itself. It is also portable because it goes where the application goes.
Developers can embed data access governance and protection into applications through a thin layer of technology wrapped around database drivers or APIs, which all applications use to connect to their databases. An obvious advantage is that this is more easily maintained over time.
Data security is a shared responsibility among security pros, end users, and cloud providers. As the new normal becomes reality, shared responsibility means that a cloud provider handles the underlying network security such that the cloud infrastructure ensures basic, customer-level network isolation and secure physical routers and switches.
From here, under the DSaaS model the cloud service provider offers DSaaS—or else the customer provisions it through a third party—as a set of automated data security components that complete a secure cloud environment.
This makes it possible to govern each user at a granular level so that they access only the types of data they should, and perform only those actions with the data for which they are authorized. CISOs can implement and adapt rulesets to govern the flow of data by type and role. In terms of data protection, application-layer data security makes it possible to isolate and block bad traffic, including excessive data volumes, down to an individual user.
From this perspective, DSaaS can act as both an intrusion detection system (IDS) and intrusion prevention system (IPS). It can inspect data access and analyze it for intrusion attempts or vulnerabilities in workload components that could potentially exploit a cloud environment, and then automatically stop data access in progress until system admins can look into the situation.
At this level it is also feasible to log data activity such as what each user does with the data they access, satisfying both security and compliance—a notable accomplishment, considering that the two functions are often at odds with one another.
Incorporating security at the application layer also offers data protection capabilities that are similar to network intrusion appliances, or security agents that reside at the OS level on a virtual machine or at the hypervisor level.
Moreover, DSaaS governance and protection is so fine-grained that it does not inhibit traffic flow, data availability, and uptime even in the face of multiple sustained attacks.
Everyone is talking about how the “new normal” is impacting data security, but the enterprise was well on this path before the pandemic. It is tempting for vigilance to give rise to pessimism since data security has too often been a laggard, and an inventory of the cloud data-security bona fides of most companies is not encouraging.
However, data protection and governance can be assured should we adopt shared models for responsibility and finely tuned, application-level controls. It’s a new world and we can be ready for it.
In the wake of COVID-19, enabling remote work has required IT teams to rapidly lean into cloud technologies to keep their businesses operating smoothly. A survey suggests that cloud usage continues to rise, and what was a sudden shift will become a permanent strategy for most organizations.
Despite many countries planning for a return to physical offices and workspaces, 60% of IT leaders are continuing to increase their overall cloud usage and 91% are changing their cloud strategy as a result of the current economic climate.
The study, conducted by Snow Software surveyed 250 IT leaders around the world to find out how cloud usage and investment decisions have evolved during the crisis.
Cloud usage continues to increase
Overall, 82% of those surveyed said they have increased their cloud usage over the past several weeks in response to the pandemic. 60% said their cloud usage continues to increase, indicating that cloud consumption patterns are still in flux even after the initial surge in remote work.
Additionally, 66% reported that they will continue to use the cloud services and applications they implemented during the crisis once employees return to the workplace. Surprisingly, only 22% reported they saw an initial increase in cloud usage but that it had leveled off.
While Zoom and Teams dominated the headlines, cloud infrastructure was actually the biggest driver of this increase. When asked about how their company’s use of cloud services and applications changed in response to the current crisis, 76% said they have increased their use of cloud platforms such as AWS, Microsoft Azure and even private cloud.
55% noted an increase in collaboration tools like Slack, Teams or Google Chat, while 52% of those surveyed indicated an increase in cloud-based video conferencing software like Zoom, Cisco WebEx or GoToMeeting.
While many companies may have already relied on these productivity services ahead of the crisis, the surge in cloud infrastructure represents a more fundamental shift in how organizations operate.
A change in enterprise cloud strategy
Overall, these trends hint at a larger change in enterprise cloud strategy. As IT leaders face the concurrent challenges of continuing to support remote work, enabling a return to the workplace and tightening budgets, 91% said they are altering their cloud strategy.
Twice as many say they are accelerating cloud migration (45%) and digital transformation (41%) versus putting those initiatives on hold (22% and 21% respectively).
However, while usage and investment in cloud technologies continue to increase, a third of respondents indicated that they are getting creative with their budget – 32% of respondents are asking their cloud vendors for extended payment terms and 31% are renegotiating their cloud contracts. Around 10% of respondents indicated that they would not be able to pay their cloud bills this month.
“The COVID-19 pandemic has turned cloud into an essential service for many organizations while also highlighting the complexities of managing cloud cost and usage,” said Jay Litkey, EVP of Cloud Management at Snow Software.
“This survey confirms what we are hearing from our customers – that while many CIOs are being asked to trim costs, there will be continued investment in technology that presents the opportunity for long-term growth and stability.
“To weather the storm, IT leaders must take a comprehensive approach to managing cloud, uncovering opportunities to streamline costs while continuing to provide the infrastructure needed to support their workforce and drive innovation.”
Additional key findings
- 82% of IT leaders surveyed said they have noticed positive changes in employees’ attitudes towards IT since the pandemic started.
- 47% of respondents said they will feel comfortable returning to a physical office once their company outlines a clear plan that ensures the safety of employees. However, 43% would like their company to offer work from home options even after reopening.
- When asked about which applications – beyond core IT software – that have been lifesavers, respondents said video conferencing apps like Zoom, Cisco WebEx and GoToMeeting (73%) and communication apps like Slack, Teams and Google Chat (65%).
With the dramatic shift toward remote workforces over the last three months, many organizations are relying more heavily on cloud tools and application suites. One of the most popular is Microsoft’s OneDrive.
While OneDrive may seem like a secure cloud storage solution for companies looking to use Microsoft’s suite of business tools, many glaring security issues can expose sensitive data and personally identifiable information (PII) if proper protection protocols are ignored. Data theft, data loss, ransomware, and compliance violations are just a few things that organizations need to watch for as their employees increasingly rely on this application to save more and more documents to the cloud.
While OneDrive does provide cloud storage, it doesn’t have cloud backup functionality, a critical distinction that must be made when choosing which information to upload and share. The data is accessible, but not protected. How can businesses ensure they’re mitigating security risks, while also enabling employee access? Below we’ll discuss some of the most significant security gaps associated with OneDrive and highlight the steps organizations can take to better protect their data.
One area that often breeds confusion for OneDrive users is who can access company files once they’re uploaded in the cloud. For employees saving documents on their personal accounts, all the files created or added outside of a “Shared with Me” folder are private until the user decides otherwise. At this point, files are encrypted for anyone but the creator and Microsoft personnel with administrative rights. For someone else to see your data, you have to share the folder or a separate file.
The same rule holds for files shared on a OneDrive for Business account, with one exception: a policy set by an administrator determines the visibility of the data you create in the “Shared” folder.
Are sensitive documents safe in OneDrive?
For purposes of this article, sensitive documents refer to materials that contain either personally identifiable information (PII), personal health information (PHI), financial information, or data covered under FISMA and GLBA compliance requirements. As we established above, these types of documents can be saved one of two ways – by an individual under a personal OneDrive account or uploaded under a Business account. Even if your business does not subscribe to a OneDrive business account, organizations should be aware that employees may be emailing themselves documents or sharing them to their personal OneDrive folders for easy access, especially over the past several months with most employees working from home.
For personal users, OneDrive has a feature called Personal Vault (PV). How secure is the OneDrive Personal Vault? It is a safe located in your Files folder explicitly designed for sensitive information.
When using PV, your files are encrypted until your identity is verified. It has several different verification methods that users can set up, whether it’s a fingerprint, a face ID, or a one-time code sent via email or SMS. The PV folder also has an idle-time screensaver that locks if you are inactive for 3 minutes on the mobile app, and 20 minutes on the web. To regain access, you need to verify yourself again.
Interestingly, the PV function isn’t available in the OneDrive for Business package. Therefore, if your organization has no other way to store sensitive data than on OneDrive, additional security measures must be taken.
OneDrive is not a backup solution
OneDrive is not a backup tool. OneDrive provides cloud storage, and there is a massive difference between cloud backup and cloud storage. They have a few things in common, like storing your files on remote hardware. But it’s not enough to make them interchangeable.
In short, cloud storage is a place in the cloud where you upload (manually or automatically) and keep all your files. Cloud storage allows you to reach files from any device at any time, making it an attractive option for workers on the go and those that work from different locations. It also allows you to manually restore files from storage in case of unwanted deletion and scale storage for your needs. While “restoring files” sounds eerily similar to backup protection, it has some fundamental faults. For example, if you accidentally delete a file in storage, or it was hit by ransomware and encrypted, you can consider the file lost. This makes OneDrive storage alone a weak solution for businesses. If disaster strikes and information is compromised, the organization will have no way to restore high volumes of data.
Cloud backup, on the other hand, is a service that uses cloud storage to save files, but its functionality doesn’t end there. Cloud backup services automatically copy your data to the storage area and restore your data relatively quickly after a disaster. You can also restore multiple versions of a backed-up file, search for specific files, and it protects data from most of the widespread threats, including accidental deletion, brute-force attacks, and ransomware.
In summary: cloud storage provides access, cloud backup provides protection.
What are the most common OneDrive risks?
All the security issues tied with using OneDrive are common for most cloud storage services. Both individual OneDrive and OneDrive for Business have multiple risks, including data theft, data loss, corrupted data, and the inadvertent sharing of critical information. Given the ease of access to documents in OneDrive, compliance violations are also a top concern for organizations that deal with sensitive data.
How can you maximize OneDrive security?
To minimize the above security issues, organizations need to follow a set of strict protocols, including:
1. Device security protocols – Several general security protocols should be implemented with devices using OneDrive. Some of the most basic include mandatory downloading of antivirus software and ensuring it is current on all employee devices. Other steps include using a firewall, which will block all questionable inbound traffic, and activating idle-time screensaver passwords. As employees return from remote work locations and bring their devices back on-premise, it’s crucial to ensure all devices have updated security and meet the latest compliance requirements.
2. Network security protocols – In addition to using protected devices, employees should be especially cautious when connecting to any unsecured networks. Before connecting to a hotspot, instruct employees to make sure the connection is encrypted and never open OneDrive if the link is unfamiliar. Turning off the functionality that allows your computer to connect to in-range networks automatically is one easy way to add a layer of protection.
3. Protocols for secure sharing – Make sure to terminate OneDrive for Business access for any users who are no longer with the company. Having an employee offboarding process that includes this step lessens the risk of a former employee stealing documents or information. Make sure to allow access to only invited viewers on OneDrive. If you share a file or folder with “Everyone” or enable access with the link, it opens up new risks as anyone on the internet can find and access your document. It’s also helpful to have outlined rules for downloading and sharing documents inside, and outside, the corporation.
4. Secure sensitive data – Avoid storing any payment data in any Office 365 products. For other confidential documents, individual users can use PV. Organizations can store sensitive data only by using a secure on-premises or encrypted third-party cloud backup service that is compliant with data regulations mandatory for your organization.
5. Use a cloud backup solution – To best protect your company from all sides, it’s essential to use a cloud backup solution when saving valuable information to OneDrive. Make sure any backup solution you choose has cloud-to-cloud capabilities with automatic daily backup. In addition, a ransomware protection service that scans OneDrive and other Office 365 services for ransomware and automatically blocks attacks is your best defense against costly takeovers.
Whether it’s preparing for upcoming mandatory regulations or dealing with the sudden management of employees working offsite, the security landscape is ever-changing. Keeping up with the latest methods to keep your company both protected and compliant is a challenge that needs constant attention. With a few critical steps and the utilization of new technology, business users can protect themselves and lessen the risk to their data.
The ease and speed at which new cloud tools can be deployed is also making it harder for security teams to control their usage, IBM Security reveals.
According to the data, basic security oversight issues, including governance, vulnerabilities, and misconfigurations, remain the top risk factors organizations must address to secure increasingly cloud-based operations.
Additionally, an analysis of security incidents over the past year sheds light on how cybercriminals are targeting cloud environments with customized malware, ransomware and more.
With businesses rapidly moving to cloud to accommodate remote workforce demands, understanding the unique security challenges posed by this transition is essential for managing risk.
While the cloud enables many critical business and technology capabilities, ad-hoc adoption and management of cloud resources is also creating complexity for IT and cybersecurity teams.
According to IDC, more than a third of companies purchased 30+ types of cloud services from 16 different vendors in 2019 alone. This distributed landscape can lead to unclear ownership of security in the cloud, creating policy “blind spots” and potential for shadow IT to introduce vulnerabilities and misconfiguration.
Cloud environment threats and challenges
- Complex ownership: 66% of respondents surveyed say they rely on cloud providers for baseline security; yet perception of security ownership varied greatly across specific cloud platforms and applications.
- Cloud applications opening the door: The most common path for cybercriminals to compromise cloud environments was via cloud-based applications, representing 45% of incidents in IBM X-Force IRIS cloud-related case studies. Cybercriminals took advantage of configuration errors as well as vulnerabilities within the applications, which often remained undetected due to employees standing up new cloud apps on their own, outside of approved channels.
- Amplifying attacks: While data theft was the top impact of attacks in the cloud, hackers also targeted the cloud for cryptomining and ransomware3 – using cloud resources to amplify the effect of these attacks.
“The cloud holds enormous potential for business efficiency and innovation, but also can create a ‘wild west’ of broader and more distributed environments for organizations to manage and secure,” said Abhijit Chakravorty, Cloud Security Competency Leader, IBM Security Services.
“When done right, cloud can make security scalable and more adaptable – but first, organizations need to let go of legacy assumptions and pivot to new security approaches designed specifically for this new frontier of technology, leveraging automation wherever possible. This starts with a clear picture of regulatory obligations and compliance mandate, as well as the unique technical and policy-driven security challenges and external threats targeting the cloud.”
Who owns security in the cloud?
Organizations that rely heavily on cloud providers to own security in the cloud, despite the fact that configuration issues – which are typically users’ responsibility – are most often to blame for data breaches (accounting for more than 85% of all breached records in 2019).
Additionally, perceptions of security ownership in the cloud varied widely across various platforms and applications. For example, 73% of respondents believed public cloud providers were the main party responsible for securing software-as-a-service (SaaS), while only 42% believed providers were primarily responsible for securing cloud infrastructure-as-a-service (IaaS).
While this type of shared responsibility model is necessary for the hybrid, multi-cloud era, it can also lead to variable security policies and a lack of visibility across cloud environments. Organizations who are able streamline their cloud and security operations can help reduce this risk, through clearly defined policies which apply across their entire IT environment.
Top threats in the cloud: Data theft, cryptomining and ransomware
In order to get a better picture of how attackers are targeting cloud environments, incident response experts conducted an in-depth analysis of cloud-related cases the team responded to over the past year. The analysis found:
- Cybercriminals leading the charge: Financially motivated cybercriminals were the most commonly observed threat group category targeting cloud environments, though nation state actors are also a persistent risk.
- Exploiting cloud apps: The most common entry point for attackers was via cloud applications, including tactics such as brute-forcing, exploitation of vulnerabilities and misconfigurations. Vulnerabilities often remained undetected due to “shadow IT,” when an employee goes outside approved channels and stands up a vulnerable cloud app. Managing vulnerabilities in the cloud can be challenging, since vulnerabilities in cloud products remained outside the scope of traditional CVEs until 2020.
- Ransomware in the cloud: Ransomware was deployed 3x more than any other type of malware in cloud environments, followed by cryptominers and botnet malware.
- Data theft: Outside of malware deployment, data theft was the most common threat activity observed in breached cloud environments over the last year, ranging from personally identifying information to client-related emails.
- Exponential returns: Threat actors used cloud resources to amplify the effect of attacks like cryptomining and DDoS. Additionally, threat groups used the cloud to host their malicious infrastructure and operations, adding scale and an additional layer of obfuscation to remain undetected.
“Based on the trends in our incident response cases, it’s likely that malware cases targeting cloud will continue to expand and evolve as cloud adoption increases,” said Charles DeBeck, IBM X-Force IRIS.
“Malware developers have already begun making malware that disables common cloud security products, and designing malware that takes advantage of the scale and agility offered by the cloud.”
Maturing cloud security leads to faster security response
While the cloud revolution is posing new challenges for security teams, organizations who are able to pivot to a more mature and streamlined governance model for cloud security can reap significant benefits in their security agility and response capabilities.
The survey found that organizations who ranked high maturity in both Cloud and Security evolution were able to identify and contain data breaches faster than colleagues who were still in early phases of their cloud adoption journey.
In terms of data breach response time, the most mature organizations were able to identify and contain data breaches twice as fast as the least mature organizations (average threat lifecycle of 125 days vs. 250 days).
As the cloud becomes essential for business operations and an increasingly remote workforce, organizations should focus on the following elements to improve cybersecurity for hybrid, multi-cloud environments:
- Establish collaborative governance and culture: Adopt a unified strategy that combines cloud and security operations – across application developers, IT Operations and Security. Designate clear policies and responsibilities for existing cloud resources as well as for the acquisition of new cloud resources.
- Take a risk-based view: Assess the kinds workload and data you plan to move to the cloud and define appropriate security policies. Start with a risk-based assessment for visibility across your environment and create a roadmap for phasing cloud adoption.
- Apply strong access management: Leverage access management policies and tools for access to cloud resources, including multifactor authentication, to prevent infiltration using stolen credentials. Restrict privileged accounts and set all user groups to least-required privileges to minimize damage from account compromise (zero trust model).
- Have the right tools: Ensure tools for security monitoring, visibility and response are effective across all cloud and on-premise resources. Consider shifting to open technologies and standards which allow for greater interoperability between tools.
- Automate security processes: Implementing effective security automation in your system can improve your detection and response capabilities, rather than relying on manual reaction to events.
- Use proactive simulations to rehearse for various attack scenarios: This can help identify where blind spots may exist, and also address any potential forensic issues that may arise during attack investigation.
More than 88% percent of organizations use cloud infrastructure in one form or another, and 45% expect to migrate three quarters or more of their apps to the cloud over the next twelve months, according to the O’Reilly survey.
The report surveyed 1,283 software engineers, technical leads, and decision-makers from around the globe. Of note, the report uncovered that 21% of organizations are hosting all applications in a cloud context.
The report also found that while 49% of organizations are running applications in traditional, on-premises contexts, 39% use a combination of public and private cloud deployments in a hybrid-cloud alternative, and 54% use multiple cloud services.
Public cloud is the most popular deployment option
Public cloud dominates as the most popular deployment option with a usage share greater than 61%, with AWS (67%), Azure (48%), and Google Cloud Platform (GCP) (32%) as the most used platforms. However, while Azure and GCP customers also report using AWS, the reverse is not necessarily true.
“We see a widespread embrace of cloud infrastructure across the enterprise which suggests that most organizations now equate cloud with “what’s next” for their infrastructure decisions and AWS as the front-runner when it comes to public cloud adoption,” said Mary Treseler, VP of content strategy at O’Reilly.
“For those still on the journey to cloud-based infrastructure migration, ensuring that staff is well-versed in critical skills, such as cloud security and monitoring, will be incredibly important for successful implementations.
“Enterprises with solid footing have the potential to leverage this infrastructure for better software development and AI-based services, which will put them at an advantage over competitors.”
Other notable findings
- Cloud-based security was cited as the number one critical skill area needed to migrate to or implement cloud-based infrastructure (65%), followed by monitoring (58%) and Kubernetes (56%).
- 37% of organizations have developed production for AI services with 47% expected to deploy AI-based services at some point over the next three years.
- 52% of respondents say they use microservices concepts, tools, or methods for software development.
- 35% have adopted Site Reliability Engineering (SRE) with 47% of organizations expect to implement an SRE function at some point in the future.
Cloud adoption has grown at an astonishing rate, providing organizations with the freedom to store data in numerous cloud applications that meet their specific business demands. Additionally, migrating to the cloud gives employees the ability to access work material from anywhere and anytime.
This increases productivity by allowing employees to collaborate remotely with applications like G Suite, Office 365, Salesforce, and Slack (to name a few). While utilizing these cloud apps provides flexibility and cost savings, it also can allow sensitive data to be exposed.
While there are plenty of cloud applications available, let’s explore G Suite, Office 365, Salesforce, and Slack, and how organizations can leverage these apps to reap benefits while keeping data safe.
Proceed with caution
No matter what your company does, you likely share documents with employees, clients, or partners on a daily basis. These documents can include proposals, contracts, financial records, HR paperwork, and other confidential files. While these apps have made it easier to share, the documents and files are highly sensitive and could be very damaging if malicious actors got their hands on them.
Over 6 million businesses are paying to use G Suite, which provides access to corporate data from any device, anywhere, improving IT flexibility and employee productivity.
Similarly, Microsoft’s Office 365 provides teams with collaborative services to share and store data on SharePoint or Microsoft Teams. Another popular application over 150,000 enterprises use is Salesforce, a customer relationship management service that supports marketing, sales, commerce, and service functions. Lastly, Slack has become one of the most used team collaboration solutions with over 12 million daily active users sharing messages or other files on the platform.
Unfortunately, companies are not able to monitor all of the documents or data being shared across these apps. For example, Slack has private channels and direct messaging capabilities where admins cannot view what information is being shared unless they are a part of the conversation.
As we have witnessed with previous data breaches, there is a risk that sensitive data will not always be shielded from anyone outside your organization. Slack previously experienced a data breach back in 2015 as a result of unauthorized users gaining access to the infrastructure where usernames and passwords were stored. Salesforce has also had security issues in the past exposing users stored data to third parties due to an API error. These are just a few instances that should serve as a stark warning to enterprises that they can’t rely solely on app providers to ensure the security of their data – they must implement their own proper security solutions and processes in tandem.
While these cloud-based services have native security capabilities in place to protect the infrastructure against intrusions, the onus is on the enterprises using these tools to ensure files that are being stored and accessed in the cloud are secure. As businesses continue to use these apps, they must understand that they have a shared responsibility to protect corporate and customer information.
To achieve this shared goal, organizations need tools that are designed for enforcing real-time access control, detecting and remediating misconfigurations, encrypting sensitive data at rest, managing the sharing of data with external parties, and preventing data leakage while using these apps.
You can have your cake and eat it, too
Single sign-on (SSO) should be included as part of an organization’s cloud security strategy in order to authenticate their users and ensure that sensitive data is not being accessed maliciously. Along with SSO, having a cybersecurity solution that can protect data at upload, download, and at rest is essential to preventing a security breach.
Enterprises should also equip themselves with full-strength, data encryption and data loss prevention (DLP) as a part of their cloud-based collaborative apps. Additionally, companies should often train employees on best practices while using these apps including educating them on any specific company rules around data sharing.
As enterprise security teams have come to the realization that legacy security tools are not enough to secure their ever-changing ecosystem, cloud adoption will continue to rise. However, just like any other application, it’s important to have further preventive security in place to ensure that the data that is stored within the app, stays completely secure.
Cloud spend exceeds budgets as organizations expect increased cloud use due to COVID-19, according to a Flexera report.
“With employees working from home and more business interactions going digital,” said Jim Ryan, President and CEO of Flexera, “more than half of enterprise respondents said their cloud usage will be higher than originally planned at the beginning of the year due to the pandemic.
“Companies plan to migrate more services to cloud, yet they’re already exceeding cloud budgets. They will need to focus on optimizing workloads as they migrate in addition to cost management and governance to ensure operational efficiency.”
Organizations embrace multi-cloud
- 93% of enterprises have a multi-cloud strategy; 87% have a hybrid cloud strategy
- Respondents use an average of 2.2 public clouds and 2.2 private clouds.
Public cloud adoption continues to accelerate
- Twenty percent of enterprises spend more than $12 million per year on public cloud
- More than 50 percent of enterprise workloads and data are expected to be in a public cloud within 12 months
- 59% of enterprises expect cloud usage to exceed prior plans due to COVID-19
- The top challenge in cloud migration is understanding application dependencies.
Cloud cost optimization
- Organizations are over budget for cloud spend by an average of 23 percent, and expect cloud spend to increase by 47 percent next year
- Respondents estimate that 30 percent of cloud spend is wasted.
Cloud initiatives and metrics
- 73% of organizations plan to optimize existing use of cloud (cost savings), making it the top initiative for the fourth year in a row
- 61% percent of organizations plan to focus on cloud migration
- 77% of organizations use cost efficiency and savings to measure cloud progress.
Organizational approach to cloud
- 73% of enterprises have a central cloud team or cloud center of excellence
- 57% of cloud teams are responsible for governing infrastructure-as-a-service (IaaS)/platform-as-a-service (PaaS) usage costs.
- 83% of enterprises indicate that security is a challenge, followed by 82 percent for managing cloud spend and 79 percent for governance
- For cloud beginners, lack of resources/expertise is the top challenge; for advanced cloud users, managing cloud spend is the top challenge
- 56% of organizations report that understanding cost implications of software licenses is a challenge for software in the cloud.
During this extended period of social distancing filled with increased online activity, I can’t help but reflect on all the user data that has been created, stored, hacked, exposed, bought, shared and sold over the last 10 years. What’s known as the black market is built on this immeasurable and personally identifiable data – information both believed to be secured and known to be exposed – and frankly, it is entirely of our own creation.
The transition from traditional onsite data colocation to the use of third-party cloud shared tenant services should be on everyone’s minds. With this growing shift, everyone from individuals to enterprises will continue to fuel threat actors by improperly storing information in the cloud.
Adversaries today do not have to spend nearly as much time or effort exploiting an organization – it’s a no brainer for them to suck down improperly secured data from the cloud. In fact, I would argue that the amount of data exposed by misconfigured S3 buckets and or third‐party vendors (for example misconfigured Mongo databases, Elastic Search Engines or other applications) far exceeds exposure by any other threat actor activity.
Major factors contributing to improperly secured data include a misconception that the cloud is inherently more secure than storing data on‐premise, the struggle to define the scope of an enterprise environment and a lack of visibility into threat actor environments, the perpetual selling of security solutions as if they are a silver bullet, and a shortage of security professionals.
The cloud is only as secure as we make it
I regularly hear people say the cloud is so much more secure, but when asked, “Why is it more secure?” the responses are not reassuring. Larger organizations are likely to have highly skilled teams to secure their own infrastructure, but the cloud model is designed for ease of use, and reduced friction and complexity – a ripe combinations for folks with less technical skills to launch data into the cloud. In fact, placing the data you govern into a shared tenant service is as easy as putting in a valid credit card.
However, many companies move to virtual servers in cloud services and simply duplicate traditional on‐ site services. They do not consider that in order to remain secure, these servers require the exact attention that an on‐site server requires, continuous backporting and patching, network services firewall and identity access management. Because the cloud is often utilized by organizations who do not have robust security teams, this maintenance and security hygiene often goes unchecked.
You can’t protect unknown data from unknown attackers
It’s well understood by now that organizations are challenged by defining the boundaries and scope of their environments, and knowing where web applications ingress and egress, whether an environment has adequate segmentation or if it’s a flat network. But it bears repeating that in order to protect data, you have to know everywhere it is and what it means.
Conducting tabletop exercises that leverage modern threat vectors such as Stride, Trike or other frameworks is one way to track the most likely ways a threat actor could gain access or circumvent intended security controls, but many organizations are unprepared to complete these exercises or internally discuss the technical issues surrounding the results. In other words, they lack the language and ability to quantify threat risks to the organization which prevents the brand from defining their appetite for risk.
The industry is still selling silver bullets and alert fatigue
Enterprise security solutions are being sold as silver bullets. Many of these solutions are generally syslog tools marked up with hot words like “AI” and “Next Generation”, but really should be noted as “Lipstick on a Pig”. These solutions are often the cause of alert fatigue and companies quickly losing sight of the forest for the trees.
It doesn’t matter how easy to use a tool is or how positive the intended outcome is – an organization must be able to remediate their identified risk and have a plan to determine whether the risk is greater than the technical debt. Often times this looks like delaying a product rollout and ultimately delaying revenue, or working in haste by dumping data into a new and easy to use product through cloud services that creates unaccounted for risk.
A lack of “highly seasoned” IT professionals
At the crux of the issues surrounding improperly secured information in the cloud is the lack of IT professionals available in the market today. Companies that lack robust IT teams, understandably, seek out flexible options to keep their business operations streamlined and continue supporting growth.
While organizations are hyper‐focused on alert fatigue, underfunded security teams or those who simply cannot find the needed talent will be at greater risk of having their data stolen. Hiring managers should consider expanding their search radius for filling these roles, as there are many talented job seekers that could get up to speed quickly if time is allotted for training.
The transition to third party cloud environments as an enabler… eventually
I do believe third party cloud environments will eventually be the enabler we prop them up to be. For larger organizations it may be an enabler to have more control over environments by creating actual CI/CD heavily security, controlled environments such as sanitized development environments with actual sanitized quality control and testing environments. After all, it’s easy to quickly duplicate and/or burn down environments in the cloud. However, many traditional security controls are often bypassed by decisions to quickly adapt to modern third-party platforms.
Data has overtaken the materials of old as the currency that drives the world. As we move further into this decade, it behooves organizations large and small to consider what data they actually need to collect or store; how and where they are securing it; and the role they may play in fueling the underground economy. Assessing how data loss will affect a company (and a company’s tolerance for such loss) is certainly complex but is imperative. I implore organizations to leverage threat vectoring frameworks and avoid the pitfalls of believing the cloud is inherently more secure.
The survey highlights data privacy, unauthorized access, server outages and integration as key concerns.
Not everyone has migrated to the cloud yet
The survey shows a mixed picture when it comes to firms migrating security tools to the cloud. While just over half of respondents (52 percent) began migrating to cloud-based security products during or before 2018, around a fifth (18 percent) waited until 2019, three percent started in 2020, 13 percent have not yet started and the remainder don’t know when they’ll migrate.
Of those that have started their migration, over half (58 percent) have migrated at least one quarter of their security tools to the cloud, while one third (33 percent) said more than 50 percent of their security tools are now cloud-based.
Typically, organizations migrate security tools to the cloud to minimize the resources and overhead associated with owning and maintaining on-premises equipment and software. This means security teams can avoid system sizing, maintenance, uptime management, and product upgrades.
Reducing engineering effort to deploy and maintain new solutions allows security analysts to complete tasks faster and frees engineers up to focus on other projects.
The survey results support this, with improvements in monitoring and tracking of attacks (29 percent) and reduced maintenance (22 percent) considered the most important gains from using cloud-based security tools.
CAPEX reductions (18 percent), faster time to value (17 percent) and access to the latest features (13 percent) are drivers for cloud adoption, but considered less important.
However, when asked what concerns they have about moving security tools to the cloud, data privacy (30 percent) remains high on the list, with unauthorized access (16 percent), server outages (14 percent), integration with other security tools (14 percent), and data sovereignty (13 percent) also being raised.
Lack of understanding about migration
While 22 percent stated migration to the cloud was not a priority for their organization, the results suggest a lack of understanding about the migration issue as a whole. Around a third (32 percent) said they did not know what concerns their organization has about moving security tools to the cloud.
Furthermore, despite about a third (32 percent) of respondents saying they consider it to be too difficult or too risky to migrate security tools to the cloud, nearly half said their preference is to migrate legacy products to the cloud (46 percent) rather than replace legacy on-premise products with new cloud-native security tools (54 percent).
Organizations are protecting a variety of data types with cloud-based security tools, with email the most widely protected (22 percent), followed by customer information (21 percent), file-sharing (20 percent) and personnel files (18 percent). However, few organizations (12%) have extended cloud-based security to protecting corporate financial information.
“I think regardless of what security teams want, their monitoring and response tools will follow where organizations are moving their infrastructure for business services. Ultimately, security teams might have opinions, but they really don’t have a choice. They need to operate in a way that enables the business to function, grow, and profit. That said, if history has proven anything, it is the continuous, multi-decade ebb and flow between centralized and distributed computing and cloud is the next phase of that iteration. Ultimately, security teams need to be flexible in order to be able to integrate and interoperate both their cloud and non-cloud security tools and be in a position to enable the business to deliver capabilities and services where it is best for the business – not exclusively what is good for security,” Swimlane CEO, Cody Cornell, told Help Net Security.
“If the recent events are any proof of the security impacts to the security visibility of centralized vs distributed workforces, a lot of organizations that felt they were well-positioned to secure their users and devices have been caught flat-footed as their ability to gather security information from the endpoints and network perimeters have evaporated depending on some of the infrastructure decisions and assumptions they’ve previously made. If distributed workforces are the new normal, technologies that can be both cloud-deployed and managed have some obvious advantages in that they don’t lose visibility when endpoints data and the perimeter (e.g. traditional versus newer DNS, Proxy, Browser Isolation, & CASB solutions) telemetry are no longer available for detection and response.”
Since the advent of the public cloud as a viable alternative to on-premise systems, CIOs and CISOs have been citing security as one of the top concerns when it comes to making the switch.
While most of their worries have abated over the years, some remain, fuelled by the number of data leak incidents, mainly arising from misconfiguration.
Johnnie Konstantas, Senior Director, Security Go to Market at Oracle, says that the main reason we are seeing so many headlines around sensitive data leaks and loss is that there are almost too many security tools offered by public cloud providers.
Making cloud security administration less person-intensive and error-prone
“Public clouds are, by and large, homogeneous infrastructures with embedded monitoring capabilities that are ubiquitous and have centralized security administration and threat remediation tools built on top,” Konstantas told Help Net Security.
But cloud customers must train anew on the use of these tools and be properly staffed to leverage them and to coordinate amongst the various security disciplines – and this is hard to do as cybersecurity expertise is at historic shortage.
“Customers don’t want more tools, they want the benefit of cloud service provider innovation and expertise to address the challenge. At this point, we need reliable, accurate security configuration management and threat response that is automated,” Konstantas opined.
This is the direction in which she expects cloud-native security to go in the next five years. She believes we are likely to see a shift away from discussions about the shared responsibility model and more toward making customers cloud security heroes through automation.
“Automation really is central to effective cloud security. Just take the example of data and consider the volume of data flowing into cloud hosted data bases and data warehouses. Classifying the data, identifying PII, PHI, credit cards etc., flagging overly permissioned access, and requiring additional authorization for data removal – all these things have to be automated. Even the remediation, or prevention of access needs to be automated,” she noted.
Cloud providers will have to break through customers’ fear that automated security means breaking business by over-reacting to false positives, but those that find a way to excel in using machine learning, model tuning and artificial intelligence for effective and accurate automated threat prevention will deservedly earn customer confidence – and not a moment too soon.
Is it safe to put critical enterprise workloads in the public cloud?
Without a doubt, the public cloud has proven a worthy alternative to private data centers by offering high resilience to threats and rapid security incident recovery. But not all public cloud providers are the same when it comes to expertise or built-in security.
Organizations with sensitive data and workloads must find those that will offer adequate security, and can do so by asking many questions and evaluating the answers.
Konstantas proposes the following (though the list isn’t exhaustive):
- What are your data protection, detection, response and recovery capabilities for both structured (database) and unstructured (object storage) data?
- How do you protect against hypervisor-based attacks, cross tenant infection, hardware-based attacks?
- Which customer-controlled security functions are built into your cloud and are they charged for?
- Which parts of security configuration, detection and threat remediation are automated on your platform and to which services do they apply (i.e. IaaS, PaaS, SaaS)?
For the CISO that has to work with the CIO to lead a massive migration of the organization’s data to the cloud, she advises to get as much visibility into the project as possible.
“CISOs need to prepare answers for how the organization will meet its regulatory and compliance obligations for the data during the migration and once fully operational in the cloud,” she explained.
Again, there are many questions that must be answered. Among them are:
- How will security coverage look after the migration as compared to what is being done on premises?
- How will security posture visibility and effectiveness increase?
- What cost savings will be incurred on security spend by adopting built-in cloud security?
- How will holistic cloud security posture be communicated to the CIO and board of directors?
“If the CISO is working with a cloud security provider that understands critical enterprise workloads, they will have ample support and guidance in preparing and documenting these answers because enterprise-focused CSPs have deep experience with the specific requirements of global companies, complex enterprise applications and data residency and sovereignty requirements. Enterprise-focused CSPs staff teams ready to share those insights and furnish the proof points customers require,” she concluded.
Improved AI capabilities, accelerated business intelligence, and increased productivity and efficiency were the top expectations of organizations currently investing in cloud-based quantum computing technologies, according to IDC.
Users are very optimistic
Initial survey findings indicate that while cloud-based quantum computing is a young market, and allocated funds for quantum computing initiatives are limited (0-2% of IT budgets), end-users are optimistic that early investment will result in a competitive advantage.
The manufacturing, financial services, and security industries are currently leading the way by experimenting with more potential use cases, developing advanced prototypes, and being further along in their implementation status.
Easy access to quantum computing
Complex technology, skillset limitations, lack of available resources, and cost deter some organizations from investing in quantum computing technology. These factors, combined with a large interdisciplinary interest, has forced quantum computing vendors to develop quantum computing technology that addresses multiple end-user needs and skill levels.
The result has led to increased availability of cloud-based quantum computing technology that is more easily accessible and user friendly for new end users. Currently, the preferred types of quantum computing technologies employed across industries include quantum algorithms, cloud-based quantum computing, quantum networks, and hybrid quantum computing.
“Quantum computing is the future industry and infrastructure disruptor for organizations looking to use large amounts of data, artificial intelligence, and machine learning to accelerate real-time business intelligence and innovate product development. Many organizations — from many industries — are already experimenting with its potential,” said Heather West, senior research analyst, Infrastructure Systems, Platforms, and Technology at IDC.