Businesses increasingly embrace the moving of multiple applications to the cloud using containers and utilize Kubernetes for orchestration, according to Zettaset.
However, findings also confirm that organizations are inadequately securing the data stored in these new cloud-native environments and continue to leverage existing legacy security technology as a solution.
Businesses are faced with significant IT-related challenges as they strive to keep up with the demands of digital transformation. Now more than ever to maintain a competitive edge, companies are rapidly developing and deploying new applications.
Companies must invest in high performance data protection
The adoption of containers, microservices and Kubernetes for orchestration play a significant role in these digital acceleration efforts. And yet, while many companies are eager to adopt these new cloud-native technologies, research shows that companies are not accurately weighing the benefits of enterprise IT innovation with inherent security risks.
“Our goal with this research was to determine whether enterprise organizations who are actively transitioning from DevOps to DevSecOps are investing in proper security and data protection technology. And while findings confirm that companies are in fact making the strategic decision to shift towards cloud-native environments, they are currently ill-equipped to secure their company’s most critical asset: data.
“Companies must invest in high performance data protection so as it to secure critical information in real-time across any architecture.”
- Organizations are embracing the cloud and cloud-native technologies: 39% of respondents have multiple production applications deployed on Kubernetes. But, companies are still struggling with the complexities associated with these environments and how to secure deployments.
- Cloud providers offer considerable influence with regards to Kubernetes distribution: A little over half of those surveyed are using open source Kubernetes available through the Cloud Native Computing Foundation (CNCF). And 34.7% of respondents are using a Kubernetes offering managed by an existing cloud provider such as AWS, Google, Azure, and IBM.
- Kubernetes security best practices have yet to be identified: 60.1% of respondents believe there is a lack of proper education and awareness of the proper ways to mitigate risk associated with storing data in cloud-native environments. And 43.2% are confident that multiple vulnerable attack surfaces are created with the introduction of Kubernetes.
- Companies have yet to evolve their existing security strategies: Almost half of respondents (46.5%) are using traditional data encryption tools to protect their data stored in Kubernetes clusters. Over 20% are finding that these traditional tools are not performing as desired.
“The results of our research substantiate the notion that enterprise organizations are moving forward with cloud-native technologies such as containers and Kubernetes. What we were most interested in discovering was how these companies are approaching security,” said Charles Kolodgy, security strategist and author of the report.
“Companies overall are concerned about the wide range of potential attack surfaces. They are applying legacy solutions but those are not designed to handle today’s ever-evolving threat landscape, especially as data is being moved off-premise to cloud-based environments.
“To stay ahead of what’s to come, companies must look to solutions purposely built to operate in a Kubernetes environment.”
Driven by a strong curiosity to know how computers and computer programs are made, how they work, and how safe they are, Sheila A. Berta, Head of Security Research at Dreamlab Technologies, has been interested in cybersecurity since her early teens.
For the last several years, she has been conducting investigations in a variety of information security areas like hardware hacking, car hacking, wireless security, malware and – more recently – Docker, Kubernetes and cloud security.
“At the moment everything tends to migrate to containerized, serverless and/or cloud environments with a microservices focus, so DevOps and other IT professionals have been forced to learn how to implement and work with these infrastructures,” she explained her more recent research interests.
“The attack and defense techniques that can be applied in these environments are completely different from the techniques applied in ‘traditional’ architectures, so it’s very important that security professionals now acquire the necessary skills to competently protect these modern infrastructures.”
One of the ways they can achieve this is to attend a training course on the subject.
Virtual trainings through HITBSecTrain
During HITBCyberWeek, which is scheduled to start on November 15, Berta’s colleague Sol Ozzan will hold an online workshop focused on Docker and Kubernetes defense that will serve as a preview for a 2-day virtual training courses that the two will conduct through HITBSecTrain in February next year.
“Our Attack and Defense on Docker, Swarm and Kubernetes training at HITBSecTrain will provide attendees with the practical knowledge they need to analyze and secure containerized & Kubernetes-orchestrated environments,” Berta told Help Net Security.
“Our trainings have a lot of hands-on laboratories. We start with the Docker fundamentals and then jump into the labs with Docker Black Box and White Box analysis, as well as defense on containers and Docker images. At the end of the first day, we focus on Swarm (official Docker orchestrator) with a variety of practices in attack and defense.”
The second day is fully dedicated to Kubernetes. They start with the fundamentals of this technology and then dive into the hands-on with Black Box, Gray Box, and White Box analysis. Sophisticated attack techniques will be explained, as well as advanced security features that can be implemented in this famous orchestrator.
This is not the first time she has held a container environment-related training – she also did it at Black Hat USA 2020. But, as can be expected, they are continuously updating the materials: they have added lately more attack techniques in different Docker and Kubernetes components, such as the Docker Registry and Kubernetes Kubelet, and more open source tools that can be used to analyze and secure these infrastructures.
She also couldn’t help but speak highly of another 2-day training course that two other Dreamlab Technologies colleagues are set to hold in February.
“I had the pleasure of seeing how the trainers built the materials for the Attacking and Securing Industrial Control Systems (ICS) course and I have to say that it is the most practical training on ICS hacking I have ever seen. It even has practices for air-gap bypass techniques,” she noted.
“I believe practical experience is very important when it comes to this kind of topics. We have prepared a realistic ICS environment that students will access throughout the course to perform all the exploitation techniques explained by the trainers.”
After five months in beta, the GitHub Code Scanning security feature has been made generally available to all users: for free for public repositories, as a paid option for private ones.
“So much of the world’s development happens on GitHub that security is not just an opportunity for us, but our responsibility. To secure software at scale, we need to make a base-level impact that can drive the most change; and that starts with the code,” Grey Baker, GitHub’s Senior Director of Product Management, told Help Net Security.
“Everything we’ve built previously was about responding to security incidents (dependency scanning, secret scanning, Dependabot) — reacting in real time, quickly. Our future state is about fundamentally preventing vulnerabilities from ever happening, by moving security core into the developer workflow.”
GitHub Code Scanning
The Code Scanning feature is powered by CodeQL, a powerful static analysis engine built by Semmle, which was acquired by GitHub in September 2019.
“We want developers to be able to use their tools of choice, for any of their projects on GitHub, all within the native GitHub experience they love. We’ve partnered with more than a dozen open source and commercial security vendors to date and we’ll continue to integrate code scanning with other third-party vendors through GitHub Actions and Apps,” Baker noted.
“The major value add here is that developers can work, and stay within, the code development ecosystem in which they’re most accustomed to while using their preferred scanning tools,” explained James Brotsos, Senior Solutions Engineer at Checkmarx.
“GitHub is an immensely popular resource for developers, so having something that ensures the security of code without hindering agility is critical. Our ability to automate SAST and SCA scans directly within GitHub repos simplifies workflows and removes tedious steps for the development cycle that can traditionally stand in the way of achieving DevSecOps.”
Checkmarx’s SCA (software composition analysis) help developers discover and remedy vulnerabilities within open source components that are being included into the application and prioritizing them accordingly based on severity. Checkmarx SAST (static application security testing) scans proprietary code bases – even uncompiled – to detect new and existing vulnerabilities.
“This is all done in an automated fashion, so as soon as a pull request takes place, a scan is triggered, and results are embedded directly into GitHub. Together, these integrations paint a holistic picture of the entire application’s security posture to ensure all potential gaps are accounted for,” Brotsos added.
Leon Juranic, CTO at DefenseCode, said that they are very excited by this initiative, as it provides access to security analysis to over 50+ million Github users.
“Having the security analysis results displayed as code scanning alerts in GitHub provides an convenient way to triage and prioritize fixes, a process that could be cumbersome usually requiring scrolling through many pages of exported reports, going back and forth between your code and the reported results, or reviewing them in dashboards provided by the security tool. The ease of use now means you can initiate scans, view, fix, and close alerts for potential vulnerabilities in your project’s code in an environment that is already familiar and where most of your other workflows are done,” he noted.
A week ago, GitHub also announced additional support for container scanning and standards and configuration scanning for infrastructure as code, with integration by 42Crunch, Accurics, Bridgecrew, Snyk, Aqua Security, and Anchore.
The benefits and future plans
“We expect code scanning to prevent thousands of vulnerabilities from ever existing, by catching them at code review time. We envisage a world with fewer software vulnerabilities because security review is an automated part of the developer workflow,” Baker explained.
“During the code scanning beta, developers fixed 72% of the security errors found by CodeQL and reported in the code scanning pull request experience. Achieving such a high fix rate is the result of years of research, as well as an integration that makes it easy to understand each result.”
Over 12,000 repositories tried code scanning during the beta, and another 7,000 have enabled it since it became generally available, he says, and the reception has been really positive, with many highlighting valuable security finds.
“We’ll continue to iterate and focus on feedback from the community, including around access control and permissions, which are of high priority to our users,” he concluded.
There’s a growing, organized and increasingly sophisticated pattern of attacks on cloud native infrastructure, according to Aqua Security.
While most attacks were aimed at abusing public cloud compute resources for cryptocurrency mining, the methods used open the door for higher-value targets that leverage security gaps in container software supply chains and runtime environments.
The report provides trends and observed categories of attacks, but also explains in great detail the specific progression of several attack vectors, from the originating malicious images to the specific evasion techniques, malicious payloads, and propagation attempts.
Attacks on cloud native infrastructure
- Container images in public registries being poisoned with Potentially Unwanted Applications (PUAs) that cannot be detected using static scanning. They spring into action only when the container is running.
- Sophisticated evasion techniques are being used to hide attacks and make them more persistent. This includes the use of “vanilla” images that seem innocuous, disabling other malware, delaying before downloading payloads into the running container, using 64-bit encoding to obfuscate malware, and more.
- Since the beginning of 2020, the volume of attacks has dramatically increased, suggesting that there is organized infrastructure and systematic targeting behind these attacks. More than 16,000 individual attacks were tracked back to multiple locations across the globe.
- The main motivation of the malicious actors has been to hijack cloud compute resources to mine for cryptocurrency, but Team Nautilus has seen evidence that other objectives, such as establishing DDoS infrastructure, were also attempted.
“The attacks we observed are a significant step up in attacks targeting cloud native infrastructure. We expect a further increase in sophistication, the use of evasion techniques and diversity of the attack vectors and objectives, since the widespread the use of cloud native technologies makes them a more lucrative target for bad actors,” notes Idan Revivo, Head of Team Nautilus at Aqua.
“Security teams are advised to take the appropriate measures both in their pipelines as well as runtime environments, to detect and intercept such attempts.”
A malicious cryptocurrency miner and DDoS worm that has been targeting Docker systems for months now also steals Amazon Web Services (AWS) credentials.
The original threat
TeamTNT’s “calling card” appears when the worm first runs on the target installation:
- Scan for open Docker daemon ports (i.e., misconfigured Docker containers)
- Create an Alpine Linux container to host the coinminer and DDoS bot
- Search for and delete other coin miners and malware
- Configure the firewall to allow ports that will be used by the other components, sinkhole other domain names, exfiltrate sensitive information from the host machine
- Download additional utilities, a log cleaner, and a tool that attackers may use to pivot to other devices in the network (via SSH)
- Download and install the coinminer
- Collect system information and send it to the C&C server
The latest iteration has been equipped with new capabilities, Cado Security researchers found.
The worm still scans for open Docker APIs, then spins up Docker images and install itself in a new container, but it now also searches for exploitable Kubernetes systems and files containing AWS credentials and configuration details – just in case the compromised systems run on the AWS infrastructure.
The code to steal these files is relatively straightforward, the researchers note, and they expect other worms to copy this new ability soon.
But are the attackers using the stolen credentials or are they selling them? The researchers tried to find out by sending “canary” AWS keys to TeamTNT’s servers, but they haven’t been used yet.
“This indicates that TeamTNT either manually assess and use the credentials, or any automation they may have created isn’t currently functioning,” they concluded.
Nevertheless, they urge businesses to:
- Identify systems that are storing AWS credential files and delete them if they aren’t needed
- Use firewall rules to limit any access to Docker APIs
- Review network traffic for connections to mining pools or using the Stratum mining protocol
- Review any connections sending the AWS Credentials file over HTTP
77% of organizations have adopted microservices, with 92% experiencing success with microservices, according to an O’Reilly survey.
success with microservices
The report surveyed 1,502 software engineers, systems and technical architects, engineers, and decision-makers from around the globe. Of note, the report found that adopters are betting big on microservices, with 29% of organizations reporting that they are migrating or implementing a majority of their systems using microservices.
Additionally, the survey found that teams who own the software lifecycle (building, testing, deployment and maintenance) succeed with microservices at a rate 18% higher than those who don’t.
“The majority of organizations have already started to migrate their monolithic systems, applications, and architectures to microservices, and many more are looking to begin that transition,” said Mary Treseler, vice president of content strategy at O’Reilly.
“Breaking a monolith into microservices has clear engineering benefits including improved flexibility, simplified scaling, and easier management – all of which result in better customer experiences.”
Containers bring success
Respondents who used containers to deploy and manage microservices were significantly more likely to report success than those who didn’t. Almost half (49%) of respondents who describe their deployments as “a complete success” also instantiate at least 75% of their microservices in containers. In total, 62% of respondents use containers to deploy at least some of their microservices.
“While container adoption in microservices contributes to microservices success, we saw a lower percent of container adoption than we did in our 2018 report,” said Treseler.
“For some adopters, technical debt from proprietary or monolithic systems might constrain them from using containers and it might be faster and less costly, at least in the short term, to deploy microservices in a database or application server.”
Other notable findings
- 61% of respondents say their organizations have been using microservices for a year or more and 28% have used microservices for at least three years.
- 74% of respondents say their teams own the build-test-deploy-maintain phases of the software lifecycle. 49% of these teams report being at least “mostly successful” with microservices and 10% report that microservices development efforts were a “complete success.”
- 40% of adopters cite corporate culture or mindset as the biggest barrier to microservices adoption. Complexity in one form or another (56%) and decomposing monolithic applications into microservices (37%) were also major challenges.
New vulnerabilities in open source packages were down 20% compared to last year suggesting security of open source packages and containers are heading in a positive direction, according to Snyk.
Well known vulnerabilities, such as cross-site scripting, continue to be reported but aren’t impacting as many projects as they have in previous years. This is further encouraged as organizations start to drive a culture shift that embodies open source and container security as a core responsibility shared and integrated across development, security and operations teams.
This year the report took an even deeper look at vulnerability and ecosystem-level trends that impact the overall security posture of organizations relying on open source libraries.
Across the six popular ecosystems the report examined, there were fewer new vulnerabilities reported in 2019 than in 2018 – a promising finding – but there are still significant improvements to strive for with slightly less than two thirds of vulnerabilities still taking more than 20 days to remediate.
Common threats getting caught and remediated early
While well-known vulnerabilities in open source packages, such as cross site scripting are reported in high numbers, and the number of projects they impact are fairly low. These common threats appear to be getting caught and remediated early unlike some lesser known vulnerabilities.
For example, the report found certain vulnerabilities were reported in highly popular packages, affecting thousands of projects and thereby increasing the probability of them being exploited by attackers. Based on the report, the top vulnerability currently impacting scanned projects is prototype pollution in nearly 27% of all projects.
For the first time in the last four years, there has been a big shift in security mindset as organizations start embracing the core elements of DevSecOps and begin implementing more scalable programs and best practices to ensure shared responsibility.
Who should be responsible for designing and implementing security controls?
When respondents were asked the multi-answer question about who they felt should be responsible for designing and implementing security controls in their software development, development teams were commonly identified in addition to operations and security teams. This is a much more even spread across the three different teams compared to last year in which less than 25% felt security and operations played a role.
However, the fact the responses were all less than 65% still indicates that respondents did not typically identify all three groups as jointly being responsible. While progress has been made, it’s clear there is still a need for a more significant shift towards a shared-responsibility culture.
“This year’s report is very encouraging as we are seeing the volume of open source vulnerabilities trending down for the first time in four years. In addition, there are positive trends emerging around the collaboration of development, security and operations teams to address the growing demand for secure application development,” said Alyssa Miller, Application Security Advocate, Snyk.
“Despite the year over year progress, we must continue to prioritize security and empower organizations to implement programs to help drive DevSecOps and developers to be involved in securing their code from the very beginning. We need to focus on continuing these efforts to ensure these emerging trends continue on this positive trajectory in 2021 and beyond.”
Open source statistics
Open source ecosystems continue to expand, led by npm which grew over 117% in 2019 and spanning over 1,300,000 packages to this date.
- New vulnerabilities were down almost 20% across the most popular ecosystems in 2019.
- Cross-site scripting vulnerabilities were the most commonly reported.
- Two prevalent prototype pollution vulnerabilities resulted in an impact on over 25% of scanned projects.
- New vulnerabilities reported in common Linux distributions demonstrate the need for comprehensive monitoring for new vulnerabilities in container images.
- SQL Injection vulnerabilities, while decreasing in prevalence in most ecosystems, have increased over the last three years in PHP packages.
Container and orchestration challenges
- Official base images tagged as latest include known vulnerabilities; in particular the official node image which has almost 700 known vulnerabilities.
- Over 30% of survey participants do not review Kubernetes manifests for insecure configurations.
- Requirements for security-related resource controls in Kubernetes are not widely implemented.
- Increasingly, survey respondents feel that security for software and infrastructure should be shared among development, security, and operations roles.
- However, few organizations have programs in place to develop shared responsibility across the dev, sec, and ops personnel.
Hybrid and multi-cloud architectures have become the de-facto standard among organizations, with 53 percent embracing them as the most popular form of deployment.
Advantages of hybrid and multi-cloud architectures
Surveying over 250 worldwide business executives and IT professionals from a diverse group of technical backgrounds, Denodo’s cloud usage survey revealed that hybrid cloud configurations are the centre of all cloud deployments at 42 percent, followed by public (18 percent) and private clouds (17 percent).
The advantages of hybrid cloud and multi-cloud configurations according to respondents include the ability to diversify spend and skills, build resiliency, and cherry-pick features and capabilities depending on each cloud service provider’s particular strengths, all while avoiding the dreaded vendor lock-in.
The use of container technologies increased by 50 percent year-over-year indicating a growing trend to use it for scalability and portability to the cloud. DevOps professionals continue to look to containerization for production, because it enables reproducibility and the ability to automate deployments.
About 80 percent of the respondents are leveraging some type of container deployment, with Docker being the most popular (46 percent) followed by Kubernetes (40 percent) which is gaining steam, as is evident from the consistent support of all the key cloud providers.
Most popular cloud service providers
A foundational metric for demonstrating cloud adoption maturity, 78 percent of all respondents are running some kind of a workload in the cloud. Over the past year, there has been a positive reinforcement of cloud adoption with at least a 10 percent increase across beginners, intermediate, and advanced adopters.
About 90 percent of those embracing cloud are selecting AWS and Microsoft Azure as their service providers, demonstrating the continued dominance of these front-runners.
But users are not just lifting their on-premises applications and shifting them to either of or both of these clouds; 35 percent said they would re-architect their applications for the best-fit cloud architecture.
For the most popular cloud initiative, analytics and BI came out at the top with two out of three (66 percent) participants claiming to use it for big data analytics projects. AWS, Azure, and Google Cloud each has its own specific strengths, but analytics surfaced as the top use case across all three of them. This use case was followed closely by both logical data warehouse (43 percent) and data science (41 percent) in the cloud.
When it comes to data formats, two thirds of the data being used is still in structured format (68 percent), while there is a vast pool of unstructured data that is growing in importance. Cloud object storage (47 percent) along with SaaS data (44 percent) are frequently used to maximize ease of computation and performance optimization.
Further, cloud marketplaces are growing at a phenomenal speed and are becoming more popular. Half (50 percent) of those surveyed are leveraging cloud marketplaces with utility/pay-as-you-go pricing being the most popular incentive (19 percent) followed by its self-service capability/ability to minimize IT dependency (13 percent). Avoiding a long-term commitment also played a role (6 percent).
“As data’s center of gravity shifts to the cloud, hybrid cloud and multi-cloud architectures are becoming the basis of data management, but the challenge of integrating data in the cloud has almost doubled (43 percent),” said Ravi Shankar, SVP and CMO of Denodo.
“Today, users are looking to simplify cloud data integration in a hybrid/multi-cloud environment without having to depend on heavy duty data migration or replication which may be why almost 50 percent of respondents said they are considering data virtualization as a key part of their cloud integration and migration strategy.”
Most enterprises (85%) believe embracing the public cloud is critical to fuel innovation, but the majority are not equipped to operate in the cloud securely, according to a DivvyCloud survey of nearly 2,000 IT professionals.
In fact, of those surveyed whose organization has already adopted public cloud, only 40% have in place an approach to managing cloud and container security.
Avoiding security issues in the cloud
Only a little over half (58%) said their organization has clear guidelines and policies in place for developers building applications and operating in the public cloud. And of those, 25% said these policies are not enforced, while 17% confirmed their organization lacks clear guidelines entirely.
“Enterprises believe they must choose between innovation and security—a false choice we see manifested in the results of this report, as well as in conversations with our customers and prospects,” said Brian Johnson, CEO at DivvyCloud.
“Only 35% of respondents do not believe security impedes developers’ self-service access to best-in-class cloud services to drive innovation—meaning 65% believe they must choose between giving developers self-service access to tools that fuel innovation and remaining secure.
“The truth is, security issues in the cloud can be avoided. By employing the necessary people, processes, and systems at the same time as cloud adoption (not weeks, months, or years later), enterprises can reap the benefits of the cloud while ensuring continuous security and compliance.”
Additional key findings
Automation is coveted but not leveraged in cloud security: Nearly 70% of all respondents believe that automation can provide benefits to their organization’s cloud security strategy, but only 48% say their cloud security strategy currently incorporates products that leverage automation.
The vast majority of respondents (85%) trust automated security solutions more than or the same as human security professionals.
Developers and security are misaligned: Almost half (49%) of all respondents whose organizations use public cloud said their developers and engineers at times ignore or circumvent cloud security and compliance policies.
Enterprises lack understanding of applicable regulations and standards: Out of all respondents, 42% do not know which frameworks their company uses to maintain compliance with relevant standards and regulations (such as GDPR, HIPAA, PCI DSS, SOC 2, etc.)
Infrastructure-as-a-Service (IaaS) reigns supreme: When asked about the architectures their organizations currently use or plan to use within the next year to build apps, 42% said IaaS; among larger organizations with 10,000 or more employees, that number goes up to 53%.
The cloud is ubiquitous: Only 7% of respondents work for organizations that do not use any public cloud services, and only 5% reported no plans to adopt public cloud—a significant drop from the 11% who reported no adoption plans last year.
Enterprise multicloud strategies are declining: 64% of this year’s survey respondents confirmed their organization is using two or more cloud services, a 13% decline from last year.
Only half of the vulnerabilities in cloud containers ever posed a threat, according to a Rezilion study.
The top 20 most popular container images on DockerHub were analyzed to discover that 50% of vulnerabilities were never loaded into memory and therefore did not pose a threat, regardless of Common Vulnerability Scoring System (CVSS) scores and despite vast resources in budget and manpower spent on patching or mitigation.
By triaging vulnerabilities using a continuous adaptive risk and trust assessment (CARTA) approach and then prioritizing treatment of those that are commonly targeted, companies can significantly reduce their security budgets or free up manpower to focus on other critical issues.
Firms with good security posture are equally breached
According to IDC, enterprises are spending 7-10% of their security budget on vulnerability management as daily operations become increasingly more dependent on cloud services. Vulnerability scanners overload and confuse security teams with mountainous results that would be impossible to patch all at once.
The existing prioritization practices such as CVSS provide no notable reduction of breaches in organizations with mature vulnerability management programs. Firms with good security posture are equally breached by known vulnerabilities as those with poor security posture.
A risk-based approach to vulnerability management
Gartner recommends that “security and risk management leaders should rate vulnerabilities on the basis of risk in order to improve vulnerability management program effectiveness”.
Gartner also predicts that “by 2022, approximately 30% of enterprises will adopt a risk-based approach to vulnerability management” and “by 2022, organizations that use the risk-based vulnerability management method will suffer 80% fewer breaches.”
“A vulnerability is only as dangerous as the threat exploiting it and in some instances during our research, we found the figure dropped to as low as 2%. By focusing on actual vs. perceived risk, we found the security industry has been unnecessarily exaggerating the number of vulnerabilities security teams must address, which has dangerous ramifications to the cloud security landscape,” said Shlomi Boutnaru, CTO at Rezilion.
“A continuous adaptive risk and trust assessment-based approach reduces friction and overhead by identifying vulnerabilities running in memory and then prioritizing treatment to those vulnerabilities commonly targeted by hackers as well as any that don’t have mitigations.”
Five security best practices for DevOps and development professionals managing Kubernetes deployments have been introduced by Portshift.
Integrating these security measures into the early stages of the CI/CD pipeline will assist organizations in the detection of security issues earlier, allowing security teams to remediate issues quickly.
Kubernetes as the market leader
The use of containers continues to rise in popularity in test and production environments, increasing demand for a means to manage and orchestrate them. Of all the orchestration tools, Kubernetes (K8s) has emerged as the market leader in cloud-native environments.
Unfortunately, Kubernetes is not as adept at security as it is at orchestration. It is therefore essential to use the right deployment architecture and security best practices for all deployments.
Kubernetes security challenges
However, while Kubernetes has risen in popularity, it has also come with its own set of security issues, increasing the risk of attacks on applications.
Because Kubernetes deployments consist of many different components (including: the Kubernetes’ master and nodes, the server that hosts Kubernetes, the container runtime used Kubernetes, networking layers within the cluster and the applications that run inside containers hosted on Kubernetes), securing Kubernetes requires DevOps/developers to address the security challenges associated with each of these components.
Five security best practices
- Authorization: Kubernetes offers several authorization methods which are not mutually exclusive. It is recommended to use RBAC and ABAC in combination with Kubernetes where RBAC policies will be forced first, while ABAC policies complement this with finer filtering.
- Pod security: Since each pod contains a set of one or more containers, it is essential to control their communication. This is done by using Pod Security Policies which are cluster-level resources that control security sensitive aspects of the pod specification.
- Container security: Kubernetes includes basic workload security primitives related to container security. However, if apps, or the environment, are not configured correctly, the containers become vulnerable to attacks.
- Migration to production: As companies move more deployments into production, that migration increases the volume of vulnerable workloads at runtime. This issue can be overcome by applying the solutions described above, as well as making sure that your organization maintains a healthy DevOps/DevSecOps culture.
- Securing CI/CD pipelines on Kubernetes: Running CI/CD on Kubernetes allows for the build-out, testing, and deployment of K8‘s environments that can quickly be scaled as needed. Security must be baked at the CI/CD process because otherwise attackers can gain access at a later point and infect your code or environment. Leverage a security solution that acts as a protection layer for K8s and provides visibility both at the app and cluster levels.
“As the leading orchestration platform, Kubernetes is in active use at AWS, Google Cloud Platform, and Azure,“ said Zohar Kaufman, VP, R&D, Portshift. “With the right security infrastructure in place, it is set to change the way applications are deployed in the cloud with unprecedented efficiency and agility.”