As we near 2021, it seems that the changes to our working life that came about in 2020 are set to remain. Businesses are transforming as companies continue to embrace remote working practices to adhere to government guidelines. What does the next year hold for organizations as they continue to adapt in the age of the Everywhere Enterprise?
We will see the rush to the cloud continue
The pandemic saw more companies than ever move to the cloud as they sought collaboration and productivity tools for employee bases working from home. We expect that surge to continue as more companies realize the importance of the cloud in 2021. Businesses are prepared to preserve these new working models in the long term, some perhaps permanently: Google urged employees to continue working from home until at least next July and Twitter stated employees can work from home forever if they prefer.
Workforces around the world need to continue using alternatives to physical face-to-face meetings and remote collaboration tools will help. Cloud-based tools are perfect for that kind of functionality, which is partly why many customers that are not in the cloud, want to be. The customers who already started the cloud migration journey are also moving more resources to public cloud infrastructure.
People will be the new perimeter
While people will eventually return to the office, they won’t do so full-time, and they won’t return in droves. This shift will close the circle on a long trend that has been building since the mid-2000s: the dissolution of the network perimeter. The network and the devices that defined its perimeter will become even less special from a cybersecurity standpoint.
Instead, people will become the new perimeter. Their identity will define what they’re allowed to access, both inside and outside the corporate network. Even when they are logged into the network, they will have minimal access to resources until they and the device they are using have been authenticated and authorized. This approach, known as zero trust networking, will pervade everything, covering not just employees, but customers, contractors, and other business partners.
User experience will be increasingly important in remote working
Happy, productive workers are even more important during a pandemic. Especially as on average, employees are working three hours longer since the pandemic started, disrupting the work-life balance. It’s up to employers to focus on the user experience and make workers’ lives as easy as possible.
When the COVID-19 lockdown began, companies coped by expanding their remote VPN usage. That got them through the immediate crisis, but it was far from ideal. On-premises VPN appliances suffered a capacity crunch as they struggled to scale, creating performance issues, and users found themselves dealing with cumbersome VPN clients and log-ins. It worked for a few months, but as employees settle in to continue working from home in 2021, IT departments must concentrate on building a better remote user experience.
Old-school remote access mechanisms will fade away
This focus on the user experience will change the way that people access computing resources. In the old model, companies used a full VPN to tunnel all traffic via the enterprise network. This introduced latency issues, especially when accessing applications in the cloud because it meant routing all traffic back through the enterprise data center.
It’s time to stop routing cloud sessions through the enterprise network. Instead, companies should allow remote workers to access them directly. That means either sanitizing traffic on the device itself or in the cloud.
User authentication improvements
Part of that new approach to authentication involves better user verification. That will come in two parts. First, it’s time to ditch the password. The cybersecurity community has advocated this for a long time, but the work-from-home trend will accelerate it. Employees accessing from mobile devices are increasingly using biometric authentication, which is more secure and convenient.
The second improvement to user verification will see people logging into applications less often. Sessions will persist for longer, based on deep agent-based device knowledge that will form a big part of the remote access experience.
Changing customer interactions will require better mobile security
It isn’t just employees who will need better mobile security. Businesses will change the way that they interact with customers too. We can expect fewer person-to-person interactions in retail as social distancing rules continue. Instead, contact-free transactions will become more important and businesses will move to self-checkout options. Retailers must focus more on mobile devices for everything from browsing products, to ordering and payment.
The increase in QR codes presents a great threat
Retailers and other companies are already starting and will continue to use QR codes more and more to bridge contact with things like menus and payment systems, as well as comply with social distance rules. Users can scan them from two meters away, making them perfect for payments and product information.
The problem is that they were never designed for these applications or digital authentication and can easily be replaced with malicious codes that manipulate smartphones in unexpected and damaging ways. We can expect to see QR code fraud problems increase as the usage of these codes expands in 2021.
The age of the Everywhere Enterprise
One overarching message came through clearly in our conversations with customers: the enterprise changed for the longer term in 2020, and this will have profound effects in 2021. What began as a rushed reaction during a crisis this year will evolve during the next as the IT department joins HR in rethinking employee relationships in the age of the everywhere enterprise.
If 2020 was the year that businesses fell back on the ropes, 2021 will be the one where they bounce forward, moving from a rushed reaction into a thoughtful, measured response.
We live in the age of data. We are constantly producing it, analyzing it, figuring out how to store and protect it, and, hopefully, using it to refine business practices. Unfortunately, 58% of organizations make decisions based on outdated data.
While enterprises are rapidly deploying technologies for real-time analytics, machine learning and IoT, they are still utilizing legacy storage solutions that are not designed for such data-intensive workloads.
To select a suitable data storage for your business, you need to think about a variety of factors. We’ve talked to several industry leaders to get their insight on the topic.
Phil Bullinger, SVP and General Manager, Data Center Business Unit, Western Digital
Selecting the right data storage solution for your enterprise requires evaluating and balancing many factors. The most important is aligning the performance and capabilities of the storage system with your critical workloads and their specific bandwidth, application latency and data availability requirements. For example, if your business wants to gain greater insight and value from data through AI, your storage system should be designed to support the accelerated performance and scale requirements of analytics workloads.
Storage systems that maximize the performance potential of solid state drives (SSDs) and the efficiency and scalability of hard disk drives (HDDs) provide the flexibility and configurability to meet a wide range of application workloads.
Your applications should also drive the essential architecture of your storage system, whether directly connected or networked, whether required to store and deliver data as blocks, files, objects or all three, and whether the storage system must efficiently support a wide range of workloads while prioritizing the performance of the most demanding applications.
Consideration should be given to your overall IT data management architecture to support the scalability, data protection, and business continuity assurance required for your enterprise, spanning from core data centers to those distributed at or near the edge and endpoints of your enterprise operations, and integration with your cloud-resident applications, compute and data storage services and resources.
Ben Gitenstein, VP of Product Management, Qumulo
When searching for the right data storage solution to support your organizational needs today and in the future, it’s important to select a solution that is trusted, scalable to secure demanding workloads of any size, and ensures optimal performance of applications and workloads both on premises and in complex, multi- cloud environments.
With the recent pandemic, organizations are digitally transforming faster than ever before, and leveraging the cloud to conduct business. This makes it more important than ever that your storage solution has built in tools for data management across this ecosystem.
When evaluating storage options, be sure to do your homework and ask the right questions. Is it a trusted provider? Would it integrate well within my existing technology infrastructure? Your storage solution should be easy to manage and meet the scale, performance and cloud requirements for any data environment and across multi-cloud environments.
Also, be sure the storage solution gives IT control in how they manage storage capacity needs and delivers real-time insight into analytics and usage patterns so they can make smart storage allocation decisions and maximize an organizations’ storage budget.
David Huskisson, Senior Solutions Manager, Pure Storage
Data backup and disaster recovery features are critically important when selecting a storage solution for your business, as now no organization is immune to ransomware attacks. When systems go down, they need to be recovered as quickly and safely as possibly.
Look for solutions that offer simplicity in management, can ensure backups are viable even when admin credentials are compromised, and can be restored quickly enough to greatly reduce major organizational or financial impact.
Storage solutions that are purpose-built to handle unstructured data are a strong place to start. By definition, unstructured data means unpredictable data that can take any form, size or shape, and can be accessed in any pattern. These capabilities can accelerate small, large, random or sequential data, and consolidate a wide range of workloads on a unified fast file and object storage platform. It should maintain its performance even as the amount of data grows.
If you have an existing backup product, you don’t need to rip and replace it. There are storage platforms with robust integrations that work seamlessly with existing solutions and offer a wide range of data-protection architectures so you can ensure business continuity amid changes.
Tunio Zafer, CEO, pCloud
Bear in mind: your security team needs to assist. Answer these questions to find the right solution: Do you need ‘cold’ storage or cloud storage? If you’re looking to only store files for backup, you need a cloud backup service. If you’re looking to store, edit and share, go for cloud storage. Where are their storage servers located? If your business is located in Europe, the safest choice is a storage service based in Europe.
Client-side encryption means that your data is secured on your device and is transferred already encrypted. What is their support package? At some point, you’re going to need help. A data storage service with a support package that’s included for free, answers in up to 24 hours is preferred.
Despite COVID-related supply and demand disruptions, customers deployed more data center ethernet switches in the first half of 2020 than they did in the same year-ago period, according to Crehan Research. Port shipments increased by 12% year-over-year, resulting in a new record high.
Hyperscale cloud service providers and China leading the way
Hyperscale cloud service providers and China were significant contributors to the market’s growth, according to the report. The hyperscale cloud service provider’s contribution was reflected in the especially strong growth of 100 gigabit ethernet (GbE) and 25GbE – a preferred data center networking architecture within this customer segment. In fact, 100GbE and 25GbE combined had a 40% year-over-year increase, comprising a majority of total data center switch port shipments.
“This robust shipment growth, even in the face of COVID disruptions, is a reflection of the critical nature of data center networks in delivering needed services to businesses, homes and governments,” said Seamus Crehan, president of Crehan Research.
“The data center networking vendors have really risen to the challenge of helping their customers keep networks up and running, despite the myriad of obstacles presented during this pandemic.”
Other data center switch shipments top contributors
- Cisco accounted for the majority of data center switch shipments and saw stable year-over-year market share.
- As a result of its strong presence in the hyperscale cloud service provider sector, Arista was a key driver of the 100GbE switch growth, holding the top share position in this segment.
- In correlation with the strong growth in China, H3C and Huawei gained additional market share.
- Nvidia, through its Mellanox acquisition, saw a doubling of its data center switch shipments, on the strength of its Spectrum-based 100GbE switches.
“Back in January 2017, we forecast that combined shipments of 100GbE and 25GbE would comprise over half of all data center ethernet switch shipments by 2021,” Crehan said. “These recent results show that the transition to higher networking speeds that underpin modern data center architectures is happening even faster than expected.”
A technical support intervention has revealed two zero-day vulnerabilities in the OS running on Cisco enterprise-grade routers that attackers are trying to actively exploit.
Cisco plans to release software updates to plug these security holes, but in the meantime administrators are advised to implement one or all of the provided mitigations.
About the vulnerabilities
The two zero-day flaws – CVE-2020-3566 and CVE-2020-3569 – affect the Distance Vector Multicast Routing Protocol (DVMRP) feature of Cisco IOS XR Software, running on Cisco enterprise-grade routers for service providers, data centers, enterprises, and critical infrastructure.
They can be exploited by an unauthenticated, remote attacker by sending crafted IGMP (Internet Group Management Protocol) traffic to an affected device.
“A successful exploit could allow the attacker to cause memory exhaustion, resulting in instability of other processes. These processes may include, but are not limited to, interior and exterior routing protocols,” Cisco explained.
Proposed mitigations include:
- Implementing a rate limiter for IGMP traffic
- implementing an access control entry (ACE) to an existing interface access control list (ACL). “Alternatively, the customer can create a new ACL for a specific interface that denies DVMRP traffic inbound on that interface,” the company noted.
The company has also provided indicators of compromise, i.e., messages that can be seen in the system logs if a device is experiencing memory exhaustion based on exploitation of these vulnerabilities.
“These vulnerabilities affect any Cisco device that is running any release of Cisco IOS XR Software if an active interface is configured under multicast routing,” they added.
Enterprise resource planning (ERP) systems are an indispensable tool for most businesses. They allow them to track business resources and commitments in real time and to manage day-to-day business processes (e.g., procurement, project management, manufacturing, supply chain, human resources, sales, accounting, etc.).
The various applications integrated in ERP systems collect, store, manage, and interpret sensitive data from the many business activities, which allows organizations to improve their efficiency in the long run.
Needless to say, the security of such a crucial system and all the data it stores should be paramount for every organization.
Common misconceptions about ERP security
“Since ERP systems have a lot of moving parts, one of the biggest misconceptions is that the built-in security is enough. In reality, while you may not have given access to your company’s HR data to a technologist on your team, they may still be able to access the underlying database that stores this data,” Mike Rulf, CTO of Americas Region, Syntax, told Help Net Security.
“Another misconception is that your ERP system’s access security is robust enough that you can allow people to access their ERP from the internet.”
In actual fact, the technical complexity of ERP systems means that security researchers are constantly finding vulnerabilities in them, and businesses that make them internet-facing and don’t think through or prioritize protecting them create risks that they may not be aware of.
When securing your ERP systems you must think through all the different ways someone could potentially access sensitive data and deploy business policies and controls that address these potential vulnerabilities, Rulf says. Patching security flaws is extremely important, as it ensures a safe environment for company data.
Advice for CISOs
While patching is necessary, it’s true that business leaders can’t disrupt day-to-day business activity for every new patch.
“Businesses need some way to mitigate any threats between when patches are released and when they can be fully tested and deployed. An application firewall can act as a buffer to allow a secure way to access your proprietary technology and information during this gap. Additionally, an application firewall allows you to separate security and compliance management from ERP system management enabling the checks and balances required by most audit standards,” he advises.
He also urges CISOs to integrate the login process with their corporate directory service such as Active Directory, so they don’t have to remember to turn off an employee’s credentials in multiple systems when they leave the company.
To make mobile access to ERP systems safer for a remote workforce, CISOs should definitely leverage multi factor identification that forces employees to prove their identity before accessing sensitive company information.
“For example, Duo sends a text to an employee’s phone when logging in outside the office. This form of security ensures that only the people granted access can utilize those credentials,” he explained.
VPN technology should also be used to protect ERP data when employees access it from new devices and unfamiliar Wi-Fi networks.
“VPNs today can enable organizations to validate these new/unfamiliar devices adhere to a minimum security posture: for example, allowing only devices with a firewall configured and appropriate malware detection tools installed can access the network. In general, businesses can’t really ever know where their employees are working and what network they’re on. So, using VPNs to encrypt that data being sent back and forth is crucial.”
On-premise vs. cloud ERP security?
The various SaaS applications in your ERP, such as Salesforce and Oracle Cloud Apps, leave you beholden to those service providers to manage your applications’ security.
“You need to ask your service providers about their audit compliance and documentation. Because they are providing services critical to your business, you will be asked about these third parties by auditors during a SOC audit. You’ll thus need to expand your audit and compliance process (and the time it takes) to include an audit of your external partners,” Rulf pointed out.
“Also, when you move to AWS or Azure, you’re essentially building a new virtual data center, which requires you to build and invest in new security and management tools. So, while the cloud has a lot of great savings, you need to think about the added and unexpected costs of things like expanded audit and compliance.”
The effects of the COVID-19 pandemic have resulted in a negative impact on organizations’ ability to manage their storage infrastructures in order to ensure continued access to an increasingly remote workforce and to satisfy health protocols put in place to protect workers, according to StorONE.
The impact on data center operations
More than two-thirds of those surveyed maintain some level of on-premises storage. Because of the pandemic, almost 40 percent of those organizations have had no or critically restricted access to their data centers to address storage hardware failures or increase data protection levels, such as improved drive redundancies or snapshot intervals.
Reduced budgets mean that organizations will be unable to offer more performance and capacity to their users or will need to rely on better vendor pricing to supplement their needs.
Among the survey’s findings are:
- 20 percent of organizations have not had any access to their data centers, meaning that any physical hardware failures have had to wait. Another 20 percent have had restricted access to only allow work done in critical instances. The remaining 60 percent have been able to maintain moderate access with established maintenance windows and limited workforces.
- A third and have been forced to go to the data center to replace drives despite health risks. 12.5 percent of respondents indicated that they have had to live with the risk of data loss due to access issues, while another 12.5 percent have leveraged hot spares for their failed drives.
- 20 percent of organizations had restricted access to storage systems remotely during the pandemic, with 12 percent experiencing constrained remote administration capabilities due to hardware limitations. Another 12 percent had no remote administration during the pandemic with connections that either failed or were impractical.
- 33.3 percent of respondents said they had to count on their backup system for improved data protection levels, with 20.8 percent not able to enable any improvements to protection levels and 16.7 percent unable to afford the performance impact required of increased data protection.
- 16.7 percent of companies cut their IT budgets by more than 50 percent, with 45.8 percent cutting budgets between 10 and 25 percent. 4.2 percent cut budgets between 25 and 50 percent, 16.7 percent cut by as much as 10 percent and 16.7 percent reported no cuts to their IT budget due to coronavirus.
- To deal with reduced budgets, 40 percent of organizations are hoping for better pricing from their existing vendors, 30 percent will seek out other vendors that provide lower prices, and 30 percent will stand pat without increasing services to their users.
“While some organizations have been able to weather the storm of this unprecedented event, the negative impacts of COVID-19 on storage infrastructures are already being felt by a large majority of companies throughout the world,” said Gal Naor, CEO, StorONE.
“IT has long been expected to do more with less, but these survey results show that data is being left unprotected and unavailable in many instances due to lack of access to physical hardware or severe budget cuts. Companies cannot afford to risk their data regardless of the current issues at hand. Organizations need to implement a solution that will allow them to take existing servers and storage to create a near-zero additional cost system complete with data-protection services. A storage system with these capabilities ensures mission-critical information is always available, immediately recoverable and remains durable during times of crisis.”
Cisco has released another batch of critical security updates for flaws in Cisco Data Center Network Manager (DCMN) and the Cisco SD-WAN Solution software.
Cisco Data Center Network Manager flaws
Cisco Data Center Network Manager is the network management platform for all NX-OS-enabled deployments, spanning new fabric architectures, IP Fabric for Media, and storage networking deployments for the Cisco Nexus-powered data center.
These latest updates fix:
- One critical authentication bypass vulnerability (CVE-2020-3382) in the solution’s REST API that could allow an unauthenticated, remote attacker to bypass authentication and execute arbitrary actions with administrative privileges on an affected device
- Five high-risk flaws that could allow an authenticated, remote attacker to inject arbitrary commands on the affected device, write arbitrary files in the system with the privileges of the logged-in user, perform arbitrary actions through the REST API with administrative privileges, and interact with and use certain functions within the Cisco DCNM
- Three medium-risk bugs (XSS, SQL injection, information disclosure)
The vulnerabilities affect various versions of the Cisco Data Center Network Manager software and their exploitability occasionally depends on how the Cisco DCNM appliances were installed. But the fixes are all included in the latest Cisco DCNM software releases: 11.4(1) and later.
The flaws were either reported by security researchers or found by Cisco during internal security testing, and there is no indication that any of them are actively exploited.
The Cisco SD-WAN Solution software flaws
Cisco SD-WAN gives users the ability to manage connectivity across their WAN from a single dashboard: the Cisco vManage console.
The company has found:
- A critical buffer overflow vulnerability (CVE-2020-3375) affecting Cisco SD-WAN Solution software that could be exploited by sending crafted traffic to an affected device and could allow the attacker to gain access to information that they are not authorized to access, make changes to the system that they are not authorized to make, and execute commands on an affected system with privileges of the root user
- A critical vulnerability (CVE-2020-3374) in the web-based management interface of Cisco SD-WAN vManage Software that could be exploited by sending crafted HTTP requests to it and could allow the attacker to access sensitive information, modify the system configuration, or impact the availability of the affected system.
Again, there is no indication that these flaws are being exploited, but Cisco urges admins to implement the security updates as soon as possible, as there are no workarounds for addressing these flaws.
Security advisories for all of the fixed flaws can be found here.
The global data center networking market is projected to reach $40.9 billion by 2025 and projected to register 11.0% CAGR over the forecast period, from 2019 to 2025 according to Million Insights.
The growing huge amount of unstructured data across several industries is expected to drive the market growth. In addition, rising adoption of cloud computing and the introduction of advanced data center operating models are also anticipated to boost the market growth over the forecast period.
This data center networking helps the organization to consolidate and organize the information at a single platform before exposing to cross-channel processes and systems. It also allows the organization to connect with its customers operating in different industries.
Factors such as operational cost reduction, improvement in the integration of server, and optimum performance are augmenting the growth of this market. Most of the organizations are focusing on the state of the art infrastructure to resolve the concern and fulfill the customers’ expectations efficiently.
Channelizing information to enhance daily operations
The collected information is stored, analyzed, and managed on share platforms by using diverse networking solutions which enables the service provider to update their business model and helps to boost up their revenue. This has resulted in the requirement for channelizing information to enhance daily operations, thus anticipated to fuel the demand for data center networking over the next few years.
The data center networking market is projected to witness considerable growth due to the rising incidence of cyber-attacks, increasing adoption of the cloud-based platform, and increasing demand for real-time information. This solution helps the organization to access information on-demand and allows them to augment sale of their products and services.
Data center networking is also considered as a proficient mode of disaster recovery, as it allows operational recovery and restores function along with access to the clone database.
Further key findings
- Rising adoption of cloud computing and digitalization in several regions, especially in food & beverage, automobile and pharmaceuticals is expected to drive the market growth over the forecast period.
- In 2018, the storage area network (SAN) solution held the largest market share and expected to grow with significant growth in the next few years due to increasing adoption of various combination of computation mechanisms.
- The BFSI sector is anticipated to register fastest CAGR from 2019 to2025, as BFSI, IT & telecom are concentrating on adopting advanced technologies to maintain their complex infrastructure assets.
- Asia Pacific is projected to grow with the fastest CAGR of more than 14.0% during the forecast period due to increasing adoption of data centre networking solution in South Asian countries.
- Cisco Systems, Alcatel-Lucent, Dell, Equinix, Hitachi Data Systems Corporation, HP Development Company Vmware, and IBM are the key players operating in this market.
- Market players are implementing several strategies such as product expansion, merger & acquisition to sustain in the competitive market.
Two vulnerabilities in SaltStack Salt, an open-source remote task and configuration management framework, are being actively exploited by attackers, CISA warns.
About SaltStack Salt
Salt is used for configuring, managing and monitoring servers in datacenters and cloud environments.
The Salt installation is the “master” and each server it monitors runs an API agent called a “minion”. The minions send state reports to the master and the master publishes update messages containing instructions/commands to the minions. The communication between the master and its minions is secured (encrypted).
About the vulnerabilities
Discovered by F-Secure researchers, CVE-2020-11651 (an authentication bypass flaw) and CVE-2020-11652 (a directory traversal flaw) can be exploited by remote, unauthenticated attackers.
According to the researchers, the vulnerabilities allow attackers to “connect to the ‘request server’ port to bypass all authentication and authorization controls and publish arbitrary control messages, read and write files anywhere on the ‘master’ server filesystem and steal the secret key used to authenticate to the master as root.”
The attackers can thusly achieve remote command execution as root on both the master and all minions that connect to it.
The vulnerabilities affect all Salt versions prior to 2019.2.4 and 3000.2, which were released last week.
“Adding network security controls that restrict access to the salt master (ports 4505 and 4506 being the defaults) to known minions, or at least block the wider Internet, would also be prudent as the authentication and authorization controls provided by Salt are not currently robust enough to be exposed to hostile networks,” the researchers added.
F-Secure warned that there are over 6,000 Salt masters exposed to the public Internet, so they chose not to publish a PoC.
But, they said, “any competent hacker will be able to create 100% reliable exploits for these issues in under 24 hours,” and they were right: a few days later a researcher reported their honeypots already being targeted.
Even though SaltStack did send an advanced notice about the critical nature of the flaws and the need for a quick update and additional mitigation actions to their users, not everybody reacted promptly.
During the weekend, attackers successfully leveraged the flaws to gain access to the infrastructure of the LineageOS project, the Ghost blogging platform, and one of the Certificate Transparency logs (CT2) operated by DigiCert. In all three cases, the attackers’ goal was to install cryptominers.
UPDATE (May 4, 2020, 5:10 a.m. PT):
“Upon notification of the CVE, SaltStack took immediate action to remediate the vulnerability, develop and issue patches, and communicate to our customers about the affected versions so they can prepare their systems for update. Although there was no initial evidence that the CVE had been exploited, we have confirmed that some vulnerable, unpatched systems have been accessed by unauthorized users since the release of the patches,” Alex Peay, SVP, Product and Marketing, SaltStack, told Help Net Security.
“We must reinforce how critical it is that all Salt users patch their systems and follow the guidance we have provided outlining steps for remediation and best practices for Salt environment security. It is equally important to upgrade to latest versions of the platform and register with support for future awareness of any possible issues and remediations. As the primary maintainers of the Salt Open Project, trusted by the world’s largest businesses to automate digital infrastructure operations and security, we take this vulnerability and the security of our platform very seriously. More information about our response and handling of CVEs is available in our Knowledge Base.”
UPDATE (May 4, 2020, 9:45 a.m. PT):
“Yesterday, May 3, DigiCert announced that it is deactivating its Certificate Transparency (CT) 2 log server after determining that the key used to sign SCTs may have been exposed via critical SALT vulnerabilities. We do not believe the key was used to sign SCTs outside of the CT log’s normal operation, though as a precaution, CAs that received SCTs from the CT2 log after May 2 at 5 p.m. U.S. Mountain Daylight Time (MDT) should receive an SCT from another trusted log,” a DigiCert spokesperson told Help Net Security.
“Three other DigiCert CT logs: CT1, Yeti and Nessie, are not affected as they are run on completely different infrastructure. The impacts are limited to only the CT2 log and no other part of DigiCert’s CA or CT Log systems.”
The spokesperson added that DigiCert has been planning for some time to shut down CT2, in order to move the industry toward their newer and more robust CT logs, Yeti and Nessie.
“We notified the industry of our intention to terminate signing operations of CT2 on May 1 but pushed back the date based on industry feedback. This timeline has now been moved up, with the CT2 log in read-only mode effective May 3,” they explained.
“Because of Google’s implementation of CT that requires SCTs be posted in multiple logs in order for a certificate to be valid, active TLS certificates posted to the CT2 log should continue to work as expected if issued before May 2 at 5 p.m. MDT.”
The first all optical stealth encryption technology that will be significantly more secure and private for highly sensitive cloud-computing and data center network transmission, has been introduced by BGN Technologies.
Time is running out on security and privacy
“Today, information is still encrypted using digital techniques, although most data is transmitted over distance using light spectrum on fiber optic networks,” says Prof. Dan Sadot, Director of the Optical Communications Research Laboratory, who heads the team that developed the technology.
“Time is running out on security and privacy of digital encryption technology, which can be read offline if recorded and code-broken using intensive computing power. We’ve developed an end-to-end solution providing encryption, transmission, decryption, and detection optically instead of digitally.”
How does the optical stealth encryption technology work?
Using standard optical equipment, the research team essentially renders the fiber-optic light transmission invisible or stealthy. Instead of using one color of the light spectrum to send one large data stream, this method spreads the transmission across many colors in the optical spectrum bandwidth (1,000 x wider than digital) and intentionally creates multiple weaker data streams that are hidden under noise and elude detection.
Every transmission – electronic, digital or fiber – has a certain amount of “noise.” The researchers demonstrated that they can transmit weaker encrypted data under a stronger inherent noise level that cannot be detected.
The solution also employs a commercially available phase mask, which changes the phase of each wavelength (color). That process also appears as noise, which destroys the “coherence” or ability to recompile the data without the correct encryption key. The optical phase mask cannot be recorded offline, so the data is destroyed if a hacker tries to decode it.
“Basically, the innovative breakthrough is that if you can’t detect it, you can’t steal it,” Prof. Sadot says. “Because an eavesdropper can neither read the data nor even detect the existence of the transmitted signal, our optical stealth transmission provides the highest level of privacy and security for sensitive data applications.”
Zafrir Levy, senior vice president for exact sciences and engineering at BGN, says, “The novel, patented method invented by Prof. Sadot and his team is highly useful for multiple applications, such as high-speed communication, sensitive transmission of financial, medical or social media-related information without the risk of eavesdropping or jamming data flow. In fact, with this method, an eavesdropper will require years to break the encryption key.”
“Every data center has 100G and 400G lines, and part of those lines are encrypted end-to-end,” Prof. Sadot adds. “There is the need for non-digital encryption for customers who require the most advanced security possible.”
Despite the inevitability of security-related incidents, few organizations currently protect against the spread of breaches with segmentation – only 19 percent of the 300 IT professionals surveyed by Illumio currently implement segmentation solutions today.
While approximately 25 percent are actively planning a project, more than half are not protecting with segmentation at all or planning to in the next six months.
While unprepared, organizations are hoping for the best
Security segmentation limits the ability for attacks to move laterally inside an organization by breaking data center and campus networks or clouds into smaller segments. It is widely recognized as a cyber security best practice, although it is drastically underutilized in organizations today.
“The results from this survey confirm what we have long known. Despite the fact that organizations realize the likelihood of a security incident is high, they do not leverage segmentation because it is too hard and costly to implement, especially with firewalls, preventing wider adoption.
“This is why we have spent years developing a purpose-built segmentation solution used for security. It is simpler, more effective and drives the cost out of segmentation projects so organizations can consider a future free of high-profile breaches,” said Matt Glenn, VP of Product Management at Illumio.
A somewhat positive finding showed that 45 percent of respondents currently have a segmentation project in flight or are planning to begin one in the next six months.
Of those who are planning a project, the survey found that 81 percent of respondents will leverage firewalls for segmentation, despite the fact that they are slow to implement, don’t adapt, are complex to work with, and were not built to serve this function.
Firewalls are falling short
Companies still wisely rely on firewalls for perimeter security, however most cited difficulties with how costly they are to implement and manage for segmentation. 68 percent of respondents struggle with securing initial capital expenditure budgets for firewalls and 66 percent find it challenging to secure ongoing operating expenditure budgets.
The size and complexity of firewalls also cause problems for organizations. The average time for respondents to deploy and tune firewalls for segmentation was one to three months.
In addition, more than two thirds of respondents acknowledge that firewalls make it hard to test rules prior to deploying, making it easier to accidentally misconfigure rules and break applications. Regardless of these downfalls, 57 percent cite potential risk induced by change as the leading reason why they do not stop using firewalls.
Segmentation as a practice is foundational to security frameworks like Zero Trust. According to Forrester Research’s Zero Trust website, “defending the perimeter is no longer an effective strategy. Zero Trust implements methods to localize and isolate threats through microcore, microsegmentation, and deep visibility to give you an organized approach to identify threats and limit the impact of any breach.”
Host-based security segmentation is more cost-effective and reliable
Host-based security segmentation offers a more cost-effective and reliable approach to segmentation and is more effective at protecting data centers and cloud ecosystems against lateral data breaches. Since host-based, security segmentation is software-based and isn’t tied to the network, it offers several strong benefits:
- At least 200% more cost effective than firewalls.
- Deploys four to six times faster than firewalls.
- Has up to 90% fewer rules than firewalls.
- Easy to test before deployment and can be updated in hours.
- Low risk of breaking an application.