Deep learning is everywhere. This branch of artificial intelligence curates your social media and serves your Google search results. Soon, deep learning could also check your vitals or set your thermostat.
MIT researchers have developed a system that could bring deep learning neural networks to new – and much smaller – places, like the tiny computer chips in wearable medical devices, household appliances, and the 250 billion other objects that constitute the IoT.
The system, called MCUNet, designs compact neural networks that deliver unprecedented speed and accuracy for deep learning on IoT devices, despite limited memory and processing power. The technology could facilitate the expansion of the IoT universe while saving energy and improving data security.
The Internet of Things
The IoT was born in the early 1980s. Grad students at Carnegie Mellon University, including Mike Kazar ’78, connected a Cola-Cola machine to the internet. The group’s motivation was simple: laziness.
They wanted to use their computers to confirm the machine was stocked before trekking from their office to make a purchase. It was the world’s first internet-connected appliance. “This was pretty much treated as the punchline of a joke,” says Kazar, now a Microsoft engineer. “No one expected billions of devices on the internet.”
Since that Coke machine, everyday objects have become increasingly networked into the growing IoT. That includes everything from wearable heart monitors to smart fridges that tell you when you’re low on milk.
IoT devices often run on microcontrollers – simple computer chips with no operating system, minimal processing power, and less than one thousandth of the memory of a typical smartphone. So pattern-recognition tasks like deep learning are difficult to run locally on IoT devices. For complex analysis, IoT-collected data is often sent to the cloud, making it vulnerable to hacking.
“How do we deploy neural nets directly on these tiny devices? It’s a new research area that’s getting very hot,” says Han. “Companies like Google and ARM are all working in this direction.” Han is too.
With MCUNet, Han’s group codesigned two components needed for “tiny deep learning” – the operation of neural networks on microcontrollers. One component is TinyEngine, an inference engine that directs resource management, akin to an operating system. TinyEngine is optimized to run a particular neural network structure, which is selected by MCUNet’s other component: TinyNAS, a neural architecture search algorithm.
Designing a deep network for microcontrollers isn’t easy. Existing neural architecture search techniques start with a big pool of possible network structures based on a predefined template, then they gradually find the one with high accuracy and low cost. While the method works, it’s not the most efficient.
“It can work pretty well for GPUs or smartphones,” says Lin. “But it’s been difficult to directly apply these techniques to tiny microcontrollers, because they are too small.”
So Lin developed TinyNAS, a neural architecture search method that creates custom-sized networks. “We have a lot of microcontrollers that come with different power capacities and different memory sizes,” says Lin. “So we developed the algorithm [TinyNAS] to optimize the search space for different microcontrollers.”
The customized nature of TinyNAS means it can generate compact neural networks with the best possible performance for a given microcontroller – with no unnecessary parameters. “Then we deliver the final, efficient model to the microcontroller,” say Lin.
To run that tiny neural network, a microcontroller also needs a lean inference engine. A typical inference engine carries some dead weight – instructions for tasks it may rarely run. The extra code poses no problem for a laptop or smartphone, but it could easily overwhelm a microcontroller.
“It doesn’t have off-chip memory, and it doesn’t have a disk,” says Han. “Everything put together is just one megabyte of flash, so we have to really carefully manage such a small resource.” Cue TinyEngine.
The researchers developed their inference engine in conjunction with TinyNAS. TinyEngine generates the essential code necessary to run TinyNAS’ customized neural network. Any deadweight code is discarded, which cuts down on compile-time.
“We keep only what we need,” says Han. “And since we designed the neural network, we know exactly what we need. That’s the advantage of system-algorithm codesign.”
In the group’s tests of TinyEngine, the size of the compiled binary code was between 1.9 and five times smaller than comparable microcontroller inference engines from Google and ARM.
TinyEngine also contains innovations that reduce runtime, including in-place depth-wise convolution, which cuts peak memory usage nearly in half. After codesigning TinyNAS and TinyEngine, Han’s team put MCUNet to the test.
MCUNet’s first challenge was image classification. The researchers used the ImageNet database to train the system with labeled images, then to test its ability to classify novel ones. On a commercial microcontroller they tested, MCUNet successfully classified 70.7 percent of the novel images — the previous state-of-the-art neural network and inference engine combo was just 54 percent accurate. “Even a 1 percent improvement is considered significant,” says Lin. “So this is a giant leap for microcontroller settings.”
The team found similar results in ImageNet tests of three other microcontrollers. And on both speed and accuracy, MCUNet beat the competition for audio and visual “wake-word” tasks, where a user initiates an interaction with a computer using vocal cues (think: “Hey, Siri”) or simply by entering a room. The experiments highlight MCUNet’s adaptability to numerous applications.
The promising test results give Han hope that it will become the new industry standard for microcontrollers. “It has huge potential,” he says.
The advance “extends the frontier of deep neural network design even farther into the computational domain of small energy-efficient microcontrollers,” says Kurt Keutzer, a computer scientist at the University of California at Berkeley, who was not involved in the work. He adds that MCUNet could “bring intelligent computer-vision capabilities to even the simplest kitchen appliances, or enable more intelligent motion sensors.”
MCUNet could also make IoT devices more secure. “A key advantage is preserving privacy,” says Han. “You don’t need to transmit the data to the cloud.”
Analyzing data locally reduces the risk of personal information being stolen — including personal health data. Han envisions smart watches with MCUNet that don’t just sense users’ heartbeat, blood pressure, and oxygen levels, but also analyze and help them understand that information.
MCUNet could also bring deep learning to IoT devices in vehicles and rural areas with limited internet access.
Plus, MCUNet’s slim computing footprint translates into a slim carbon footprint. “Our big dream is for green AI,” says Han, adding that training a large neural network can burn carbon equivalent to the lifetime emissions of five cars. MCUNet on a microcontroller would require a small fraction of that energy.
“Our end goal is to enable efficient, tiny AI with less computational resources, less human resources, and less data,” says Han.
As the Internet of Things becomes more and more part of our lives, the security of these devices is imperative, especially because attackers have wasted no time and are continuously targeting them.
Chen Ku-Chieh, an IoT cyber security analyst with the Panasonic Cyber Security Lab, is set to talk about the company’s physical honeypot and about the types of malware they managed to discover through it at HITB CyberWeek on Wednesday (October 18).
In the meantime, we had some questions for him:
Global organizations are increasingly experiencing IoT-focused cyberattacks. What is the realistic worst-case scenario when it comes to such attacks?
The use of IoT is increasingly widespread, from home IoT, office IoT to factory IoT, and the use of automation equipment is increasing. Therefore, the most realistic and worst case for IoT is to affect critical infrastructure equipment, such as industrial control systems (ICS), by attacking IIoT devices.
Hackers can affect the operation of ICSes by attacking IIoT, resulting in large-scale damage. Furthermore, protecting medical IoT devices is also important. Hacked pacemakers, insulin pumps, etc. can affect human lives directly.
What are the main challenges when it comes to vulnerability research of IoT devices?
Expanding from IoT devices to IoT systems. The main challenge is that IoT systems consist of various components. Most components have different software/firmware, hardware, etc. The discovery of vulnerabilities in IoT devices requires expertise in many fields – researchers need to know a lot about chips, applications, communication protocols, network protocols, operation systems, cloud services, and so on.
What advice would you give to an enterprise CISO that wants to make sure the connected devices in use in the organization are as secure as possible?
To start, CISOs should check whether the vendors of the products they plan to use care about product security. How do they deal with vulnerabilities? Do they have a PSIRT? Do they have a point of contact for vulnerability reports? And so on.
Once they settle on a product to use, they should make sure that best practices – e.g., safely configuring the device, applying security updates in a timely manner – are part of the internal processes. They should also check the security of the services the devices use, e.g., network services used by an IP camera. Finally, network defenses should be structured to effectively control the access rights of the various networked devices in the environment.
How do you expect the security of IoT devices to evolve in the near future?
As we move forward, governments will attempt to create security baselines with regulations and certifications (labelling schemes). New security standards for various sectors (automotive, aviation – to name a few) will also be created.
As IoT products use similar network security protocols or hardware components, IoT security will no longer be a unilateral effort by the manufacturers. In the future, manufacturers, suppliers of parts, security organizations and governments will cooperate more closely, and even achieve mutual defense alliances to ensure effective and immediate protection.
The European Union Agency for Cybersecurity (ENISA) released its Guidelines for Securing the IoT, which covers the entire IoT supply chain – hardware, software and services.
Supply chains are currently facing a broad range of threats, from physical threats to cybersecurity threats. Organisations are becoming more dependent than ever before on third parties.
As organisations cannot always control the security measures of their supply chain partners, IoT supply chains have become a weak link for cybersecurity. Today, organisations have less visibility and understanding of how the technology they acquire is developed, integrated and deployed than ever before.
“Securing the supply chain of ICT products and services should be a prerequisite for their further adoption particularly for critical infrastructure and services. Only then can we reap the benefits associated with their widespread deployment, as it happens with IoT,” said Juhan Lepassaar, Executive Director, ENISA.
In the context of the development of the guidelines, ENISA has conducted a survey that identifies the existence of untrusted third-party components and vendors, and the vulnerability management of third-party components as the two main threats to the IoT supply chain. The publication analyses the different stages of the development process, explores the most important security considerations, identifies good practices to be taken into account at each stage, and offers readers additional resources from other initiatives, standards and guidelines.
As in most cases pre-prepared products are used to build up an IoT solution, introducing the concept of security by design and security by default is a fundamental building block to protect this emerging technology. The agency has worked with IoT experts to create specific security guidelines for the whole lifespan of IoT devices.
These guidelines to help tackle the complexity of IoT focus on bringing together the key actors in the supply chain to adopt a comprehensive approach to security, leverage existing standards and implement security by design principles.
McAfee released a report examining cybercriminal activity related to malware and the evolution of cyber threats in Q2 2020. During this period, there was an average of 419 new threats per minute as overall new malware samples grew by 11.5%.
A significant proliferation in malicious Donoff Microsoft Office documents attacks propelled new PowerShell malware up 117%, and the global impact of COVID-19 prompted cybercriminals to adjust their cybercrime campaigns to lure victims with pandemic themes and exploit the realities of a workforce working from home.
“The second quarter of 2020 saw continued developments in innovative threat categories such as PowerShell malware and the quick adaptation by cybercriminals to target organizations through employees working from remote environments,” said Raj Samani, McAfee fellow and chief scientist.
“What began as a trickle of phishing campaigns and the occasional malicious app quickly turned into a deluge of malicious URLs, attacks on cloud users and capable threat actors leveraging the world’s thirst for more information on COVID-19 as an entry mechanism into systems across the globe.”
COVID-19-themed threat campaigns
After a first quarter that saw the world plunge into pandemic, the second quarter saw enterprises continue to adapt to unprecedented numbers of employees working from home and the cybersecurity challenges this new normal demands.
Over the course of Q2, a 605% increase in COVID-19-related attack detections were observed, compared to Q1.
Donoff and PowerShell malware
Donoff Microsoft Office documents act as TrojanDownloaders by leveraging the Windows Command shell to launch PowerShell and proceed to download and execute malicious files. Donoff played a critical role in driving the 689% surge in PowerShell malware in Q1 2020.
In Q2, the acceleration of Donoff-related malware growth slowed but remained robust, driving up PowerShell malware by 117% and helping to drive a 103% increase in overall new Microsoft Office malware. This activity should be viewed within the context of the overall continued growth trend in PowerShell threats. In 2019, total samples of PowerShell malware grew 1,902%.
Attacks on cloud users
Nearly 7.5 million external attacks on cloud user accounts were observed.
This data set represents companies in all major industries across the globe, including financial services, healthcare, public sector, education, retail, technology, manufacturing, energy, utilities, legal, real estate, transportation, and business services.
Q2 2020 threat activity
- Malware overall. 419 new threats per minute were observed in Q2 2020, an increase of almost 12% over the previous quarter. Ransomware growth remained steady compare to the first quarter of 2020.
- Coinminer malware. After growing 26% in Q1, new coinmining malware increased 25% over the previous quarter sustained by the popularity of new coinmining applications.
- Mobile malware. After a 71% increase in new mobile malware samples in Q1, Q2 saw the category slow 15% despite a surge in Android Mobby Adware.
- Internet of Things. New IoT malware increased only 7% in Q2, but the space saw significant activity by Gafgyt and Mirai threats, both of which drove growth in new Linux malware by 22% during the period.
- Regional cyber activity. McAfee counted 561 publicly disclosed security incidents in the second quarter of 2020, an increase of 22% from Q1. Disclosed incidents targeting North America decreased 30% over the previous quarter. These incidents decreased 47% in the United States, but increased 25% in Canada and 29% in the United Kingdom.
- Attack vector. Overall, malware led among reported attack vectors accounting for 35% of publicly reported incidents in Q2. Account hijacking and targeted attacks accounted for 17% and 9% respectively.
- Sector activity. Disclosed incidents detected in the second quarter of 2020 targeting science and technology increased 91% over the previous quarter. Incidents in manufacturing increased 10%, but public sector events decreased by 14%.
The global number of industrial IoT connections will increase from 17.7 billion in 2020 to 36.8 billion in 2025, representing an overall growth rate of 107%, Juniper Research found.
The research identified smart manufacturing as a key growth sector of the industrial IoT market over the next five years, accounting for 22 billion connections by 2025.
The research predicted that 5G and LPWA (Low Power Wide Area) networks will play pivotal roles in creating attractive service offerings to the manufacturing industry, and enabling the realisation of the ‘smart factory’ concept, in which real-time data transmission and high connection densities allow highly-autonomous operations for manufacturers.
5G to maximise benefits of smart factories
The report identified private 5G services as crucial to maximising the value of a smart factory to service users, by leveraging the technology to enable superior levels of autonomy amongst operations.
It found that private 5G networks will prove most valuable when used for the transmission of large amounts of data in environments with a high density of connections, and where significant levels of data are generated. In turn, this will enable large-scale manufacturers to reduce operational spend through efficiency gains.
Software revenue to dominate industrial IoT market value
The research forecasts that over 80% of global industrial IoT market value will be attributable to software spend by 2025, reaching $216 billion. Software tools leveraging machine learning for enhanced data analysis and the identification of network vulnerabilities are now essential to connected manufacturing operations.
Research author Scarlett Woodford noted: “Manufacturers must exercise caution when implementing IoT technology, resisting the temptation to introduce connectivity to all aspects of operations. Instead, manufacturers must focus on the collection of data on the most valuable areas to drive efficiency gains.”
These days, you’d be hard-pressed to find connected devices that do not come with companion smartphone applications. In fact, it’s very common for contemporary devices to offload most (if not all) of its display to the user handset.
Smartphones and the rise of IoT
Relying on the ubiquity of smartphones and the rise of remote controls, users and vendors alike have embraced the move away from physical device interfaces. This evolution in the IoT ecosystem, however, brings major benefits AND serious drawbacks.
While users enjoy the remote capabilities of companion apps and vendors bypass the need for hardware interfaces, studies show that they present serious cybersecurity risks. For example, the communication between an IoT device and its app is often not properly encrypted nor authenticated – and these issues enable the construction of exploits to achieve remote control of victim’s devices.
How the industry got here
It is important to explain that connected devices have not always been this way. I’m sure others like myself do not need to cast their minds far back to remember a time when smartphones did not even exist. User input during these halcyon days relied on physical interfaces on the device itself, interfaces that typically consisted of basic touch screens or two-line LCD displays.
Though functional, these physical interfaces were certainly limited (and limiting) when compared to the applications that superseded them. Devices without physical interfaces are smaller, consume less power, and look better. Developers, meanwhile, enjoy the relative ease of creating an app – with the additional support of software development kits – instead of manually programming physical interfaces. Perhaps most importantly, it’s many times cheaper for vendors to create devices with companion apps than to create devices with physical interfaces.
All that is without even starting on the benefits of remote connectivity! Smartphone apps enable users anywhere in the world to set the temperature of their air conditioning and record from their home security webcam with the click of a screen. These apps are simply much more expressive and intuitive than physical interfaces, enabling users to customize what they like from wherever they are. On the other hand, however, it is this element of remote connectivity which presents the compromise between usability and security.
The dangers of device companion apps
Unfortunately, the majority of companion apps have the potential to open devices to bad actors. Researchers last year found that about half are potentially exploitable through protocol analysis since they use local communication or local broadcast communication, thus providing an attack path to exploit lack of crypto or use of hardcoded encryption keys. Further, this study into companion apps from some of Amazon’s most popular devices found a lack of encryption in one-third of cases and the use of hardcoded keys in one-fifth of cases.
These findings were confirmed in another study where researchers tested more than 2000 device companion apps for security faults. The researchers found more than 30 devices from 10 vendors relied on the same cloud service to manage their devices, with the cloud service reporting security weakness that previously allowed attackers to take full control by device ID and password enumeration.
To make matters worse, there is little incentive for vendors to release fixes when vulnerabilities are uncovered. Most vendors in this space are small and medium-sized businesses that lack the budget for software quality control and security best practices. This issue is only exacerbated by the relative inexpensiveness of the devices they sell, meaning that vendors simply do not have the resources necessary to implement security best practices like monitoring agents or authentication hardware.
What users must do
The good news is that secure communication between a device and an app is possible. For example, EZVIZ smart home security applications support local communication between the companion app and the device over the local network. The shared encryption key is enclosed in the device box in the form of a QR code and must be scanned by the companion app. This strategy is better than hardcoded keys, provided that the key in the QR code is of sufficient length and randomness.
Another security workaround is possible to ensure that commands between the client and the device cannot be intercepted by a third-party. Peer-to-peer is a private connection type used by German smart heating and cooling provider SOREL to ensure its smartphone app communicates without interference. Moreover, the connection offers the company minimized risk since end users only manage their data on their device.
The bad news is that users today remain at the mercy of the vendors. There is currently no legislation that requires device makers to ensure that their devices or companion apps implement certain cybersecurity protocols. As we have seen time and again, vendor indifference to cybersecurity consistently results in subpar security protocols.
Therefore, the onus is on users to take extra cybersecurity steps in this context of vendor ambivalence. Until legislators catch up or manufacturers begin to implement stricter security protocols for their devices and apps, users will need to take matters into their own hands to make certain that the devices they bring into the workplace or the home are safe from outside forces. While the benefits of companion apps are clear, it is only the user who can prevent the worst dangers of these digital interfaces from becoming reality.
There’s a growing use of ransomware, encrypted threats and attacks among cybercriminals leveraging non-standard ports, while overall malware volume declined for the third consecutive quarter, SonicWall reveals.
“However, the overnight emergence of remote workforces and virtual offices has given cybercriminals new and attractive vectors to exploit. These findings show their relentless pursuit to obtain what is not rightfully theirs for monetary gain, economic dominance and global recognition.”
Key findings include:
- 39% decline in malware (4.4 billion YTD); volume down for third consecutive quarter
- 40% surge in global ransomware (199.7 million)
- 19% increase in intrusion attempts (3.5 trillion)
- 30% rise in IoT malware (32.4 million)
- 3% growth of encrypted threats (3.2 million)
- 2% increase in cryptojacking (57.9 million)
Malware volume dipping as attacks more targeted, diversified
While malware authors and cybercriminals are still busy working to launch sophisticated cyberattacks, the research concludes that overall global malware volume continues steadily decline in 2020. In a year-over-year comparison through the third quarter, researchers recorded 4.4 billion malware attacks — a 39% drop worldwide.
Regional comparisons show India (-68%) and Germany (-64%) have once again seen a considerable drop-rate percentage, as well as the United States (-33%) and the United Kingdom (-44%). Lower numbers of malware do not mean it is going away entirely. Rather, this is part of a cyclical downturn that can very easily right itself in a short amount of time.
Ransomware erupts, Ryuk responsible for third of all attacks
Ransomware attacks are making daily headlines as they wreak havoc on enterprises, municipalities, healthcare organizations and educational institutions. Researchers tracked aggressive growth during each month of Q3, including a massive spike in September.
While sensors in India (-29%), the U.K. (-32%) and Germany (-86%) recorded decreases, the U.S. saw a staggering 145.2 million ransomware hits — a 139% YoY increase.
Notably, researchers observed a significant increase in Ryuk ransomware detections in 2020. Through Q3 2019, just 5,123 Ryuk attacks were detected. Through Q3 2020, 67.3 million Ryuk attacks were detected — 33.7% of all ransomware attacks this year.
“What’s interesting is that Ryuk is a relatively young ransomware family that was discovered in August 2018 and has made significant gains in popularity in 2020,” said SonicWall VP, Platform Architecture, Dmitriy Ayrapetov.
“The increase of remote and mobile workforces appears to have increased its prevalence, resulting not only in financial losses, but also impacting healthcare services with attacks on hospitals.
“Ryuk is especially dangerous because it is targeted, manual and often leveraged via a multi-stage attack preceded by Emotet and TrickBot malware. Therefore, if an organization has Ryuk, it’s a pretty good indication that its infested with several types of malware.”
IoT dependency grows along with threats
COVID-19 led to an unexpected flood of devices on networks, resulting in an increase of potential threats to companies fighting to remain operational during the pandemic. A 30% increase in IoT malware attacks was found, a total of 32.4 million world-wide.
Most IoT devices — including voice-activated smart devices, door chimes, TV cameras and appliances — were not designed with security as a top priority, making them susceptible to attack and supplying perpetrators with numerous entry points.
“Employees used to rely upon the safety office networks provided, but the growth of remote and mobile workforces has extended distributed networks that serve both the house and home office,” said Conner.
“Consumers need to stop and think if devices such as AC controls, home alarm systems or baby monitors are safely deployed. For optimum protection, professionals using virtual home offices, especially those operating in the C-suite, should consider segmenting home networks.”
Threat intelligence data also concluded that while cryptojacking (57.9 million), intrusion attempts (3.5 trillion) and IoT malware threats (32.4 million) are trending with first-half volume reports, they continue to pose a threat and remain a source of opportunity for cybercriminals.
Connected devices are becoming more ingrained in our daily lives and the burgeoning IoT market is expected to grow to 41.6 billion devices by 2025. As a result of this rapid growth and adoption at the consumer and commercial level, hackers are infiltrating these devices and mounting destructive hacks that put sensitive information and even lives at risk.
These attacks and potential dangers have kept security at top of mind for manufacturers, technology companies and government organizations, which ultimately led to the U.S. House of Representatives passing the IoT Cybersecurity Improvement Act of 2020.
The bill focuses on increasing the security of federal devices with standards provided by the National Institute of Standards and Technology (NIST), which will cover devices from development to the final product. The bill also requires Homeland Security to review and revisit the legislation up to every five years and revise it as necessary, which will keep it up to date with the latest innovative tech and new standards that might come along with it.
Although it is a step in the right direction to tighten security for federal devices, it only scratches the surface of what the IoT industry needs as a whole. However, as this bill is the first of its kind to be passed by the House, we need to consider how it will help shape the future of IoT security:
Better transparency throughout the device lifecycle
With a constant focus on innovation in the IoT industry, oftentimes security is overlooked in order to rush a product onto shelves. By the time devices are ready to be purchased, important details like vulnerabilities may not have been disclosed throughout the supply chain, which could expose and exploit sensitive data. To date, many companies have been hesitant to publish these weak spots in their device security in order to keep it under wraps and their competition and hackers at bay.
However, now the bill mandates contractors and subcontractors involved in developing and selling IoT products to the government to have a program in place to report the vulnerabilities and subsequent resolutions. This is key to increasing end-user transparency on devices and will better inform the government on risks found in the supply chain, so they can update guidelines in the bill as needed.
For the future of securing connected devices, multiple stakeholders throughout the supply chain need to be held accountable for better visibility and security to guarantee adequate protection for end-users.
Public-private partnerships on the rise
Per this bill, for the development of the security guidelines, the government will need to consult with cybersecurity experts to align on industry standards and best practices for better IoT device protection.
Working with industry-led organizations can provide accurate insight and allow the government to see current loopholes to create standards for real-world application. Encouraging these public-private partnerships is essential to advancing security in a more holistic way and will ensure guidelines and standards aren’t created in a silo.
Shaping consumer security from a federal focused bill
The current bill only focuses on securing devices on a federal level, but with the crossover from manufacturers and technology companies working in both the commercial/government and consumer space, naturally this bill will infiltrate the consumer device market too. It’s not practical for a manufacturer to follow two separate guidelines for both categories of products, so those standards in place for government contracted devices will likely be applied to all devices on the assembly line.
As the focus will shift to consumer safety after this bill, the challenge for manufacturers to eventually test products against two bills – one with federal and one with consumer standards – has been raised in the industry. The only way to remedy the issue is if there are global, adoptable and scalable standards across all industries to streamline security and provide appropriate protection for all device categories.
Universal standards – Are we there yet?
While this bill is a great start for the IoT industry and may serve as the catalyst for future IoT bills, there is still some room for improvement for the future of connected device security. In its current form, the bill does not explicitly define the guidelines for security, which can be frustrating and confusing for IoT device stakeholders who need to comply with them. With multiple government organizations and industry-led programs creating their own set of standards, the only way to truly propel this initiative forward is to harmonize and clearly define standards for universal adoption.
While the IoT bill signals momentum from the US government to prioritize IoT security, an international effort needs to be in place for establishing global standards and protecting connected devices must be made, as the IoT knows no boundaries. Syncing these standards and enforcing them through trusted certification programs will hold manufacturers and tech companies accountable for security and provide transparency for all end-users on a global scale.
The IoT Cybersecurity Improvement Act of 2020 is a landmark accomplishment for the IoT industry but is only just the beginning. As the world grows more integrated through connected devices, security standards will need to evolve to keep up with the digital transformation occurring in nearly every industry.
Due to security remaining a key concern for device manufacturers, tech companies, consumers and government organizations, the need for global standards remains in focus and will likely need an act of Congress to make them a priority.
Healthcare delivery organizations (HDOs) have been busy increasing their network and systems security in the last year, though there is still much room for improvement, according to Forescout researchers.
This is the good news: the percentage of devices running Windows unsupported operating systems fell from 71% in 2019 to 32% in 2020 and there have been improvements when it comes to timely patching and network segmentation.
The bad news? Some network segmentation issues still crop up and HDOs still use insecure protocols for both medical and non-medical network communications, as well as for external communications.
Based on two data sources – an analysis of network traffic from five large hospitals and clinics and the Forescout Device Cloud (containing data for some 3.3 million devices in hundreds of healthcare networks) – the researchers found that, between April 2019 and April 2020:
- The percentage of devices running versions of Windows OS that will be supported for more than a year jumped from 29% to 68% and the percentage of devices running Windows OS versions supported via ESU fell from 71% to 32%. Unfortunately, the percentage of devices running Windows OSes like Windows XP and Windows Server 2003 remained constant (though small)
- There was a decided increase in network segmentation
Unfortunately, most network segments (VLANs) still have a mix of healthcare devices and IT devices or healthcare equipment, personal, and OT devices, or mix sensitive and vulnerable devices.
As far as communication protocols are concerned, they found that:
- 4 out of the 5 HDOs were communicating between public and private IP addresses using a medical protocol, HL7, that transports medical information in clear text
- 2 out of the 5 HDOs allowed medical devices to communicate over IT protocols with external servers reachable from outside the HDO’s perimeter
- All HDOs used obsolete versions of communication protocols, internally and externally (e.g., SSLv3, TLSv1.0, and TLSv1.1, SNMP v1 and 2, NTP v1 and 2, Telnet)
- Many of the medical and proprietary protocols used by medical equipment lack encryption and authentication, or don’t enforce its usage (e.g., HL7, DICOM, POCT01, LIS02). OT and IoT devices in use also have a similar problem
That’s all a big deal, because attacks exploiting these security vulnerabilities could do a lot of damage, including stealing patients’ information, altering it, disrupting the normal behavior of medical devices, disrupting the normal functioning of the entire organization (e.g., via a ransomware attack), etc.
Defense strategies for better healthcare network security
The researchers advised HDOs’ cyber defenders to:
- Find a way to “see” all the devices on the network, whether they comply with company policies, and detect malicious network behavior they may exhibit
- Identify and remediate weak and default passwords
- Map the network flow of existing communications to help identify unintended external communications, prevent medical data from being exposed publicly, and to detect the use of insecure protocols
- Improve segmentation of devices (e.g., isolate fragile legacy applications and operating systems, segment groups of devices according to their purpose, etc.)
“Whenever possible, switch to using encrypted versions of protocols and eliminate the usage of insecure, clear-text protocols such as Telnet. When this is not possible, use segmentation for zoning and risk mitigation,” they noted.
They also warned about the danger of over-segmentation.
“Segmentation requires well-defined trust zones based on device identity, risk profiles and compliance requirements for it to be effective in reducing the attack surface and minimizing blast radius. Over-segmentation with poorly defined zones simply increases complexity without tangible security benefits,” they concluded.
Attacks on IoT devices continue to rise at an alarming rate due to poor security protections and cybercriminals use of automated tools to exploit these vulnerabilities, according to Nokia.
IoT devices most infected
The report found that internet-connected, or IoT, devices now make up roughly 33% of infected devices, up from about 16% in 2019. The report’s findings are based on data aggregated from monitoring network traffic on more than 150 million devices globally.
Adoption of IoT devices, from smart home security monitoring systems to drones and medical devices, is expected to continue growing as consumers and enterprises move to take advantage of the high bandwidth, ultra-low latency, and fundamentally new networking capabilities that 5G mobile networks enable, according to the report.
The rate of success in infecting IoT devices depends on the visibility of the devices to the internet, according to the report. In networks where devices are routinely assigned public facing internet IP addresses, a high infection rate is seen.
In networks where carrier-grade Network Address Translation is used, the infection rate is considerably reduced, because the vulnerable devices are not visible to network scanning.
Cybercriminals taking advantage of the pandemic
The report also reveals there is no let up in cybercriminals using the COVID-19 pandemic to try to steal personal data through a variety of types of malware. One in particular is disguised as a Coronavirus Map application – mimicking the legitimate and authoritative Coronavirus Map issued by Johns Hopkins University – to take advantage of the public’s demand for accurate information about COVID-19 infections, deaths and transmissions.
But the bogus application is used to plant malware on victims’ computers to exploit personal data. “Cybercriminals are playing on people’s fears and are seeing this situation as an opportunity to promote their agendas,” the report says. The report urges the public to install applications only from trusted app stores, like Google and Apple.
Bhaskar Gorti, President and Chief Digital Officer, Nokia, said: “The sweeping changes that are taking place in the 5G ecosystem, with even more 5G networks being deployed around the world as we move to 2021, open ample opportunities for malicious actors to take advantage of vulnerabilities in IoT devices.
“This report reinforces not only the critical need for consumers and enterprises to step up their own cyber protection practices, but for IoT device producers to do the same.”
Organizations are rapidly increasing the size, scope and scale of their data protection infrastructure, reflected in dramatic rises in adoption of public key infrastructure (PKI) across enterprises worldwide, according to Entrust research.
PKI is at the core of nearly every IT infrastructure, enabling security for critical digital initiatives such as cloud, mobile device deployment, identities and the IoT.
The annual study is based on feedback from more than 1,900 IT security professionals in 17 countries.
IoT, authentication and cloud, top drivers in PKI usage growth
As organizations become more dependent on digital information and face increasingly sophisticated cyberattacks, they rely on PKI to control access to data and ascertain the identities of people, systems and devices on a mass scale.
IoT is the fastest growing trend driving PKI application deployment, up 26 percent over the past five years to 47 percent in 2020, with cloud-based services the second highest driver cited by 44 percent of respondents.
PKI usage surging for cloud and authentication use cases
TLS/SSL certificates for public-facing websites and services are the most often cited use case for PKI credentials (84 percent of respondents).
Public cloud-based applications saw the fastest year-over-year growth, cited by 82 percent, up 27 percent from 2019, followed by enterprise user authentication by 70 percent of respondents, an increase of 19 percent over 2019. All underscore the critical need of PKI in supporting core enterprise applications.
The average number of certificates an organization needs to manage grew 43 percent in the 2020 study over the previous year, from 39,197 to 56,192 certificates, highlighting a pivotal requirement for enterprise certificate management.
The rise is likely driven by the industry transition to shorter certificate validity periods, and the sharp growth in cloud and enterprise user authentication use cases.
Challenges, change and uncertainty
The study found that IT security professionals are confronting new challenges to enabling applications to use PKI. 52 percent cited lack of visibility of an existing PKI’s security capabilities as their top challenge, an increase of 16 percent over the 2019 study.
This issue underscores the lack of cybersecurity expertise available within even the most well-resourced organizations, and the need for PKI specialists who can create custom enterprise roadmaps based on security and operational best practices.
Respondents also cited inability to change legacy applications and the inability of their existing PKIs to support new applications as critical challenges – both at 51 percent.
When it comes to deploying and managing a PKI, IT security professionals are most challenged by organizational issues such as no clear ownership, insufficient skills and insufficient resources.
PKI deployment figures from the study clearly indicate a trend toward more diversified approaches, with as-a-service offerings even becoming more prevalent than on-premise offerings in some countries.
The two greatest areas of PKI change and uncertainty come from new applications such as IoT (52 percent of respondents) and external mandates and standards (49 percent). The regulatory environment is also increasingly driving deployment of applications that use PKI, cited by 24 percent of respondents.
Security practices have not kept pace with growth
In the next two years, an estimated average of 41 percent of IoT devices will rely primarily on digital certificates for identification and authentication. Encryption for IoT devices, platforms and data repositories, while growing, is at just 33 percent – a potential exposure point for sensitive data.
Respondents cited several threats to IoT security, including altering the function of IoT devices through malware or other attacks (68 percent) and remote control of a device by an unauthorized user (54 percent).
However, respondents rated controls relevant to malware protection – like securely delivering patches and updates to IoT devices – last on a list of the five most important IoT security capabilities.
The US National Institute of Standards and Technology (NIST) recommends that cryptographic modules for certificate authorities (CAs), key recovery servers and OCSP responders should be validated to FIPS 140-2 level 3 or higher.
Thirty-nine percent of respondents in this study use hardware security modules (HSMs) to secure their PKIs, most often to manage the private keys for their root, issuing, or policy CAs. Yet only 12 percent of respondents indicate the use of HSMs in their OSCP installations, demonstrating a significant gap between best practices and observed practices.
“PKI underpins the security of both the business and the consumer world, from digitally signing transactions and applications to prove the source as well as integrity, to supporting the authentication of smart phones, games consoles, citizen passports, mass transit ticketing and mobile banking, says Larry Ponemon, founder of the Ponemon Institute.
“The 2020 Global PKI and IoT Trends Study shows a surge in the use of PKI credentials for cloud-based applications and enterprise user authentication, underscoring the criticality of PKI in supporting core enterprise applications.”
“We are seeing increasing reliance on PKI juxtaposed with struggles by internal teams to adapt it to new market needs — driving changes to traditional PKI deployment models and methods,” says John Grimm, vice president strategy for digital solutions at Entrust.
“In newer areas like IoT, enterprises are clearly failing to prioritize security mechanisms like firmware signing that would counter the most urgent threats, such as malware.
“And with the massive increase in certificates issued and acquired found in this year’s study, the importance of automated certificate management, a flexible PKI deployment approach, and strong best practice-based security including HSMs has never been greater.”
The ongoing global pandemic that has led to massive levels of remote work and an increased use of hybrid IT systems is leading to greater insecurity and risk exposure for enterprises.
According to new data released by Cybersecurity Insiders, 72% of organizations experienced an increase in endpoint and IoT security incidents in the last year, while 56% anticipate their organization will likely be compromised due to an endpoint or IoT-originated attack with the next 12 months.
The comprehensive survey of 325 IT and cybersecurity decision makers in the US, conducted in September 2020, represented a balanced cross-section of organizations from financial services, healthcare and technology to government and energy.
IoT and enpoint security challenge
Alongside headline data that the majority experienced an endpoint and IoT security incident over the last 12 months, the top 3 issues were related to malware (78%), insecure network and remote access (61%), and compromised credentials (58%).
Perhaps more concerning was that 43% of respondents expressed “moderate to unlikely means to discover, identify, and respond to unknown, unmanaged, or insecure devices accessing network and cloud resources.”
“It is clear from this new research that the challenge of securing IoT and endpoints has escalated considerably as employees have been forced to work remotely while organizations try to rapidly adapt to the situation,” said Scott Gordon, CMO at Pulse Secure.
“The threat is real and growing. Yet, on a positive note, the survey shows that organizations are investing in key initiatives and adopting zero trust elements such as remote access device posture checking and Network Access Control (NAC) to address some of these issues.“
The negative impact of an endpoint or IoT security issue
The research found that 41% will implement or advance on-premise device security enforcement, 35% will advance their remote access devices posture checking, and 22% will advance their IoT device identification and monitoring capabilities.
For those that have been victim of an endpoint or IoT security issue, the most significant negative impact was a reported loss of user (55%) and IT (45%) productivity, followed by system downtime (42%).
Holger Schulze, CEO at Cybersecurity Insiders added, “The diversity of users, devices, networks, and threats continue to grow as enterprises take advantage of greater workforce mobility, workplace flexibility, and cloud computing opportunities.
“Not only do organizations need to ensure endpoints are secure and adhering to usage policy, but they must also manage appropriate IoT device access. New zero trust security controls can fortify dynamic device discovery, verification, tracking, remediation, and access enforcement.”
Additional key findings
- Respondents rated the biggest endpoint and IoT security challenges as #1 insufficient protection against the latest threats (49%), #2 high complexity of deployment and operations (47%), and #3 inability to enforce endpoint and IoT device access/usage policy (40%).
- Respondents rated the most critical capabilities required to mitigate endpoint and IoT security as #1 monitoring endpoint or IoT devices for malicious or anomalous activity (54%), #2 blocking or isolating unknown or at-risk endpoint and IoT devices’ network access (51%), and #3 blocking at-risk devices’ access to network or cloud resources (46%).
- When asked about anticipated investments to secure remote worker access and endpoint security technology, most organizations (61%) anticipate an increase, or significant increase, while few expect a decrease (6%).
Vodafone Business launched a report focused on the impact IoT is having on businesses at a time when their digital capabilities are put to the test by the COVID-19 pandemic.
The report features responses from 1,639 businesses globally, exploring how they are using IoT and how IoT is helping them be ready for the future.
IoT has made the difference for business success
The pandemic has forced almost all businesses to change their working practices and priorities in a matter of weeks, with the findings showing 77% of adopters increased the pace of IoT projects during this time.
Adopters clearly believe IoT was vital to keep them going: 84% said the technology was key to maintaining business continuity during the pandemic. As a result, 84% of adopters now view the integration of IoT devices with workers as a higher priority and 73% of businesses considering IoT agree the pandemic will accelerate their adoption plans.
IoT is key to improving business performance
The research findings are clear: IoT continues to generate value and ROI for adopters and 87% agree their core business strategy has changed for the better as a result of adopting IoT.
95% say they have achieved a return on investment and 55% of adopters have seen operating costs decrease by an average of 21%.
From improving operational efficiency to creating new connected products and services, key benefits of IoT deployments include boosted employee productivity (49%) and improved customer experience (59%).
Data is the key to future readiness
You can’t manage what you can’t measure. IoT data is becoming essential to support businesses’ decision-making (59%) and 84% of adopters think they can do things they couldn’t do before thanks to IoT. And IoT data is also helping 84% of businesses meet their sustainability goals.
IoT benefits clearly outweigh the risks
Businesses see IoT as an essential element of being future ready. So much so that 73% say that organisations who have failed to embrace IoT will have fallen behind within five years.
While cybersecurity was one of the main barriers to business’ willingness to adopt IoT in previous years, the IoT Spotlight 2020 sees the concerns significantly reducing, with only 18% of businesses seeing it as one of the top-three barriers to IoT adoption.
This, coupled with the improvements in brand differentiation and competitiveness (43%) showed by mature adopters of IoT, proves businesses that embrace this technology believe the opportunities IoT offers businesses greatly outweigh the challenges of implementation.
Erik Brenneis, Internet of Things Director at Vodafone Business said: “IoT has grown up. It’s no longer just about increasing return on investment or providing cost savings to businesses: it’s changing the way they think and operate. And it’s giving them an opportunity to re-design their operations and future-proof their business model. This research proves IoT is an essential technology for businesses that want to be resilient, more flexible and quicker to adapt and react to change.”
Hacking a Coffee Maker
As expected, IoT devices are filled with vulnerabilities:
As a thought experiment, Martin Hron, a researcher at security company Avast, reverse engineered one of the older coffee makers to see what kinds of hacks he could do with it. After just a week of effort, the unqualified answer was: quite a lot. Specifically, he could trigger the coffee maker to turn on the burner, dispense water, spin the bean grinder, and display a ransom message, all while beeping repeatedly. Oh, and by the way, the only way to stop the chaos was to unplug the power cord.
In any event, Hron said the ransom attack is just the beginning of what an attacker could do. With more work, he believes, an attacker could program a coffee maker — and possibly other appliances made by Smarter — to attack the router, computers, or other devices connected to the same network. And the attacker could probably do it with no overt sign anything was amiss.
Sidebar photo of Bruce Schneier by Joe MacInnis.
IoT gateways are becoming an increasingly important link in the IoT security and device authentication value chain and emerging as a crucial conduit for intelligent operations across the entire IoT.
The new wave of next-generation smart IoT gateways has arrived at an opportune time, enabling a breadth of novel security, intelligence, and authentication operations at the edge, causing IoT vendors to revisit their deployment and management strategies.
According to ABI Research, there will be 21.4 million next-gen smart IoT gateways shipped in 2025.
“Smart IoT gateways are currently caught amid a greater transformative evolution, further enhancing capabilities for gateways, shifting focus toward the edge, and reversing the cloud-centric investment priorities of the past decade,” states Dimitrios Pavlakis, Digital Security analyst at ABI Research.
The characteristics of next-gen smart IoT gateways
The primary characteristics of next-gen IoT gateways include enhanced cybersecurity options, extended connectivity support, edge processing and filtering, authentication and management, cloud services, analytics, and intelligence operations.
These highly demanding technological characteristics have been steadily reaching the core of the implementation lists of IoT implementers, shifting the dynamics of IoT security and pulling focus ever closer to the edge.
“This is not to say that edge-focused IoT gateways will completely replace data servers and cloud computing – far from it. Rather they are set to create a more symbiotic relationship between them while increasing the amount of responsibility towards edge computing and intelligence-gathering operations,” Pavlakis explains.
Turning challenges into well-honed value propositions
The current market demands brought forth by the intense increase of IoT technologies allow gateway vendors to turn challenges into well-honed value propositions. This can include tackling the secure transition of legacy equipment into larger IoT fleets, enable increased visibility, monitoring, and management of IoT devices, aid in the clash between IT and OT in industrial and healthcare systems, and streamline digital security and device management.
The surge of IoT gateways shipments is expected to create a variable penetration rate across different IoT end markets led by innovative gateway vendors like Advantech, Cisco, Kerlink, MultiTech, and Sierra Wireless.
“The data suggest that video surveillance, heavy transport vehicles and equipment, intelligent transportation, and fleet management depict the highest penetration rate for the next-level security and intelligence components for smart IoT gateways, with a clear focus revolving around automotive verticals and data-heavy applications,” Pavlakis concludes.
5G is set to deliver higher data transfer rates for mission-critical communications and will allow massive broadband capacities, enabling high-speed communication across various applications such as the Internet of Things (IoT), robotics, advanced analytics and artificial intelligence.
According to a study from CommScope, only 46% of respondents feel their current network infrastructure is capable of supporting 5G, but 68% think 5G will have a significant impact on their agency operations within one to four years.
Of the respondents who do not feel their current infrastructure is capable of supporting 5G, none have deployed 5G, 19% are piloting, 43% are planning to pilot, and 52% are not planning or evaluating whether to pilot 5G.
Costs reported as top barriers to 5G implementation
According to the report, ongoing and initial costs are reported as top barriers for federal agencies wishing to implement 5G – 44% believe initial/up-front costs will be the biggest barrier and 49% are concerned about ongoing costs.
“This study indicates that federal agencies are at the beginning stages of 5G evaluation and deployment. As they are looking to finalize their strategy for connectivity, agencies should also consider private networks, whether those are private LTE networks, private 5G networks, or a migration from one to the other to ensure flexibility and scalability.”
Desired outcomes for federal agencies
Remote employee productivity (40%) is one of the top desired outcomes for federal agencies looking to implement 5G, along with introducing high bandwidth (39%), higher throughput (39%) and better connectivity (38%).
Additional findings from the study include:
- 32% hope that 5G will make it easier to share information securely and 32% would like to see easier access to data
- 82% plan to or have already adopted 5G with 6% having already deployed 5G, 14% piloting 5G and 62% evaluating/planning to pilot 5G
- 71% are looking at hardware, software or endpoint upgrades to support 5G
- 83% believe it is very/somewhat important for mission-critical traffic on the agency network to remain onsite while 64% feel it is very/somewhat important
67% of business and IT managers expect the sheer quantity of data to grow nearly five times by 2025, a Splunk survey reveals.
The research shows that leaders see the significant opportunity in this explosion of data and believe data is extremely or very valuable to their organization in terms of: overall success (81%), innovation (75%) and cybersecurity (78%).
81% of survey respondents believe data to be very or highly valuable yet 57% fear that the volume of data is growing faster than their organizations’ ability to keep up.
“The aata age is here. We can now quantify how data is taking center stage in industries around the world. As this new research demonstrates, organizations understand the value of data, but are overwhelmed by the task of adjusting to the many opportunities and threats this new reality presents,” said Doug Merritt, President and CEO, Splunk.
“There are boundless opportunities for organizations willing to quickly learn and adapt, embrace new technologies and harness the power of data.”
The data age has been accelerated by emerging technologies powered by, and contributing to, exponential data growth. Chief among these emerging technologies are Edge Computing, 5G networking, IoT, AI/ML, AR/VR and Blockchain.
It’s these very same technologies 49% of those surveyed expect to use to harness the power of data, but across technologies, on average, just 42% feel they have high levels of understanding of all six.
Data is valuable, and data anxiety is real
To thrive in this new age, every organization needs a complete view of its data — real-time insight, with the ability to take real-time action. But many organizations feel overwhelmed and unprepared. The study quantifies the emergence of a data age as well as the recognition that organizations have some work to do in order to use data effectively and be successful.
- Data is extremely or very valuable to organizations in terms of: overall success (81%), innovation (75%) and cybersecurity (78%).
- And yet, 66% of IT and business managers report that half or more of their organizations’ data is dark (untapped, unknown, unused) — a 10% increase over the previous year.
- 57% say the volume of data is growing faster than their organizations’ ability to keep up.
- 47% acknowledge their organizations will fall behind when faced with rapid data volume growth.
Some industries are more prepared than others
The study quantifies the emergence of a data age and the adoption of emerging technologies across industries, including:
- Across industries, IoT has the most current users (but only 28%). 5G has the fewest and has the shortest implementation timeline at 2.6 years.
- Confidence in understanding of 5G’s potential varies: 59% in France, 62% in China and only 24% in Japan.
- For five of the six technologies, financial services leads in terms of current development of use cases. Retail comes second in most cases, though retailers lag notably in adoption of AI.
- 62% of healthcare organizations say that half or more of their data is dark and that they struggle to manage and leverage data.
- The public sector lags commercial organizations in adoption of emerging technologies.
- Manufacturing leaders predict growth in data volume (78%) than in any other industry; 76% expect the value of data to continue to rise.
Some countries are more prepared than others
The study also found that countries seen as technology leaders, like the U.S. and China, are more likely to be optimistic about their ability to harness the opportunities of the data age.
- 90% of business leaders from China expect the value of data to grow. They are by far the most optimistic about the impact of emerging technologies, and they are getting ready. 83% of Chinese organizations are prepared, or are preparing, for rapid data growth compared to just 47% across all regions.
- U.S. leaders are the second most confident in their ability to prepare for rapid data growth, with 59% indicating that they are at least somewhat confident.
- In France, 59% of respondents say that no one in their organization is having conversations about the impact of the data age. Meanwhile, in Japan 67% say their organization is struggling to stay up to date, compared to the global average of 58%.
- U.K. managers report relatively low current usage of emerging technologies but are optimistic about plans to use them in the future. For example, just 19% of U.K. respondents say they are currently using AI/ML technologies, but 58% say they will use them in the near future.
Liability for cyber-physical security incidents will pierce the corporate veil to personal liability for 75% of CEOs by 2024, according to Gartner.
Due to the nature of cyber-physical systems (CPSs), incidents can quickly lead to physical harm to people, destruction of property or environmental disasters. Gartner analysts predict that incidents will rapidly increase in the coming years due to a lack of security focus and spending currently aligning to these assets.
The funcion of CPSs
CPSs are defined as systems that are engineered to orchestrate sensing, computation, control, networking and analytics to interact with the physical world (including humans). They underpin all connected IT, operational technology (OT) and Internet of Things (IoT) efforts where security considerations span both the cyber and physical worlds, such as asset-intensive, critical infrastructure and clinical healthcare environments.
“Regulators and governments will react promptly to an increase in serious incidents resulting from failure to secure CPSs, drastically increasing rules and regulations governing them,” said Katell Thielemann, research vice president at Gartner.
“In the U.S., the FBI, NSA and CISA have already increased the frequency and details provided around threats to critical infrastructure-related systems, most of which are owned by private industry. Soon, CEOs won’t be able to plead ignorance or retreat behind insurance policies.”
The financial impact of CPS attacks resulting in fatal casualties is predicted to reach over $50 billion by 2023. Even without taking the actual value of a human life into the equation, the costs for organizations in terms of compensation, litigation, insurance, regulatory fines and reputation loss will be significant.
“Technology leaders need to help CEOs understand the risks that CPSs represent and the need to dedicate focus and budget to securing them,” said Ms. Thielemann. “The more connected CPSs are, the higher the likelihood of an incident occurring.”
Many enterprises not aware of CPSs already deployed in their org
With OT, smart buildings, smart cities, connected cars and autonomous vehicles evolving, incidents in the digital world will have a much greater effect in the physical world as risks, threats and vulnerabilities now exist in a bidirectional, cyber-physical spectrum.
However, many enterprises are not aware of CPSs already deployed in their organization, either due to legacy systems connected to enterprise networks by teams outside of IT, or because of new business-driven automation and modernization efforts.
“A focus on ORM – or operational resilience management – beyond information-centric cybersecurity is sorely needed,” Ms. Thielemann said.
Beginning September 1st, all publicly trusted TLS certificates must have a lifespan of 398 days or less. According to security experts from Venafi, this latest change is another indication that machine identity lifetimes will continue to shrink.
Since many organizations lack the automation capabilities necessary to replace certificates with short lifespans at machine scale and speed, they are likely to see sharp increases in outages caused by unexpected certificate expirations.
“Apple’s unilateral move to reduce machine identity lifespans will profoundly impact businesses and governments globally,” said Kevin Bocek, vice president of security strategy and threat intelligence at Venafi.
“The interval between certificate lifecycle changes is shrinking, while at the same time, certificates lifecycles themselves are being reduced. In addition, the number of machines—including IoT and smart devices, virtual machines, AI algorithms and containers—that require machine identities is skyrocketing.
“It seems inevitable that certificate-related outages, similar to those that have haunted Equifax, LinkedIn, and the State of California, will spiral out-of-control over the next few years.”
The interval between changes in the length of certificate lifespans has been shrinking over the last decade:
- Pre-2011: Certificate lifespans were 8–10 years (96 months)
- 2012: Certificate lifespans were shortened to 60 months (five years), a reduction of 37%. This change was preplanned in CA/Browser Forum Baseline Requirements.
- 2015: Certificate lifespans were shortened to 39 months (3 years), a reduction of 35%. This change happened three years after the five-year limitation was adopted.
- 2018: Certificate lifespans were shortened to 27 months (two years), a reduction of 30%. This change happened two years after the three-year limitation was adopted.
- 2020: Certificate lifespans were shortened to 13 months, a reduction of 51%. This change happened one year after the two-year limitation was adopted.
Bocek continued: “If the interval between lifecycle changes continues on its current cadence, it’s likely that we could see certificate lifespans for all publicly trusted TLS certificates reduced to 6 months by early 2021 and perhaps become as short as three months by the end of next year.
“Actions by Apple, Google or Mozilla could accomplish this. Ultimately, the only way for organizations to eliminate this external, outside risk is total visibility, comprehensive intelligence and complete automation for TLS machine identities.”
Digital keys and certificates act as machine identities
They control the flow of sensitive data to trusted machines in a wide range of security and operational systems.
Enterprises rely on machine identities to connect and encrypt over 330 million internet domains, over 1.8 billion websites and countless applications. When these certificates expire unexpectedly, the machines or applications they identify will cease to communicate with other machines, shutting down critical business processes.
Unfortunately, eliminating certificate-related outages within complex, multitiered architectures can be challenging. Ownership and control of these certificates often reside in different parts of the organization, with certificates sometimes shared across multiple layers of infrastructure.
These problems are exacerbated by the fact that most organizations have certificate renewal processes that are prone to human error. When combined, these factors make outage prevention a complex process that is made much more difficult by shorter certificate lifetimes.
The Network Computing, Communications and Storage research group at Aarhus University has developed a completely new way to compress data. The new technique provides possibility to analyze data directly on compressed files, and it may have a major impact on the so-called “data tsunami” from massive amounts of IoT devices.
The method will now be further developed, and it will form the framework for an end-to-end solution to help scale-down the exponentially increasing volumes of data from IoT devices.
“Today, if you need just 1 Byte of data from a 100 MB compressed file, you usually have to decompress a significant part of the whole file to access to the data. Our technology enables random access to the compressed data. It means that you can access 1 Byte data at the cost of only decompressing less than 100 Bytes, which is several orders of magnitude lower compared to the state-of-the-art technologies. This could have a huge impact on data accessibility, data processing speed and the cloud storage infrastructure,” says Associate Professor Qi Zhang from Aarhus University.
Compressed IoT data
The compression technique makes it feasible to compress IoT data (typically data in time series) in real time before the data is sent to the cloud. After this, the typical data analytics could be carried out directly on the compressed data. There is no need to decompress all the data or large amounts of it in order to carry out an analysis.
This could potentially alleviate the ever-increasing pressure on the communication and data storage infrastructure. The research group believes that the project’s results will serve as a foundation for the development of sustainable IoT solutions, and that it could have a profound impact on digitalization:
“Today, IoT data is constantly being streamed to the cloud, and as consequence of the massive amounts of IoT devices deployed globally an exponential data growth is expected. Conventionally, to allow fast frequent data retrieval and analysis, it is preferable to store the data in an uncompressed form.
“The drawback here is the use of more storage space. If you keep the data in compressed form; however, it takes time to decompress the data first before you can access and analyze it. Our project outcome has the potential not only to reduce data storage space but also to accelerate data analysis,” says Qi Zhang.
Asset tracking is one of the highest growth application segments for the Internet of Things (IoT). According to a report by ABI Research, asset tracking device shipments will see a 51% year-on-year device shipment growth rate through 2024.
Expanding LPWAN coverage, technological maturity, and the associated miniaturization of sophisticated devices are key to moving asset tracking from traditionally high-value markets to low-value high-volume markets, which will account for most of the tracker connection and shipment numbers.
“Hardware devices for the asset tracking market are primarily dominated by the need to balance power consumption, form factor, and device cost. Balance and compromise between these three must be achieved based on the use-case and are dictated by the business case and possible return on investment for the customer,” said Tancred Taylor, Research Analyst at ABI Research.
“As these constraints are marginalized by greater volumes of adoption, by emerging technologies like eSIM or System-on-Chip, and by increasingly low-power components and connectivity, so too will the limitations on the business case.”
OEMs diversifying their hardware offerings
Expanding technological and network foundations drive the number of use-cases, and OEMs are responding by diversifying their hardware offerings. Some OEMs such as CoreKinect, Particle, Mobilogix, or Starcom Systems are innovating in this space by taking a reference-architecture or modular approach to device design for personalized solutions. Others are going to market with off-the-shelf or vertically-focused devices for quickly scalable deployments – such as BeWhere, Roambee, Sony, or FFLY4U.
Early adoption of asset tracking was in the fleet, container, and logistics industries to provide basic data on the location and condition of assets in transit. The total addressable market for these industries remains extensive, particularly as the solutions trickle down from the largest enterprises to small- and medium-sized companies.
Increased device functionality combined with component miniaturization is key to driving the next generation of low-cost tracking devices. This will enable granular tracking at the pallet, package, or item level, and open new markets and device categories, such as disposable trackers. Emerson, Sensitech, CoreKinect, and Bayer are among companies driving innovation in this field.
Product innovation accompanied by variations in business models
Innovation among product offerings is accompanied by variations in business models and go-to-market approaches. Mobile Network Operators (MNOs) are playing a significant role in driving adoption through increased verticalization, with Verizon, AT&T, and Orange among those offering subscription models for end-to-end solutions – comprising device, connectivity, software, and managed service offerings.
This model is additionally gaining traction among OEMs, with Roambee an early adopter for a subscription-only model, and others such as Mobilogix following suit. This service-based model will gain additional traction as OEMs move down the value-chain by developing in-house capabilities or partner networks to simplify the ecosystem and consumer’s solution.
“While there is extensive work to be done on the hardware side to make low-cost trackers that can be simply attached to any ‘thing’, many OEMs are shifting from a hardware-only model to more of a consultative approach to a customer’s requirements and deliver personalized end-to-end solutions. Flexibility, simplicity, and cost are crucial to gain enterprise traction,” Taylor concluded.