Researchers at the University of Rochester and Cornell University have taken an important step toward developing a communications network that exchanges information across long distances by using photons, mass-less measures of light that are key elements of quantum computing and quantum communications systems.
Each pillar serves as a location marker for a quantum state that can interact with photons. Credit: University of Rochester illustration / Michael Osadciw
The research team has designed a nanoscale node made out of magnetic and semiconducting materials that could interact with other nodes, using laser light to emit and accept photons.
The development of such a quantum network -designed to take advantage of the physical properties of light and matter characterized by quantum mechanics – promises faster, more efficient ways to communicate, compute, and detect objects and materials as compared to networks currently used for computing and communications.
The node consists of an array of pillars a mere 120 nanometers high. The pillars are part of a platform containing atomically thin layers of semiconductor and magnetic materials.
The array is engineered so that each pillar serves as a location marker for a quantum state that can interact with photons and the associated photons can potentially interact with other locations across the device–and with similar arrays at other locations.
This potential to connect quantum nodes across a remote network capitalizes on the concept of entanglement, a phenomenon of quantum mechanics that, at its very basic level, describes how the properties of particles are connected at the subatomic level.
“This is the beginnings of having a kind of register, if you like, where different spatial locations can store information and interact with photons,” says Nick Vamivakas, professor of quantum optics and quantum physics at Rochester.
Toward ‘miniaturizing a quantum computer’
The project builds on work the Vamivakas Lab has conducted in recent years using tungsten diselenide (WSe2) in so-called Van der Waals heterostructures. That work uses layers of atomically thin materials on top of each other to create or capture single photons.
The new device uses a novel alignment of WSe2 draped over the pillars with an underlying, highly reactive layer of chromium triiodide (CrI3). Where the atomically thin, 12-micron area layers touch, the CrI3 imparts an electric charge to the WSe2, creating a “hole” alongside each of the pillars.
In quantum physics, a hole is characterized by the absence of an electron. Each positively charged hole also has a binary north/south magnetic property associated with it, so that each is also a nanomagnet
When the device is bathed in laser light, further reactions occur, turning the nanomagnets into individual optically active spin arrays that emit and interact with photons. Whereas classical information processing deals in bits that have values of either 0 or 1, spin states can encode both 0 and 1 at the same time, expanding the possibilities for information processing.
“Being able to control hole spin orientation using ultrathin and 12-micron large CrI3, replaces the need for using external magnetic fields from gigantic magnetic coils akin to those used in MRI systems,” says lead author and graduate student Arunabh Mukherjee. “This will go a long way in miniaturizing a quantum computer based on single hole spins. ”
Still to come: Entanglement at a distance?
Two major challenges confronted the researchers in creating the device.
One was creating an inert environment in which to work with the highly reactive CrI3. This was where the collaboration with Cornell University came into play.
“They have a lot of expertise with the chromium triiodide and since we were working with that for the first time, we coordinated with them on that aspect of it,” Vamivakas says. For example, fabrication of the CrI3 was done in nitrogen-filled glove boxes to avoid oxygen and moisture degradation.
The other challenge was determining just the right configuration of pillars to ensure that the holes and spin valleys associated with each pillar could be properly registered to eventually link to other nodes.
And therein lies the next major challenge: finding a way to send photons long distances through an optical fiber to other nodes, while preserving their properties of entanglement.
“We haven’t yet engineered the device to promote that kind of behavior,” Vamivakas says. “That’s down the road.”
The race is on to build the world’s first reliable and truly useful quantum computer, and the finish line is closer than you might think – we might even reach it this decade. It’s an exciting prospect, particularly as these super-powerful machines offer huge potential to almost every industry, from drug development to electric-vehicle battery design.
But quantum computers also pose a big security problem. With exponentially higher processing power, they will be able to smash through the public-key encryption standards widely relied on today, threatening the security of all digital information and communication.
While it’s tempting to brush it under the carpet as “tomorrow’s problem”, the reality of the situation is much more urgent. That’s because quantum computers don’t just pose a threat to tomorrow’s sensitive information: they’ll be able to decrypt data that has been encrypted in the past, that’s being encrypted in the present, and that will be encrypted in the future (if quantum-resistant algorithms are not used).
It’s why the NSA warned, as early as 2015, that we “must act now” to defuse the threat, and why the US National Institute of Standards and Technology (NIST) is racing to standardize new post-quantum cryptographic solutions, so businesses can get a trusted safety net in place before the threat materializes.
From aviation to pharma: The industries at risk
The harsh reality is that no one is immune to the quantum threat. Whether it’s a security service, pharmaceutical company or nuclear power station, any organization holding sensitive information or intellectual property that needs to be protected in the long term has to take the issue seriously.
The stakes are high. For governments, a quantum attack could mean a hostile state gains access to sensitive information, compromising state security or revealing secrets that undermine political stability. For pharmaceuticals, on the other hand, a quantum computer could allow competitors to gain access to valuable intellectual property, hijacking a drug that has been in costly development for years. (As we’re seeing in the race for a COVID-19 vaccine, this IP can sometimes have significant geopolitical importance.)
Hardware and software are also vulnerable to attack. Within an industry like aviation, a quantum-empowered hacker would have the ability to forge the signature of a software update, push that update to a specific engine part, and then use that to alter the operations of the aircraft. Medical devices like pacemakers would be vulnerable to the same kind of attack, as would connected cars whose software is regularly updated from the cloud.
Though the list of scenarios goes on, the good news is that companies can ready themselves for the quantum threat using technologies available today. Here’s how:
1. Start the conversation early
Begin by promoting quantum literacy within your business to ensure that executive teams understand the severity and immediacy of the security threat. Faced with competing priorities, they may otherwise struggle to understand why this issue deserves immediate attention and investment.
It’s your job to make sure they understand what they’re up against. Identify specific risks that could materialize for your business and industry – what would a quantum attack look like, and what consequences would you be facing if sensitive information were to be decrypted?
Paint a vivid picture of the possible scenarios and calculate the cost that each one would have for your business, so everyone knows what’s at stake. By doing so, you’ll start to build a compelling business case for upgrading your organization’s information security, rather than assuming that this will be immediately obvious.
2. Work out what you’ve got and what you still need
Do a full audit of every place within your business where you are using cryptography, and make sure you understand why that is. Surprisingly, many companies have no idea of all the encryption they currently have in place or why, because the layers of protection have been built up in a siloed fashion over many years.
What cryptographic standards are you relying on today? What data are you protecting, and where? Try to pinpoint where you might be vulnerable. If you’re storing sensitive information in cloud-based collaboration software, for example, that may rely on public key cryptography, so won’t be quantum-secure.
As part of this audit, don’t forget to identify the places where data is in transit. However well your data is protected, it’s vulnerable when moving from one place to another. Make sure you understand how data is moving within your business – where from and to – so you can create a plan that addresses these weak points.
It’s also vital that you think about what industry regulations or standards you need to comply with, and where these come into play across the areas of your business. For industries like healthcare or finance, for example, there’s an added layer of regulation when it comes to information security, while privacy laws like the GDPR and CCPA will apply if you hold personal information relating to European or Californian citizens.
3. Build a long-term strategy for enhanced security
Once you’ve got a full view of what sensitive data you hold, you can start planning your migration to a quantum-ready architecture. How flexible is your current security infrastructure? How crypto-agile are your cryptography solutions? In order to migrate to new technology, do you need to rewrite everything, or could you make some straightforward switches?
Post-quantum encryption standards will be finalized by NIST in the next year and a half, but the process is already underway, and the direction of travel is becoming clearer. Now that finalist algorithms have been announced, businesses don’t need to wait to get quantum-secure – they must simply ensure that they design their security infrastructure to work with any of the shortlisted approaches that NIST is currently considering for standardization.
Deploying a hybrid solution – pairing existing solutions with one of the post-quantum schemes named as a NIST finalist – can be a good way to build resilience and flexibility into your security architecture. By doing this, you’ll be able to comply with whichever new industry standards are announced and remain fully protected against present and future threats in the meantime.
Whatever you decide, remember that migration can take time – especially if your business is already built on a complex infrastructure that will be hard to unpick and rebuild. Put a solid plan in place before you begin and consider partnering with an expert in the field to speed up the process.
A risk we can’t see
Just because a risk hasn’t yet materialized, doesn’t mean it isn’t worth preparing for (a mindset that could have come in handy for the coronavirus pandemic, all things considered…).
The quantum threat is serious, and it’s urgent. The good thing is that we already have all the ingredients to get a safety net in place, and thanks to strong mathematical foundations, we can be confident in the knowledge that the algorithms being standardized by NIST will protect businesses from even the most powerful computers.
The next step? Making sure this cutting-edge technology gets out of the lab and into the hands of the organizations who need it most.
Researchers from the University of Ottawa, in collaboration with Ben-Gurion University of the Negev and Bar-Ilan University scientists, have been able to create optical framed knots in the laboratory that could potentially be applied in modern technologies.
Top view of the framed knots generated in this work
Their work opens the door to new methods of distributing secret cryptographic keys – used to encrypt and decrypt data, ensure secure communication and protect private information.
“This is fundamentally important, in particular from a topology-focused perspective, since framed knots provide a platform for topological quantum computations,” explained senior author, Professor Ebrahim Karimi, Canada Research Chair in Structured Light at the University of Ottawa.
“In addition, we used these non-trivial optical structures as information carriers and developed a security protocol for classical communication where information is encoded within these framed knots.”
The concept of framed knots
The researchers suggest a simple do-it-yourself lesson to help us better understand framed knots, those three-dimensional objects that can also be described as a surface.
“Take a narrow strip of a paper and try to make a knot,” said first author Hugo Larocque, uOttawa alumnus and current PhD student at MIT.
“The resulting object is referred to as a framed knot and has very interesting and important mathematical features.”
The group tried to achieve the same result but within an optical beam, which presents a higher level of difficulty. After a few tries (and knots that looked more like knotted strings), the group came up with what they were looking for: a knotted ribbon structure that is quintessential to framed knots.
“In order to add this ribbon, our group relied on beam-shaping techniques manipulating the vectorial nature of light,” explained Hugo Larocque. “By modifying the oscillation direction of the light field along an “unframed” optical knot, we were able to assign a frame to the latter by “gluing” together the lines traced out by these oscillating fields.”
According to the researchers, structured light beams are being widely exploited for encoding and distributing information.
“So far, these applications have been limited to physical quantities which can be recognized by observing the beam at a given position,” said uOttawa Postdoctoral Fellow and co-author of this study, Dr. Alessio D’Errico.
“Our work shows that the number of twists in the ribbon orientation in conjunction with prime number factorization can be used to extract a so-called “braid representation” of the knot.”
“The structural features of these objects can be used to specify quantum information processing programs,” added Hugo Larocque. “In a situation where this program would want to be kept secret while disseminating it between various parties, one would need a means of encrypting this “braid” and later deciphering it.
“Our work addresses this issue by proposing to use our optical framed knot as an encryption object for these programs which can later be recovered by the braid extraction method that we also introduced.”
“For the first time, these complicated 3D structures have been exploited to develop new methods for the distribution of secret cryptographic keys. Moreover, there is a wide and strong interest in exploiting topological concepts in quantum computation, communication and dissipation-free electronics. Knots are described by specific topological properties too, which were not considered so far for cryptographic protocols.”
“Current technologies give us the possibility to manipulate, with high accuracy, the different features characterizing a light beam, such as intensity, phase, wavelength and polarization,” said Larocque.
“This allows to encode and decode information with all-optical methods. Quantum and classical cryptographic protocols have been devised exploiting these different degrees of freedom.”
“Our work opens the way to the use of more complex topological structures hidden in the propagation of a laser beam for distributing secret cryptographic keys.”
“Moreover, the experimental and theoretical techniques we developed may help find new experimental approaches to topological quantum computation, which promises to surpass noise-related issues in current quantum computing technologies,” added Dr. Ebrahim Karimi.
Researchers from CSIRO’s Data61 and the Monash Blockchain Technology Centre have developed the world’s most efficient blockchain protocol that is both secure against quantum computers and protects the privacy of its users and their transactions.
The technology can be applied beyond cryptocurrencies, such as digital health, banking, finance and government services, as well as services which may require accountability to prevent illegal use.
The protocol — a set of rules governing how a blockchain network operates — is called MatRiCT.
Cryptocurrencies vulnerable to attacks by quantum computers
The cryptocurrency market is currently valued at more than $325 billion, with an average of approximately $50 billion traded daily over the past year.
However, blockchain-based cryptocurrencies like Bitcoin and Ethereum are vulnerable to attacks by quantum computers, which are capable of performing complex calculations and processing substantial amounts of data to break blockchains, in significantly faster times than current computers.
“Quantum computing can compromise the signatures or keys used to authenticate transactions, as well as the integrity of blockchains themselves,” said Dr Muhammed Esgin, lead researcher at Monash University and Data61’s Distributed Systems Security Group. “Once this occurs, the underlying cryptocurrency could be altered, leading to theft, double spend or forgery, and users’ privacy may be jeopardised.
“Existing cryptocurrencies tend to either be quantum-safe or privacy-preserving, but for the first time our new protocol achieves both in a practical and deployable way.”
The MatRiCT protocol is based on hard lattice problems, which are quantum secure, and introduces three new key features: the shortest quantum-secure ring signature scheme to date, which authenticates activity and transactions using only the signature; a zero-knowledge proof method, which hides sensitive transaction information; and an auditability function, which could help prevent illegal cryptocurrency use.
Blockchain challenged by speed and energy consumption
Speed and energy consumption are significant challenges presented by blockchain technologies which can lead to inefficiencies and increased costs.
“The protocol is designed to address the inefficiencies in previous blockchain protocols such as complex authentication procedures, thereby speeding up calculation efficiencies and using less energy to resolve, leading to significant cost savings,” said Dr Ron Steinfeld, associate professor, co-author of the research and a quantum-safe cryptography expert at Monash University.
“Our new protocol is significantly faster and more efficient, as the identity signatures and proof required when conducting transactions are the shortest to date, thereby requiring less data communication, speeding up the transaction processing time, and reducing the amount of energy required to complete transactions.”
“Hcash will be incorporating the protocol into its own systems, transforming its existing cryptocurrency, HyperCash, into one that is both quantum safe and privacy protecting,” said Dr Joseph Liu, associate professor, Director of Monash Blockchain Technology Centre and HCash Chief Scientist.
The world is one step closer to having a totally secure internet and an answer to the growing threat of cyber-attacks, thanks to a team of international scientists who have created a multi-user quantum communication network which could transform how we communicate online.
The invention led by the University of Bristol has the potential to serve millions of users, is understood to be the largest-ever quantum network of its kind, and could be used to secure people’s online communication, particularly in these internet-led times accelerated by the COVID-19 pandemic.
By deploying a new technique, harnessing the simple laws of physics, it can make messages completely safe from interception while also overcoming major challenges which have previously limited advances in this little used but much-hyped technology.
Lead author Dr Siddarth Joshi, who headed the project at the university’s Quantum Engineering Technology (QET) Labs, said: “This represents a massive breakthrough and makes the quantum internet a much more realistic proposition. Until now, building a quantum network has entailed huge cost, time, and resource, as well as often compromising on its security which defeats the whole purpose.”
“Our solution is scalable, relatively cheap and, most important of all, impregnable. That means it’s an exciting game changer and paves the way for much more rapid development and widespread rollout of this technology.”
Protecting the future internet
The current internet relies on complex codes to protect information, but hackers are increasingly adept at outsmarting such systems leading to cyber-attacks across the world which cause major privacy breaches and fraud running into trillions of pounds annually. With such costs projected to rise dramatically, the case for finding an alternative is even more compelling and quantum has for decades been hailed as the revolutionary replacement to standard encryption techniques.
So far physicists have developed a form of secure encryption, known as quantum key distribution, in which particles of light, called photons, are transmitted. The process allows two parties to share, without risk of interception, a secret key used to encrypt and decrypt information. But to date this technique has only been effective between two users.
“Until now efforts to expand the network have involved vast infrastructure and a system which requires the creation of another transmitter and receiver for every additional user. Sharing messages in this way, known as trusted nodes, is just not good enough because it uses so much extra hardware which could leak and would no longer be totally secure,” Dr Joshi said.
How the multi-user quantum communication network works
The team’s quantum technique applies a seemingly magical principle, called entanglement, which Albert Einstein described as “spooky action at a distance.” It exploits the power of two different particles placed in separate locations, potentially thousands of miles apart, to simultaneously mimic each other. This process presents far greater opportunities for quantum computers, sensors, and information processing.
“Instead of having to replicate the whole communication system, this latest methodology, called multiplexing, splits the light particles, emitted by a single system, so they can be received by multiple users efficiently,” Dr Joshi said.
The team created a network for eight users using just eight receiver boxes, whereas the former method would need the number of users multiplied many times – in this case, amounting to 56 boxes. As the user numbers grow, the logistics become increasingly unviable – for instance 100 users would take 9,900 receiver boxes.
To demonstrate its functionality across distance, the receiver boxes were connected to optical fibres via different locations across Bristol and the ability to transmit messages via quantum communication was tested using the city’s existing optical fibre network.
“Besides being completely secure, the beauty of this new technique is its streamline agility, which requires minimal hardware because it integrates with existing technology,” Dr Joshi said.
The team’s unique system also features traffic management, delivering better network control which allows, for instance, certain users to be prioritised with a faster connection.
Saving time and money
Whereas previous quantum systems have taken years to build, at a cost of millions or even billions of pounds, this network was created within months for less than £300,000. The financial advantages grow as the network expands, so while 100 users on previous quantum systems might cost in the region of £5 billion, Dr Joshi believes multiplexing technology could slash that to around £4.5 million, less than 1 per cent.
In recent years quantum cryptography has been successfully used to protect transactions between banking centres in China and secure votes at a Swiss election. Yet its wider application has been held back by the sheer scale of resources and costs involved.
“With these economies of scale, the prospect of a quantum internet for universal usage is much less far-fetched. We have proved the concept and by further refining our multiplexing methods to optimise and share resources in the network, we could be looking at serving not just hundreds or thousands, but potentially millions of users in the not too distant future,” Dr Joshi said.
“The ramifications of the COVID-19 pandemic have not only shown importance and potential of the internet, and our growing dependence on it, but also how its absolute security is paramount. Multiplexing entanglement could hold the vital key to making this security a much-needed reality.”
Collaborating institutions with the University of Bristol are the University of Leeds, Croatia’s Ruder Boskovic Institute (RBI) in Zagreb, Austria’s Institute for Quantum Optics and Quantum Information (IQOQI), in Vienna, and China’s National University of Defence Technology (NUDT) in Changsha.
Two UCLA computer scientists have shown that existing compilers, which tell quantum computers how to use their circuits to execute quantum programs, inhibit the computers’ ability to achieve optimal performance.
Specifically, their research has revealed that improving quantum compilation design could help achieve computation speeds up to 45 times faster than currently demonstrated.
Better quantum computer performance
The computer scientists created a family of benchmark quantum circuits with known optimal depths or sizes. In computer design, the smaller the circuit depth, the faster a computation can be completed.
Smaller circuits also imply more computation can be packed into the existing quantum computer. Quantum computer designers could use these benchmarks to improve design tools that could then find the best circuit design.
“We believe in the ‘measure, then improve’ methodology,” said lead researcher Jason Cong, a Distinguished Chancellor’s Professor of Computer Science at UCLA Samueli School of Engineering.
“Now that we have revealed the large optimality gap, we are on the way to develop better quantum compilation tools, and we hope the entire quantum research community will as well.”
Cong and graduate student Daniel (Bochen) Tan tested their benchmarks in four of the most used quantum compilation tools.
Tan and Cong have made the benchmarks, named QUEKO, open source and available on the software repository GitHub.
Many issues yet to be addressed
Quantum computers utilize quantum mechanics to perform a great deal of computations simultaneously, which has the potential to make them exponentially faster and more powerful than today’s best supercomputers. But many issues need to be addressed before these devices can move out of the research lab.
For example, due to the sensitive nature of how quantum circuits work, tiny environmental changes, such as small temperature fluctuations, can interfere with quantum computation. When that happens, the quantum circuits are called decoherent — which is to say they have lost the information once encoded in them.
“If we can consistently halve the circuit depth by better layout synthesis, we effectively double the time it takes for a quantum device to become decoherent,” Cong said.
“This compilation research could effectively extend that time, and it would be the equivalent to a huge advancement in experimental physics and electrical engineering,” Cong added. “So we expect these benchmarks to motivate both academia and the industry to develop better layout synthesis tools, which in turn will help drive advances in quantum computing.”
How it all started?
Cong and his colleagues led a similar effort in the early 2000s to optimize integrated circuit design in classical computers. That research effectively pushed two generations of advances in computer processing speeds, using only optimized layout design, which shortened the distance between the transistors that comprise the circuit. This cost-efficient improvement was achieved without any other major investments in technological advances, such as physically shrinking the circuits themselves.
“Quantum processors in existence today are extremely limited by environmental interference, which puts severe restrictions on the length of computations that can be performed,” said Mark Gyure, executive director of the UCLA Center for Quantum Science and Engineering, who was not involved in this study.
“That’s why the recent research results from Professor Cong’s group are so important because they have shown that most implementations of quantum circuits to date are likely extremely inefficient and more optimally compiled circuits could enable much longer algorithms to be executed. This could result in today’s processors solving much more interesting problems than previously thought. That’s an extremely important advance for the field and incredibly exciting.”
The U.S. Department of Energy (DOE) unveiled a report that lays out a blueprint strategy for the development of a national quantum internet. It provides a pathway to ensure the development of the National Quantum Initiative Act, which was signed into law by President Trump in December of 2018.
Around the world, consensus is building that a system to communicate using quantum mechanics represents one of the most important technological frontiers of the 21st century. Scientists now believe that the construction of a prototype will be within reach over the next decade.
In February of this year, DOE National Laboratories, universities, and industry met to develop the blueprint strategy of a national quantum internet, laying out the essential research to be accomplished, describing the engineering and design barriers, and setting near-term goals.
“The Department of Energy is proud to play an instrumental role in the development of the national quantum internet,” said U.S. Secretary of Energy Dan Brouillette. “By constructing this new and emerging technology, the United States continues with its commitment to maintain and expand our quantum capabilities.”
DOE’s 17 National Laboratories will serve as the backbone of the coming quantum internet, which will rely on the laws of quantum mechanics to control and transmit information more securely than ever before. Currently in its initial stages of development, the quantum internet could become a secure communications network and have a profound impact on areas critical to science, industry, and national security.
Crucial steps toward building such an internet are already underway in the Chicago region, which has become one of the leading global hubs for quantum research. In February of this year, scientists from DOE’s Argonne National Laboratory in Lemont, Illinois, and the University of Chicago entangled photons across a 52-mile “quantum loop” in the Chicago suburbs, successfully establishing one of the longest land-based quantum networks in the nation. That network will soon be connected to DOE’s Fermilab in Batavia, Illinois, establishing a three-node, 80-mile testbed.
“Decades from now, when we look back to the beginnings of the quantum internet, we’ll be able to say that the original nexus points were here in Chicago—at Fermilab, Argonne, and the University of Chicago,” said Nigel Lockyer, director of Fermilab. “As part of an existing scientific ecosystem, the DOE National Laboratories are in the best position to facilitate this integration.”
A range of unique abilities
One of the hallmarks of quantum transmissions is that they are exceedingly difficult to eavesdrop on as information passes between locations. Scientists plan to use that trait to make virtually unhackable networks. Early adopters could include industries such as banking and health services, with applications for national security and aircraft communications. Eventually, the use of quantum networking technology in mobile phones could have broad impacts on the lives of individuals around the world.
Scientists are also exploring how the quantum internet could expedite the exchange of vast amounts of data. If the components can be combined and scaled, society may be at the cusp of a breakthrough in data communication, according to the report.
Finally, creating networks of ultra-sensitive quantum sensors could allow engineers to better monitor and predict earthquakes—a longtime and elusive goal—or to search for underground deposits of oil, gas, or minerals. Such sensors could also have applications in health care and imaging.
A multi-lab, multi-institution effort
Creating a full-fledged prototype of a quantum internet will require intense coordination among U.S. Federal agencies—including DOE, the National Science Foundation, the Department of Defense, the National Institute for Standards and Technology, the National Security Agency, and NASA—along with National Laboratories, academic institutions, and industry.
The report lays out crucial research objectives, including building and then integrating quantum networking devices, perpetuating and routing quantum information, and correcting errors. Then, to put the nationwide network into place, there are four key milestones: verify secure quantum protocols over existing fiber networks, send entangled information across campuses or cities, expand the networks between cities, and finally expand between states, using quantum “repeaters” to amplify signals.
“The foundation of quantum networks rests on our ability to precisely synthesize and manipulate matter at the atomic scale, including the control of single photons,” said David Awschalom, Liew Family Professor in Molecular Engineering at the University of Chicago’s Pritzker School of Molecular Engineering, senior scientist at Argonne National Laboratory, and director of the Chicago Quantum Exchange. “Our National Laboratories house world-class facilities to image materials with subatomic resolution and state-of-the-art supercomputers to model their behavior. These powerful resources are critical to accelerating progress in quantum information science and engineering, and to leading this rapidly evolving field in collaboration with academic and corporate partners.”
Other National Laboratories are also driving advances in quantum networking and related technologies. For example, Stony Brook University and Brookhaven National Laboratory, working with the DOE’s Energy Sciences Network headquartered at Lawrence Berkeley National Laboratory, have established an 80-mile quantum network testbed and are actively expanding it in New York State and at Oak Ridge and Los Alamos National Laboratories. Other research groups are focused on developing a quantum cryptography system with highly secured information.
The race to protect sensitive electronic information against the threat of quantum computers has entered the home stretch.
Post-quantum cryptography standard
After spending more than three years examining new approaches to encryption and data protection that could defeat an assault from a quantum computer, the National Institute of Standards and Technology (NIST) has winnowed the 69 submissions it initially received down to a final group of 15.
NIST has now begun the third round of public review. This “selection round” will help the agency decide on the small subset of these algorithms that will form the core of the first post-quantum cryptography standard.
“At the end of this round, we will choose some algorithms and standardize them,” said NIST mathematician Dustin Moody. “We intend to give people tools that are capable of protecting sensitive information for the foreseeable future, including after the advent of powerful quantum computers.”
“We request that cryptographic experts everywhere focus their attention on these last algorithms,” Moody said. “We want the algorithms we eventually select to be as strong as possible.”
Classical computers have many strengths, but issues remain
Classical computers have many strengths, but they find some problems intractable — such as quickly factoring large numbers. Current cryptographic systems exploit this difficulty to protect the details of online bank transactions and other sensitive information.
Quantum computers could solve many of these previously intractable problems easily, and while the technology remains in its infancy, it will be able to defeat many current cryptosystems as it matures.
Because the future capabilities of quantum computers remain an open question, the NIST team has taken a variety of mathematical approaches to safeguard encryption. The previous round’s group of 26 candidate algorithms were built on ideas that largely fell into three different families of mathematical approaches.
“Of the 15 that made the cut, 12 are from these three families, with the remaining three algorithms based on other approaches,” Moody said. “It’s important for the eventual standard to offer multiple avenues to encryption, in case somebody manages to break one of them down the road.”
New standard to specify one or more quantum-resistant algorithms
Cryptographic algorithms protect information in many ways, for example by creating digital signatures that certify an electronic document’s authenticity.
The new standard will specify one or more quantum-resistant algorithms each for digital signatures, public-key encryption and the generation of cryptographic keys, augmenting those in FIPS 186-4, Special Publication (SP) 800-56A Revision 3 and SP 800-56B Revision 2, respectively.
For this third round, the organizers have taken the novel step of dividing the remaining candidate algorithms into two groups they call tracks. The first track contains the seven algorithms that appear to have the most promise.
“We’re calling these seven the finalists,” Moody said. “For the most part, they’re general-purpose algorithms that we think could find wide application and be ready to go after the third round.”
The eight alternate algorithms in the second track are those that either might need more time to mature or are tailored to more specific applications. The review process will continue after the third round ends, and eventually some of these second-track candidates could become part of the standard.
Future consideration of more recently developed ideas
Because all of the candidates still in play are essentially survivors from the initial group of submissions from 2016, there will also be future consideration of more recently developed ideas, Moody said.
“The likely outcome is that at the end of this third round, we will standardize one or two algorithms for encryption and key establishment, and one or two others for digital signatures,” he said.
“But by the time we are finished, the review process will have been going on for five or six years, and someone may have had a good idea in the interim. So we’ll find a way to look at newer approaches too.”
Because of potential delays due to the COVID-19 pandemic, the third round has a looser schedule than past rounds. Moody said the review period will last about a year, after which NIST will issue a deadline to return comments for a few months afterward.
Following this roughly 18-month period, NIST will plan to release the initial standard for quantum-resistant cryptography in 2022.
To achieve long-term data protection in today’s fast-changing and uncertain world, companies need the ability to respond quickly to unforeseen events. Threats like quantum computing are getting more real while cryptographic algorithms are subject to decay or compromise. Without the ability to identify, manage and replace vulnerable keys and certificates quickly and easily, companies are at risk.
So, what do we mean when we talk about crypto-agility? Fundamentally, you will have achieved crypto-agility when your security systems are able to rapidly deploy and update algorithms, cryptographic primitives, and other encryption mechanisms. Going a step further, it means you have achieved complete control over cryptographic mechanisms – your public key infrastructure (PKI) and associated processes – and can quickly make whatever changes are needed without intense manual effort.
The replacement of manual processes with automated ones is critical to keeping up with accelerating change. As computing power and security technologies continue to evolve at a faster and faster pace, your existing cryptographic infrastructure is destined to become obsolete in a few years unless you can keep it upgraded to the latest technologies. Notably, threats continue to evolve as well.
Moreover, as the world transforms to depend on digital systems more fully, we’ve embedded cryptography deeply into virtually every communication system in the world. It’s no longer possible for cryptography to remain isolated from other critical systems. The vast interdependent nature of modern systems makes it imperative that IT teams have the ability to respond quickly – or face the risk of major outages and disruption.
Cryptographic standards like RSA, ECC, and AES that are in broad use today are constantly being updated with more advanced versions. Eventually governing bodies like NIST get in the act and mandate the use of the latest standards, with browser and cloud providers often raising the bar as well. To avoid becoming non-compliant, you must have the ability to quickly upgrade all your systems that rely on deprecated cryptography.
A robust, cryptographically agile infrastructure also brings other long-term benefits and plays a critical role in preventing security breaches. Achieving crypto-agility will make your operations teams more efficient, and eliminate unnecessary costs such consulting fees, temporary staff, fines, or remediation costs.
Such scenarios can unfold when a bad actor gains admin access, for instance, and may or may not have issued certificates. This uncertainty means that certificates from the impacted certificate authority (CA) can no longer be trusted and all certs from that CA must be revoked and re-issued. Without crypto-agility and a clear understanding of your potential exposure, you’re looking at a costly all-hands-on-deck response to track and update hundreds or thousands of certs. And, of course, anytime you have humans involved with security response, you’re opening yourself to human error and further compromise and outages.
Quantum computing keeps getting closer
The looming threat of quantum computing – some say we could see 100,000x faster quantum computers as soon as 2025 – represents another compelling reason to focus on improving your crypto-agility. While all crypto algorithms are breakable on paper, the incredible computing power required for such a feat does not currently exist. That could change with quantum computers which one day will be able to break most existing algorithms and hash function in minutes or hours.
To avoid the doomsday scenario where every system in the world is potentially exposed to compromise, work is already underway toward quantum-safe cryptography. However, given how little we know about quantum computing and the inability to perform real-world testing, it’s safe to assume there will be considerable give and take before quantum-safe algorithms are widely available.
In the meantime, your cryptography, certificate management and key distribution systems must be agile enough to adapt to this very real emerging threat. The table below presents a scenario of the time and expense involved with swapping out existing cryptography for quantum-safe cryptography. In this scenario, with incomplete or partial automation most enterprises would be looking at a 15-month vulnerability period compared to just six days when a fully automated solution has been put in place.
A comparison of quantum doomsday mitigation scenarios
Crypto-agility is a complex topic at scale and working towards it requires a multifaceted approach. Changes need to be made to security setups in organizational policy, operating methods, and core technology and processes. Your PKI may need to be upgraded and enhanced to support rapid swaps of cryptography, and software development procedures may need to be revamped to incorporate a nimbler approach to cryptography – as opposed to being bolted on top of finished software.
The first step toward true crypto-agility is to understand the extent of your cryptographic exposure. This is accomplished by tracking down every digital certificate deployed across the organization and capturing details including algorithms and their size, the type of hashing/signature, validity period, where it’s located and how it can be used.
Once you have a complete inventory, you’ll then need to identify the vulnerable certificates by the type of cryptography in use and look for anomalies and potential problems. These can include certificates that use wildcards or IP address, certificates located on unauthorized or unintended systems as well as certificates abandoned on deprecated systems.
Finding your certificates and vulnerability isn’t enough by itself to deliver crypto-agility – you’re still looking at the aforementioned 15-month-long process if you need to swap everything out manually.
Here are three pillars of crypto-agility that will put your organization on the right path toward withstanding whatever the future holds:
#1 – Automate discovery and reporting. At the push of a button, you should be able to produce a full report of all your cryptographic assets. This will allow you quickly identify vulnerable cryptography and to report anomalies. There are any number of tools available to help you do this, but ideally certificate reporting should just be incorporated into an automated PKI solution.
#2 – Automate PKI operations at scale. The ideal solution here is a fully automated Certificate Management Systems (CMS) that will manage the entire lifecycle of a certificate from creation to renewal. When the CMS is used to create a certificate it should have all the data it needs to not only monitor the certificate for expiration but automatically provision a replacement certificate without human intervention.
#3 – Be nimble. At an organization and management level, your IT organization from DevOps through to day-to-day operations staff need to be ready for threats and change. You should carefully evaluate and rethink all aspects of your PKI to identify areas that may lock you into a particular vendor or technology.
The risk of having a slow-to-respond cryptographic infrastructure is increasingly daily, not only as digital transformations increase our dependency on inter-connected systems but as external threats and technology evolve with increasing pace. Looming above it all is the threat of quantum computing. Put it all together and it’s clear that the time to automate your PKI and move toward crypto-agility is at hand.
Programming quantum computers is becoming easier: computer scientists at ETH Zurich have designed the first programming language that can be used to program quantum computers as simply, reliably and safely as classical computers.
“Programming quantum computers is still a challenge for researchers,” says Martin Vechev, computer science professor in ETH’s Secure, Reliable and Intelligent Systems Lab (SRI), “which is why I’m so excited that we can now continue ETH Zurich’s tradition in the development of quantum computers and programming languages.”
He adds: “Our quantum programming language Silq allows programmers to utilize the potential of quantum computers better than with existing languages, because the code is more compact, faster, more intuitive and easier to understand for programmers.”
Quantum computing has been seeing increased attention over the last decade, since these computers, which function according to the principles of quantum physics, have enormous potential.
Today, most researchers believe that these computers will one day be able to solve certain problems faster than classical computers, since to perform their calculations they use entangled quantum states in which various bits of information overlap at a certain point in time. This means that in the future, quantum computers will be able to efficiently solve problems which classical computers cannot solve within a reasonable timeframe.
This quantum supremacy has still to be proven conclusively. However, some significant technical advances have been achieved recently. In late summer 2019, a quantum computer succeeded in solving a problem – albeit a very specific one – more quickly than the fastest classical computer.
For certain “quantum algorithms”, i.e. computational strategies, it is also known that they are faster than classical algorithms, which do not exploit the potential of quantum computers. To date, however, these algorithms still cannot be calculated on existing quantum hardware because quantum computers are currently still too error-prone.
Expressing the programmer’s intent
Utilizing the potential of quantum computation not only requires the latest technology, but also a quantum programming language to describe quantum algorithms. In principle, an algorithm is a “recipe” for solving a problem; a programming language describes the algorithm so that a computer can perform the necessary calculations.
Today, quantum programming languages are tied closely to specific hardware; in other words, they describe precisely the behavior of the underlying circuits. For programmers, these “hardware description languages” are cumbersome and error-prone, since the individual programming instructions must be extremely detailed and thus explicitly describe the minutiae needed to implement quantum algorithms.
This is where Vechev and his group come in with their development of Silq. “Silq is the first quantum programming language that is not designed primarily around the construction and functionality of the hardware, but on the mindset of the programmers when they want to solve a problem – without requiring them to understand every detail of the computer architecture and implementation,” says Benjamin Bichsel, a doctoral student in Vechev’s group who is supervising the development of Silq.
Computer scientists refer to computer languages that abstract from the technical details of the specific type of computer as high-level programming languages. Silq is the very first high-level programming language for quantum computers.
High-level programming languages are more expressive, meaning that they can describe even complex tasks and algorithms with less code. This makes them more comprehensible and easier to use for programmers. They can also be used with different computer architectures.
Eliminating errors through automatic uncomputation
The greatest innovation and simplification that Silq brings to quantum programming languages concerns a source of errors that has plagued quantum programming until now. A computer calculates a task in several intermediate steps, which creates intermediate results or temporary values.
In order to relieve the memory, classical computers automatically erase these values. Computer scientists refer to this as “garbage collection”, since the superfluous temporary values are disposed of.
In the case of quantum computers, this disposal is trickier due to quantum entanglement: the previously calculated values can interact with the current ones, interfering with the correct calculation. Accordingly, cleaning up such temporary values on quantum computers requires a more advanced technique of so-called uncomputation.
“Silq is the first programming language that automatically identifies and erases values that are no longer needed,” explains Bichsel. The computer scientists achieved this by applying their knowledge of classical programming languages: their automatic uncomputation method uses only programming commands that are free of any special quantum operations – they are “qfree”, as Vechev and Bichsel say.
“Silq is a major breakthrough in terms of optimising the programming of quantum computers; it is not the final phase of development,” says Vechev. There are still many open questions, but because Silq is easier to understand, Vechev and Bichsel hope to stimulate both the further development of quantum programming languages and the theory and development of new quantum algorithms.
“Our team of four has made the breakthrough after two years of work thanks to the combination of different expertise in language design, quantum physics and implementation. If other research and development teams embrace our innovations, it will be a great success,” says Bichsel.
European organizations have a false sense of security when it comes to protecting themselves, with only 68% seeing themselves as vulnerable, down from 86% in 2018, according to Thales.
Problems with implementing security basics
This confidence flies in the face of the findings of the survey of 509 European executives which reveals 52% of organizations were breached or failed a compliance audit in 2019, raising concerns as to why 20% intend to reduce data security spend in the next year.
The findings come as workers across Europe are working from home due to COVID-19, often using personal devices which don’t have the built-in security office systems do, significantly increasing risk to sensitive data.
Across the board, companies are racing to digitally transform and move more applications and data to the cloud; 37% of European countries stated they are aggressively disrupting the markets they participate in or embedding digital capabilities to enable greater enterprise agility.
A key aspect of this transformation is in the cloud becoming the leading data environment. 46% of all data stored by European organizations is now stored in the cloud, and with 43% of that data in the cloud being described as sensitive, it is essential that it is kept safe.
As more sensitive data is stored in cloud environments, however, data security risks increase. This is of particular concern given that 100% of businesses surveyed report that at least some of the sensitive data they are storing in the cloud is not encrypted.
Only 54% of sensitive data in the cloud is protected by encryption and even less (44%) is protected by tokenisation, highlighting the disconnect between the level of investment companies are making into cybersecurity and the increasing threats they face.
Multi-cloud adoption complicates data security
Despite the multitude of threats, businesses feel that the complexity (40%) of their environments is holding their data security capabilities back.
Multi-cloud adoption is the main driver of this complexity; 80% of businesses are using more than one IaaS (Infrastructure as a Service) vendor, whilst 29% have more than 50 SaaS (Software as a Service) applications to manage.
Businesses also identified a lack of budget (30%), staff to manage (28%) and organization buy-in/low priority (25%) as other top blockers.
“Businesses are continuing to race towards digital transformation and many are increasingly reliant on complex cloud environments, without taking a zero-trust approach. Data is more at risk than ever, whilst organizations are unwittingly creating the perfect storm for hackers by not implementing the security basics,” commented Rob Elliss, EMEA Vice President for Data Security solutions at Thales.
“Unfortunately, this will result in increasing problems, particularly in a world where working remotely will be part of the new-normal, unless companies can step up to the plate when it comes to keeping data safe.”
Quantum(fying) the problem
Whilst organizations continue to look at the threat of today, many are starting to turn their attention to peril that the acceleration of computing power, quantum, could bring to them. In fact, 93% respondents are concerned quantum computing will lead to exploits being created that could expose the sensitive data they hold.
What’s more, 69% European organizations expect quantum to affect their cryptographic operations in the next five years.
As a result, most organizations are reacting, with 31% planning to offset quantum computing threats by switching away from static encryption or symmetric cryptography. Furthermore, a similar amount (30%) plans to implement key management that supports quantum safe random number generator.
“It is clear that businesses are aware of evolving threats they face and it’s reassuring to see them acknowledging some of the key steps they need to take – including moving away from static encryption and implementing quantum-proof key management.
“It’s critical, though, that organizations don’t just look at threats years away, but invest in their cybersecurity processes now and see it as an integral part of their digital transformation,” Elliss concluded.
Large enterprises have a major problem when it comes to preparing for the advent of quantum computing: few, if any, have a working knowledge of all the locations where cryptographic keys are being stored and used across applications, browsers, platforms, files and modules, as well as being shared with third parties and vendors.
Enterprises with tens or hundreds of thousands of employees require a massive technology base including computers, mobile devices, operating systems, applications, data, and network resources to keep operations running smoothly. Cryptography in all of its various forms is broadly used to encrypt and protect sensitive information as it moves across this vast landscape of systems and devices. Exactly which algorithms and cryptography methods are being used is virtually unknowable without a concerted effort to track down and compile a comprehensive inventory of the literally hundreds of crypto assets in use across an enterprise.
Most enterprise IT managers and chief security officers are well-acquainted with tracking software assets as a way to improve security. A good understanding of software versions can help with ensuring that updates and patches are applied before the next big vulnerability is discovered and systems get compromised. There’s a sense of urgency around patching software as new flaws and data breaches get discovered on a nearly daily basis.
Crypto systems, in contrast, are often perceived to already be hardened and less vulnerable than software applications. Changes to cryptography systems tend to happen slowly, so there is less immediacy. Organizations often take years to upgrade their cryptography, as with migrations from SHA-1 to SHA-2 or Triple Data Encryption Standard (TDES) to Advanced Encryption Standard (AES).
The lack of urgency concerning cryptography is one of most significant problems facing most enterprises as they consider what steps they should be taking to survive in a post-quantum world. With Y2K, for instance, the deadline to revamp systems with two-digit date codes was obvious. That’s not the case here – the timeline is anything but certain. It could happen in two or three years or it might happen in 10-15 years, or it might never happen. At the current rate of advancement, most experts expect that functional quantum computers capable of breaking current-grade cryptography such as RSA will arrive within the next 10 years. Maybe. Or maybe not.
Uncertainty is a deal breaker for driving urgency. When there are 50,000 fires to fight on a daily basis, enterprises don’t have time to think about a fire that someone tells them is going to happen sometime in the future. It’s a matter of human nature. We all continue living our lives knowing that someday the sun will explode and life on Earth as we know it will be over. We tell ourselves, “Sure, someday quantum computers will arrive on the scene and I’ll deal with it when the time comes, but I’m too busy to think about it right now.”
If guarding against the threat of quantum were a simple matter of using different algorithms, a wait-and-see attitude might be sufficient. In reality, in the 40 years that asymmetric encryption technology has been in use, there has never be a threat to cryptography of this scale. There will be massive upheaval and disruption.
A sweeping crypto transition like this will happen at Internet scale. Making the move to quantum-resistance algorithms will be a complex process for the entire industry, involving countless systems and devices and will require intense engagement with partners and third-party vendors. It will take time and patience.
Every enterprise is different and the only way to know how your organization will fare in a post-quantum world is to gain an understanding of what systems are doing cryptographic signing or encryption. The ultimate goal is a listing all the applications, systems and devices across the organization and its subsidiaries detailing the type of cryptography and algorithms in use. You’ll also want to evaluate exposure to attack, the sensitivity of information that is being protected, and whether there’s support for crypto agility to determine if the system will need to be replaced by something more agile. Such information is often not immediately obvious and may require special tools, expert-level sleuthing and discussions with vendors to figure out. Given a general lack of urgency toward quantum, few enterprises are likely to invest the necessary resources for a comprehensive cryptography audit.
Quantum readiness: Focus on business-critical systems first
A much more practical approach – and one that business leaders will more likely find acceptable – is to focus on understanding the exposure to your more important, business-critical set of applications. For example, if you’re a bank, what systems do you have that allow you to operate daily as a bank? You’re not going to care about an employee website that sells Disneyland tickets. It that were to be turned off tomorrow, it wouldn’t be a problem. By focusing on business-critical systems, you’ve just overcome a major obstacle to getting started toward quantum readiness.
Once you have the ball rolling and business-critical systems identified, now comes the task of tracking down where and how those system are using signing or encryption. Is that SQL database sitting on the network using certificates? How do I know? There’s no magical tool that can run in an environment and tell you everything. You’ll need to look at network ports and look for certificates and even then, you’ll only find a small portion.
If your company makes widgets, you’ll likely decide that the systems you use to make widgets are business critical. Is encryption or signing enabled? If so, what type of cryptographic keys and can it be upgraded? Is there something in the documentation, or will I need to have a conversation with the vendor? It’s also important not to overlook systems that may not be business critical per se but could expose the organization to considerable risk. The video conferencing system used to discuss quarterly earnings could be prime target, for instance.
Improving crypto ability
Even if you aren’t sure about post-quantum impact, having a list of all the systems and algorithms is important for other security controls and standards as well as knowing where your risks are. So even if the quantum supremacy is never realized, it’s still a good process to go through – it’s not wasted nor is it only for the doomsayers.
What’s more, cryptographic algorithms are constantly evolving. Having a list of the type of cryptography in use makes it relatively simple to move to stronger algorithms as needed. Researchers are constantly looking for ways to crack encryption algorithms and sometimes they are successful, such as the discovery of a significant flaw that caused all major browser vendors to flag SHA-1 certificates as unsafe, finally putting that outdated algorithm to bed.
A good understanding of cryptography also puts you in a better position with vendors. As quantum-safe algorithms and methods are developed, you can put pressure on vendors to implement them within a reasonable time frame, or if they refuse, you can move to different vendors. And to some degree, time is of the essence. Even before a quantum computer capable of breaking encryption arrives, malicious actors are already starting to harvest encrypted data hoping they can one day unlock a veritable treasure trove.
Despite the uncertainty surrounding the arrival of quantum computing, sitting back and waiting for the sky to fall is a sure recipe for disaster. Avoid the worst-case scenario by at least documenting how your organization uses cryptography across all business-critical systems.
Improved AI capabilities, accelerated business intelligence, and increased productivity and efficiency were the top expectations of organizations currently investing in cloud-based quantum computing technologies, according to IDC.
Users are very optimistic
Initial survey findings indicate that while cloud-based quantum computing is a young market, and allocated funds for quantum computing initiatives are limited (0-2% of IT budgets), end-users are optimistic that early investment will result in a competitive advantage.
The manufacturing, financial services, and security industries are currently leading the way by experimenting with more potential use cases, developing advanced prototypes, and being further along in their implementation status.
Easy access to quantum computing
Complex technology, skillset limitations, lack of available resources, and cost deter some organizations from investing in quantum computing technology. These factors, combined with a large interdisciplinary interest, has forced quantum computing vendors to develop quantum computing technology that addresses multiple end-user needs and skill levels.
The result has led to increased availability of cloud-based quantum computing technology that is more easily accessible and user friendly for new end users. Currently, the preferred types of quantum computing technologies employed across industries include quantum algorithms, cloud-based quantum computing, quantum networks, and hybrid quantum computing.
“Quantum computing is the future industry and infrastructure disruptor for organizations looking to use large amounts of data, artificial intelligence, and machine learning to accelerate real-time business intelligence and innovate product development. Many organizations — from many industries — are already experimenting with its potential,” said Heather West, senior research analyst, Infrastructure Systems, Platforms, and Technology at IDC.
In anticipation of his keynote at HITB Security Conference 2020 in Amsterdam, we talked to Jon Callas, a world-renowned cryptographer, software engineer, UX designer, and entrepreneur.
Before joining the ACLU as senior technology fellow, he was at Apple, where he helped design the encryption system to protect data stored on a Mac. Jon also worked on security, UX, and crypto for Kroll-O’Gara, Counterpane, and Entrust. He has launched or worked on the launches of many tools designed to encrypt and secure personal data, including PGP, Silent Circle, Blackphone, DKIM, ZRTP, Skein, and Threefish.
You’ve been in the cybersecurity industry for a long time, taking on a variety of roles. What advice would you give to those just entering this industry? What pitfalls can they expect?
There are things that have been true for technical people for decades and will continue to be true.
Expertise gets common, gets automated, and then the people push buttons on the automated tool think they are experts; they might be. About half the things you know will be obsolete after five years, so you’ll have to learn new things and maybe pivot your career.
The best thing to work on is always something that excites you. Everyone does a good job on what they like and bad on on things that bore us. When (not if) you need to make a change, it might take a couple of years. A once-in-a-lifetime opportunity will come to you every year or two. If you miss this one, there will be another. And yet, the right opportunity never comes at the perfect time.
Technology changes, people are the same. People will always be lazy. They’ll always forget things and lose things. Assume stupidity over malice. Build your systems so they take advantage of people’s flaws when you can, or at least won’t be destroyed when they don’t know and don’t care.
Year after year, data breach losses continue to rise. What is the cybersecurity industry doing wrong? There’s plenty of innovation, yet most organizations fail at basic security hygiene.
I think you’re hitting on the exact thing. It’s closely related to what we were talking about before — people are lazy, stupid, and don’t want to spend money. They will want to know why then need to buy a lock if no one has broken in.
A cybersecurity company will have a brilliant idea, and that brilliant idea will be a solution to some problem, and often prevention would have worked better. Meanwhile, it’s really hard to sell prevention both as a company and as a cybersecurity group. It’s hard to show metrics about what was prevented.
Thus we have a kind of evolutionary process here. The companies we see being successful are the ones selling things people want to buy. There are a lot of companies selling things people need but they don’t want to buy and those companies struggle.
That’s why what we see of the cybersecurity industry is not addressing these basic issues. And yet, the organizations that are failing are failing because they don’t want to do those basic things.
I snark that CISO stands for Chief Intrusion Scapegoat Officer. The CISO is the person that you fire because the bad thing they said was going to happen unless measures were improved really happened. It’s their fault that measures weren’t improved, right? I know security officers who have left their job because they weren’t being listened to and knew that the inevitable breach would be blamed on them.
What’s your take on the global privacy erosion brought on by large social networks?
I’m really glad to see policy reactions coming from that. I like GDPR. I like CCPA (the new California privacy act). No, they’re not perfect. As time goes on, likely we need to tweak or come up with interpretations of the gray areas in each, but they’re good. We need both policy and technology to protect us, along with privacy norms. We technical people tend to scoff, but norms work.
Today, most web sites are using TLS and that’s a norm; we expect that a site will use TLS and that expectation is a norm. The technical backing for that new norm is that we changed from presenting a lock for TLS, but for saying that the lack of it is not secure.
How do you expect encryption technologies to evolve in the next decade? What would you like to see implemented/created?
I expect that we’ll see a number things sorted out in choices for post-quantum public key crypto, but still talking about the eventuality of quantum computers. I expect we’ll still be waiting for homomorphic encryption to be efficient enough for the uses we’d like, as well as waiting for multiparty computation to speed up more. I expect we’re still going to have law enforcement wanting to get into encryption, as well.
In related fronts, I’m hoping we’ll have more verification like certificate and key transparency, formally verified implementations of important algorithms, and a number of interesting new protocols.
I think that the important thing for us all to remember is that encryption is a technology that implicitly rearranges power. It is implicitly political as well as personal. I think that this is why everyone finds it alluring.
More than half (54%) of cybersecurity professionals have expressed concerns that quantum computing will outpace the development of other security tech, according to a research from Neustar.
Keeping a watchful eye on developments, 74% of organizations admitted to paying close attention to the technology’s evolution, with 21% already experimenting with their own quantum computing strategies.
A further 35% of experts claimed to be in the process of developing a quantum strategy, while just 16% said they were not yet thinking about it. This shift in focus comes as the vast majority (73%) of cyber security professionals expect advances in quantum computing to overcome legacy technologies, such as encryption, within the next five years.
Almost all respondents (93%) believe the next-generation computers will overwhelm existing security technology, with just 7% under the impression that true quantum supremacy will never happen.
The impact of quantum computing to existing security tech
Despite expressing concerns that other technologies will be overshadowed, 87% of CISOs, CSOs, CTOs and security directors are excited about the potential positive impact of quantum computing. The remaining 13% were more cautious and under the impression that the technology would create more harm than good.
“At the moment, we rely on encryption, which is possible to crack in theory, but impossible to crack in practice, precisely because it would take so long to do so, over timescales of trillions or even quadrillions of years,” said Rodney Joffe, Chairman of NISC and Security CTO at Neustar.
“Without the protective shield of encryption, a quantum computer in the hands of a malicious actor could launch a cyberattack unlike anything we’ve ever seen.”
“For both today’s major attacks, and also the small-scale, targeted threats that we are seeing more frequently, it is vital that IT professionals begin responding to quantum immediately.
“The security community has already launched a research effort into quantum-proof cryptography, but information professionals at every organization holding sensitive data should have quantum on their radar.
“Quantum computing’s ability to solve our great scientific and technological challenges will also be its ability to disrupt everything we know about computer security. Ultimately, IT experts of every stripe will need to work to rebuild the algorithms, strategies, and systems that form our approach to cybersecurity,” added Joffe.
Changes in the cybersecurity landscape
The report also highlighted a steep two-year increase on the International Cyber Benchmarks Index. Calculated based on changes in the cybersecurity landscape – including the impact of cyberattacks and changing level of threat – November 2019 saw the highest score yet at 28.2. In November 2017, the benchmark sat at just 10.1, demonstrating an 18-point increase over the last couple of years.
During September – October 2019, security professionals ranked system compromise as the greatest threat to their organizations (22%), with DDoS attacks and ransomware following very closely behind (21%).
Last month, Google claimed to have achieved quantum supremacy—the overblown name given to the step of proving quantum computers can deliver something that a classical computer can’t. That claim is still a bit controversial, so it may yet turn out that we need a better demonstration.
Independently of the claim, it’s notable that both Google and its critics at IBM have chosen the same type of hardware as the basis of their quantum computing efforts. So has a smaller competitor called Rigetti. All of which indicates that the quantum-computing landscape has sort of stabilized over the last decade. We are now in the position where we can pick some likely winners and some definite losers.
Why are you a loser?
But why did the winners win and the losers lose?