top of page

Quantum Software

Programming the defining technology of the 21st century.

Quantum Software: Projects

Critically, determining the timeframe for a scientific breakthrough to be incorporated into large-scale infrastructure is extremely difficult. Estimates for the impact of quantum computing to be felt in industry can range from two to thirty years [4,5] and depend not just on the advantage offered by quantum computers but also on the complexity of deployment. This is made more challenging by the early stage of quantum computing development, and in particular by the fact that the industry lacks reliable tools for estimating the quantum resources required for application deployment.

 

Quantum computing advantage emerges not from brute force power, but from subtle differences in information processing that can occur for key bottleneck subroutines. Unfortunately, for real-world commercially relevant applications the best-case estimates for a quantum advantage typically range in the millions to billions of qubits achievable within wildly varying timeframes from 5-30+ years [3,4]. As quantum processors are still in their infancy, quantum application development and performance benchmarking cannot be modeled through a robust prototyping and application development framework in the same way as is done in traditional computing. Instead, we must more heavily rely on theoretical analysis and software tools to project where a quantum advantage might be found.  

Quantum Algorithms and Complexity.

How do we currently solve complex problems in a variety of market sectors? and what are the bottlenecks in these algorithms that prevent us from getting the answers we want?  Can quantum algorithms bypass these bottlenecks and push us into regimes where complex simulation is either possible or cost-efficient

Error Correction and Fault-tolerance

Large-scale quantum computation requires Quantum Error Correction and Fault-tolerant protocols to actually run on imperfect hardware. 

 

"A quantum computer is a large error-correcting machine, computation is just a byproduct". 

- Andrew Steane  

As so much of what a quantum computer does is just error-correction, we have to compile and optimise to reduce the ultimate size of the quantum chipsets needed to run a commercially or scientifically useful quantum algorithm.  

Programming and Languages.

What are the data structures and languages most suitable for a quantum computer?  How do we verify computations that we cannot classically simulate?  Is it even correct for quantum programming to be constantly analogised with classical programming?  These are the questions being addressed by researchers at the forefront of research into quantum data structures and programming.

Dynamic control and feedback.

Quantum computers output a huge amount of data, and they similarly require a huge amount of classical control signals as input.  Being able to process this classical data and feed it back into the quantum computer is not a trivial task, especially for quantum computers that work at GHZ speeds and sit in fridges near zero degrees Kelvin.  

A large outstanding question is even if we can build a million or ten million qubit chipset, will we be able to control it and process the classical data that it produces?

Circuit design and optimisation.

How do we convert quantum algorithms into viable circuits that can be run on actual quantum computers which are optimised with respect to any number of possible metrics?  What compilation techniques ensure faithful translation of an abstract algorithm to actual implementations, and how do we verify this?

Quantum firmware and control software.

Quantum computers need to be controlled and they need software to perform these control operations.  This software can have a big influence on how good the qubits and qubit gate operations are.  Better control leads to fewer errors and a reduction in the total size of the chipset needed to perform a computation.  

Quantum control is an intricate part of the software development stack and integrated solutions for a variety of quantum hardware platforms will be required, tested, validates and ultimately deployed, at scale.

There are several commonly known applications of quantum computing, each with their own challenges and timeframes:

  • Cybersecurity: The security of the internet is underpinned by public key cryptography. Public key cryptography is on mathematical function that are difficult to compute but easy to verify. Essential to understanding the threat posed by quantum computers towards cyber-security, is establishing the size of quantum computer that is capable of attacking a modern public-key cryptosystems. Currently, the best estimate for attacking RSA 2048 with a quantum computer requires a device that has approximately 20 million qubits. While this is a large, number this number has fallen by a factor of 1000 in 10 years due to advancements in algorithms, compiling, and error correction. If this can be further improved by an order of magnitude through theoretical developments RSA 2048 would be under threat within a decade, assuming the projected manufacturing timelines of IBM. Due to the danger posed by “harvest now, decrypt later” attacks there is an urgent need to deploy “post-quantum” security schemes that cannot be exploited by quantum computers.  The US is pioneering the transition to quantum-safe cryptography via a presidential executive order and standardising post-quantum cryptography protocols. Australia must follow the US lead and set the requirements to update security protocols for critical systems and data.

  • Simulation: Quantum simulation is a set of algorithms for simulating the dynamics of quantum system which becomes classically intractable as the system size grows. Quantum simulation is also a foundation of quantum speedups for quantum chemistry, material science and even quantum linear algebra. However, simulating practical problems that are unambiguously beyond the reach of classical computers requires quantum computers capable of  full-scale error correction and processors with more than  2 000 000 physical qubits for problems of current interest in quantum chemistry. Despite the required resources, it is likely that quantum simulation problems will be the first to demonstrate the utility quantum computation for industry relevant problems on a large scale and there is some hope that some advantage over classical computers can be obtained without fault-tolerant devices.

  • Optimisation: Optimization algorithms are utilized in almost all computational problems. Even seemingly distinct problems in machine learning and AI or computational chemistry are in practice solved through optimization. From the point of view of computer science, most of these problems are NP-hard, i.e. not solvable in general by computers (quantum or classical) at scale. However, real-world scenarios tend to have additional structure that allows for practical solutions in many cases. Such applications are often developed through competitions such as the grid optimisation competition.

Given the potential impact of even mild improvements to optimisation algorithms, there has been a focus on quantum algorithms for optimization since the 1990s. Current research indicates potential “quadratic” scaling improvements for quantum computers, a middling but nonetheless meaningful improvement. However, a recent study by a researcher showed that these approaches are not likely to produce an impactful quantum advantage before we have quantum computers consisting of well over one million qubits, which is not likely to occur for at least a decade and likely longer. Further research using the more advanced tools currently under development at QSI could lead to more reliable analyses or even superior quantum advantages that would reduce the timescale to impact.

  • Machine Learning and AI: Processing big data has been identified as a potential application for quantum computers and there are data-processing tasks that can be performed more efficiently on a quantum computers. However, the input and output of the data remains a bottleneck for quantum computers erasing most known quantum speedups for machine learning and AI. While quantum advantage in terms of computational time is not expected in the foreseeable future, there are examples where a quantum computer needs exponentially less data for learning. This is most pronounced when learning from quantum data, such as quantum states produced by multi-qubit quantum sensors. 

  • Quantum benchmarking and performance analytics is a rapidly expanding field to both assess the utility of quantum computing for simulation, but also build new techniques to reduce the physical resources to realise these applications.  Both the United States and Europe have initiated region-wide programs to build out the tools and capabilities to accelerate the ability of quantum computers to achieve advantage across all of these broad applications.

  • Quantum Key Distribution: Quantum Key Distribution is often cited as the quantum solution to the vulnerabilities posed by quantum computers to cryptographic systems.  By replacing computational hardness assumptions with a physical system based on quantum mechanics, we can guarantee by the laws of physics a method for distributing random but correlated bit-strings between two users that could be used to establish a secret key.  Quantum Key Distribution remains the only commercially viable quantum protocol - in communications or computation - that is available today.  However, the technological capability of QKD systems compared to ‘adequate’ non-quantum solutions that are also available has limited the up-take of commercial QKD outside of a few select government and private sector actors.  The notable exception to this is the investment by the Chinese Government on an entirely new, quantum secured backbone communications system based upon dedicated fibre-optics based quantum networks and quantum enabled satellite systems.  

Screen Shot 2022-07-10 at 6.46.32 pm.png

From the software and theory perspective we can see that the following key technologies will need to develop alongside the evolution of core quantum technologies for the value to be realized:

  • Post-quantum cryptography. Given the proven potential for quantum computers to one day break certain existing cryptosystems like RSA, it is critical to develop cryptosystems that are impervious to attack by quantum computers. This field of research, known as post-quantum cryptography, should be continuously developed alongside quantum technologies.

  • Techniques in algorithmic benchmarking and testing. The establishment of benchmarks for algorithm development has been a key driver of success in operations research and optimization over the last few decades. The creation of algorithm benchmarks that integrate with quantum technologies, utilizing best-practice in classical complexity theory and algorithms research will be key for recognizing the utility of quantum information processing technologies.

  • Formal data and programming structures for quantum information processing. Current programming and data structures for quantum information technologies are built for NISQ hardware, not quantum computers that are capable of solving large-scale problems. In order to recognize the utility of quantum computers and networks, data structures and programming languages need to be developed to allow for efficient integration with classical programming languages and computing systems.

  • Compilation tools integrating quantum and classical systems. Programming languages and software tools that enable high-level programming languages to deploy subroutines on classical and quantum information processing hardware will be essential for developing and assessing high-impact quantum technologies. Today’s computers utilize sophisticated virtual layers that transition code from a compiler to an optimized intermediate representation suitable for deployment to an instruction set to a processor. In future quantum devices this will be made more complicated due to the interplay between quantum and classical processor characteristics.

  • A formal and scalable structure for quantum diagnostics to aid in de-bugging and software verification. While much research is being conducted into quantum error correction to handle hardware errors in quantum information processing systems, these devices will also need software diagnostics and software verification tools. Classical model checking and verification tools will need to be incorporated into the software systems that drive quantum devices to enable debugging tools comparable to the standards expected in today’s classical computing systems.

  • A basic level protocol stack for general purpose quantum communications. As quantum communication devices are integrated into current networks protocols will need to be developed to allow for their efficient and secure use. As the complexity of quantum devices deployed increases, new protocols that can best leverage their capabilities will have to be created.

  • Hardware system certification protocols and automated testing protocols. As quantum devices become more sophisticated, the problem of certifying their quality and verifying that they are working as planned becomes extremely difficult. The same features that make quantum computers powerful, also make them difficult to characterize and certify. New software-based methods to allow for the certification, verification, and calibration of devices at both the quantum level and via their integration with classical systems will need to be deployed.

  • Standardisation across both computation and communications. The development of standards across the entire technology stack in quantum computation and communications will enable competition in the development of the building blocks of these technologies. 

While there is a significant amount of excellent work throughout the quantum community related to algorithm development, analysis, optimisation and implementation, the sudden push of quantum computing out from the academic sector and into the commercial sector has created a mismatch related to claims regarding what these machines can do and what is actually backed up by solid theoretical and/or experimental work.  This has caused an extensive amount of confusion that is beginning to hamstring real development of both experimental quantum computing systems and theoretical work related to when and if quantum computers can provide a real benefit to a specific problem or set of problems.

 

In the classical world, the field of computer benchmarking was designed to help assess both the hardware and software aspects of computing.  This could be to confirm that hardware clock speeds are consistent with what is advertised in a newly designed microprocessor or it could be that a computer program is subject to a series of programs or operations to test that it is either free of bugs, or performs to a level that is expected by either the developer or a potential client.  These areas of classical computer science are essentially non-existent in the quantum world.  

 

Potential solutions to this problem are being investigated and researchers are developing an initial structural framework to a subfield that has been dubbed Quantum Resource Estimation (QRE) and performance analytics.  This framework sits between the quantum software community and the quantum hardware community and acts to link together these two areas of research, providing continuity that will allow researchers and developers to quantify and optimise the hardware requirements of their programs, and allowing quantum hardware developers to understand the micro-architectural structures needed for high-impact applications.  

bottom of page