For the first time ever, scientists at Paderborn University have used high-performance computing (HPC) at large scales to analyse a quantum photonics experiment. In specific terms, this involved the tomographic reconstruction of experimental data from a quantum detector. This is a device that measures individual photons, i.e. light particles. The researchers involved developed new HPC software to achieve this. Their findings have now been published in the specialist journal Quantum Science and Technology.
Quantum tomography on a megascale photonic quantum detector
High-resolution photon detectors are increasingly being used for quantum research. Precisely characterising these devices is crucial if they are to be put to effective use for measurement purposes — and thus far, doing so has been a challenge. This is because it involves huge volumes of data that need to be analysed without neglecting their quantum mechanical structure. Suitable tools for processing these data sets are particularly important for future applications. Whilst traditional approaches cannot perform like-for-like computations of quantum systems beyond a certain scale, Paderborn’s scientists are using high-performance computing for characterisation and certification tasks.
“By developing open-source customized algorithms using HPC, we perform quantum tomography on a megascale quantum photonic detector,” explains physicist Timon Schapeler, who authored the paper with computer scientist Dr. Robert Schade and colleagues from PhoQS (Institute for Photonic Quantum Systems) and PC2 (Paderborn Center for Parallel Computing). PC2, an interdisciplinary research project at Paderborn University, operates the HPC systems. The university is one of Germany’s national high-performance computing centres and thus stands at the forefront of university high-performance computing.
‘Unprecedented scale’
“The findings are opening up entirely new horizons for the size of systems being analysed in the field of scalable quantum photonics. This has wider implications, for example for characterising photonic quantum computer hardware,” Schapeler continues. Researchers were able to perform their calculations for describing a photon detector within just a few minutes — faster than ever before. The system also managed to complete calculations involving huge quantities of data extremely quickly. Schapeler: “This shows the unprecedented scale on which this tool can be used with quantum photonic systems. As far as we know, our work is the first contribution to the field of traditional high-performance computing enabling experimental quantum photonics at large scales. This field will become increasingly important when it comes to demonstrating quantum supremacy in quantum photonic experiments — and on a scale that cannot be calculated by conventional means.”
Shaping the future with fundamental research
Schapeler is a doctoral student in the Mesoscopic Quantum Optics research group headed by Professor Tim Bartley. This team conducts research into the fundamental physics of the quantum states of light and its applications. These states consist of tens, hundreds or thousands of photons. “The scale is crucial, as this illustrates the fundamental advantage that quantum systems hold over conventional ones. There is a clear benefit in many areas, including measurement technology, data processing and communications,” Bartley explains. The major discipline of quantum research is one of Paderborn University’s flagship fields. Respected experts are conducting fundamental research to shape the specific applications of the future.
Further information about quantum research at Paderborn University is available at:
https://www.uni-paderborn.de/en/topic/quantum-research