Borrowing a page from high-energy physics and astronomy textbooks, a team of physicists and computer scientists has successfully adapted and applied a common error-reduction technique to the field of quantum computing.
In the world of subatomic particles and giant particle detectors, and distant galaxies and giant telescopes, scientists have learned to live, and to work, with uncertainty. They are often trying to tease out ultra-rare particle interactions from a massive tangle of other particle interactions and background «noise» that can complicate their hunt, or trying to filter out the effects of atmospheric distortions and interstellar dust to improve the resolution of astronomical imaging.
Also, inherent problems with detectors, such as with their ability to record all particle interactions or to exactly measure particles’ energies, can result in data getting misread by the electronics they are connected to, so scientists need to design complex filters, in the form of computer algorithms, to reduce the margin of error and return the most accurate results.
The problems of noise and physical defects, and the need for error-correction and error-mitigation algorithms, which reduce the frequency and severity of errors, are also common in the fledgling field of quantum computing, and a study published in the journal npj Quantum Information found that there appear to be some common solutions, too.
Ben Nachman, a Berkeley Lab physicist who is involved with particle physics experiments at CERN as a member of Berkeley Lab’s ATLAS group, saw the quantum-computing connection while working on a particle physics calculation with Christian Bauer, a Berkeley Lab theoretical physicist who is a co-author of the study. ATLAS is one of the four giant particle detectors at CERN’s Large Hadron Collider, the largest and most powerful particle collider in the world.
«At ATLAS, we often have to ‘unfold,’ or correct for detector effects,» said Nachman, the study’s lead author. «People have been developing this technique for years.»
In experiments at the LHC, particles called protons collide at a rate of about 1 billion times per second. To cope with this incredibly busy, «noisy» environment and intrinsic problems related to the energy resolution and other factors associated with detectors, physicists use error-correcting «unfolding» techniques and other filters to winnow down this particle jumble to the most useful, accurate data.
Story Source: Materials provided by DOE/Lawrence Berkeley National Laboratory. Note: Content may be edited for style and length.