Thresholds for the distributed surface code in the presence of memory decoherence
S.W. de Bone (TU Delft - QID/Elkouss Group, TU Delft - QuTech Advanced Research Centre, Centrum Wiskunde & Informatica (CWI))
C. E. Bradley (TU Delft - QuTech Advanced Research Centre, TU Delft - QID/Taminiau Lab, Pritzker School of Molecular Engineering)
Tim Hugo Taminiau (TU Delft - QuTech Advanced Research Centre, TU Delft - Quantum Internet Division)
David Elkouss (TU Delft - Quantum Computer Science, Okinawa Institute of Science and Technology Graduate University, TU Delft - QuTech Advanced Research Centre)
More Info
expand_more
Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.
Abstract
In the search for scalable, fault-tolerant quantum computing, distributed quantum computers are promising candidates. These systems can be realized in large-scale quantum networks or condensed onto a single chip with closely situated nodes. We present a framework for numerical simulations of a memory channel using the distributed toric surface code, where each data qubit of the code is part of a separate node, and the error-detection performance depends on the quality of four-qubit Greenberger-Horne-Zeilinger (GHZ) states generated between the nodes. We quantitatively investigate the effect of memory decoherence and evaluate the advantage of GHZ creation protocols tailored to the level of decoherence. We do this by applying our framework for the particular case of color centers in diamond, employing models developed from experimental characterization of nitrogen-vacancy centers. For diamond color centers, coherence times during entanglement generation are orders of magnitude lower than coherence times of idling qubits. These coherence times represent a limiting factor for applications, but previous surface code simulations did not treat them as such. Introducing limiting coherence times as a prominent noise factor makes it imperative to integrate realistic operation times into simulations and incorporate strategies for operation scheduling. Our model predicts error probability thresholds for gate and measurement reduced by at least a factor of three compared to prior work with more idealized noise models. We also find a threshold of 4 × 10 2 in the ratio between the entanglement generation and the decoherence rates, setting a benchmark for experimental progress.