DOE PAGES title logo U.S. Department of Energy
Office of Scientific and Technical Information
  1. Ancilla-entangling Floquet kicks for accelerating quantum algorithms

    Quantum simulation with adiabatic annealing can provide insight into difficult problems that are impossible to study with classical computers. However, it deteriorates when the systems scale up due to the shrinkage of the excitation gap and thus places an annealing rate bottleneck for high success probability. Here, in this study, we accelerate quantum simulation using digital multiqubit gates that entangle primary system qubits with ancillary qubits. The practical benefits originate from tuning the ancillary gauge degrees of freedom to enhance the quantum algorithm's original functionality in the system registry. For simple but nontrivial short-ranged, infinite long-ranged transverse-field Ising models, andmore » the hydrogen molecule model after qubit encoding, we show improvement in the time to solution by one hundred percent but with higher accuracy through exact state-vector numerical simulation in a digital-analog setting. The findings are further supported by time-averaged Hamiltonian theory.« less
  2. Benchmarking characterization methods for noisy quantum circuits

    Effective methods for characterizing the noise in quantum computing devices are essential for programming and debugging circuit performance. Existing approaches vary in the information obtained as well as the amount of quantum and classical resources required, with more information generally requiring more resources. Here we benchmark the characterization methods of gate set tomography, Pauli channel noise reconstruction, and empirical direct characterization for developing models that describe noisy quantum circuit performance on a 27-qubit superconducting transmon device. We evaluate these models by comparing the accuracy of noisy circuit simulations with the corresponding experimental observations. Here, we find that the agreement ofmore » noise model to experiment does not correlate with the information gained by characterization and that the underlying circuit strongly influences the best choice of characterization approach. Empirical direct characterization scales best of the methods we tested and produced the most accurate characterizations across our benchmarks.« less
  3. Deep quantum circuit simulations of low-energy nuclear states

    Numerical simulation is an important method for verifying the quantum circuits used to simulate low-energy nuclear states. However, real-world applications of quantum computing for nuclear theory often generate deep quantum circuits that place demanding memory and processing requirements on conventional simulation methods. Here, we present advances in high-performance numerical simulations of deep quantum circuits to efficiently verify the accuracy of low-energy nuclear physics applications. Our approach employs novel methods for accelerating the numerical simulation including management of simulated mid-circuit measurements to verify projection based state preparation circuits. In this study, we test these methods across a variety of high-performance computingmore » systems and our results show that circuits up to 21 qubits and more than 115,000,000 gates can be efficiently simulated.« less
  4. Dimensionality Reduction with Variational Encoders Based on Subsystem Purification

    Efficient methods for encoding and compression are likely to pave the way toward the problem of efficient trainability on higher-dimensional Hilbert spaces, overcoming issues of barren plateaus. Here, we propose an alternative approach to variational autoencoders to reduce the dimensionality of states represented in higher dimensional Hilbert spaces. To this end, we build a variational algorithm-based autoencoder circuit that takes as input a dataset and optimizes the parameters of a Parameterized Quantum Circuit (PQC) ansatz to produce an output state that can be represented as a tensor product of two subsystems by minimizing $Tr(ρ^2)$. The output of this circuit ismore » passed through a series of controlled swap gates and measurements to output a state with half the number of qubits while retaining the features of the starting state in the same spirit as any dimension-reduction technique used in classical algorithms. The output obtained is used for supervised learning to guarantee the working of the encoding procedure thus developed. We make use of the Bars and Stripes (BAS) dataset for an 8 × 8 grid to create efficient encoding states and report a classification accuracy of 95% on the same. Thus, the demonstrated example provides proof for the working of the method in reducing states represented in large Hilbert spaces while maintaining the features required for any further machine learning algorithm that follows.« less
  5. Quantum Programming Paradigms and Description Languages

    Here, this article offers perspective on quantum computing programming languages, as well as their emerging runtimes and algorithmic modalities. With the scientific high-performance computing (HPC) community as a target audience, we describe the current state of the art in the field, and outline programming paradigms for scientific workflows. One take-home message is that there is significant work required to first refine the notion of the quantum processing unit in order to integrate in the HPC environments. Programming for today’s quantum computers is making significant strides toward modern HPC-compatible workflows, but key challenges still face the field.
  6. Integrating quantum computing resources into scientific HPC ecosystems

    Quantum Computing (QC) offers significant potential to enhance scientific discovery in fields such as quantum chemistry, optimization, and artificial intelligence. Yet QC faces challenges due to the noisy intermediate-scale quantum era’s inherent external noise issues. Here, this paper discusses the integration of QC as a computational accelerator within classical scientific high-performance computing (HPC) systems. By leveraging a broad spectrum of simulators and hardware technologies, we propose a hardware-agnostic framework for augmenting classical HPC with QC capabilities. Drawing on the HPC expertise of the Oak Ridge National Laboratory (ORNL) and the HPC lifecycle management of the Department of Energy (DOE), ourmore » approach focuses on the strategic incorporation of QC capabilities and acceleration into existing scientific HPC workflows. This includes detailed analyses, benchmarks, and code optimization driven by the needs of the DOE and ORNL missions. Our comprehensive framework integrates hardware, software, workflows, and user interfaces to foster a synergistic environment for quantum and classical computing research. This paper outlines plans to unlock new computational possibilities, driving forward scientific inquiry and innovation in a wide array of research domains.« less
  7. Quantum-centric supercomputing for materials science: A perspective on challenges and future directions

    Computational models are an essential tool for the design, characterization, and discovery of novel materials. Computationally hard tasks in materials science stretch the limits of existing high-performance supercomputing centers, consuming much of their resources for simulation, analysis, and data processing. Quantum computing, on the other hand, is an emerging technology with the potential to accelerate many of the computational tasks needed for materials science. In order to do that, the quantum technology must interact with conventional high-performance computing in several ways: approximate results validation, identification of hard problems, and synergies in quantum-centric supercomputing. Here in this paper, we provide amore » perspective on how quantum-centric supercomputing can help address critical computational problems in materials science, the challenges to face in order to solve representative use cases, and new suggested directions.« less
  8. Approximate Boltzmann distributions in quantum approximate optimization

    Approaches to compute or estimate the output probability distributions from the quantum approximate optimization algorithm (QAOA) are needed to assess the likelihood it will obtain a quantum computational advantage. We analyze output from QAOA circuits solving 7200 random MaxCut instances, with $$n$$ = 14–23 qubits and depth parameter $$p$$ ≤ 12 and find that the average basis state probabilities follow approximate Boltzmann distributions: The average probabilities scale exponentially with their energy (cut value), with a peak at the optimal solution. Furthermore, we describe the rate of exponential scaling or effective temperature in terms of a series with a leading-order termmore » $$T$$ ~ $$C$$min/$$n$$ $$\sqrt{p}$$, with $$C$$min the optimal solution energy. Using this scaling, we generate approximate output distributions with up to 38 qubits and find these give accurate accounts of important performance metrics in cases we can simulate exactly.« less
  9. Modeling Singlet Fission on a Quantum Computer

    We demonstrate a practical application of quantum computing by using it to investigate the linear H4 molecule as a simple model for singlet fission. We use the Peeters-Devreese-Soldatov energy functional to calculate the necessary energetics based on the moments of the Hamiltonian estimated on the quantum computer. To reduce the number of required measurements, we use several independent strategies: 1) reduction of the size of the relevant Hilbert space by tapering off qubits; 2) measurement optimization via rotations to eigenbases shared by groups of qubit-wise commuting Pauli strings; and 3) parallel execution of multiple state preparation and measurement operations usingmore » all 20 qubits available on the Quantinuum H1-1 quantum hardware. Our results meet the energetic requirements for singlet fission, are in excellent agreement with exact transition energies (for the chosen one-particle basis), and outperform classical methods considered computationally feasible for singlet fission candidates.« less
  10. Solving MaxCut with quantum imaginary time evolution

    We introduce a method to solve the MaxCut problem efficiently based on quantum imaginary time evolution (QITE). We employ a linear Ansatz for unitary updates and an initial state involving no entanglement, as well as an imaginary-time-dependent Hamiltonian interpolating between a given graph and a subgraph with two edges excised. We apply the method to thousands of randomly selected graphs with up to fifty vertices. We show that our algorithm exhibits a 93% and above performance converging to the maximum solution of the MaxCut problem for all considered graphs. Our results compare favorably with the performance of classical algorithms, suchmore » as the greedy and Goemans–Williamson algorithms. We also discuss the overlap of the final state of the QITE algorithm with the ground state as a performance metric, which is a quantum feature not shared by other classical algorithms.« less
...

Search for:
All Records
Creator / Author
"Humble, Travis"

Refine by:
Article Type
Availability
Journal
Creator / Author
Publication Date
Research Organization