Skip to content

Electronic structure problem

Overview

We seek the energy eigenstates (or thermal states) of the Hamiltonian used to describe the electrons in molecules or material systems. The electrons interact with each other, in addition to fields produced by the nuclei (which are typically assumed to be fixed in position, and classical) and any external applied fields.

In simulations of a finite sized system, there is not a clear distinction between a "molecule" and a "material"—materials may be viewed as an extended molecule, typically with a repeating underlying atomic structure. In materials we are additionally concerned with extrapolating finite size properties to the thermodynamic limit by repeating the simulation at a range of system sizes. This enables the measurement of thermodynamic properties, such as phase diagrams. For molecular systems, we are interested in measuring microscopic properties, such as excitation energies, reaction rates, dipole moments, or nuclear forces.

One may also consider time evolution under the electronic structure Hamiltonian; this is a less well-studied problem in both classical and quantum settings, likely due to the high costs of classical simulations. As such, we will predominantly focus on static properties, commenting on dynamics simulations where relevant.

Actual end-to-end problem(s) solved

The Hamiltonian of a system consisting of \(K\) nuclei and \(\eta\) electrons interacting via the Coulomb interaction is (in atomic units)

\[\begin{align} H = -\sum_{i=1}^\eta \frac{(\nabla_{i})^2}{2} - \sum_{I=1}^K \frac{(\nabla_{I})^2}{2M_I} - \sum_{i,I}\frac{Z_I}{|r_{i}-R_{I}|} +\frac{1}{2}\sum_{i\neq j}\frac{1}{|r_{i}-r_{j}|} + \frac{1}{2}\sum_{I\neq J}\frac{Z_IZ_J}{|R_{I}-R_{J}|} \end{align}\]

where \(\nabla\) is the derivative operator, \(r_{i}\) gives the position of the \(i\)th electron, and \(R_{I}\) and \(Z_I\) give the position and charge of the \(I\)th nucleus. It is often appropriate to make the Born–Oppenheimer approximation, fixing the positions of the nuclei, which are treated as classical particles. The resulting electronic Hamiltonian at a fixed nuclear configuration is given by

\[\begin{equation} \label{Eq:BornOppElectronic} H(\{R_{I}\}) = -\sum_i\frac{(\nabla_{i})^2}{2} - \sum_{i,I}\frac{Z_I}{|r_{i}-R_{I}|} + \frac{1}{2}\sum_{i\neq j}\frac{1}{|r_{i}-r_{j}|} + V(\{R_{I}\}) \end{equation}\]

where \(V(\{R_{I}\})\) is the constant offset from the nuclear repulsion energy. This Hamiltonian can be projected onto a basis set \(\{\phi_i(r)\}_{i=1}^N\) of electron spin orbital functions or grid points, and solved for the electronic eigenstates \(\ket{E_i}\) or thermal state \(\rho \propto e^{-\beta H}\). We note that for many molecules, the ground state of the electronic structure Hamiltonian is a good approximation for the thermal state at room temperature. This can be contrasted with the vibrational structure of molecules, where excited states are also populated at room temperature. When simulating dynamics, it is necessary to use a basis set that is sufficiently flexible (or adaptive) to accurately describe the states at all times (for example, many chemical basis sets are highly optimized for ground state calculations and so are less suitable for dynamics calculations).

The electronic energy is the largest contribution to the energy of molecular/material systems in ambient conditions, and dictates the equilibrium structure and motion of the nuclei. As a result, the electronic energy eigenstates (or thermal states) often provide a good description of a wide range of system properties. Preparing the desired electronic state for a given nuclear configuration is typically the first step in learning properties of the system. We then measure the expectation values of observables with respect to these states. Properties of interest for molecular systems include:

  • Energy values, potentially across a range of nuclear configurations (for electronic excitation energies at a fixed nuclear geometry, determining molecular geometries by computing the electronic ground state energy at different geometries, and finding reaction pathways & rates by computing energy differences between a sequence of geometries involved in a reaction).
  • Determining transition probabilities between states (for reactions and optical properties).
  • Differential changes in electronic energy in response to an applied field, for example, electronic or magnetic dipole moments, polarizability.
  • Calculating forces on the nuclei, for use in molecular dynamics calculations (used in a range of applications, including protein folding and calculating drug molecule binding affinities).

Properties of interest for materials include:

  • Energy densities for given system parameters (to determine phase diagrams).
  • Thermodynamic properties (magnetization, thermal/electrical conductivity, bulk modulus).
  • Particle densities and correlation functions between positions.

In order to understand how these observables vary as the system parameters (i.e. nuclear positions, atomic doping, temperature, applied field etc.) are changed, the desired state may need to be prepared and measured a number of times.

In dynamics simulations, one may consider how the system evolves in response to a perturbation such as that induced by an ultrafast laser pulse [1, 2, 3], or in particle scattering interactions.

Dominant resource cost/complexity

Mapping the problem to qubits:

We discretize the electron positions by projecting onto a basis of spin orbitals. The discretization error typically decays as \(1/N\) where \(N\) is the number of spin orbitals used [4, 5] and is limited by the resolution of singularities in the Coulomb interaction at charge coalescences. A variety of functional forms have been considered for the electron orbitals (see Table 1). The optimal choice will be system dependent and must consider:

  • The resolution of the orbital (improved by matching the character of local vs delocalized physics in the system to that of the orbital).
  • The cost of computing the Hamiltonian, either in classical precomputation or (if required) coherently on a quantum device (see "Accessing the Hamiltonian," below).
  • The properties of the resulting Hamiltonian (number of terms, 1-norm, locality of terms, etc.) which determine the cost of accessing the Hamiltonian in algorithms.

We can represent electronic states on a quantum computer using either first or second quantized representations.

  • For \(\eta\) electrons in \(N\) spin orbitals, first quantization uses \(\eta\) registers, which each contain \(\log_2(N)\) qubits; each register enumerates which orbital its corresponding electron is in, and the wavefunction must then be antisymmetrized to respect fermionic constraints [6]. The Hamiltonian of Eq. \(\eqref{Eq:BornOppElectronic}\) in first quantization can be written as
    \[\begin{equation} H = \sum_\alpha^\eta \sum_{i,j}^N h_{ij} \ket{i}\bra{j}_\alpha + \frac{1}{2} \sum_{\alpha \neq \beta}^\eta \sum_{i,j,k,l}^N h_{ijkl} \ket{i}\bra{l}_\alpha \otimes \ket{j}\bra{k}_\beta \end{equation}\]

    with one- and two-electron integrals

    \[\begin{align} h_{ij} & = \int dr \phi_i^*(r) \left(-\frac{(\nabla)^2}{2} - \sum_I \frac{Z_I}{|r - R_I|} \right) \phi_j(r) \\ h_{ijkl} & = \int dr_1 dr_2 \frac{\phi_i^*(r_1) \phi_j^*(r_2) \phi_k(r_2) \phi_l(r_1)}{|r_1 - r_2|}. \end{align}\]
  • In second quantization, antisymmetry is stored in the operators, which obey fermionic anticommutation relations. The Hamiltonian of Eq. \(\eqref{Eq:BornOppElectronic}\) in second quantization can be written as
    \[\begin{equation} H = \sum_{i,j}^N h_{ij} a_i^\dag a_j + \frac{1}{2} \sum_{i,j,k,l}^N h_{ijkl} a_i^\dag a_j^\dag a_k a_l. \end{equation}\]

    Under the commonly used Jordan–Wigner mapping (other mappings have also been studied, see [7] for discussion) we require \(N\) qubits, where each qubit stores the occupancy of the corresponding spin orbital. These mappings induce a mapping of the Hamiltonian (and other observables) to qubit operators.

Representation Gaussians Plane waves Bloch/Wannier functions Grids
First quantized [8] 1 [9, 10] Not yet studied [11, 12, 10]
Second quantized [13] [14] [15, 16] [14, Appendix A]

Table 1: Representative references (chosen based on their discussion of their choice of representation) showing the use of different basis functions in quantum algorithms for the electronic structure problem. Note, this is not intended to be a complete list of all works that have used these basis sets.

Accessing the Hamiltonian:

Quantum algorithms for the electronic structure problem require access to the Hamiltonian. This is typically provided by block-encoding or Hamiltonian simulation. For some approaches, it may be necessary to compute Hamiltonian coefficients (molecular integrals) or matrix elements coherently [12, 8, 17, 18, 9, 10], or load them from a quantum memory [19, 20, 21]. As this access is often a dominant contribution to the cost of quantum algorithms, significant effort has been spent on methods of factorizing the electronic structure Hamiltonian to reduce the resources required for accessing it coherently [22, 19, 20, 21, 16]. Some data-loading routines provide the ability to trade gate count for additional ancilla qubits, leading to a larger logical qubit count than required to store the system wavefunction (see the section on loading classical data for additional details).

State preparation:

Solving the electronic structure problem on a quantum computer reduces to the task of preparing a desired state, and measuring observables. The state to be prepared is typically an energy eigenstate, a thermal state, or a time evolved state.

  • Energy eigenstates: In the following discussion, we refer to the overlap \(\gamma = |\braket{\psi}{E_j}|\) between a desired eigenstate \(\ket{E_j}\) and a given initial state \(\ket{\psi}\), and the minimum gap \(\Delta\) between the desired energy eigenvalue and other energy eigenvalues. Below, we list several methods for preparing energy eigenstates, or approximations to them.
  • Approximate eigenstates: Approximate eigenstates obtained from a classical calculation can be prepared as quantum trial states using the methods of [23, 24], which scale as \(\mathcal{O}\left( ND \right)\), where \(D\) is the number of Slater determinants in the trial state. These states can be used as input for the methods below.
  • Eigenstate filtering: Methods such as those in [25, 26] filter out undesired eigenstates using spectral window functions applied via quantum singular value transformation (QSVT) to a block-encoding of the Hamiltonian. The complexity to prepare the ground state (to infidelity \(\epsilon\), with failure probability less than \(\theta\)) using this approach scales as \(\widetilde{\mathcal{O}}\left( \frac{\alpha}{\gamma \Delta} \log(\theta^{-1} \epsilon^{-1}) \right)\) calls to an \((\alpha, m, 0)\)-block-encoding of the Hamiltonian (where \(\alpha \geq \nrm{H}\) is a normalization factor of the block-encoding). For comparison to related methods, we refer the reader to [27, 26].
  • Adiabatic state preparation (ASP): ASP can be used to prepare a target eigenstate (typically the ground state) by evolving from the corresponding easy-to-prepare eigenstate of an initial Hamiltonian \(H(0)\) to the full electronic structure Hamiltonian \(H(1)\). Time evolution can be implemented using algorithms for Hamiltonian simulation. The total evolution time is typically chosen according to the heuristic \(T \gg \max_{0 \leq s \leq 1} \nrm{\frac{dH}{ds}} / \Delta(s)^2\) where \(s\) describes the adiabatic path \(H(s)\) and \(\Delta(s)\) is the spectral gap of \(H(s)\). It is difficult to analytically bound this complexity for molecular systems (see e.g., [28]) motivating numerical studies on small molecules [29, 30, 31, 32].
  • Quantum phase estimation (QPE): The above techniques all provide methods of preparing approximate eigenstates, in some cases using promises on the gap \(\Delta\), or by exploiting pre-existing knowledge of the energy eigenvalue. Given an approximate eigenstate, we can use QPE to project into the desired eigenstate and provide an estimate of the eigenenergy. QPE makes \(\mathcal{O}\left( \gamma^{-2} \epsilon^{-1} \right)\) calls to a unitary \(U\) encoding the spectrum of the Hamiltonian, where \(\gamma = |\braket{\psi}{E_j}|\) is the overlap between the state \(\ket{\psi}\) input to quantum phase estimation, and the desired energy eigenstate \(\ket{E_j}\), and \(\epsilon\) is the desired precision in the energy estimate. It is possible to improve the complexity to \(\mathcal{O}\left( \gamma^{-1} \epsilon^{-1} \right)\) using amplitude amplification, or to \(\mathcal{O}\left( \gamma^{-2} \Delta^{-1} + \epsilon^{-1} \right)\) by exploiting knowledge of the gap \(\Delta\) between the energy eigenstates to perform rejection sampling [6]. The unitary encoding the Hamiltonian is typically either \(U \approx e^{-iHt}\) (the approximation error must be balanced against the error from QPE) implemented via Hamiltonian simulation, or a quantum walk operator \(W\) which acts like \(e^{i\arccos{H}}\) and can be implemented via qubitization [33, 6] (note that if phase estimation is performed on a qubitization operator, the output state will have the form \(\frac{1}{\sqrt{2}}(\ket{E_j}\ket{0} \pm \ket{\phi_j 0^\perp})\), which reduces the success probability of obtaining the desired eigenstate by 50% [6]). The costs to implement \(U\) are inherited from the method used, based on the properties (commutativity, locality, number of terms, 1-norm, cost of coherently calculating coefficients) of the Hamiltonian in the chosen spin orbital basis.
  • Thermal states: Several quantum algorithms have been proposed for preparing thermal states [34, 35, 36, 37]. The most efficient algorithms typically make repeated calls to a block-encoding of the Hamiltonian. The complexity of these methods for concrete electronic structure problems of interest has not yet been determined. Thermal states could also be used as an approximation to the ground state, by choosing the temperature to be sufficiently low compared to the gap between the ground and first excited state [37].
  • Time evolved states: A time evolved state can be prepared using Hamiltonian simulation algorithms, up to an error \(\epsilon\). While many proposed quantum algorithms for chemistry simulation have considered using Hamiltonian simulation as a subroutine in quantum phase estimation, these have typically considered the use of Gaussian basis functions, which are not sufficiently flexible to accurately describe the time dynamics of the electrons. Classical algorithms for this task typically consider grid- or plane wave–based methods for dynamics simulations. Reference [38] compared the costs of Trotter-based methods [12] and prior work in the interaction picture [18, 9, 39] against classical mean-field methods, finding large polynomial speedups, even for this apples-to-oranges comparison.

Measuring observables:

In a fault-tolerant computation, it is preferable to measure observables through phase estimation-like approaches, rather than direct measurement averaging, as the former is asymptotically more efficient and can be made robust to logical errors through repetition and majority voting. Measurement schemes have been developed which achieve this using overlap estimation [40] (which can be viewed as a special case of amplitude estimation) or the approach of [41, 42] based on the quantum gradient estimation algorithm of [43]. Both approaches require access to a state preparation unitary \(U_\psi\), and its inverse2. The algorithm based on overlap estimation can be formulated as performing amplitude estimation on \(U_O\), a unitary block-encoding of the observable \(O\) with subnormalization factor \(\alpha_O\). The complexity to compute the expectation value to precision \(\epsilon\) is \(\mathcal{O}\left( \alpha_O/\epsilon \right)\) calls to \(U_O\) and \(U_\psi\) (or the reflection \(R_\psi = I - 2 \ket{\psi}\bra{\psi}\)) and their inverses. This approach has been considered in the context of measuring: correlation functions, density of states, and linear response properties (all in [44]), and energy gradients with respect to various parameters (which can be used to compute forces or dipole moments, and for which a range of estimation strategies are possible) [45, 46].

The gradient-based algorithm simultaneously computes the value of \(M\) (noncommuting) observables \(O_j\) by making \(\widetilde{\mathcal{O}}\left( M^{1/2}/\epsilon \right)\) calls to \(U_\psi, U_\psi^\dag\) (or \(R_\psi\)) and either \(\widetilde{\mathcal{O}}\left( M^{3/2}/\epsilon \right)\) calls to gates of the form \(e^{i x O_j}\) [41] or \(\widetilde{\mathcal{O}}\left( M/\epsilon \right)\) calls to a block-encoding of the observables [42]. The algorithm also requires \(\mathcal{O}\left( M \log(1/\epsilon) \right)\) additional qubits. This approach has been considered in the context of measuring nuclear forces [45], fermionic reduced density matrices [41] and dynamic correlation functions [41].

Existing error corrected resource estimates

There are a large number of resource estimates for performing phase estimation to learn the ground state energies of molecular or material systems, which we list in Table 2 and Table 3. These resource estimates use compilation methods described in the fault-tolerant quantum computing section. We also note the existence of a software package that provides features for calculating the non-Clifford costs of quantum phase estimation for the electronic structure problem [47]. There are currently no results that provide resource estimates for solving a full end-to-end application (see caveats below).

Molecule(s) References Number of Logical qubits Number of \(T\)/Toffoli gates
FeMo-co
(Nitrogen fixation)
[28, 19, 20, 21, 48, 47] \(2196\)[21]
\(\sim 193\)[48]
\(3.2 \times 10^{10}\)[21]
\(\sim 5 \times 10^{11}\) [48]
Cytochrome P450
(Biological drug metabolizing enzyme)
[49] \(1434\) \(7.8 \times 10^{9}\)
Lithium-ion
battery molecules
[50, 9] \((10^4-10^5)\) [50]
(\(2000 - 3000\)) [9]
\((10^{12} - 10^{14})\) [50]  
\((10^{11} - 10^{12})\) [9]
Chromium dimer [51] \(\sim 1300\) \(\sim 10^{10}\)
Ruthenium catalyst
(CO2 fixation)
[20] \(\sim 4000\) \(\sim 3 \times 10^{10}\)
Ibrutinib
(drug molecule)
[52] \(2207\) \(1.1 \times 10^{10}\)

Table 2: Fault-tolerant resource estimates for quantum phase estimation applied to a range of molecular systems. The presented gate counts are for a single run of the phase estimation circuit. QPE must be run a number of times if the overlap is \(\leq 1\), and to account for rounding errors in phase estimation [53]. The molecules presented can have different numbers of electrons, orbitals, and classical simulation complexities, and so the results may not be directly comparable, even within a single row of the table.

Material(s) References Number of Logical qubits Number of \(T\)/Toffoli gates
Homogeneous electron gas
(Prototypical model)
[54, 55, 56, 9] \((1500-5000)\)[9]
\(\sim(100 - 1000)\) [54, 56]
\((10^9-10^{14})\)[9]
\(\sim(10^8 - 10^{11})\) [54, 56]
Lithium-ion
battery materials
[57, 58, 16] (\(2375 - 6652\))[57]
\(10^4\) [58]
\((10^5 - 10^6)\)[16]
(\(5 \times 10^{12} - 5 \times 10^{14}\)) [57]
\(10^{15}\) [58]
\((10^{12} - 10^{14})\) [16]
Condensed phase elements
Lithium, Diamond, etc
[54, 55] \(128\) [55] \((10^8 - 10^{11})\) [55]
Transition metal catalysts
Nickel/Palladium Oxide
[15] \(10^4 - 10^5\) \(10^{10} - 10^{13}\)

Table 3: Fault-tolerant resource estimates for quantum phase estimation applied to a range of material systems. The presented gate counts are for a single run of the phase estimation circuit. QPE must be run a number of times if the overlap is \(\leq 1\), and to account for rounding errors in phase estimation [53]. The systems presented in a given row may be different chemical compounds, and/or can have different numbers of electrons, orbitals, and classical simulation complexities, and so the results may not be directly comparable.

There have been comparatively few studies of the fault-tolerant resources required for the simulation of chemical dynamics. Recent work has computed the resources required to calculate the energy loss of charged particles moving through a medium ("stopping power\"), as pertaining to nuclear fusion experiments [59]. End-to-end resource estimates were determined, including the costs of initial state preparation, measurement of observables, and repetitions across a range of parameters. The resource estimates for the end-to-end task ranged from \(\sim 2000\) logical qubits and \(\mathcal{O}\left( 10^{13} \right)\) Toffoli gates, to \(\sim 30000\) logical qubits and \(\mathcal{O}\left( 10^{17} \right)\) Toffoli gates.

Caveats

Existing resource estimates typically consider only a single run of phase estimation and assume that we have access to the desired energy eigenstate. As outlined above, both phase estimation and eigenstate filtering scale as \(\Omega{\gamma^{-1} \Delta^{-1}}\) when we have a lower bound on the gap. The "orthogonality catastrophe" suggests that the overlap of simple trial states with the desired eigenstate will decay exponentially as a function of system size. It is still an open question [23, 31] as to whether initial states with nonexponentially vanishing overlaps can be prepared for systems of interest. This issue may become more pressing for materials systems as we scale to the thermodynamic limit. In general, we know that the problem of finding the ground state of electronic structure Hamiltonians is QMA-hard [60], but it is not yet known if these complexity theoretic statements provide intuition for physically realistic Hamiltonians.

As noted above, to accurately resolve the system, a large basis set must be used (the discretization error decays as \(1/N\) where \(N\) is the number of spin orbitals considered). In practice, one typically repeats the calculation using increasingly accurate basis sets and then extrapolates to the continuum limit. Most quantum resource estimates to date have considered basis sets of the minimal allowable size (for exceptions, see [50, 9, 51, 56, 57, 58, 16, 59]), and so underestimate the resources required to achieve sufficiently accurate results to be informative.

The end-to-end applications typically solved in the electronic structure problem can require between tens (structure determination) and millions (molecular dynamics) of energy evaluations—each with different Hamiltonian parameters that may require preparing a new state to be measured. For example, a recent analysis of quantum algorithms applied to pharmaceutical chemistry [61] highlighted that to calculate the binding affinity between a drug molecule and its target (free energy differences) requires sampling a range of thermodynamic configurations, resulting in millions to billions of single-point energy evaluations. This introduces a large overhead when preparing a different state for each configuration and measuring its energy [45], although alternative approaches may provide more favorable scaling [62].

Comparable classical complexity and challenging instance sizes

The cost of exact diagonalization of the electronic structure Hamiltonian scales exponentially with the number of electrons and basis set size. As such, classical approaches to the electronic structure problem typically utilize a range of approximations that reduce their complexity to polynomial in an approximation parameter but introduce a (potentially uncontrolled) deviation from the exact ground state, leading to a bias in energy estimates and/or the expectation values of other observables. Approaches include: Hartree–Fock, density functional theory, perturbation theory, configuration interaction methods, coupled cluster methods, quantum Monte Carlo techniques, and tensor network approaches. The cheapest approaches can be applied to thousands of orbitals, but can be qualitatively inaccurate for strongly correlated systems. The most expensive approaches are more effective for strongly correlated systems, but their higher computational cost limits their applicability to roughly 100 spin orbitals. For example, [49] found that a density matrix renormalization group (DMRG) calculation performed on an 86 spin orbital active space of the Cytochrome P450 enzyme molecule referenced in Table 2 required around 50 hours, using 32 threads, 48 GB of RAM, and 235 GB of disk memory. We also refer to [63] for a comparison of 20 first-principles many-body electronic structure methods applied to a test set of seven transition metal atoms and their ions and monoxides.

Due to their extended nature, material systems are most commonly targeted with density functional theory (DFT). DFT can be applied to systems with thousands of electrons and orbitals, but can lead to uncontrolled energy bias in strongly correlated systems. Quantum Monte Carlo and tensor network methods have been successfully applied to prototypical models of material systems, and are becoming increasingly practical for more realistic models. We refer to [64, 65, 66, 67] for cutting edge benchmarks of classical electronic structure methods on hydrogen chains and Hubbard models scaling to the thermodynamic limit, which act as simplified models for real materials.

Speedup

It is nontrivial to determine the speedup of quantum algorithms for the electronic structure problem over their classical counterparts. If we consider the subtask of determining energy eigenstates, then for speedup greater than polynomial to be achieved, we require:

  • The ability to prepare a trial state with nonexponentially vanishing overlap with the ground state as the system size increases.
  • Polynomially scaling classical algorithms having an exponential growth in their approximation parameter (e.g., bond dimension, number of excitations) as the system size increases.

Whether these two requirements can coexist in systems of interest is an active area of research [31]. Even if exponential speedups are not available, it may be the case that quantum algorithms provide polynomial speedups over exact classical algorithms—and potentially over approximate classical algorithms.

From a complexity theoretic viewpoint, we know that simulating the dynamics of a quantum system is a BQP-complete problem [68]. Combined with the observed difficulty of classically simulating the time evolution of electronic structure Hamiltonians, this may be taken as evidence for the possibility of an exponential speedup when simulating dynamics. In [38] quantum algorithms for simulating the dynamics of electrons in a grid or plane-wave basis [12, 18, 9] were compared against classical methods for mean-field dynamics. Large polynomial speedups were observed, ranging from superquadratic to seventh power in the salient parameters, depending on the relation between \(N\) and \(\eta\).

NISQ implementations

Solving the electronic structure problem is one of the most widely studied and touted NISQ applications. The primary NISQ approach is the variational quantum eigensolver (VQE). There have been a number of experimental demonstrations on small molecules, e.g., Refs. [69, 70], as well as proposals to simulate material systems [71, 72]. Related methods, such as quantum computing assisted quantum Monte Carlo methods [73] have also been developed. Nevertheless, current device noise rates are too high to enable the running of circuits sufficiently deep that they can outperform classical electronic structure methods. There is currently no evidence that heuristic NISQ approaches will be able to scale to large system sizes and provide advantage over classical methods. There have also been proposals to simulate the electronic structure problem using analog quantum simulators [74], though to the best of our knowledge, these have not yet been experimentally demonstrated.

Outlook

Solving the electronic structure problem has repeatedly been identified as one of the most promising applications for quantum computers. Nevertheless, the discussion above highlights a number of challenges for current quantum approaches to become practical. Most notably, after accounting for the approximations typically made (i.e. incorporating the cost of initial state preparation, using nonminimal basis sets, including repetitions for correctness checking and sampling a range of parameters), a large number of logical qubits and total \(T\)/Toffoli gates are required. A major difficulty is that, unlike problems such as factoring, the end-to-end electronic structure problem typically requires solving a large number of closely related problem instances.

Solving the electronic structure problem for materials is likely to be more difficult than for molecules for both classical and quantum algorithms. This is predominantly due to the larger system sizes considered. First quantized quantum algorithms may provide a promising approach to efficiently represent the large system sizes required, and their natural use of a plane wave basis is well suited to periodic material systems [9]. Nevertheless, additional developments are required to understand how to best apply these algorithms to real systems [58].

Bibliography

  1. Bern Kohler, Jeffrey L. Krause, Ferenc Raksi, Kent R. Wilson, Vladislav V. Yakovlev, Robert M. Whitnell, and YiJing Yan. Controlling the future of matter. Accounts of Chemical Research, 28(3):133–140, 1995. URL: https://doi.org/10.1021/ar00051a006, arXiv:https://doi.org/10.1021/ar00051a006, doi:10.1021/ar00051a006.

  2. A. Assion, T. Baumert, M. Bergt, T. Brixner, B. Kiefer, V. Seyfried, M. Strehle, and G. Gerber. Control of chemical reactions by feedback-optimized phase-shaped femtosecond laser pulses. Science, 282(5390):919–922, 1998. URL: https://www.science.org/doi/abs/10.1126/science.282.5390.919, arXiv:https://www.science.org/doi/pdf/10.1126/science.282.5390.919, doi:10.1126/science.282.5390.919.

  3. Ferenc Krausz and Misha Ivanov. Attosecond physics. Reviews of Modern Physics, 81:163–234, 2 2009. URL: https://link.aps.org/doi/10.1103/RevModPhys.81.163, doi:10.1103/RevModPhys.81.163.

  4. Asger Halkier, Trygve Helgaker, Poul Jørgensen, Wim Klopper, Henrik Koch, Jeppe Olsen, and Angela K. Wilson. Basis-set convergence in correlated calculations on ne, n\(\_2\), and h\(\_2\)o. Chemical Physics Letters, 286(3):243–252, 1998. URL: https://www.sciencedirect.com/science/article/pii/S0009261498001110, doi:https://doi.org/10.1016/S0009-2614(98)00111-0.

  5. James J. Shepherd, Andreas Grüneis, George H. Booth, Georg Kresse, and Ali Alavi. Convergence of many-body wave-function expansions using a plane-wave basis: from homogeneous electron gas to solid state systems. Physical Review B, 86:035111, 7 2012. arXiv: https://arxiv.org/abs/1202.4990. URL: https://link.aps.org/doi/10.1103/PhysRevB.86.035111, doi:10.1103/PhysRevB.86.035111.

  6. Dominic W. Berry, Mária Kieferová, Artur Scherer, Yuval R. Sanders, Guang Hao Low, Nathan Wiebe, Craig Gidney, and Ryan Babbush. Improved techniques for preparing eigenstates of fermionic hamiltonians. npj Quantum Information, 4(1):22, 5 2018. arXiv: https://arxiv.org/abs/1711.10460. URL: https://doi.org/10.1038/s41534-018-0071-5, doi:10.1038/s41534-018-0071-5.

  7. Sam McArdle, Suguru Endo, Alán Aspuru-Guzik, Simon C. Benjamin, and Xiao Yuan. Quantum computational chemistry. Reviews of Modern Physics, 92:015003, 3 2020. arXiv: https://arxiv.org/abs/1808.10402. URL: https://link.aps.org/doi/10.1103/RevModPhys.92.015003, doi:10.1103/RevModPhys.92.015003.

  8. Ryan Babbush, Dominic W Berry, Yuval R Sanders, Ian D Kivlichan, Artur Scherer, Annie Y Wei, Peter J Love, and Alán Aspuru-Guzik. Exponentially more precise quantum simulation of fermions in the configuration interaction representation. Quantum Science and Technology, 3(1):015006, 2017. arXiv: https://arxiv.org/abs/1506.01029. doi:10.1088/2058-9565/aa9463.

  9. Yuan Su, Dominic W Berry, Nathan Wiebe, Nicholas Rubin, and Ryan Babbush. Fault-tolerant quantum simulations of chemistry in first quantization. PRX Quantum, 2(4):040332, 2021. arXiv: https://arxiv.org/abs/2105.12767. doi:10.1103/PRXQuantum.2.040332.

  10. Hans Hon Sang Chan, Richard Meister, Tyson Jones, David P. Tew, and Simon C. Benjamin. Grid-based methods for chemistry simulations on a quantum computer. Science Advances, 9(9):eabo7484, 2023. arXiv: https://arxiv.org/abs/2202.05864. URL: https://www.science.org/doi/abs/10.1126/sciadv.abo7484, arXiv:https://www.science.org/doi/pdf/10.1126/sciadv.abo7484, doi:10.1126/sciadv.abo7484.

  11. Ian D Kivlichan, Nathan Wiebe, Ryan Babbush, and Alán Aspuru-Guzik. Bounding the costs of quantum simulation of many-body physics in real space. Journal of Physics A: Mathematical and Theoretical, 50(30):305301, 6 2017. arXiv: https://arxiv.org/abs/1608.05696. URL: https://dx.doi.org/10.1088/1751-8121/aa77b8, doi:10.1088/1751-8121/aa77b8.

  12. Ivan Kassal, Stephen P. Jordan, Peter J. Love, Masoud Mohseni, and Alán Aspuru-Guzik. Polynomial-time quantum algorithm for the simulation of chemical dynamics. Proceedings of the National Academy of Sciences, 105(48):18681–18686, 2008. arXiv: https://arxiv.org/abs/0801.2986. URL: https://www.pnas.org/doi/abs/10.1073/pnas.0808245105, arXiv:https://www.pnas.org/doi/pdf/10.1073/pnas.0808245105, doi:10.1073/pnas.0808245105.

  13. James D. Whitfield, Jacob Biamonte, and Alán Aspuru-Guzik. Simulation of electronic structure hamiltonians using quantum computers. Molecular Physics, 109(5):735–750, 2011. arXiv: https://arxiv.org/abs/1001.3855. URL: https://doi.org/10.1080/00268976.2011.552441, arXiv:https://doi.org/10.1080/00268976.2011.552441, doi:10.1080/00268976.2011.552441.

  14. Ryan Babbush, Nathan Wiebe, Jarrod Mcclean, James Mcclain, Hartmut Neven, and Garnet Kin-Lic Chan. Low-depth quantum simulation of materials. Physical Review X, 8(1):11044, 2018. URL: https://doi.org/10.1103/PhysRevX.8.011044, doi:10.1103/PhysRevX.8.011044.

  15. Aleksei V. Ivanov, Christoph Sünderhauf, Nicole Holzmann, Tom Ellaby, Rachel N. Kerber, Glenn Jones, and Joan Camps. Quantum computation for periodic solids in second quantization. Physical Review Research, 5:013200, 3 2023. arXiv: https://arxiv.org/abs/2210.02403. URL: https://link.aps.org/doi/10.1103/PhysRevResearch.5.013200, doi:10.1103/PhysRevResearch.5.013200.

  16. Nicholas C Rubin, Dominic W Berry, Fionn D Malone, Alec F White, Tanuj Khattar, A Eugene DePrince III, Sabrina Sicolo, Michael Kühn, Michael Kaicher, Joonho Lee, and others. Fault-tolerant quantum simulation of materials using bloch orbitals. arXiv: https://arxiv.org/abs/2302.05531, 2023.

  17. Ryan Babbush, Dominic W Berry, Ian D Kivlichan, Annie Y Wei, Peter J Love, and Alán Aspuru-Guzik. Exponentially more precise quantum simulation of fermions in second quantization. New Journal of Physics, 18(3):033032, 3 2016. arXiv: https://arxiv.org/abs/1506.01020. URL: https://dx.doi.org/10.1088/1367-2630/18/3/033032, doi:10.1088/1367-2630/18/3/033032.

  18. Ryan Babbush, Dominic W. Berry, Jarrod R. McClean, and Hartmut Neven. Quantum simulation of chemistry with sublinear scaling in basis size. npj Quantum Information, 5(1):92, 11 2019. arXiv: https://arxiv.org/abs/1807.09802. URL: https://doi.org/10.1038/s41534-019-0199-y, doi:10.1038/s41534-019-0199-y.

  19. Dominic W. Berry, Craig Gidney, Mario Motta, Jarrod R. McClean, and Ryan Babbush. Qubitization of arbitrary basis quantum chemistry leveraging sparsity and low rank factorization. Quantum, 3:208, 12 2019. arXiv: https://arxiv.org/abs/1902.02134. URL: https://doi.org/10.22331/q-2019-12-02-208, doi:10.22331/q-2019-12-02-208.

  20. Vera von Burg, Guang Hao Low, Thomas Häner, Damian S. Steiger, Markus Reiher, Martin Roetteler, and Matthias Troyer. Quantum computing enhanced computational catalysis. Physical Review Research, 3(3):033055, 2021. arXiv: https://arxiv.org/abs/2007.14460. doi:10.1103/PhysRevResearch.3.033055.

  21. Joonho Lee, Dominic W Berry, Craig Gidney, William J Huggins, Jarrod R McClean, Nathan Wiebe, and Ryan Babbush. Even more efficient quantum computations of chemistry through tensor hypercontraction. PRX Quantum, 2(3):030305, 2021. arXiv: https://arxiv.org/abs/2011.03494. doi:10.1103/PRXQuantum.2.030305.

  22. Mario Motta, Erika Ye, Jarrod R McClean, Zhendong Li, Austin J Minnich, Ryan Babbush, and Garnet Kin Chan. Low rank representations for quantum simulation of electronic structure. npj Quantum Information, 7(1):1–7, 2021. arXiv: https://arxiv.org/abs/1808.02625. URL: https://www.nature.com/articles/s41534-021-00416-z, doi:https://doi.org/10.1038/s41534-021-00416-z.

  23. Norm M. Tubman, Carlos Mejuto-Zaera, Jeffrey M. Epstein, Diptarka Hait, Daniel S. Levine, William Huggins, Zhang Jiang, Jarrod R. McClean, Ryan Babbush, Martin Head-Gordon, and K. Birgitta Whaley. Postponing the orthogonality catastrophe: efficient state preparation for electronic structure simulations on quantum devices. arXiv: https://arxiv.org/abs/1809.05523, 2018.

  24. Kenji Sugisaki, Shigeaki Nakazawa, Kazuo Toyota, Kazunobu Sato, Daisuke Shiomi, and Takeji Takui. Quantum chemistry on quantum computers: a method for preparation of multiconfigurational wave functions on quantum computers without performing post-hartree–fock calculations. ACS Central Science, 5(1):167–175, 2019. URL: https://doi.org/10.1021/acscentsci.8b00788, arXiv:https://doi.org/10.1021/acscentsci.8b00788, doi:10.1021/acscentsci.8b00788.

  25. Lin Lin and Yu Tong. Optimal polynomial based quantum eigenstate filtering with application to solving quantum linear systems. Quantum, 4:361, 2020. arXiv: https://arxiv.org/abs/1910.14596. doi:10.22331/q-2020-11-11-361.

  26. Lin Lin and Yu Tong. Near-optimal ground state preparation. Quantum, 4:372, 2020. arXiv: https://arxiv.org/abs/2002.12508. doi:10.22331/q-2020-12-14-372.

  27. Yimin Ge, Jordi Tura, and J. Ignacio Cirac. Faster ground state preparation and high-precision ground energy estimation with fewer qubits. Journal of Mathematical Physics, 60(2):022202, 2019. arXiv: https://arxiv.org/abs/1712.03193. doi:10.1063/1.5027484.

  28. Markus Reiher, Nathan Wiebe, Krysta M. Svore, Dave Wecker, and Matthias Troyer. Elucidating reaction mechanisms on quantum computers. Proceedings of the National Academy of Sciences, 114(29):7555–7560, 2017. arXiv: https://arxiv.org/abs/1605.03590. URL: https://www.pnas.org/doi/abs/10.1073/pnas.1619152114, arXiv:https://www.pnas.org/doi/pdf/10.1073/pnas.1619152114, doi:10.1073/pnas.1619152114.

  29. Libor Veis and Jiří Pittner. Adiabatic state preparation study of methylene. The Journal of Chemical Physics, 140(21):214111, 2014. arXiv: https://arxiv.org/abs/1401.3186.pdf. URL: https://doi.org/10.1063/1.4880755, arXiv:https://doi.org/10.1063/1.4880755, doi:10.1063/1.4880755.

  30. Vladimir Kremenetski, Carlos Mejuto-Zaera, Stephen J. Cotton, and Norm M. Tubman. Simulation of adiabatic quantum computing for molecular ground states. The Journal of Chemical Physics, 155(23):234106, 2021. arXiv: https://arxiv.org/abs/2103.12059. URL: https://doi.org/10.1063/5.0060124, arXiv:https://doi.org/10.1063/5.0060124, doi:10.1063/5.0060124.

  31. Seunghoon Lee, Joonho Lee, Huanchen Zhai, Yu Tong, Alexander M. Dalzell, Ashutosh Kumar, Phillip Helms, Johnnie Gray, Zhi-Hao Cui, Wenyuan Liu, Michael Kastoryano, Ryan Babbush, John Preskill, David R. Reichman, Earl T. Campbell, Edward F. Valeev, Lin Lin, and Garnet Kin-Lic Chan. Evaluating the evidence for exponential quantum advantage in ground-state quantum chemistry. Nature Communications, 14(1):1952, 2023. arXiv: https://arxiv.org/abs/2208.02199. URL: https://doi.org/10.1038/s41467-023-37587-6, doi:10.1038/s41467-023-37587-6.

  32. Kenji Sugisaki, Kazuo Toyota, Kazunobu Sato, Daisuke Shiomi, and Takeji Takui. Adiabatic state preparation of correlated wave functions with nonlinear scheduling functions and broken-symmetry wave functions. Communications Chemistry, 5(1):84, 7 2022. URL: https://doi.org/10.1038/s42004-022-00701-8, doi:10.1038/s42004-022-00701-8.

  33. David Poulin, Alexei Kitaev, Damian S. Steiger, Matthew B. Hastings, and Matthias Troyer. Quantum algorithm for spectral measurement with a lower gate count. Physical Review Letters, 121:010501, 7 2018. arXiv: https://arxiv.org/abs/1711.11025. URL: https://link.aps.org/doi/10.1103/PhysRevLett.121.010501, doi:10.1103/PhysRevLett.121.010501.

  34. David Poulin and Pawel Wocjan. Sampling from the thermal quantum gibbs state and evaluating partition functions with a quantum computer. Physical Review Letters, 103(22):220502, 2009. arXiv: https://arxiv.org/abs/0905.2199. doi:10.1103/PhysRevLett.103.220502.

  35. Anirban Narayan Chowdhury and Rolando D. Somma. Quantum algorithms for gibbs sampling and hitting-time estimation. Quantum Information and Computation, 17(1&2):41–64, 2017. arXiv: https://arxiv.org/abs/1603.02940. doi:10.26421/QIC17.1-2.

  36. K. Temme, T. J. Osborne, K. G. Vollbrecht, D. Poulin, and F. Verstraete. Quantum metropolis sampling. Nature, 471(7336):87–90, 3 2011. arXiv: https://arxiv.org/abs/0911.3635. doi:10.1038/nature09770.

  37. Chi-Fang Chen, Michael J. Kastoryano, Fernando G. S. L. Brandão, and András Gilyén. Quantum thermal state preparation. arXiv: https://arxiv.org/abs/2303.18224, 2023.

  38. Ryan Babbush, William J. Huggins, Dominic W. Berry, Shu Fay Ung, Andrew Zhao, David R. Reichman, Hartmut Neven, Andrew D. Baczewski, and Joonho Lee. Quantum simulation of exact electron dynamics can be more efficient than classical mean-field methods. Nature Communications, 14(1):4058, 7 2023. URL: https://doi.org/10.1038/s41467-023-39024-0, doi:10.1038/s41467-023-39024-0.

  39. Guang Hao Low and Nathan Wiebe. Hamiltonian simulation in the interaction picture. arXiv: https://arxiv.org/abs/1805.00675, 2018.

  40. Emanuel Knill, Gerardo Ortiz, and Rolando D. Somma. Optimal quantum measurements of expectation values of observables. Physical Review A, 75:012328, 1 2007. arXiv: https://arxiv.org/abs/quant-ph/0607019. URL: https://link.aps.org/doi/10.1103/PhysRevA.75.012328, doi:10.1103/PhysRevA.75.012328.

  41. William J. Huggins, Kianna Wan, Jarrod McClean, Thomas E. O'Brien, Nathan Wiebe, and Ryan Babbush. Nearly optimal quantum algorithm for estimating multiple expectation values. Physical Review Letters, 129:240501, 12 2022. arXiv: https://arxiv.org/abs/2111.09283. URL: https://link.aps.org/doi/10.1103/PhysRevLett.129.240501, doi:10.1103/PhysRevLett.129.240501.

  42. Joran van Apeldoorn, Arjan Cornelissen, András Gilyén, and Giacomo Nannicini. Quantum tomography using state-preparation unitaries. In Proceedings of the 34th ACM-SIAM Symposium on Discrete Algorithms (SODA), 1265–1318. 2023. arXiv: https://arxiv.org/abs/2207.08800. doi:10.1137/1.9781611977554.ch47.

  43. András Gilyén, Srinivasan Arunachalam, and Nathan Wiebe. Optimizing quantum optimization algorithms via faster quantum gradient computation. In Proceedings of the 30th ACM-SIAM Symposium on Discrete Algorithms (SODA), 1425–1444. 2019. arXiv: https://arxiv.org/abs/1711.00465. doi:10.1137/1.9781611975482.87.

  44. Patrick Rall. Quantum algorithms for estimating physical quantities using block encodings. Physical Review A, 102:022408, 8 2020. arXiv: https://arxiv.org/abs/2004.06832. URL: https://link.aps.org/doi/10.1103/PhysRevA.102.022408, doi:10.1103/PhysRevA.102.022408.

  45. Thomas E. O'Brien, Michael Streif, Nicholas C. Rubin, Raffaele Santagati, Yuan Su, William J. Huggins, Joshua J. Goings, Nikolaj Moll, Elica Kyoseva, Matthias Degroote, Christofer S. Tautermann, Joonho Lee, Dominic W. Berry, Nathan Wiebe, and Ryan Babbush. Efficient quantum computation of molecular forces and other energy gradients. Physical Review Research, 4:043210, 12 2022. arXiv: https://arxiv.org/abs/2111.12437. URL: https://link.aps.org/doi/10.1103/PhysRevResearch.4.043210, doi:10.1103/PhysRevResearch.4.043210.

  46. Mark Steudtner, Sam Morley-Short, William Pol, Sukin Sim, Cristian L Cortes, Matthias Loipersberger, Robert M Parrish, Matthias Degroote, Nikolaj Moll, Raffaele Santagati, and others. Fault-tolerant quantum computation of molecular observables. arXiv: https://arxiv.org/abs/2303.14118, 2023.

  47. Pablo A. M. Casares, Roberto Campos, and M. A. Martin-Delgado. Tfermion: a non-clifford gate cost assessment library of quantum phase estimation algorithms for quantum chemistry. Quantum, 6:768, 7 2022. arXiv: https://arxiv.org/abs/2110.05899. URL: https://doi.org/10.22331/q-2022-07-20-768, doi:10.22331/q-2022-07-20-768.

  48. Kianna Wan, Mario Berta, and Earl T. Campbell. Randomized quantum algorithm for statistical phase estimation. Physical Review Letters, 129:030503, 7 2022. arXiv: https://arxiv.org/abs/2110.12071. URL: https://link.aps.org/doi/10.1103/PhysRevLett.129.030503, doi:10.1103/PhysRevLett.129.030503.

  49. Joshua J Goings, Alec White, Joonho Lee, Christofer S Tautermann, Matthias Degroote, Craig Gidney, Toru Shiozaki, Ryan Babbush, and Nicholas C Rubin. Reliably assessing the electronic structure of cytochrome p450 on today's classical computers and tomorrow's quantum computers. Proceedings of the National Academy of Sciences, 119(38):e2203533119, 2022. arXiv: https://arxiv.org/abs/2202.01244. doi:10.1073/pnas.2203533119.

  50. Isaac H. Kim, Ye-Hua Liu, Sam Pallister, William Pol, Sam Roberts, and Eunseok Lee. Fault-tolerant resource estimate for quantum chemical simulations: case study on li-ion battery electrolyte molecules. Physical Review Research, 4:023019, 4 2022. arXiv: https://arxiv.org/abs/2104.10653. URL: https://link.aps.org/doi/10.1103/PhysRevResearch.4.023019, doi:10.1103/PhysRevResearch.4.023019.

  51. Vincent E Elfving, Benno W Broer, Mark Webber, Jacob Gavartin, Mathew D Halls, K Patrick Lorton, and A Bochevarov. How will quantum computers provide an industrially relevant computational advantage in quantum chemistry? arXiv: https://arxiv.org/abs/2009.12472, 2020.

  52. Nick S. Blunt, Joan Camps, Ophelia Crawford, Róbert Izsák, Sebastian Leontica, Arjun Mirani, Alexandra E. Moylett, Sam A. Scivier, Christoph Sünderhauf, Patrick Schopf, Jacob M. Taylor, and Nicole Holzmann. Perspective on the current state-of-the-art of quantum computing for drug discovery applications. Journal of Chemical Theory and Computation, 18(12):7001–7023, 2022. arXiv: https://arxiv.org/abs/2206.00551. doi:10.1021/acs.jctc.2c00574.

  53. Michael A. Nielsen and Isaac L. Chuang. Quantum computation and quantum information. Cambridge University Press, 2000. doi:10.1017/CBO9780511976667.

  54. Ryan Babbush, Craig Gidney, Dominic W. Berry, Nathan Wiebe, Jarrod McClean, Alexandru Paler, Austin Fowler, and Hartmut Neven. Encoding electronic spectra in quantum circuits with linear t complexity. Physical Review X, 8(4):041015, 2018. arXiv: https://arxiv.org/abs/1805.03662. doi:10.1103/PhysRevX.8.041015.

  55. Ian D. Kivlichan, Craig Gidney, Dominic W. Berry, Nathan Wiebe, Jarrod McClean, Wei Sun, Zhang Jiang, Nicholas Rubin, Austin Fowler, Alán Aspuru-Guzik, Hartmut Neven, and Ryan Babbush. Improved fault-tolerant quantum simulation of condensed-phase correlated electrons via trotterization. Quantum, 4:296, 7 2020. arXiv: https://arxiv.org/abs/1902.10673. URL: https://doi.org/10.22331/q-2020-07-16-296, doi:10.22331/q-2020-07-16-296.

  56. Sam McArdle, Earl Campbell, and Yuan Su. Exploiting fermion number in factorized decompositions of the electronic structure hamiltonian. Physical Review A, 105:012403, 1 2022. arXiv: https://arxiv.org/abs/2107.07238. URL: https://link.aps.org/doi/10.1103/PhysRevA.105.012403, doi:10.1103/PhysRevA.105.012403.

  57. Alain Delgado, Pablo A. M. Casares, Roberto dos Reis, Modjtaba Shokrian Zini, Roberto Campos, Norge Cruz-Hernández, Arne-Christian Voigt, Angus Lowe, Soran Jahangiri, M. A. Martin-Delgado, Jonathan E. Mueller, and Juan Miguel Arrazola. Simulating key properties of lithium-ion batteries with a fault-tolerant quantum computer. Physical Review A, 106:032428, 9 2022. arXiv: https://arxiv.org/abs/2204.11890. URL: https://link.aps.org/doi/10.1103/PhysRevA.106.032428, doi:10.1103/PhysRevA.106.032428.

  58. Modjtaba Shokrian Zini, Alain Delgado, Roberto dos Reis, Pablo Antonio Moreno Casares, Jonathan E. Mueller, Arne-Christian Voigt, and Juan Miguel Arrazola. Quantum simulation of battery materials using ionic pseudopotentials. Quantum, 7:1049, 7 2023. arXiv: https://arxiv.org/abs/2302.07981. URL: https://doi.org/10.22331/q-2023-07-10-1049, doi:10.22331/q-2023-07-10-1049.

  59. Nicholas C Rubin, Dominic W Berry, Alina Kononov, Fionn D Malone, Tanuj Khattar, Alec White, Joonho Lee, Hartmut Neven, Ryan Babbush, and Andrew D Baczewski. Quantum computation of stopping power for inertial fusion target design. arXiv: https://arxiv.org/abs/2308.12352, 2023.

  60. James Daniel Whitfield, Peter John Love, and Alan Aspuru-Guzik. Computational complexity in electronic structure. Phys. Chem. Chem. Phys., 15:397–411, 2013. arXiv: https://arxiv.org/abs/1208.3334. URL: http://dx.doi.org/10.1039/C2CP42695A, doi:10.1039/C2CP42695A.

  61. Raffaele Santagati, Alan Aspuru-Guzik, Ryan Babbush, Matthias Degroote, Leticia Gonzalez, Elica Kyoseva, Nikolaj Moll, Markus Oppel, Robert M Parrish, Nicholas C Rubin, and others. Drug design on quantum computers. arXiv: https://arxiv.org/abs/2301.04114, 2023.

  62. Sophia Simon, Raffaele Santagati, Matthias Degroote, Nikolaj Moll, Michael Streif, and Nathan Wiebe. Improved precision scaling for simulating coupled quantum-classical dynamics. arXiv: https://arxiv.org/abs/2307.13033, 2023.

  63. Kiel T. Williams, Yuan Yao, Jia Li, Li Chen, Hao Shi, Mario Motta, Chunyao Niu, Ushnish Ray, Sheng Guo, Robert J. Anderson, Junhao Li, Lan Nguyen Tran, Chia-Nan Yeh, Bastien Mussard, Sandeep Sharma, Fabien Bruneval, Mark van Schilfgaarde, George H. Booth, Garnet Kin-Lic Chan, Shiwei Zhang, Emanuel Gull, Dominika Zgid, Andrew Millis, Cyrus J. Umrigar, and Lucas K. Wagner. Direct comparison of many-body methods for realistic electronic hamiltonians. Physical Review X, 10:011041, 2 2020. arXiv: https://arxiv.org/abs/1910.00045. URL: https://link.aps.org/doi/10.1103/PhysRevX.10.011041, doi:10.1103/PhysRevX.10.011041.

  64. J. P. F. LeBlanc, Andrey E. Antipov, Federico Becca, Ireneusz W. Bulik, Garnet Kin-Lic Chan, Chia-Min Chung, Youjin Deng, Michel Ferrero, Thomas M. Henderson, Carlos A. Jiménez-Hoyos, E. Kozik, Xuan-Wen Liu, Andrew J. Millis, N. V. Prokof'ev, Mingpu Qin, Gustavo E. Scuseria, Hao Shi, B. V. Svistunov, Luca F. Tocchio, I. S. Tupitsyn, Steven R. White, Shiwei Zhang, Bo-Xiao Zheng, Zhenyue Zhu, and Emanuel Gull. Solutions of the two-dimensional hubbard model: benchmarks and results from a wide range of numerical algorithms. Physical Review X, 5:041041, 12 2015. arXiv: https://arxiv.org/abs/1505.02290. URL: https://link.aps.org/doi/10.1103/PhysRevX.5.041041, doi:10.1103/PhysRevX.5.041041.

  65. Mario Motta, David M. Ceperley, Garnet Kin-Lic Chan, John A. Gomez, Emanuel Gull, Sheng Guo, Carlos A. Jiménez-Hoyos, Tran Nguyen Lan, Jia Li, Fengjie Ma, Andrew J. Millis, Nikolay V. Prokof'ev, Ushnish Ray, Gustavo E. Scuseria, Sandro Sorella, Edwin M. Stoudenmire, Qiming Sun, Igor S. Tupitsyn, Steven R. White, Dominika Zgid, and Shiwei Zhang. Towards the solution of the many-electron problem in real materials: equation of state of the hydrogen chain with state-of-the-art many-body methods. Physical Review X, 7:031059, 9 2017. arXiv: https://arxiv.org/abs/1705.01608. URL: https://link.aps.org/doi/10.1103/PhysRevX.7.031059, doi:10.1103/PhysRevX.7.031059.

  66. Mario Motta, Claudio Genovese, Fengjie Ma, Zhi-Hao Cui, Randy Sawaya, Garnet Kin-Lic Chan, Natalia Chepiga, Phillip Helms, Carlos Jiménez-Hoyos, Andrew J. Millis, Ushnish Ray, Enrico Ronca, Hao Shi, Sandro Sorella, Edwin M. Stoudenmire, Steven R. White, and Shiwei Zhang. Ground-state properties of the hydrogen chain: dimerization, insulator-to-metal transition, and magnetic phases. Physical Review X, 10:031058, 9 2020. arXiv: https://arxiv.org/abs/1911.01618. URL: https://link.aps.org/doi/10.1103/PhysRevX.10.031058, doi:10.1103/PhysRevX.10.031058.

  67. Thomas Schäfer, Nils Wentzell, Fedor Šimkovic, Yuan-Yao He, Cornelia Hille, Marcel Klett, Christian J. Eckhardt, Behnam Arzhang, Viktor Harkov, Fran ç çois-Marie Le Régent, Alfred Kirsch, Yan Wang, Aaram J. Kim, Evgeny Kozik, Evgeny A. Stepanov, Anna Kauch, Sabine Andergassen, Philipp Hansmann, Daniel Rohe, Yuri M. Vilk, James P. F. LeBlanc, Shiwei Zhang, A.-M. S. Tremblay, Michel Ferrero, Olivier Parcollet, and Antoine Georges. Tracking the footprints of spin fluctuations: a multimethod, multimessenger study of the two-dimensional hubbard model. Physical Review X, 11:011058, 3 2021. arXiv: https://arxiv.org/abs/2006.10769. URL: https://link.aps.org/doi/10.1103/PhysRevX.11.011058, doi:10.1103/PhysRevX.11.011058.

  68. Seth Lloyd. Universal quantum simulators. Science, 273(5278):1073–1078, 1996. doi:10.1126/science.273.5278.1073.

  69. Abhinav Kandala, Antonio Mezzacapo, Kristan Temme, Maika Takita, Markus Brink, Jerry M. Chow, and Jay M. Gambetta. Hardware-efficient variational quantum eigensolver for small molecules and quantum magnets. Nature, 549(7671):242–246, 9 2017. arXiv: https://arxiv.org/abs/1704.05018. URL: https://doi.org/10.1038/nature23879, doi:10.1038/nature23879.

  70. Google AI Quantum, Frank Arute, Kunal Arya, Ryan Babbush, Dave Bacon, Joseph C. Bardin, Rami Barends, Sergio Boixo, Michael Broughton, Bob B. Buckley, David A. Buell, Brian Burkett, Nicholas Bushnell, Yu Chen, Zijun Chen, Benjamin Chiaro, Roberto Collins, William Courtney, Sean Demura, Andrew Dunsworth, Edward Farhi, Austin Fowler, Brooks Foxen, Craig Gidney, Marissa Giustina, Rob Graff, Steve Habegger, Matthew P. Harrigan, Alan Ho, Sabrina Hong, Trent Huang, William J. Huggins, Lev Ioffe, Sergei V. Isakov, Evan Jeffrey, Zhang Jiang, Cody Jones, Dvir Kafri, Kostyantyn Kechedzhi, Julian Kelly, Seon Kim, Paul V. Klimov, Alexander Korotkov, Fedor Kostritsa, David Landhuis, Pavel Laptev, Mike Lindmark, Erik Lucero, Orion Martin, John M. Martinis, Jarrod R. McClean, Matt McEwen, Anthony Megrant, Xiao Mi, Masoud Mohseni, Wojciech Mruczkiewicz, Josh Mutus, Ofer Naaman, Matthew Neeley, Charles Neill, Hartmut Neven, Murphy Yuezhen Niu, Thomas E. O'Brien, Eric Ostby, Andre Petukhov, Harald Putterman, Chris Quintana, Pedram Roushan, Nicholas C. Rubin, Daniel Sank, Kevin J. Satzinger, Vadim Smelyanskiy, Doug Strain, Kevin J. Sung, Marco Szalay, Tyler Y. Takeshita, Amit Vainsencher, Theodore White, Nathan Wiebe, Z. Jamie Yao, Ping Yeh, and Adam Zalcman. Hartree–fock on a superconducting qubit quantum computer. Science, 369(6507):1084–1089, 2020. arXiv: https://arxiv.org/abs/2004.04174. URL: https://www.science.org/doi/abs/10.1126/science.abb9811, arXiv:https://www.science.org/doi/pdf/10.1126/science.abb9811, doi:10.1126/science.abb9811.

  71. Nobuyuki Yoshioka, Takeshi Sato, Yuya O. Nakagawa, Yu-ya Ohnishi, and Wataru Mizukami. Variational quantum simulation for periodic materials. Physical Review Research, 4:013052, 1 2022. arXiv: https://arxiv.org/abs/2008.09492. URL: https://link.aps.org/doi/10.1103/PhysRevResearch.4.013052, doi:10.1103/PhysRevResearch.4.013052.

  72. David Zsolt Manrique, Irfan T Khan, Kentaro Yamamoto, Vijja Wichitwechkarn, and David Munoz Ramo. Momentum-space unitary coupled cluster and translational quantum subspace expansion for periodic systems on quantum computers. arXiv: https://arxiv.org/abs/2008.08694, 2020.

  73. William J. Huggins, Bryan A. O'Gorman, Nicholas C. Rubin, David R. Reichman, Ryan Babbush, and Joonho Lee. Unbiasing fermionic quantum monte carlo with a quantum computer. Nature, 603(7901):416–420, 3 2022. arXiv: https://arxiv.org/abs/2106.16235. URL: https://doi.org/10.1038/s41586-021-04351-z, doi:10.1038/s41586-021-04351-z.

  74. Javier Argüello-Luengo, Alejandro González-Tudela, Tao Shi, Peter Zoller, and J. Ignacio Cirac. Analogue quantum chemistry simulation. Nature, 574(7777):215–218, 10 2019. arXiv: https://arxiv.org/abs/1807.09228. URL: https://doi.org/10.1038/s41586-019-1614-4, doi:10.1038/s41586-019-1614-4.


  1. This reference is not technically a first quantized representation, as antisymmetry is stored in the operators rather than the wavefunction, but it stores states in an analogously compressed way to first quantized representations. 

  2. Note that it can be substantially cheaper to directly execute the reflection \(R_\psi = I - 2 \ket{\psi}\bra{\psi}\) used in both methods, rather than through the use of \(U_\psi\), as the complexity of \(R_\psi\) does not depend on the overlap \(\gamma\) that appears in state preparation—see [26] for additional discussion.