scispace - formally typeset
Search or ask a question

Showing papers by "William G. Macready published in 2012"


Posted Content
TL;DR: The architecture of the D-Wave One machine is described and the ways to circumvent the limitation of the Ising mapping using a "blackbox" approach based on ideas from probabilistic computing are discussed.
Abstract: In this article, we show how to map a sampling of the hardest artificial intelligence problems in space exploration onto equivalent Ising models that then can be attacked using quantum annealing implemented in D-Wave machine. We overview the existing results as well as propose new Ising model implementations for quantum annealing. We review supervised and unsupervised learning algorithms for classification and clustering with applications to feature identification and anomaly detection. We introduce algorithms for data fusion and image matching for remote sensing applications. We overview planning problems for space exploration mission applications and algorithms for diagnostics and recovery with applications to deep space missions. We describe combinatorial optimization algorithms for task assignment in the context of autonomous unmanned exploration. Finally, we discuss the ways to circumvent the limitation of the Ising mapping using a "blackbox" approach based on ideas from probabilistic computing. In this article we describe the architecture of the D-Wave One machine and report its benchmarks. Results on random ensemble of problems in the range of up to 96 qubits show improved scaling for median core quantum annealing time compared with classical algorithms; whether this scaling persists for larger problem sizes is an open question. We also review previous results of D-Wave One benchmarking studies for solving binary classification problems with a quantum boosting algorithm which is shown to outperform AdaBoost. We review quantum algorithms for structured learning for multi-label classification and introduce a hybrid classical/quantum approach for learning the weights. Results of D-Wave One benchmarking studies for learning structured labels on four different data sets show a better performance compared with an independent Support Vector Machine approach with linear kernel.

65 citations


Proceedings Article
17 Nov 2012
TL;DR: Evidence is gathered that adiabatic quantum optimization is able to handle the discrete optimization problems generated by QBoost, which is proposed as an iterative training algorithm in which a subset of weak classifiers is selected by solving a hard optimization problem in each iteration.
Abstract: We introduce a novel discrete optimization method for training in the context of a boosting framework for large scale binary classifiers. The motivation is to cast the training problem into the format required by existing adiabatic quantum hardware. First we provide theoretical arguments concerning the transformation of an originally continuous optimization problem into one with discrete variables of low bit depth. Next we propose QBoost as an iterative training algorithm in which a subset of weak classifiers is selected by solving a hard optimization problem in each iteration. A strong classifier is incrementally constructed by concatenating the subsets of weak classifiers. We supplement the findings with experiments on one synthetic and two natural data sets and compare against the performance of existing boosting algorithms. Finally, by conducting a quantum Monte Carlo simulation we gather evidence that adiabatic quantum optimization is able to handle the discrete optimization problems generated by QBoost.

59 citations


Patent
06 Jul 2012
TL;DR: In this paper, a quantum processor is used as a sample generator providing low-energy samples from a probability distribution with high probability, and the probability distribution is shaped to assign relative probabilities to samples based on their corresponding objective function values until the samples converge on a minimum for the objective function.
Abstract: Quantum processor based techniques minimize an objective function for example by operating the quantum processor as a sample generator providing low-energy samples from a probability distribution with high probability. The probability distribution is shaped to assign relative probabilities to samples based on their corresponding objective function values until the samples converge on a minimum for the objective function. Problems having a number of variables and/or a connectivity between variables that does not match that of the quantum processor may be solved. Interaction with the quantum processor may be via a digital computer. The digital computer stores a hierarchical stack of software modules to facilitate interacting with the quantum processor via various levels of programming environment, from a machine language level up to an end-use applications level.

39 citations


Journal ArticleDOI
TL;DR: This paper calculates median adiabatic times determined by the minimum gap during the aduabatic quantum optimization for an NP-hard Ising spin glass instance class with up to 128 binary variables and shows that if the adiABatic time scale were to determine the computation time, adi Ab quantum optimization would be significantly superior to those classical solvers for median spin glass problems of at least up to128 qubits.
Abstract: Adiabatic quantum optimization offers a new method for solving hard optimization problems. In this paper we calculate median adiabatic times (in seconds) determined by the minimum gap during the adiabatic quantum optimization for an NP-hard Ising spin glass instance class with up to 128 binary variables. Using parameters obtained from a realistic superconducting adiabatic quantum processor, we extract the minimum gap and matrix elements using high performance Quantum Monte Carlo simulations on a large-scale Internet-based computing platform. We compare the median adiabatic times with the median running times of two classical solvers and find that, for the considered problem sizes, the adiabatic times for the simulated processor architecture are about 4 and 6 orders of magnitude shorter than the two classical solvers' times. This shows that if the adiabatic time scale were to determine the computation time, adiabatic quantum optimization would be significantly superior to those classical solvers for median spin glass problems of at least up to 128 qubits. We also discuss important additional constraints that affect the performance of a realistic system.

29 citations


Posted Content
09 Jan 2012
TL;DR: This computation is the largest experimental implementation of a scientifically meaningful quantum algorithm that has been done to date and shows that it correctly determines the Ramsey numbers R(3, 3) and R(m, 2) for 4 ≤ m ≤ 8.
Abstract: Ramsey theory is a highly active research area in mathematics that studies the emergence of order in large disordered structures. It has found applications in mathematics, theoretical computer science, information theory, and classical error correcting codes. Ramsey numbers mark the threshold at which order first appears and are notoriously difficult to calculate due to their explosive rate of growth. Recently, a quantum algorithm has been proposed that calculates the two-color Ramsey numbers R(m,n). Here we present results of an experimental implementation of this algorithm based on quantum annealing and show that it correctly determines the Ramsey numbers R(3, 3) and R(m, 2) for 4 ≤ m ≤ 8. The R(8, 2) computation used 84 qubits of which 28 were computational qubits. This computation is the largest experimental implementation of a scientifically meaningful quantum algorithm that has been done to date.

27 citations


Posted Content
14 Nov 2012
TL;DR: In this paper, a case study for encoding related combinatorial optimization problems in a form suitable for adiabatic quantum optimization is presented, where the authors demonstrate how to constrain and embed lattice heteropolymer problems using several strategies, each striking a unique balance between number of constraints, complexity of constraints and number of variables.
Abstract: Optimization problems associated with the interaction of linked particles are at the heart of polymer science, protein folding and other important problems in the physical sciences. In this review we explain how to recast these problems as constraint satisfaction problems such as linear programming, maximum satisfiability, and pseudo-boolean optimization. By encoding problems this way, one can leverage substantial insight and powerful solvers from the computer science community which studies constraint programming for diverse applications such as logistics, scheduling, artificial intelligence, and circuit design. We demonstrate how to constrain and embed lattice heteropolymer problems using several strategies. Each strikes a unique balance between number of constraints, complexity of constraints, and number of variables. Finally, we show how to reduce the locality of couplings in these energy functions so they can be realized as Hamiltonians on existing quantum annealing machines. We intend that this review be used as a case study for encoding related combinatorial optimization problems in a form suitable for adiabatic quantum optimization.

23 citations


Book ChapterDOI
TL;DR: This review explains how to recast combinatorial optimization problems as constraint satisfaction problems such as linear programming, maximum satisfiability, and pseudo-boolean optimization, and shows how to constrain and embed lattice heteropolymer problems using several strategies.
Abstract: Optimization problems associated with the interaction of linked particles are at the heart of polymer science, protein folding and other important problems in the physical sciences. In this review we explain how to recast these problems as constraint satisfaction problems such as linear programming, maximum satisfiability, and pseudo-boolean optimization. By encoding problems this way, one can leverage substantial insight and powerful solvers from the computer science community which studies constraint programming for diverse applications such as logistics, scheduling, articial intelligence, and circuit design. We demonstrate how to constrain and embed lattice heteropolymer problems using several strategies. Each strikes a unique balance between number of constraints, complexity of constraints, and number of variables. Finally, we show how to reduce the locality of couplings in these energy functions so they can be realized as Hamiltonians on existing quantum annealing machines. We intend that this review be used as a case study for encoding related combinatorial optimization problems in a form suitable for adiabatic quantum optimization.

22 citations