2016 IEEE High Performance
Extreme Computing Conference
(HPEC ‘16)
Twentieth Annual HPEC Conference
13 - 15 September 2016
Westin Hotel, Waltham, MA USA
Quantum Tools & Information Theory 2
3:00-4:40 in Eden Vale C3
Chair: Steve Reinhardt / D-Wave
Thursday, September 15
Parameter Setting for Quantum Annealers
Kristen L. Pudenz, Lockheed Martin Aeronautics
We develop and apply several strategies for setting physical
parameters on quantum annealers for application prob-lems that do
not fit natively on the hardware graph. The strategies are tested with
a culled random set of mixed satisfiability problems, yielding results
that generalize to guidelines regarding which parameter setting
strategies to use for different classes of problems, and how to choose
other necessary hardware quantities as well. Alternate methods of
changing the hardware implementation of an application problem are
also considered and their utility discussed.
Abstractions Considered Helpful: A Tools Architecture for
Quantum Annealers
Michael Booth, Edward Dahl, Mark Furtney, Steven P. Reinhardt, D-
Wave Systems, Inc.
Today’s usable quantum computers, variously known as adiabatic
quantum computers or quantum annealers and exemplified by the D-
Wave 2X™ system, have an instruction set architecture foreign to
mainstream classical computers and thus require a new class of
programming tools to enable their widespread use. We submit that
well-chosen abstractions, each balancing the ability of high- and low-
level tools to use it, will play an essential role in fostering a vibrant
ecosystem of such new tools. We propose the virtual quadratic
unconstrained binary optimization (vQUBO) problem as one such
abstraction and describe our experience in implementing and using it.
As one step toward an effective quantum computing ecosystem, we
invite other tool developers to create complementary tools that map
from user problems to the vQUBO form for end-to-end usability and
performance.
An Approach to Big Data Inspired by Statistical Mechanics
John A. Cortese, MIT Lincoln Laboratory
A family of techniques in physics known as statistical mechanics is
useful for describing the macroscopic properties of materials
composed of a large (Avogadro’s # 1024) number of atoms. This talk
applies the same approach to the analysis of big data problems. The
initial problem examined is that of classification, specifically binary
hypothesis testing, in the big data, high dimensional scenario. The
lessons learned from the binary hypothesis testing problem are
extended to other signal processing paradigms.
Associative Array Model of SQL, NoSQL, and NewSQL
Databases
Jeremy Kepner
1
,
2
,
3
, Vijay Gadepally
1
,
2
, Dylan Hutchison
4
, Hayden
Jananthan
3
,
5
, Timothy Mattson
6
, Siddharth Samsi
1
, Albert Reuther
1
1
MIT Lincoln Laboratory,
2
MIT Computer Science & AI Laboratory,
3
MIT Mathematics Department,
4
University of Washington Computer
Science Department,
5
Vanderbilt University Mathematics Department,
6
Intel Corporation
The success of SQL, NoSQL, and NewSQL databases is a reflection
of their ability to provide significant functionality and performance
benefits for specific domains, such as financial transactions, internet
search, and data analysis. The BigDAWG polystore seeks to provide
a mechanism to allow applications to transparently achieve the
benefits of diverse databases while insulating applications from the
details of these databases. Associative arrays provide a common
approach to the mathematics found in different databases: sets
(SQL), graphs (NoSQL), and matrices (NewSQL). This work presents
the SQL relational model in terms of associative arrays and identifies
the key mathematical properties that are preserved within SQL.
These properties include associativity, commutativity, distributivity,
identities, annihilators, and inverses. Performance measurements on
distributivity and associativity show the impact these properties can
have on associative array operations. These results demonstrate that
associative arrays could provide a mathematical model for polystores
to optimize the exchange of data and execution queries.