|
THE
FIELDS INSTITUTE FOR RESEARCH IN MATHEMATICAL SCIENCES
20th
ANNIVERSARY
YEAR
|
Workshop
on Mathematical Methods of
Quantum Tomography
February 19-22, 2013
Fields Institute, 222 College St.,Toronto
|
|
|
|
|
Robin Blume-Kohout
(Los Alamos National Lab) - Thursday 9:35am
Adaptive Gate-Set Tomography
Quantum information hardware needs to be characterized and calibrated.
This is the job of quantum state and process tomography, but standard
tomographic methods have an Achilles heel: to characterize an
unknown process, they rely on a set of absolutely calibrated measurements.
Many technologies (e.g., solid-state qubits) admit only a single
native measurement basis, and other bases are measured using unitary
control. So tomography becomes circular -- tomographic protocols
are using gates to calibrate themselves! Gate-set tomography confronts
this problem head-on and resolves it by treating gates relationally.
We abandon all assumptions about what a given gate operation does,
and characterize entire universal gate sets from the ground up
using only the observed statistics of an [unknown] 2-outcome measurement
after various strings of [unknown] gate operations. The accuracy
and reliability of the resulting estimates depend critically on
which gate strings are used, and benefits greatly from adaptivity.
We demonstrate gate-set tomography and quantify the accuracy with
which the individual gates can be estimated.
Agata Branczyk (University of Toronto)
- Tuesday 2:00pm
Holistic Tomography
Quantum state tomography is the characterization of a quantum
state by repeated state preparation and measurement. It relies
on the ability to prepare well-characterized unitary operations
to change the measurement basis. Conversely, quantum process tomography
is the characterization of a quantum process, relying on the preparation
of well-characterized quantum states. While parallels between
state and process tomography are well-known, the two have largely
been treated independently. In the last year, however, we saw
the development of a number of more holistic approaches, where
unknown parameters in both the state and process are treated on
an equal footing [1-5]. I will discuss advances made in these
approaches, focusing on our results for unitary processes, and
look to the future with a discussion of open problems and possible
future directions.
[1] A M Branczyk, D H. Mahler, L A. Rozema, A Darabi, A M. Steinberg,
and D F V James. "Self-calibrating quantum state tomography",
New Journal of Physics 14, no. 8 (2012): 085003.
[2] D Mogilevtsev, J. Rehácek, and Z. Hradil. "Self-calibration
for self-consistent tomography", New Journal of Physics 14,
no. 9 (2012): 095001.
[3] S T Merkel, J M Gambetta, J A Smolin, S Poletto, A D Córcoles,
B R Johnson, Colm A. Ryan, and M Steffen. "Self-Consistent
Quantum Process Tomography", arXiv:1211.0322 [quant-ph] (2012).
[4] N Quesada, A M Branczyk, and D F V James. "Self-calibrating
tomography of multi-dimensional systems.",arXiv:1212.0556
[quant-ph] (2012)
[5] Stanislav Straupe, Denis Ivanov, Alexander Kalinkin, Ivan
Bobrov, Sergey P Kulik, D Mogilevtsev, "Self-calibrating
Tomography for Angular Schmidt Modes in Spontaneous Parametric
Down-Conversion", arXiv:1112.3806 [quant-ph](2013)
Maria Chekhova (Max-Planck Institute
for the Science of Light) - Wednesday 11:am
Polarization quantum tomography of macroscopic Bell states
We study the polarization properties of macroscopic Bell states,
which are multi-photon analogues of two-photon polarization-entangled
Bell states. The states are produced using two orthogonally polarized
non-degenerate optical parametric amplifiers coherently pumped
by strong picosecond pulses. For one of macroscopic Bell states,
the singlet one, all three Stokes observables S1,2,3 have noise
suppressed below the shot-noise level. For each of the other three
states, the triplet ones, only one Stokes observable has noise
suppressed. These and other polarization properties are revealed
by reconstructing the quasiprobability W(S1,S2,S3), which is the
polarization analog of the Wigner function. The reconstruction
procedure is similar to the classical 3D inverse Radon transformation.
It is performed on the histograms obtained by measuring the probability
distributions of various Stokes observables. For comparison, we
also reconstruct the polarization quasiprobability of a pseudo-coherent
state and study the robustness of the reconstruction against the
state displacement in the Stokes space
Matthias Christandl (ETH Zurich)
- Tuesday 9:35am
Reliable quantum state tomography
Quantum state tomography is the task of inferring the state of
a quantum system by appropriate measurements. Since the frequency
distributions of the outcomes of any finite number of measurements
will generally deviate from their asymptotic limits, the estimates
computed by standard methods do not in general coincide with the
true state, and therefore have no operational significance unless
their accuracy is defined in terms of error bounds. Here we show
that quantum state tomography, together with an appropriate data
analysis procedure, yields reliable and tight error bounds, specified
in terms of confidence regions - a concept originating from classical
statistics. Confidence regions are subsets of the state space
in which the true state lies with high probability, independently
of any prior assumption on the distribution of the possible states.
Our method for computing confidence regions can be applied to
arbitrary measurements including fully coherent ones; it is practical
and particularly well suited for tomography on systems consisting
of a small number of qubits, which are currently in the focus
of interest in experimental quantum information science.
Rafal Demkowicz-Dobrzanski (University
of Warsaw) - Thursday 11:00am
All you need is squeezing! Optimal schemes for realistic quantum
metrology.
The presence of decoherence makes the quantum precision enhancement
offered by quantum metrology less spectacular than in idealized
scenarios. Nevertheless, the quantum gain may still be important
from a practical point in e.g. atomic clocks or gravitational
wave detectors. Interestingly, while decoherence certainly should
be regarded as a nuisance, it has a positive aspect namely, that
when decoherence is taken into account, simple quantum metrological
protocols based on squeezed states efficiently reach the fundamental
theoretical limits on precision. Therefore no more sophisticated
quantum states nor measurements are necessary for practical quantum
metrology.
Jens Eisert (Freie Universität
Berlin) - Wednesday 9:00am
Progress on quantum compressed sensing
Intuitively, if a density operator has small rank, then it should
be easier to estimate from experimental data, since in this case
only a few eigenvectors need to be learned. Quantum compressed
sensing provides a rigorous formalism confirming this intuition.
This talk will report recent progress in tomography based on compressed
sensing. In particular, it is shown that a low-rank density matrix
can be estimated using fewer copies of the state, i.e., the sample
complexity of tomography decreases with the rank. Second, we show
that unknown low-rank states can be reconstructed from an incomplete
set of measurements, using techniques from compressed sensing
and matrix completion. We will further elaborate on continuous-variable
instances of this idea, and sketch how similar ideas can be used
for feasible tomography in quantum optical systems, e.g., in systems
of ultracold atoms in optical lattices.
Berge Englert (Centre for Quantum
Technologies, Singapore) - Tuesday 9:00am
Maximum-likelihood regions and smallest credible regions
Rather than point estimators, states of a quantum system that
represent one's best guess for the given data, we consider optimal
regions of estimators. As the natural counterpart of the popular
maximum-likelihood point estimator, we introduce the maximum-likelihood
region -- the region of largest likelihood among all regions of
the same size.
Here, the size of a region is its prior probability. Another concept
is the smallest credible region -- the smallest region with pre-chosen
posterior probability. For both optimization problems, the optimal
region has constant likelihood on its boundary. We discuss criteria
for assigning prior probabilities to regions, and illustrate the
concepts and methods with several examples.
Collaborators Jiangwei Shang, Hui Khoon Ng, Arun Sehrawat, Xikun
Li
Alessandro Ferraro (University College
London) - Friday 2:00pm
Reconstructing the quantum state of oscillator networks with a single
qubit
A minimal scheme to reconstruct arbitrary states of networks
composed of quantum oscillators is introduced. The scheme uses
minimal resources in the sense that it i) requires only the interaction
between one-qubit probe and one constituent of the network; ii)
provides the reconstructed state directly from the data, avoiding
any tomographic transformation; iii) involves the tuning of only
one coupling parameter. In addition, a number of quantum properties
can be extracted without full reconstruction of the state. The
scheme can be used for probing quantum simulations of anharmonic
many-body systems and quantum computations with continuous variables.
Experimental implementation with trapped ions is also discussed
and shown to be within reach of current technology.
Ref.: T. Tufarelli, A. Ferraro, M. S. Kim, S. Bose, Phys. Rev.
A 85, 032334 (2012)
Steve Flammia (University of Sydney)
- Wednesday 9:35am
The Sample Complexity of Tomography and New Estimators for Compressed
Sensing
We will present a new theoretical analysis of quantum tomography,
focusing on efficient use of measurements, statistical sampling,
and classical computation. Our results are several fold. Using
Liu's restricted isometry property (RIP) for low-rank matrices
via Pauli measurements, we obtain near-optimal error bounds, for
the realistic situation where the data contains noise due to finite
statistics, and the density matrix is full-rank with decaying
eigenvalues. We also obtain upper-bounds on the sample complexity
of compressed tomography, and almost-matching lower bounds on
the sample complexity of any procedure using adaptive sequences
of Pauli measurements.
We will additionally show numerical simulations that compare the
performance of two compressed sensing estimators with standard
maximum-likelihood estimation (MLE). We find that the compressed
sensing estimators consistently produce higher-fidelity state
reconstructions than MLE given comparable experimental resources.
In addition, the use of an incomplete set of measurements leads
to faster classical processing with no loss of accuracy.
Finally, we show how to certify the accuracy of a low rank estimate
using direct fidelity estimation and we describe a method for
compressed quantum process tomography that works for processes
with small Kraus rank, and requires only Pauli eigenstate preparations
and Pauli measurements.
Joint work with David Gross, Yi-Kai Liu, and Jens Eisert, New
J. Phys. 14, 095022 (2012).
Marco Genovese (INRIM) - Thursday
2:00pm
Quantum Tomography of the POVM of PNR detectors: different methods.
The rapid development of quantum systems has enabled a wide range
of novel and innovative technologies, from quantum information
processing to quantum metrology and imaging, mainly based on optical
systems.
Precise characterization techniques of quantum resources, i.e.
states, operations and detectors play a critical role in development
of such technologies. In this talk we will present recent experiments
addressed to reconstruct the POVM of photon number resolving (PNR)
detectors and their application. In the first case [1] we will
discuss the reconstruction of a TES POVM based on the use of a
quorum of coherent states. In the second case [2] we present the
first experimental characterization of quantum properties of an
unknown PNR detector, a detector tree, that takes advantage of
a quantum resource, i.e. an ancillary state . This quantum-assisted
reconstruction method requires no prior information about the
detector under test, and its convergence is more stable and faster
to reach a final result. This is achieved by exploiting strong
quantum correlations of twin beams generated in a parametric down
conversion process: one beam is characterized by a quantum tomographer,
while the other is used to calibrate the unknown detector.Finally,
we will discuss the experimental realisation a self consistent
quantum tomography of a PNR detector POVM and its application
[3].
[1] G.Brida, L.Ciavarella, I.Degiovanni, M. Genovese, L.Lolli,
M.Mingolla, F. Piacentini, M. Rajteri, E. Taralli, M.Paris, New
J. Phys. 14 (2012) 085001
[2] G. Brida, L. Ciavarella, I. P. Degiovanni, M. Genovese, A.
Migdall,
M. G. Mingolla, M. G. A. Paris, F. Piacentini, S. V. Polyakov,
Phys. Rev. Lett. 108, 253601 (2012)
[3] G.Brida et al., work to be submitted.
Lane P. Hughston (University College
London) - Wednesday 2:35pm
Universal Tomographic Measurements
The purpose of this paper is to introduce certain rather general
classes of operations in quantum mechanics that one can regard
as \universal quantum measurements" (UQMs) in the sense that
they are applicable to all quantum systems and involve the specification
of only a minimal amount of structure on the system. The first
class of UQM that we consider involves the Hilbert space of the
system together with the specification of the stateof the system-
no further structure is brought into play. We call operations
of this type "universal tomographic measurements", since
given the statistics of the outcomes of such measurements it is
possible to reconstruct the state of the system. The second class
of UQM that we consider is also universal in the sense that the
only additional structure involved is the specification of the
Hamiltonian. Apart from the usual projective measurements of the
energy, one is also led to a number of other less familiar measurement
operations, which we discuss. Among these, for example, is a direct
measurement of the "expected" energy of the system.
Finally, we consider the large class of UQMs that can be constructed
when the universal tomographic measurements on one or more quantum
systems are lifted, through appropriate embeddings, to induce
certain associated operations on the embedding space.
As an example, one can make a measurement of the direction in
space along which the spin of a spin-s particle is oriented (s
=1/2,1,
.) In this case the additional structure involves
the embedding of CP1 as a rational curve of degree 2s in CP2s.
As another example, we show how one can construct a universal
disentangling measurement, the outcome of which, when applied
to a mixed state of an entangled compose system, is a disentangled
product of pure constituent states.
Joint work with Dorje C. Brody
Andrei Klimov (Universidad de Guadalajara)
- Friday 11:00am
Distribution of quantum fluctuations in the macroscopic limit
of N qubit systems
We analyze the structure of quasidistribution functions of N
qubit systems projected into the space of symmetric measurements
in the asymptotic limit of large number of qubits. We discuss
the possibility of reconstruction of the Q-function in this 3-dimensional
space of symmetric measurements from higher order moments. Analytical
expressions for the projected Q - function are found for different
classes of both factorized and entangled multi-qubit states. Macroscopic
features of discrete quasidistribution functions are also discussed.
Alex Lvovsky (University of Calgary)
- Friday 9:00am
Three ways to skin a cat: numerical approaches in coherent-state quantum
process tomography (Lecture Notes)
Coherent-state quantum process tomography (csQPT) is a method
for completely characterizing a quantum-optical 'black box' by
probing it with coherent states and carrying out homodyne measurements
on the output. While conceptually simple, practical implementation
of csQPT faces a few challenges associated with the highly singular
nature of the Glauber-Sudarshan function as well as the infinite
nature of the set of coherent states required for probing. To
date, three different approaches have been developed to address
these challenges. The presentation will describe these approaches
and their relative advantages and disadvantages.
Paolo Mataloni (Università
degli Studi di Roma "La Sapienza") - Thursday 2:35pm
Tomographic reconstruction of integrated waveguide optical devices
Integrated photonics is a fundamental instrument for the realization
of linear optical quantum circuits. Integrated waveguide optical
devices of increasing complexity, eventually reconfigurable, play
a fundamental role for the realization of a number of experiments
implementing quantum logic gates, enabling quantum simulations,
and coding photon states for quantum communication.
An integrated photonic device is usually based on an interferometric
network, given by an array of suitably chosen directional couplers
and is described by a unitary matrix that allows to map the distribution
of amplitudes and phases of the quantum optical field traveling
therein. To efficiently reconstruct the transfer matrix of a complex
optical network is essential in many cases, for instance when
one wants to perform different unitary transformations in a controlled
way. This task becomes more and more difficult as the complexity
grows and may be highly non trivial when the unavoidable optical
losses of the device need to be considered.
I will present the results obtained in our laboratory in characterizing
the operation of novel integrated optical devices built by the
3-dimensional capability of the femtosecond laser writing technique
and in realizing their tomographic reconstruction.
Dimitry Mogilevtsev (Universidade
Federal do ABC) - Tuesday 2:35pm
Self-calibrating tomography
A possibility of simultaneous tomography of a signal state and
measurement set-up will be discussed and illustrated with a number
of practical examples.
Tobias Moroder (University of Siegen)
- Wednesday 2:00pm
Detection of systematic errors in quantum tomography experiments
When systematic errors are ignored in an experiment, the subsequent
analysis of its results becomes questionable. We develop tests
to identify systematic errors in experiments where only a finite
amount of data is recorded and apply these tests to tomographic
data taken in an ion-trap experiment. We put particular emphasis
on quantum state tomography and present two detection methods:
the first employs linear inequalities while the second is based
on the generalized likelihood ratio.
joint work with: Matthias Kleinmann, Philipp Schindler, Thomas
Monz, Otfried Gühne, Rainer Blatt
reference: arXiv:1204.3644
Joshua Nunn (University of Oxford) -
Thursday 11:35am
Optimal experiment design for quantum state tomography: Fair,
precise, and minimal tomography
Given an experimental setup and a fixed number of measurements,
how should one take data to optimally reconstruct the state of
a quantum system? The problem of optimal experiment design (OED)
for quantum state tomography was first broached by Kosut et al.
[R. Kosut, I. Walmsley, and H. Rabitz, e-print arXiv:quant- ph/0411093
(2004)]. Here we provide efficient numerical algorithms for finding
the optimal design, and analytic results for the case of 'minimal
tomography'. We also introduce the average OED, which is independent
of the state to be reconstructed, and the optimal design for tomography
(ODT), which minimizes tomographic bias. Monte Carlo simulations
confirm the utility of our results for qubits. Finally, we adapt
our approach to deal with constrained techniques such as maximum-likelihood
estimation. We find that these are less amenable to optimization
than cruder reconstruction methods, such as linear inversion.
Jaroslav Rehácek (Palacky University)
- Tuesday 4:05pm
A little bit different quantum state tomography
Any quantum tomography scheme combines prior information about
the measured system and apparatus with the results of the measurement.
Here we show how to play around with these issues. While the standard
tomography requires perfect knowledge about the measurement apparatus
and a choice of a finite reconstruction subspace, the proposed
data pattern tomography allows us to release some of these conditions.
Guided by optical analogies, tomography can be done even without
precise knowledge of the measurement, in a way resembling the
classical image processing, when recorded response to a sufficiently
rich family of reference states provides quantum mechanical analog
of the optical response function. This will be suggested as a
toolbox for time multiplexed detection devices. Besides this,
quantum homodyne tomography will be revisited from the point of
view of informational completeness and all this will provide an
interesting link between the tomography of discrete and continuous
variable systems.
Barry C Sanders (University of Calgary)
- Thursday 9:00am
Artificial-Intelligence Reinforcement Learning for Quantum Metrology
with Adaptive Measurements
Quantum metrology aims for measurements of quantum channel (or
process) parameters that are more precise than allowed by classical
partition noise (shot noise) given a fixed number of input particles.
Specifically the imprecision of the process-parameter estimate
scales inversely with the square root of the number of particles
in the classical domain and up to inverse-linear in the number
of particles in the quantum domain by exploiting entanglement
between particles. Quantum adaptive-measurement schemes employ
entangled-particle inputs and sequential measurements of output
particles with feedback control on the channel in order to maximize
the knowledge gain from the subsequent particles being sequentially
processed. Quantum-adaptive approaches have the advantage that
input states are expected to be easier to make experimentally
than for non-adaptive schemes.
We are interested in devising input states and adaptive feedback
control on the processes to beat the standard quantum-measurement
limit in real-world scenarios with noise, decoherence and particle
losses. As such procedures are difficult to find even in ideal
cases, the usual method of clever guessing is inadequate for this
purpose. Instead we employ artificial-intelligence machine learning
to find adaptive-measurement procedures that beat the standard
quantum limit. I will discuss our approaches using reinforcement
learning and evolutionary computation to finding procedures for
adaptive interferometric phase estimation and show that machine
learning has enabled us to find procedures in the ideal case that
outperform previously known best cases in the ideal noiseless,
decoherence-free, lossless scenario as well as easily devising
robust procedures for noisy, decoherent, lossy scenario.
Marcus Silva (Raytheon BBN Technologies)
- Friday 9:35am
Process tomography without preparation and measurement errors
Randomized benchmarking (RB) can be used to estimate the fidelity
to Clifford group operations in a manner that is robust against
preparation and measurement errors --- thus allowing for a more
accurate quantification of the error compared to standard quantum
process tomography protocols. In this talk we will show how to
combine multiple RB experiments to reconstruct the unital part
of any trace preserving quantum process, while maintaining robustness
against preparation and measurement errors. This enables, in particular,
the reconstruction of any unitary or random unitary (e.g. dephasing)
error, and the quantification of the fidelity to any unitary operation.
Authors: Marcus P. da Silva (1), Shelby Kimmel (2), Colm A. Ryan
(1), Blake Johnson (1), Thomas Ohki (1)
(1) Quantum Information Processing Group, Raytheon BBN Technologies,
Cambridge, MA, USA
(2) Center for Theoretical Physics, MIT, Cambridge, MA
Denis Sych (Max-Planck Institute for the Science
of Light) - Friday 11:35am
Informational completeness of continuous-variable measurements
Measurement lies at the very heart of quantum information. A
set of measurements whose outcome probabilities are sufficient
to determine an arbitrary quantum state is called informationally
complete, while the process of reconstructing the state itself
is broadly called quantum tomography. In our work, we consider
continuous-variable measurements and prove that homodyne tomography
turns out to be informationally complete when the number of independent
quadrature measurements is equal to dimensionality of a density
matrix in the Fock representation. Using this as our thread, we
examine the completeness of other schemes, when the continuous-variable
observations are truncated to discrete finite-dimensional subspaces.
Joint work with J. Rehácek, Z. Hradil, G. Leuchs, and L.
L. Sánchez-Soto
Geza Toth (University of the Basque Country
UPV/EHU) - Friday 2:35pm
Permutationally Invariant Quantum Tomography and State Reconstruction
We present a scalable method for the tomography of large multiqubit
quantum registers. It acquires information about the permutationally
invariant part of the density operator, which is a good approximation
to the true state in many, relevant cases. Our method gives the
best measurement strategy to minimize the experimental effort
as well as the uncertainties of the reconstructed density matrix.
We apply our method to the experimental tomography of a photonic
four-qubit symmetric Dicke state. We also discuss how to obtain
a physical density matrix in a scalable way based on maximum likelihood
and least mean square fitting.
Peter S Turner (University of Tokyo)
- Monday 11:00am
t-designs in quantum optics
This talk is meant to be a brief review of the concept of t-designs,
with applications in quantum information theory, particularly
in the realm of quantum optics. I will discuss both vector ("state")
and matrix ("unitary") t-designs, and try to give some
feeling for why they are interesting both mathematically and physically.
I will then summarize some recent research regarding the surprising
non-existence of Gaussian vector 2-designs, with implications
for continuous variable tomography, as well as outline current
work aimed at implementing matrix designs in photonic circuits
in collaboration with experimentalists.
Guoyong Xiang (University of Science
and Technology of China) - Wednesday 11:35am
Experimental verification of an entangled state with finite data
Entanglement is the irreplaceable resource to the quantum information
science and because of the superposition property, quantum states
are highly complex and difficult to identify. How to verify the
entanglement of a quantum state has always been attracted significant
attention. several methods have been developed for verifying entanglement.
Recently, more and more particles entangled states have been prepared
on a wide variety of physical systems, but, most of time, we have
only finite copies of the state we want to know. And the above
methods can only give a probabilistic conclusions with the finite
copies. People have tried to reduce the measurements and improve
the precision. But the problem still has not been solved fundamentally.
So we need to consider how good we can verify the entanglement
of a quantum state with finite copies.
Blume-Kohout and coauthors recently proposed a method to give
the confidence of verifying entanglement with finite data[1].
They propose a reliable method to quantify the weight of evidence
for (or against) entanglement, based on a likelihood ratio test.
In this work, we inplement a experiment to demonstrate that the
entangled states can be distinguished easier with concurrence
increased and the states on the boundary between entangled sates
and seperated states are hard to be distinguished. Then we use
the likelihood ratio to give the quantity of the entanglement
of a quantum state and the confidence are increased as data increased
as shown in experiment.
[1] R. Blume-Kohout, J. O. S. Yin, and S. J. van Enk, Phys. Rev.
Lett. 105, 170501 (2010)
Karol Zyczkowski (Jagiellonian
University) - Wednesday 4:05pm
Measurements on random states and numerical shadow
Reconstructing quantum states from data measured one usually
assumes that the measurement procedure is optimized. Here we analyze
the complementary problem of interpretation of data obtained from
'measurements on random samples'. We assume that the observable
A is given, while the input states of dimension N are distributed
uniformly with respect to the natural, unitary invariant, Fubini--Study
measure. We show a link of the possible outcome of a measurement
of A in k copies of the same random pure state with the mathematical
notion of the numerical range of A. Repeating such a measurement
for several samples of random states leads to the probability
distribution P(x), which coincides with the numerical shadow of
the observable A. We show in particular, that for any observable
A acting on a single qubit, N=2, the distribution P(x) is constant
in a certain interval. For higher dimensions the distributions
obtained can be interpreted as projections of the set of pure
quantum states on a line. In the case of coincidence measurement
of any two observables, A_1 and A_2, one arrives at distributions
equivalent to the shadow of the set of quantum states projected
onto a plane.
A joint work with Piotr Gawron, Jaroslaw Miszczak and Zbigniew
Puchala (Gliwice, Poland)
Poster Abstracts
Full Name |
University Name |
Poster Abstract |
Gittsovich, Oleg
|
Institute for Quantum Computing
University of Waterloo, |
Reliable Entanglement Verification
Any experiment attempting to verify the presence of entanglement
in a physical system can only generate a finite amount of data. The
statement that entanglement was present in the system can thus never
be issued with certainty, requiring instead a statistical analysis of
the data. Because entanglement plays a central role in the performance
of quantum devices, it is crucial to make statistical claims in entanglement
verification experiments that are reliable and have a clear interpretation.
In this work, we apply recent results by M. Christandl and R. Renner
to construct a reliable entanglement verification procedure based on
the concept of confidence regions. The statements made do not require
the specification of a prior distribution, the assumption of independent
measurements nor the assumption of an independent and identically distributed
(i.i.d.) source of states. Moreover, we develop numerical tools that
are necessary to employ this approach in practice, rendering the procedure
ready to be applied to current experiments. We demonstrate this technique
by analyzing the data of a photonic experiment generating two-photon
states whose entanglement is verified with the use of an accessible
nonlinear witness.
|
Godoy, Sergio |
Pontificia Universidad Catolica de Chile |
Local sampling of phase space by unbalanced optical homodyning
Authors: S. Godoy, S. Wallentowitz, B. Seifert |
Goyeneche, Dardo |
Center for Optics and Photonics |
Efficient tomography for pure quantum states |
Hou, Zhibo |
Key Laboratory of Quantum Information
|
Quantum State Reconstruction via Linear Model Theory |
Johnston, Nathaniel |
University of Waterloo |
Uniqueness of Quantum States Compatible with Given Measurement
Results |
Kimmel, Shelby &
da Silva, Marcus P.
|
Massachusetts Institute of Technology |
Robust Randomized Benchmarking Beyond Cliffords |
Kueng, R. &
Gross, D.
|
University of Freiburg |
Qubit Stobilizer States are Complex Projective 3-design |
Magesan, Easwar |
Massacusetts Institute of Technology |
Efficient Measurement of Gate Error by Interleaved Randomized Benchmarking |
Mahler, Dylan |
University of Toronto |
Experimental Demonstration of Adaptive Tomography |
Quesada, Nicolás |
University of Toronto |
Simultaneous State and Process Tomography of Qudits
by N. Quesada, A. M. Branczyk and D. F. V. James
|
Rapcan, Peter |
Institute of Physics, Slovak Academy of Sciences |
Direct Estimation of Decoherence Rates
(joint work/poster with Dr. Mario Ziman) |
Salazar, Roberto |
University of Concepción |
Quantum Bayesianism for an infinite countable set of dimensions
Quantum Bayesianism is a novel mathematical formulation of quantum mechanics
whose origin lies in optimal
tomographic schemes. Usually this formulation is implemented by changing
Born´s rule by Bayesian modified
rule. Until now this have been done by the use of a special kind of
POVM known as SIC-POVM. The main problem
of this formulation is that SIC-POVM´s existence in every finite dimension
is still unknown but conjectured.
Here we show that a different set of POVM allows the replacement of
Born rule by a Bayesian modified rule.
In particular we have shown that such POVM exist for a infinite countable
set of finite dimensions. |
Schwemmer, Christian |
Max-Planck-Institut für Quantenoptik |
Permutationally Invariant Tomography of Symmetric Dicke States
|
Sedlak, Michal |
Palacky University |
Discrimination of quantum measurements |
Sugiyama, Takanori |
University of Tokyo |
Error analysis of maximum likelihood estimator in one-qubit state
tomography with finite data |
Tanaka, Fuyuhiko |
University of Tokyo |
Minimax estimation of density operators for general parametric
models |
Ziman, Mario |
Institute of Physics SAS |
Direct estimations of decoherence parameters |
Back to top
|
|