Quantum probability, measurement, information, and computation

My recent paper studies a joint probability for non-commuting quantum observables. It is known that no such precise and additive joint probability exists. There are only quasi-probabilities (e.g. the Wigner function) that become negative for certain states and thereby loose their physical meaning. Here 

A. Allahverdyan, Imprecise probability for non-commuting observables, New J. Phys. 17, 085005 (2015).

I show that unique (under certain meaningful conditions) and positive probability can be defined for a pair of non-commuting observables, but this probability ought to be imprecise and non-additive. Though such probabilities are known in computer science, they so far did not get real physical applications. Hopefully the situation will change, and imprecise probabilities will eventually be employed also in classical physics, e.g. in statistical mechanics.

Since 2000 we produced several models for the quantum measurement process. This activity is reviewed here

Our most recent achievement is a model-independent theory for ideal quantum measurement that has the same footing as thermodynamics, i.e. it follows from very few macroscopic statements that can be verified microscopically. This theory is summarized in our preprint

Here is the definition of the quantum measurement problem and a short summary of that research.

One of the main challenges about the foundations of quantum theory is still the measurement problem: Why and how can we make well-defined statements about an individual quantum system at the end of an ideal measurement despite the fact that quantum probabilities are not (and cannot be) defined on any sample space?

Two aspects of this definition of the measurement problem are crucial: 

(1) Quantum theory does not involve the same probabilites as in classical physics, which can be assigned to the individual elements of a statistical ensemble; 

(2) Experiments provide information about individual quantum systems, and this should be explained by a theory of quantum measurement process. 

Research on such topics culminated in well-known no-go theorems showing in which sense the measurement problem cannot/should not be resolved within quantum mechanics. The problem also motivated several extensions of quantum mechanics, e.g. the objective collapse theories.

We propose a resolution of the measurement problem via ideas of statistical mechanics. Within our approach the measurement process acquires a thermodynamic status: It can be viewed as the establishment of a generalized equilibrium for the apparatus A coupled to the tested system S. The analysis of the dynamical process which leads to this outcome is based on approaches of quantum statistical mechanics and involves two steps. The first one deals with the dynamics of the density matrix of S+A which describes a large ensemble of identically prepared runs; it yields the desired generalized equilibrium form for the final density matrix. However, a quantum ambiguity forbids to infer properties of individual systems from the knowledge of the density matrix of the ensemble to which they belong, so that this step is not sufficient to account for the occurrence of a well defined outcome for each individual run of the ensemble. Therefore, in a second step, a stronger result is established, still within standard statistical quantum mechanics, concerning all possible subensembles of runs: Their associated density matrices are shown to relax towards the required structure owing to a specific mechanism that acts near the end of the process. A solution of the measurement problem is thus achieved.