I am currently working on quantum stochastic calculus under the direction of Jan Wehr. Quantum stochastic calculus is a non-commutative generalization of the classical theory of stochastic processes, from which the classical theory can be derived. This makes heavy use of the fields of quantum mechanics (particularly second quantization), functional analysis, stochastic differential equations and general probability theory. It is used to model an open quantum system, which is a quantum system interacting with a larger system (usually the environment) which has many more degrees of freedom and whose dynamics are accounted for probabilistically. Essentially every real quantum system is an open quantum systems and so this theory has applications ranging from the concrete, e.g. quantum computation, to the fundamental, e.g. the measurement problem.

The measurement problem in quantum mechanics arises from a tension between the unitary Schrödinger dynamics and the projection postulate which describes the collapse of the wave function. It is the projection postulate which produces, from a range of possible outcomes with certain probabilities, an actual outcome, a process known as objectification. Unfortunately, there is not a universally agreed upon description of how these projections arise, and this is the purpose of the various interpretations of quantum mechanics. One interpretation which I have worked on recently is stochastic collapse models. In this theory, collapse happens through an omnipresent but indeterministic collapse process arising from a coupling of the system with a classically stochastic field. The following animation shows this collapse process, driven by Poisson noise (as in the GRW interpretation) acting on the quantum harmonic oscillator. In red is the probability density, and multiple peaks in this function roughly mean the particle is in multiple places at once. In blue and green are the real and imaginary parts of the wavefunction. The initial state is an energy eigenstate with the particle able to occupy roughly four different places with high probability, but as the projection process acts on the wavefunction the particle is only able to occupy roughly one place, and that place oscillates left and right like a classical oscillator. This shows how a purely quantum systems in superposition becomes objectified and begins to behave classically via the indeterministic collapse process.

The first thing that can be done with these models is to use quantum stochastic calculus and make them fully quantum by using quantum noises. The other interesting result we found is that the history of the system is being encoded into the distribution of the noise, so that a kind of observer part of the total wavefunction is keeping track of all the events of the system, as Hugh Everett, the inventor of the many worlds interpretation, originally envisioned. But this is just one take on the measurement problem which I find interesting.

I am also interested in how David Bohm and Basil Hiley's ideas about the implicate and explicate order can be applied to the measurement problem. For them quantum mechanics is described by an algebra, which provides an underlying (implicate) order to reality. This order is not fully realized in our manifested (explicate) order which captures something akin to a projection of this underlying structure. This is related to the conspicuous nature of our position-biased universe, when quantum mechanics a priori makes no preference for any representation. The idea of implicate and explicate orders can be expressed in the following animation. This animation is generated by simple operations on a few polynomials. What is plotted are the roots of these polynomials, which undergo much more complicated and entangled motion. Here the implicate order is the underlying algebra of polynomials and the operations that act on it, while the explicate order is the order reflected in the roots of a few of these polynomials.

It may very well be case that the measurement problem is not a fundamental problem of quantum mechanics, but a problem of our explicate order. Indeed, studying stochastic collapse models has led to the question of just what is the nature of a realization of a noise process (and of a unique trajectory for our universe) as it seems the purely quantum picture describes the collection of outcomes, but not the individual outcomes themselves. In this sense only one history is ever explicated from the purely (implicate) quantum state, at least as far as we are able to observe.