The Quantum Regression Theorem For Out-of-time-ordered Correlation Functions | Awesome Quantum Computing Papers

The Quantum Regression Theorem For Out-of-time-ordered Correlation Functions

Philip Daniel Blocher, Klaus Molmer Β· Physical Review A Β· 2018

We derive an extension of the quantum regression theorem to calculate out-of-time-order correlation functions in Markovian open quantum systems. While so far mostly being applied in the analysis of many-body physics, we demonstrate that out-of-time-order correlation functions appear naturally in optical detection schemes with interferometric delay lines, and we apply our extended quantum regression theorem to calculate the non-trivial photon counting fluctuations in split and recombined signals from a quantum light source.

Explore more on:
Quantum Simulation
Similar Work
Loading…