Oliver Club
We consider a statistical model of n-mode quantum Gaussian states that are shift-invariant and gauge-invariant. Such models can be considered analogs of classical Gaussian stationary time series, parametrized by their spectral density. Defining an appropriate quantum spectral density as the parameter, we establish that the quantum time series model is asymptotically equivalent to a classical Gaussian white noise (or regression) model with a transformed quantum spectral density as the signal. Asymptotic equivalence is established in the sense of the quantum deficiency distance between statistical models (experiments). This result allows to simplify statistical inference, as the approximating model is classical (non-quantum), has no dependence structure, and the parameter is the mean of a Gaussian process. If time permits, we will also identify a quantum analog of the periodogram and discuss optimal nonparametric estimation of the quantum spectral density.