Mathematical tools
We are very keen to keep an eye on advanced tools developed by mathematicians, like for instance:
-
This mathematical property can be stated informally as follows: a function depending on many independent
random variables, but not too much on any of them, is essentially constant and equal to its mean value
almost everywhere [1,2].
The simplest way to understand this phenomenon is to consider the example of a physical experiment whose
measurement outcome
is disturbed by a random noise. Averaging over \(N\) measurements taken in stationary conditions
increases the signal
over noise ratio by a factor \( \sqrt{N}\). This is the simplest illustration of the central limit
theorem.
The very same phenomenon appears when considering a thermodynamical system away from criticality, where
the additivity property of an extensive quantity makes it self-averaging: any macroscopic additive
observable will be the sum of many uncorrelated mesoscopic contributions and will undergo a central
limit theorem.
It appears that surprisingly this phenomenon of self-averaging happens not only for the empirical
averages just mentioned earlier
but also for highly non linear functions like e.g. a reduced density matrix (encoding the state
of a subsystem)
considered as a function of the interaction Hamiltonian only (all other parameters fixed: the sum
of the subparts Hamiltonians, the initial state, the strength of the interaction). This phenomenon is
the starting point of our statistical method for tackling the many body problem (see Typicality of quantum dynamics ).
-
The theory of free probability [3,4] extends the natural concept of independence for
real random variables at the level
of operators (or very large matrices in an approximate way). Considering for instance two very large
hermitian random matrices \(\hat{A}\) and \(\hat{B}\), if the eigenvectors of e.g. \(\hat{A}\) are isotropically
distributed in the eigenbasis of \(\hat{B} \), then \(\hat{A} \)
and \(\hat{B} \) are in generic position with one another: they are said to
be free . In this case, the density of states (DOS) of \( \hat{A}+\hat{B}\) is related to each
DOS of \(\hat{A}\) and \(\hat{B}\), it is their free convolution. The free convolution
is for free random operators the equivalent of the classical convolution for independent real random
variables. In summary, free probability theory provides a statistical answer to the
problem of understanding the behavior of the eigenvalues and the eigenvectors of a large hermitian matrix
\(\hat{A}\) when adding another hermitian matrix \(\hat{B}\).