Skip to content

Commit

Permalink
Update week9.do.txt
Browse files Browse the repository at this point in the history
  • Loading branch information
mhjensen committed Mar 14, 2024
1 parent 1a7993f commit 5327579
Showing 1 changed file with 4 additions and 302 deletions.
306 changes: 4 additions & 302 deletions doc/src/week9/week9.do.txt
Original file line number Diff line number Diff line change
Expand Up @@ -7,9 +7,9 @@ DATE: March 11-15
===== Overview of week 11, March 11-15 =====
!bblock Topics
o Reminder from last week about statistical observables, the central limit theorem and bootstrapping, see notes from last week
o Resampling TechniquesL Blocking
o Discussion of onebody densities
o Start discussion on optimization and parallelization
o Resampling Techniques, emphasis on Blocking
o Discussion of onebody densities (whiteboard notes)
o Start discussion on optimization and parallelization for Python and C++
#* "Video of lecture TBA":"https://youtu.be/"
#* "Handwritten notes":"https://github.com/CompPhysics/ComputationalPhysics2/blob/gh-pages/doc/HandWrittenNotes/2024/NotesMarch22.pdf"
!eblock
Expand Down Expand Up @@ -167,7 +167,7 @@ in terms of a function
f_d=\frac{2}{mn}\sum_{i=1}^{m} \sum_{k=1}^{n-d}\tilde{x}_{ik}\tilde{x}_{i(k+d)}.
\]
!et
We note that for $d=$ we have
We note that for $d=0$ we have
!bt
\[
f_0=\frac{2}{mn}\sum_{i=1}^{m} \sum_{k=1}^{n}\tilde{x}_{ik}\tilde{x}_{i(k)}=\sigma^2!
Expand All @@ -187,304 +187,6 @@ We introduce then a correlation function $\kappa_d=f_d/\sigma^2$. Note that $\ka
The code here shows the evolution of $\kappa_d$ as a function of $d$ for a series of random numbers. We see that the function $\kappa_d$ approaches $0$ as $d\rightarrow \infty$.


!split
===== Statistics, wrapping up from last week =====
!bblock
Let us analyze the problem by splitting up the correlation term into
partial sums of the form:
!bt
\[
f_d = \frac{1}{n-d}\sum_{k=1}^{n-d}(x_k - \bar x_n)(x_{k+d} - \bar x_n)
\]
!et
The correlation term of the error can now be rewritten in terms of
$f_d$
!bt
\[
\frac{2}{n}\sum_{k<l} (x_k - \bar x_n)(x_l - \bar x_n) =
2\sum_{d=1}^{n-1} f_d
\]
!et
The value of $f_d$ reflects the correlation between measurements
separated by the distance $d$ in the sample samples. Notice that for
$d=0$, $f$ is just the sample variance, $\mathrm{var}(x)$. If we divide $f_d$
by $\mathrm{var}(x)$, we arrive at the so called *autocorrelation function*
!bt
\[
\kappa_d = \frac{f_d}{\mathrm{var}(x)}
\]
!et
which gives us a useful measure of pairwise correlations
starting always at $1$ for $d=0$.
!eblock


!split
===== Statistics, final expression =====
!bblock
The sample error can now be
written in terms of the autocorrelation function:

!bt
\begin{align}
\mathrm{err}_X^2 &=
\frac{1}{n}\mathrm{var}(x)+\frac{2}{n}\cdot\mathrm{var}(x)\sum_{d=1}^{n-1}
\frac{f_d}{\mathrm{var}(x)}\nonumber\\ &=&
\left(1+2\sum_{d=1}^{n-1}\kappa_d\right)\frac{1}{n}\mathrm{var}(x)\nonumber\\
&=\frac{\tau}{n}\cdot\mathrm{var}(x)
\end{align}

!et
and we see that $\mathrm{err}_X$ can be expressed in terms the
uncorrelated sample variance times a correction factor $\tau$ which
accounts for the correlation between measurements. We call this
correction factor the *autocorrelation time*:
!bt
\begin{equation}
\tau = 1+2\sum_{d=1}^{n-1}\kappa_d
label{eq:autocorrelation_time}
\end{equation}
!et
!eblock

!split
===== Statistics, effective number of correlations =====
!bblock
For a correlation free experiment, $\tau$
equals 1.

We can interpret a sequential
correlation as an effective reduction of the number of measurements by
a factor $\tau$. The effective number of measurements becomes:
!bt
\[
n_\mathrm{eff} = \frac{n}{\tau}
\]
!et
To neglect the autocorrelation time $\tau$ will always cause our
simple uncorrelated estimate of $\mathrm{err}_X^2\approx \mathrm{var}(x)/n$ to
be less than the true sample error. The estimate of the error will be
too *good*. On the other hand, the calculation of the full
autocorrelation time poses an efficiency problem if the set of
measurements is very large.
!eblock






!split
===== Can we understand this? Time Auto-correlation Function =====
!bblock

The so-called time-displacement autocorrelation $\phi(t)$ for a quantity $\mathbf{M}$ is given by
!bt
\[
\phi(t) = \int dt' \left[\mathbf{M}(t')-\langle \mathbf{M} \rangle\right]\left[\mathbf{M}(t'+t)-\langle \mathbf{M} \rangle\right],
\]
!et
which can be rewritten as
!bt
\[
\phi(t) = \int dt' \left[\mathbf{M}(t')\mathbf{M}(t'+t)-\langle \mathbf{M} \rangle^2\right],
\]
!et
where $\langle \mathbf{M} \rangle$ is the average value and
$\mathbf{M}(t)$ its instantaneous value. We can discretize this function as follows, where we used our
set of computed values $\mathbf{M}(t)$ for a set of discretized times (our Monte Carlo cycles corresponding to moving all electrons?)
!bt
\[
\phi(t) = \frac{1}{t_{\mathrm{max}}-t}\sum_{t'=0}^{t_{\mathrm{max}}-t}\mathbf{M}(t')\mathbf{M}(t'+t)
-\frac{1}{t_{\mathrm{max}}-t}\sum_{t'=0}^{t_{\mathrm{max}}-t}\mathbf{M}(t')\times
\frac{1}{t_{\mathrm{max}}-t}\sum_{t'=0}^{t_{\mathrm{max}}-t}\mathbf{M}(t'+t).
label{eq:phitf}
\]
!et
!eblock


!split
===== Time Auto-correlation Function =====
!bblock

One should be careful with times close to $t_{\mathrm{max}}$, the upper limit of the sums
becomes small and we end up integrating over a rather small time interval. This means that the statistical
error in $\phi(t)$ due to the random nature of the fluctuations in $\mathbf{M}(t)$ can become large.

One should therefore choose $t \ll t_{\mathrm{max}}$.

Note that the variable $\mathbf{M}$ can be any expectation values of interest.



The time-correlation function gives a measure of the correlation between the various values of the variable
at a time $t'$ and a time $t'+t$. If we multiply the values of $\mathbf{M}$ at these two different times,
we will get a positive contribution if they are fluctuating in the same direction, or a negative value
if they fluctuate in the opposite direction. If we then integrate over time, or use the discretized version of, the time correlation function $\phi(t)$ should take a non-zero value if the fluctuations are
correlated, else it should gradually go to zero. For times a long way apart
the different values of $\mathbf{M}$ are most likely
uncorrelated and $\phi(t)$ should be zero.
!eblock




!split
===== Time Auto-correlation Function =====
!bblock
We can derive the correlation time by observing that our Metropolis algorithm is based on a random
walk in the space of all possible spin configurations.
Our probability
distribution function $\mathbf{\hat{w}}(t)$ after a given number of time steps $t$ could be written as
!bt
\[
\mathbf{\hat{w}}(t) = \mathbf{\hat{W}^t\hat{w}}(0),
\]
!et
with $\mathbf{\hat{w}}(0)$ the distribution at $t=0$ and $\mathbf{\hat{W}}$ representing the
transition probability matrix.
We can always expand $\mathbf{\hat{w}}(0)$ in terms of the right eigenvectors of
$\mathbf{\hat{v}}$ of $\mathbf{\hat{W}}$ as
!bt
\[
\mathbf{\hat{w}}(0) = \sum_i\alpha_i\mathbf{\hat{v}}_i,
\]
!et
resulting in
!bt
\[
\mathbf{\hat{w}}(t) = \mathbf{\hat{W}}^t\mathbf{\hat{w}}(0)=\mathbf{\hat{W}}^t\sum_i\alpha_i\mathbf{\hat{v}}_i=
\sum_i\lambda_i^t\alpha_i\mathbf{\hat{v}}_i,
\]
!et
with $\lambda_i$ the $i^{\mathrm{th}}$ eigenvalue corresponding to
the eigenvector $\mathbf{\hat{v}}_i$.
!eblock




!split
===== Time Auto-correlation Function =====
!bblock
If we assume that $\lambda_0$ is the largest eigenvector we see that in the limit $t\rightarrow \infty$,
$\mathbf{\hat{w}}(t)$ becomes proportional to the corresponding eigenvector
$\mathbf{\hat{v}}_0$. This is our steady state or final distribution.

We can relate this property to an observable like the mean energy.
With the probabilty $\mathbf{\hat{w}}(t)$ (which in our case is the squared trial wave function) we
can write the expectation values as
!bt
\[
\langle \mathbf{M}(t) \rangle = \sum_{\mu} \mathbf{\hat{w}}(t)_{\mu}\mathbf{M}_{\mu},
\]
!et
or as the scalar of a vector product
!bt
\[
\langle \mathbf{M}(t) \rangle = \mathbf{\hat{w}}(t)\mathbf{m},
\]
!et
with $\mathbf{m}$ being the vector whose elements are the values of $\mathbf{M}_{\mu}$ in its
various microstates $\mu$.
!eblock


!split
===== Time Auto-correlation Function =====

!bblock

We rewrite this relation as
!bt
\[
\langle \mathbf{M}(t) \rangle = \mathbf{\hat{w}}(t)\mathbf{m}=\sum_i\lambda_i^t\alpha_i\mathbf{\hat{v}}_i\mathbf{m}_i.
\]
!et
If we define $m_i=\mathbf{\hat{v}}_i\mathbf{m}_i$ as the expectation value of
$\mathbf{M}$ in the $i^{\mathrm{th}}$ eigenstate we can rewrite the last equation as
!bt
\[
\langle \mathbf{M}(t) \rangle = \sum_i\lambda_i^t\alpha_im_i.
\]
!et
Since we have that in the limit $t\rightarrow \infty$ the mean value is dominated by the
the largest eigenvalue $\lambda_0$, we can rewrite the last equation as
!bt
\[
\langle \mathbf{M}(t) \rangle = \langle \mathbf{M}(\infty) \rangle+\sum_{i\ne 0}\lambda_i^t\alpha_im_i.
\]
!et
We define the quantity
!bt
\[
\tau_i=-\frac{1}{log\lambda_i},
\]
!et
and rewrite the last expectation value as
!bt
\[
\langle \mathbf{M}(t) \rangle = \langle \mathbf{M}(\infty) \rangle+\sum_{i\ne 0}\alpha_im_ie^{-t/\tau_i}.
label{eq:finalmeanm}
\]
!et
!eblock


!split
===== Time Auto-correlation Function =====
!bblock

The quantities $\tau_i$ are the correlation times for the system. They control also the auto-correlation function
discussed above. The longest correlation time is obviously given by the second largest
eigenvalue $\tau_1$, which normally defines the correlation time discussed above. For large times, this is the
only correlation time that survives. If higher eigenvalues of the transition matrix are well separated from
$\lambda_1$ and we simulate long enough, $\tau_1$ may well define the correlation time.
In other cases we may not be able to extract a reliable result for $\tau_1$.
Coming back to the time correlation function $\phi(t)$ we can present a more general definition in terms
of the mean magnetizations $ \langle \mathbf{M}(t) \rangle$. Recalling that the mean value is equal
to $ \langle \mathbf{M}(\infty) \rangle$ we arrive at the expectation values
!bt
\[
\phi(t) =\langle \mathbf{M}(0)-\mathbf{M}(\infty)\rangle \langle \mathbf{M}(t)-\mathbf{M}(\infty)\rangle,
\]
!et
resulting in
!bt
\[
\phi(t) =\sum_{i,j\ne 0}m_i\alpha_im_j\alpha_je^{-t/\tau_i},
\]
!et
which is appropriate for all times.
!eblock



!split
===== Correlation Time =====
!bblock

If the correlation function decays exponentially
!bt
\[ \phi (t) \sim \exp{(-t/\tau)}\]
!et
then the exponential correlation time can be computed as the average
!bt
\[ \tau_{\mathrm{exp}} = -\langle \frac{t}{log|\frac{\phi(t)}{\phi(0)}|} \rangle. \]
!et
If the decay is exponential, then
!bt
\[ \int_0^{\infty} dt \phi(t) = \int_0^{\infty} dt \phi(0)\exp{(-t/\tau)} = \tau \phi(0),\]
!et
which suggests another measure of correlation
!bt
\[ \tau_{\mathrm{int}} = \sum_k \frac{\phi(k)}{\phi(0)}, \]
!et
called the integrated correlation time.
!eblock




!split
Expand Down

0 comments on commit 5327579

Please sign in to comment.