diff --git a/doc/pub/week9/html/._week9-bs000.html b/doc/pub/week9/html/._week9-bs000.html index 1b8da37c..7c08e6d5 100644 --- a/doc/pub/week9/html/._week9-bs000.html +++ b/doc/pub/week9/html/._week9-bs000.html @@ -62,43 +62,6 @@ 2, None, 'introducing-the-correlation-function'), - ('Statistics, wrapping up from last week', - 2, - None, - 'statistics-wrapping-up-from-last-week'), - ('Statistics, final expression', - 2, - None, - 'statistics-final-expression'), - ('Statistics, effective number of correlations', - 2, - None, - 'statistics-effective-number-of-correlations'), - ('Can we understand this? Time Auto-correlation Function', - 2, - None, - 'can-we-understand-this-time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Correlation Time', 2, None, 'correlation-time'), ('Resampling methods: Blocking', 2, None, @@ -163,23 +126,13 @@
  • The covariance term
  • Rewriting the covariance term
  • Introducing the correlation function
  • -
  • Statistics, wrapping up from last week
  • -
  • Statistics, final expression
  • -
  • Statistics, effective number of correlations
  • -
  • Can we understand this? Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Correlation Time
  • -
  • Resampling methods: Blocking
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations, getting there
  • -
  • Blocking Transformations, final expressions
  • -
  • Example code form last week
  • -
  • Resampling analysis
  • +
  • Resampling methods: Blocking
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations, getting there
  • +
  • Blocking Transformations, final expressions
  • +
  • Example code form last week
  • +
  • Resampling analysis
  • @@ -234,7 +187,7 @@

    March 11-15

  • 9
  • 10
  • ...
  • -
  • 29
  • +
  • 19
  • »
  • diff --git a/doc/pub/week9/html/._week9-bs001.html b/doc/pub/week9/html/._week9-bs001.html index 1eacb3d8..af5485f6 100644 --- a/doc/pub/week9/html/._week9-bs001.html +++ b/doc/pub/week9/html/._week9-bs001.html @@ -62,43 +62,6 @@ 2, None, 'introducing-the-correlation-function'), - ('Statistics, wrapping up from last week', - 2, - None, - 'statistics-wrapping-up-from-last-week'), - ('Statistics, final expression', - 2, - None, - 'statistics-final-expression'), - ('Statistics, effective number of correlations', - 2, - None, - 'statistics-effective-number-of-correlations'), - ('Can we understand this? Time Auto-correlation Function', - 2, - None, - 'can-we-understand-this-time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Correlation Time', 2, None, 'correlation-time'), ('Resampling methods: Blocking', 2, None, @@ -163,23 +126,13 @@
  • The covariance term
  • Rewriting the covariance term
  • Introducing the correlation function
  • -
  • Statistics, wrapping up from last week
  • -
  • Statistics, final expression
  • -
  • Statistics, effective number of correlations
  • -
  • Can we understand this? Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Correlation Time
  • -
  • Resampling methods: Blocking
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations, getting there
  • -
  • Blocking Transformations, final expressions
  • -
  • Example code form last week
  • -
  • Resampling analysis
  • +
  • Resampling methods: Blocking
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations, getting there
  • +
  • Blocking Transformations, final expressions
  • +
  • Example code form last week
  • +
  • Resampling analysis
  • @@ -197,9 +150,9 @@

    Overview of week 11, Mar
    1. Reminder from last week about statistical observables, the central limit theorem and bootstrapping, see notes from last week
    2. -
    3. Resampling TechniquesL Blocking
    4. -
    5. Discussion of onebody densities
    6. -
    7. Start discussion on optimization and parallelization +
    8. Resampling Techniques, emphasis on Blocking
    9. +
    10. Discussion of onebody densities (whiteboard notes)
    11. +
    12. Start discussion on optimization and parallelization for Python and C++
    @@ -234,7 +187,7 @@

    Overview of week 11, Mar
  • 10
  • 11
  • ...
  • -
  • 29
  • +
  • 19
  • »
  • diff --git a/doc/pub/week9/html/._week9-bs002.html b/doc/pub/week9/html/._week9-bs002.html index dc285143..2a20b565 100644 --- a/doc/pub/week9/html/._week9-bs002.html +++ b/doc/pub/week9/html/._week9-bs002.html @@ -62,43 +62,6 @@ 2, None, 'introducing-the-correlation-function'), - ('Statistics, wrapping up from last week', - 2, - None, - 'statistics-wrapping-up-from-last-week'), - ('Statistics, final expression', - 2, - None, - 'statistics-final-expression'), - ('Statistics, effective number of correlations', - 2, - None, - 'statistics-effective-number-of-correlations'), - ('Can we understand this? Time Auto-correlation Function', - 2, - None, - 'can-we-understand-this-time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Correlation Time', 2, None, 'correlation-time'), ('Resampling methods: Blocking', 2, None, @@ -163,23 +126,13 @@
  • The covariance term
  • Rewriting the covariance term
  • Introducing the correlation function
  • -
  • Statistics, wrapping up from last week
  • -
  • Statistics, final expression
  • -
  • Statistics, effective number of correlations
  • -
  • Can we understand this? Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Correlation Time
  • -
  • Resampling methods: Blocking
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations, getting there
  • -
  • Blocking Transformations, final expressions
  • -
  • Example code form last week
  • -
  • Resampling analysis
  • +
  • Resampling methods: Blocking
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations, getting there
  • +
  • Blocking Transformations, final expressions
  • +
  • Example code form last week
  • +
  • Resampling analysis
  • @@ -221,7 +174,7 @@

    Why resampling methods ?

  • 11
  • 12
  • ...
  • -
  • 29
  • +
  • 19
  • »
  • diff --git a/doc/pub/week9/html/._week9-bs003.html b/doc/pub/week9/html/._week9-bs003.html index 1956aeef..bbccd386 100644 --- a/doc/pub/week9/html/._week9-bs003.html +++ b/doc/pub/week9/html/._week9-bs003.html @@ -62,43 +62,6 @@ 2, None, 'introducing-the-correlation-function'), - ('Statistics, wrapping up from last week', - 2, - None, - 'statistics-wrapping-up-from-last-week'), - ('Statistics, final expression', - 2, - None, - 'statistics-final-expression'), - ('Statistics, effective number of correlations', - 2, - None, - 'statistics-effective-number-of-correlations'), - ('Can we understand this? Time Auto-correlation Function', - 2, - None, - 'can-we-understand-this-time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Correlation Time', 2, None, 'correlation-time'), ('Resampling methods: Blocking', 2, None, @@ -163,23 +126,13 @@
  • The covariance term
  • Rewriting the covariance term
  • Introducing the correlation function
  • -
  • Statistics, wrapping up from last week
  • -
  • Statistics, final expression
  • -
  • Statistics, effective number of correlations
  • -
  • Can we understand this? Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Correlation Time
  • -
  • Resampling methods: Blocking
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations, getting there
  • -
  • Blocking Transformations, final expressions
  • -
  • Example code form last week
  • -
  • Resampling analysis
  • +
  • Resampling methods: Blocking
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations, getting there
  • +
  • Blocking Transformations, final expressions
  • +
  • Example code form last week
  • +
  • Resampling analysis
  • @@ -226,7 +179,7 @@

    Statistical analysis

  • 12
  • 13
  • ...
  • -
  • 29
  • +
  • 19
  • »
  • diff --git a/doc/pub/week9/html/._week9-bs004.html b/doc/pub/week9/html/._week9-bs004.html index b42e6c14..5d054a97 100644 --- a/doc/pub/week9/html/._week9-bs004.html +++ b/doc/pub/week9/html/._week9-bs004.html @@ -62,43 +62,6 @@ 2, None, 'introducing-the-correlation-function'), - ('Statistics, wrapping up from last week', - 2, - None, - 'statistics-wrapping-up-from-last-week'), - ('Statistics, final expression', - 2, - None, - 'statistics-final-expression'), - ('Statistics, effective number of correlations', - 2, - None, - 'statistics-effective-number-of-correlations'), - ('Can we understand this? Time Auto-correlation Function', - 2, - None, - 'can-we-understand-this-time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Correlation Time', 2, None, 'correlation-time'), ('Resampling methods: Blocking', 2, None, @@ -163,23 +126,13 @@
  • The covariance term
  • Rewriting the covariance term
  • Introducing the correlation function
  • -
  • Statistics, wrapping up from last week
  • -
  • Statistics, final expression
  • -
  • Statistics, effective number of correlations
  • -
  • Can we understand this? Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Correlation Time
  • -
  • Resampling methods: Blocking
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations, getting there
  • -
  • Blocking Transformations, final expressions
  • -
  • Example code form last week
  • -
  • Resampling analysis
  • +
  • Resampling methods: Blocking
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations, getting there
  • +
  • Blocking Transformations, final expressions
  • +
  • Example code form last week
  • +
  • Resampling analysis
  • @@ -222,7 +175,7 @@

    And why do we use such me
  • 13
  • 14
  • ...
  • -
  • 29
  • +
  • 19
  • »
  • diff --git a/doc/pub/week9/html/._week9-bs005.html b/doc/pub/week9/html/._week9-bs005.html index 327cc47c..17b52a60 100644 --- a/doc/pub/week9/html/._week9-bs005.html +++ b/doc/pub/week9/html/._week9-bs005.html @@ -62,43 +62,6 @@ 2, None, 'introducing-the-correlation-function'), - ('Statistics, wrapping up from last week', - 2, - None, - 'statistics-wrapping-up-from-last-week'), - ('Statistics, final expression', - 2, - None, - 'statistics-final-expression'), - ('Statistics, effective number of correlations', - 2, - None, - 'statistics-effective-number-of-correlations'), - ('Can we understand this? Time Auto-correlation Function', - 2, - None, - 'can-we-understand-this-time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Correlation Time', 2, None, 'correlation-time'), ('Resampling methods: Blocking', 2, None, @@ -163,23 +126,13 @@
  • The covariance term
  • Rewriting the covariance term
  • Introducing the correlation function
  • -
  • Statistics, wrapping up from last week
  • -
  • Statistics, final expression
  • -
  • Statistics, effective number of correlations
  • -
  • Can we understand this? Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Correlation Time
  • -
  • Resampling methods: Blocking
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations, getting there
  • -
  • Blocking Transformations, final expressions
  • -
  • Example code form last week
  • -
  • Resampling analysis
  • +
  • Resampling methods: Blocking
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations, getting there
  • +
  • Blocking Transformations, final expressions
  • +
  • Example code form last week
  • +
  • Resampling analysis
  • @@ -238,7 +191,7 @@

    Central limit theorem

  • 14
  • 15
  • ...
  • -
  • 29
  • +
  • 19
  • »
  • diff --git a/doc/pub/week9/html/._week9-bs006.html b/doc/pub/week9/html/._week9-bs006.html index 0306a87a..62708c8d 100644 --- a/doc/pub/week9/html/._week9-bs006.html +++ b/doc/pub/week9/html/._week9-bs006.html @@ -62,43 +62,6 @@ 2, None, 'introducing-the-correlation-function'), - ('Statistics, wrapping up from last week', - 2, - None, - 'statistics-wrapping-up-from-last-week'), - ('Statistics, final expression', - 2, - None, - 'statistics-final-expression'), - ('Statistics, effective number of correlations', - 2, - None, - 'statistics-effective-number-of-correlations'), - ('Can we understand this? Time Auto-correlation Function', - 2, - None, - 'can-we-understand-this-time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Correlation Time', 2, None, 'correlation-time'), ('Resampling methods: Blocking', 2, None, @@ -163,23 +126,13 @@
  • The covariance term
  • Rewriting the covariance term
  • Introducing the correlation function
  • -
  • Statistics, wrapping up from last week
  • -
  • Statistics, final expression
  • -
  • Statistics, effective number of correlations
  • -
  • Can we understand this? Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Correlation Time
  • -
  • Resampling methods: Blocking
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations, getting there
  • -
  • Blocking Transformations, final expressions
  • -
  • Example code form last week
  • -
  • Resampling analysis
  • +
  • Resampling methods: Blocking
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations, getting there
  • +
  • Blocking Transformations, final expressions
  • +
  • Example code form last week
  • +
  • Resampling analysis
  • @@ -233,7 +186,7 @@

    Running many measurements

    15
  • 16
  • ...
  • -
  • 29
  • +
  • 19
  • »
  • diff --git a/doc/pub/week9/html/._week9-bs007.html b/doc/pub/week9/html/._week9-bs007.html index 653ba9f0..dfdd76e6 100644 --- a/doc/pub/week9/html/._week9-bs007.html +++ b/doc/pub/week9/html/._week9-bs007.html @@ -62,43 +62,6 @@ 2, None, 'introducing-the-correlation-function'), - ('Statistics, wrapping up from last week', - 2, - None, - 'statistics-wrapping-up-from-last-week'), - ('Statistics, final expression', - 2, - None, - 'statistics-final-expression'), - ('Statistics, effective number of correlations', - 2, - None, - 'statistics-effective-number-of-correlations'), - ('Can we understand this? Time Auto-correlation Function', - 2, - None, - 'can-we-understand-this-time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Correlation Time', 2, None, 'correlation-time'), ('Resampling methods: Blocking', 2, None, @@ -163,23 +126,13 @@
  • The covariance term
  • Rewriting the covariance term
  • Introducing the correlation function
  • -
  • Statistics, wrapping up from last week
  • -
  • Statistics, final expression
  • -
  • Statistics, effective number of correlations
  • -
  • Can we understand this? Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Correlation Time
  • -
  • Resampling methods: Blocking
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations, getting there
  • -
  • Blocking Transformations, final expressions
  • -
  • Example code form last week
  • -
  • Resampling analysis
  • +
  • Resampling methods: Blocking
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations, getting there
  • +
  • Blocking Transformations, final expressions
  • +
  • Example code form last week
  • +
  • Resampling analysis
  • @@ -231,7 +184,7 @@

    Adding more definitions

  • 16
  • 17
  • ...
  • -
  • 29
  • +
  • 19
  • »
  • diff --git a/doc/pub/week9/html/._week9-bs008.html b/doc/pub/week9/html/._week9-bs008.html index 380eaaf8..9a90b8c7 100644 --- a/doc/pub/week9/html/._week9-bs008.html +++ b/doc/pub/week9/html/._week9-bs008.html @@ -62,43 +62,6 @@ 2, None, 'introducing-the-correlation-function'), - ('Statistics, wrapping up from last week', - 2, - None, - 'statistics-wrapping-up-from-last-week'), - ('Statistics, final expression', - 2, - None, - 'statistics-final-expression'), - ('Statistics, effective number of correlations', - 2, - None, - 'statistics-effective-number-of-correlations'), - ('Can we understand this? Time Auto-correlation Function', - 2, - None, - 'can-we-understand-this-time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Correlation Time', 2, None, 'correlation-time'), ('Resampling methods: Blocking', 2, None, @@ -163,23 +126,13 @@
  • The covariance term
  • Rewriting the covariance term
  • Introducing the correlation function
  • -
  • Statistics, wrapping up from last week
  • -
  • Statistics, final expression
  • -
  • Statistics, effective number of correlations
  • -
  • Can we understand this? Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Correlation Time
  • -
  • Resampling methods: Blocking
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations, getting there
  • -
  • Blocking Transformations, final expressions
  • -
  • Example code form last week
  • -
  • Resampling analysis
  • +
  • Resampling methods: Blocking
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations, getting there
  • +
  • Blocking Transformations, final expressions
  • +
  • Example code form last week
  • +
  • Resampling analysis
  • @@ -226,7 +179,7 @@

    Further rewriting

  • 17
  • 18
  • ...
  • -
  • 29
  • +
  • 19
  • »
  • diff --git a/doc/pub/week9/html/._week9-bs009.html b/doc/pub/week9/html/._week9-bs009.html index 03ac1598..60153053 100644 --- a/doc/pub/week9/html/._week9-bs009.html +++ b/doc/pub/week9/html/._week9-bs009.html @@ -62,43 +62,6 @@ 2, None, 'introducing-the-correlation-function'), - ('Statistics, wrapping up from last week', - 2, - None, - 'statistics-wrapping-up-from-last-week'), - ('Statistics, final expression', - 2, - None, - 'statistics-final-expression'), - ('Statistics, effective number of correlations', - 2, - None, - 'statistics-effective-number-of-correlations'), - ('Can we understand this? Time Auto-correlation Function', - 2, - None, - 'can-we-understand-this-time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Correlation Time', 2, None, 'correlation-time'), ('Resampling methods: Blocking', 2, None, @@ -163,23 +126,13 @@
  • The covariance term
  • Rewriting the covariance term
  • Introducing the correlation function
  • -
  • Statistics, wrapping up from last week
  • -
  • Statistics, final expression
  • -
  • Statistics, effective number of correlations
  • -
  • Can we understand this? Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Correlation Time
  • -
  • Resampling methods: Blocking
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations, getting there
  • -
  • Blocking Transformations, final expressions
  • -
  • Example code form last week
  • -
  • Resampling analysis
  • +
  • Resampling methods: Blocking
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations, getting there
  • +
  • Blocking Transformations, final expressions
  • +
  • Example code form last week
  • +
  • Resampling analysis
  • @@ -235,8 +188,6 @@

    The covariance term

  • 17
  • 18
  • 19
  • -
  • ...
  • -
  • 29
  • »
  • diff --git a/doc/pub/week9/html/._week9-bs010.html b/doc/pub/week9/html/._week9-bs010.html index 0136b32a..3a7972c3 100644 --- a/doc/pub/week9/html/._week9-bs010.html +++ b/doc/pub/week9/html/._week9-bs010.html @@ -62,43 +62,6 @@ 2, None, 'introducing-the-correlation-function'), - ('Statistics, wrapping up from last week', - 2, - None, - 'statistics-wrapping-up-from-last-week'), - ('Statistics, final expression', - 2, - None, - 'statistics-final-expression'), - ('Statistics, effective number of correlations', - 2, - None, - 'statistics-effective-number-of-correlations'), - ('Can we understand this? Time Auto-correlation Function', - 2, - None, - 'can-we-understand-this-time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Correlation Time', 2, None, 'correlation-time'), ('Resampling methods: Blocking', 2, None, @@ -163,23 +126,13 @@
  • The covariance term
  • Rewriting the covariance term
  • Introducing the correlation function
  • -
  • Statistics, wrapping up from last week
  • -
  • Statistics, final expression
  • -
  • Statistics, effective number of correlations
  • -
  • Can we understand this? Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Correlation Time
  • -
  • Resampling methods: Blocking
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations, getting there
  • -
  • Blocking Transformations, final expressions
  • -
  • Example code form last week
  • -
  • Resampling analysis
  • +
  • Resampling methods: Blocking
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations, getting there
  • +
  • Blocking Transformations, final expressions
  • +
  • Example code form last week
  • +
  • Resampling analysis
  • @@ -203,7 +156,7 @@

    Rewriting the covariance t f_d=\frac{2}{mn}\sum_{i=1}^{m} \sum_{k=1}^{n-d}\tilde{x}_{ik}\tilde{x}_{i(k+d)}. $$ -

    We note that for \( d= \) we have

    +

    We note that for \( d=0 \) we have

    $$ f_0=\frac{2}{mn}\sum_{i=1}^{m} \sum_{k=1}^{n}\tilde{x}_{ik}\tilde{x}_{i(k)}=\sigma^2! $$ @@ -232,9 +185,6 @@

    Rewriting the covariance t
  • 17
  • 18
  • 19
  • -
  • 20
  • -
  • ...
  • -
  • 29
  • »
  • diff --git a/doc/pub/week9/html/._week9-bs011.html b/doc/pub/week9/html/._week9-bs011.html index 89741b2a..b24b2cf9 100644 --- a/doc/pub/week9/html/._week9-bs011.html +++ b/doc/pub/week9/html/._week9-bs011.html @@ -62,43 +62,6 @@ 2, None, 'introducing-the-correlation-function'), - ('Statistics, wrapping up from last week', - 2, - None, - 'statistics-wrapping-up-from-last-week'), - ('Statistics, final expression', - 2, - None, - 'statistics-final-expression'), - ('Statistics, effective number of correlations', - 2, - None, - 'statistics-effective-number-of-correlations'), - ('Can we understand this? Time Auto-correlation Function', - 2, - None, - 'can-we-understand-this-time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Correlation Time', 2, None, 'correlation-time'), ('Resampling methods: Blocking', 2, None, @@ -163,23 +126,13 @@
  • The covariance term
  • Rewriting the covariance term
  • Introducing the correlation function
  • -
  • Statistics, wrapping up from last week
  • -
  • Statistics, final expression
  • -
  • Statistics, effective number of correlations
  • -
  • Can we understand this? Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Correlation Time
  • -
  • Resampling methods: Blocking
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations, getting there
  • -
  • Blocking Transformations, final expressions
  • -
  • Example code form last week
  • -
  • Resampling analysis
  • +
  • Resampling methods: Blocking
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations, getting there
  • +
  • Blocking Transformations, final expressions
  • +
  • Example code form last week
  • +
  • Resampling analysis
  • @@ -224,10 +177,6 @@

    Introducing the cor
  • 17
  • 18
  • 19
  • -
  • 20
  • -
  • 21
  • -
  • ...
  • -
  • 29
  • »
  • diff --git a/doc/pub/week9/html/._week9-bs012.html b/doc/pub/week9/html/._week9-bs012.html index 64bfb706..3b928c98 100644 --- a/doc/pub/week9/html/._week9-bs012.html +++ b/doc/pub/week9/html/._week9-bs012.html @@ -62,43 +62,6 @@ 2, None, 'introducing-the-correlation-function'), - ('Statistics, wrapping up from last week', - 2, - None, - 'statistics-wrapping-up-from-last-week'), - ('Statistics, final expression', - 2, - None, - 'statistics-final-expression'), - ('Statistics, effective number of correlations', - 2, - None, - 'statistics-effective-number-of-correlations'), - ('Can we understand this? Time Auto-correlation Function', - 2, - None, - 'can-we-understand-this-time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Correlation Time', 2, None, 'correlation-time'), ('Resampling methods: Blocking', 2, None, @@ -163,23 +126,13 @@
  • The covariance term
  • Rewriting the covariance term
  • Introducing the correlation function
  • -
  • Statistics, wrapping up from last week
  • -
  • Statistics, final expression
  • -
  • Statistics, effective number of correlations
  • -
  • Can we understand this? Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Correlation Time
  • -
  • Resampling methods: Blocking
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations, getting there
  • -
  • Blocking Transformations, final expressions
  • -
  • Example code form last week
  • -
  • Resampling analysis
  • +
  • Resampling methods: Blocking
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations, getting there
  • +
  • Blocking Transformations, final expressions
  • +
  • Example code form last week
  • +
  • Resampling analysis
  • @@ -191,40 +144,28 @@

     

     

     

    -

    Statistics, wrapping up from last week

    -
    -
    - -

    Let us analyze the problem by splitting up the correlation term into -partial sums of the form: -

    -$$ -f_d = \frac{1}{n-d}\sum_{k=1}^{n-d}(x_k - \bar x_n)(x_{k+d} - \bar x_n) -$$ +

    Resampling methods: Blocking

    -

    The correlation term of the error can now be rewritten in terms of -\( f_d \) +

    The blocking method was made popular by Flyvbjerg and Pedersen (1989) +and has become one of the standard ways to estimate +\( V(\widehat{\theta}) \) for exactly one \( \widehat{\theta} \), namely +\( \widehat{\theta} = \overline{X} \).

    -$$ -\frac{2}{n}\sum_{k < l} (x_k - \bar x_n)(x_l - \bar x_n) = -2\sum_{d=1}^{n-1} f_d -$$ -

    The value of \( f_d \) reflects the correlation between measurements -separated by the distance \( d \) in the sample samples. Notice that for -\( d=0 \), \( f \) is just the sample variance, \( \mathrm{var}(x) \). If we divide \( f_d \) -by \( \mathrm{var}(x) \), we arrive at the so called autocorrelation function +

    Assume \( n = 2^d \) for some integer \( d>1 \) and \( X_1,X_2,\cdots, X_n \) is a stationary time series to begin with. +Moreover, assume that the time series is asymptotically uncorrelated. We switch to vector notation by arranging \( X_1,X_2,\cdots,X_n \) in an \( n \)-tuple. Define:

    $$ -\kappa_d = \frac{f_d}{\mathrm{var}(x)} +\begin{align*} +\hat{X} = (X_1,X_2,\cdots,X_n). +\end{align*} $$ -

    which gives us a useful measure of pairwise correlations -starting always at \( 1 \) for \( d=0 \). +

    The strength of the blocking method is when the number of +observations, \( n \) is large. For large \( n \), the complexity of dependent +bootstrapping scales poorly, but the blocking method does not, +moreover, it becomes more accurate the larger \( n \) is.

    -
    -
    -

    @@ -247,11 +188,6 @@

    Statistics, wrappi
  • 17
  • 18
  • 19
  • -
  • 20
  • -
  • 21
  • -
  • 22
  • -
  • ...
  • -
  • 29
  • »
  • diff --git a/doc/pub/week9/html/._week9-bs013.html b/doc/pub/week9/html/._week9-bs013.html index 228538a8..024ec8f8 100644 --- a/doc/pub/week9/html/._week9-bs013.html +++ b/doc/pub/week9/html/._week9-bs013.html @@ -62,43 +62,6 @@ 2, None, 'introducing-the-correlation-function'), - ('Statistics, wrapping up from last week', - 2, - None, - 'statistics-wrapping-up-from-last-week'), - ('Statistics, final expression', - 2, - None, - 'statistics-final-expression'), - ('Statistics, effective number of correlations', - 2, - None, - 'statistics-effective-number-of-correlations'), - ('Can we understand this? Time Auto-correlation Function', - 2, - None, - 'can-we-understand-this-time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Correlation Time', 2, None, 'correlation-time'), ('Resampling methods: Blocking', 2, None, @@ -163,23 +126,13 @@
  • The covariance term
  • Rewriting the covariance term
  • Introducing the correlation function
  • -
  • Statistics, wrapping up from last week
  • -
  • Statistics, final expression
  • -
  • Statistics, effective number of correlations
  • -
  • Can we understand this? Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Correlation Time
  • -
  • Resampling methods: Blocking
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations, getting there
  • -
  • Blocking Transformations, final expressions
  • -
  • Example code form last week
  • -
  • Resampling analysis
  • +
  • Resampling methods: Blocking
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations, getting there
  • +
  • Blocking Transformations, final expressions
  • +
  • Example code form last week
  • +
  • Resampling analysis
  • @@ -191,40 +144,40 @@

     

     

     

    -

    Statistics, final expression

    -
    -
    - -

    The sample error can now be -written in terms of the autocorrelation function: +

    Blocking Transformations

    +

    We now define +blocking transformations. The idea is to take the mean of subsequent +pair of elements from \( \vec{X} \) and form a new vector +\( \vec{X}_1 \). Continuing in the same way by taking the mean of +subsequent pairs of elements of \( \vec{X}_1 \) we obtain \( \vec{X}_2 \), and +so on. +Define \( \vec{X}_i \) recursively by:

    $$ -\begin{align} -\mathrm{err}_X^2 &= -\frac{1}{n}\mathrm{var}(x)+\frac{2}{n}\cdot\mathrm{var}(x)\sum_{d=1}^{n-1} -\frac{f_d}{\mathrm{var}(x)}\nonumber\\ &=& -\left(1+2\sum_{d=1}^{n-1}\kappa_d\right)\frac{1}{n}\mathrm{var}(x)\nonumber\\ -&=\frac{\tau}{n}\cdot\mathrm{var}(x) +\begin{align} +(\vec{X}_0)_k &\equiv (\vec{X})_k \nonumber \\ +(\vec{X}_{i+1})_k &\equiv \frac{1}{2}\Big( (\vec{X}_i)_{2k-1} + +(\vec{X}_i)_{2k} \Big) \qquad \text{for all} \qquad 1 \leq i \leq d-1 \tag{1} -\end{align} - +\end{align} $$ -

    and we see that \( \mathrm{err}_X \) can be expressed in terms the -uncorrelated sample variance times a correction factor \( \tau \) which -accounts for the correlation between measurements. We call this -correction factor the autocorrelation time: +

    The quantity \( \vec{X}_k \) is +subject to \( k \) blocking transformations. We now have \( d \) vectors +\( \vec{X}_0, \vec{X}_1,\cdots,\vec X_{d-1} \) containing the subsequent +averages of observations. It turns out that if the components of +\( \vec{X} \) is a stationary time series, then the components of +\( \vec{X}_i \) is a stationary time series for all \( 0 \leq i \leq d-1 \)

    -$$ -\begin{equation} -\tau = 1+2\sum_{d=1}^{n-1}\kappa_d -\tag{2} -\end{equation} -$$ -
    -
    +

    We can then compute the autocovariance, the variance, sample mean, and +number of observations for each \( i \). +Let \( \gamma_i, \sigma_i^2, +\overline{X}_i \) denote the autocovariance, variance and average of the +elements of \( \vec{X}_i \) and let \( n_i \) be the number of elements of +\( \vec{X}_i \). It follows by induction that \( n_i = n/2^i \). +

    @@ -246,12 +199,6 @@

    Statistics, final expression
  • 17
  • 18
  • 19
  • -
  • 20
  • -
  • 21
  • -
  • 22
  • -
  • 23
  • -
  • ...
  • -
  • 29
  • »
  • diff --git a/doc/pub/week9/html/._week9-bs014.html b/doc/pub/week9/html/._week9-bs014.html index 3467db18..0950bd6f 100644 --- a/doc/pub/week9/html/._week9-bs014.html +++ b/doc/pub/week9/html/._week9-bs014.html @@ -62,43 +62,6 @@ 2, None, 'introducing-the-correlation-function'), - ('Statistics, wrapping up from last week', - 2, - None, - 'statistics-wrapping-up-from-last-week'), - ('Statistics, final expression', - 2, - None, - 'statistics-final-expression'), - ('Statistics, effective number of correlations', - 2, - None, - 'statistics-effective-number-of-correlations'), - ('Can we understand this? Time Auto-correlation Function', - 2, - None, - 'can-we-understand-this-time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Correlation Time', 2, None, 'correlation-time'), ('Resampling methods: Blocking', 2, None, @@ -163,23 +126,13 @@
  • The covariance term
  • Rewriting the covariance term
  • Introducing the correlation function
  • -
  • Statistics, wrapping up from last week
  • -
  • Statistics, final expression
  • -
  • Statistics, effective number of correlations
  • -
  • Can we understand this? Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Correlation Time
  • -
  • Resampling methods: Blocking
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations, getting there
  • -
  • Blocking Transformations, final expressions
  • -
  • Example code form last week
  • -
  • Resampling analysis
  • +
  • Resampling methods: Blocking
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations, getting there
  • +
  • Blocking Transformations, final expressions
  • +
  • Example code form last week
  • +
  • Resampling analysis
  • @@ -191,32 +144,25 @@

     

     

     

    -

    Statistics, effective number of correlations

    -
    -
    - -

    For a correlation free experiment, \( \tau \) -equals 1. -

    +

    Blocking Transformations

    -

    We can interpret a sequential -correlation as an effective reduction of the number of measurements by -a factor \( \tau \). The effective number of measurements becomes: +

    Using the +definition of the blocking transformation and the distributive +property of the covariance, it is clear that since \( h =|i-j| \) +we can define

    $$ -n_\mathrm{eff} = \frac{n}{\tau} +\begin{align} +\gamma_{k+1}(h) &= cov\left( ({X}_{k+1})_{i}, ({X}_{k+1})_{j} \right) \nonumber \\ +&= \frac{1}{4}cov\left( ({X}_{k})_{2i-1} + ({X}_{k})_{2i}, ({X}_{k})_{2j-1} + ({X}_{k})_{2j} \right) \nonumber \\ +&= \frac{1}{2}\gamma_{k}(2h) + \frac{1}{2}\gamma_k(2h+1) \hspace{0.1cm} \mathrm{h = 0} +\tag{2}\\ +&=\frac{1}{4}\gamma_k(2h-1) + \frac{1}{2}\gamma_k(2h) + \frac{1}{4}\gamma_k(2h+1) \quad \mathrm{else} +\tag{3} +\end{align} $$ -

    To neglect the autocorrelation time \( \tau \) will always cause our -simple uncorrelated estimate of \( \mathrm{err}_X^2\approx \mathrm{var}(x)/n \) to -be less than the true sample error. The estimate of the error will be -too good. On the other hand, the calculation of the full -autocorrelation time poses an efficiency problem if the set of -measurements is very large. -

    -
    -
    - +

    The quantity \( \hat{X} \) is asymptotic uncorrelated by assumption, \( \hat{X}_k \) is also asymptotic uncorrelated. Let's turn our attention to the variance of the sample mean \( V(\overline{X}) \).

    @@ -237,13 +183,6 @@

    Statistics,
  • 17
  • 18
  • 19
  • -
  • 20
  • -
  • 21
  • -
  • 22
  • -
  • 23
  • -
  • 24
  • -
  • ...
  • -
  • 29
  • »
  • diff --git a/doc/pub/week9/html/._week9-bs015.html b/doc/pub/week9/html/._week9-bs015.html index c2bcb27a..0735ecb8 100644 --- a/doc/pub/week9/html/._week9-bs015.html +++ b/doc/pub/week9/html/._week9-bs015.html @@ -62,43 +62,6 @@ 2, None, 'introducing-the-correlation-function'), - ('Statistics, wrapping up from last week', - 2, - None, - 'statistics-wrapping-up-from-last-week'), - ('Statistics, final expression', - 2, - None, - 'statistics-final-expression'), - ('Statistics, effective number of correlations', - 2, - None, - 'statistics-effective-number-of-correlations'), - ('Can we understand this? Time Auto-correlation Function', - 2, - None, - 'can-we-understand-this-time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Correlation Time', 2, None, 'correlation-time'), ('Resampling methods: Blocking', 2, None, @@ -163,23 +126,13 @@
  • The covariance term
  • Rewriting the covariance term
  • Introducing the correlation function
  • -
  • Statistics, wrapping up from last week
  • -
  • Statistics, final expression
  • -
  • Statistics, effective number of correlations
  • -
  • Can we understand this? Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Correlation Time
  • -
  • Resampling methods: Blocking
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations, getting there
  • -
  • Blocking Transformations, final expressions
  • -
  • Example code form last week
  • -
  • Resampling analysis
  • +
  • Resampling methods: Blocking
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations, getting there
  • +
  • Blocking Transformations, final expressions
  • +
  • Example code form last week
  • +
  • Resampling analysis
  • @@ -191,34 +144,24 @@

     

     

     

    -

    Can we understand this? Time Auto-correlation Function

    -
    -
    - - -

    The so-called time-displacement autocorrelation \( \phi(t) \) for a quantity \( \mathbf{M} \) is given by

    +

    Blocking Transformations, getting there

    +

    We have

    $$ -\phi(t) = \int dt' \left[\mathbf{M}(t')-\langle \mathbf{M} \rangle\right]\left[\mathbf{M}(t'+t)-\langle \mathbf{M} \rangle\right], +\begin{align} +V(\overline{X}_k) = \frac{\sigma_k^2}{n_k} + \underbrace{\frac{2}{n_k} \sum_{h=1}^{n_k-1}\left( 1 - \frac{h}{n_k} \right)\gamma_k(h)}_{\equiv e_k} = \frac{\sigma^2_k}{n_k} + e_k \quad \text{if} \quad \gamma_k(0) = \sigma_k^2. +\tag{4} +\end{align} $$ -

    which can be rewritten as

    +

    The term \( e_k \) is called the truncation error:

    $$ -\phi(t) = \int dt' \left[\mathbf{M}(t')\mathbf{M}(t'+t)-\langle \mathbf{M} \rangle^2\right], +\begin{equation} +e_k = \frac{2}{n_k} \sum_{h=1}^{n_k-1}\left( 1 - \frac{h}{n_k} \right)\gamma_k(h). +\tag{5} +\end{equation} $$ -

    where \( \langle \mathbf{M} \rangle \) is the average value and -\( \mathbf{M}(t) \) its instantaneous value. We can discretize this function as follows, where we used our -set of computed values \( \mathbf{M}(t) \) for a set of discretized times (our Monte Carlo cycles corresponding to moving all electrons?) -

    -$$ -\phi(t) = \frac{1}{t_{\mathrm{max}}-t}\sum_{t'=0}^{t_{\mathrm{max}}-t}\mathbf{M}(t')\mathbf{M}(t'+t) --\frac{1}{t_{\mathrm{max}}-t}\sum_{t'=0}^{t_{\mathrm{max}}-t}\mathbf{M}(t')\times -\frac{1}{t_{\mathrm{max}}-t}\sum_{t'=0}^{t_{\mathrm{max}}-t}\mathbf{M}(t'+t). -\tag{3} -$$ -
    -
    - +

    We can show that \( V(\overline{X}_i) = V(\overline{X}_j) \) for all \( 0 \leq i \leq d-1 \) and \( 0 \leq j \leq d-1 \).

    @@ -238,14 +181,6 @@

    Ca
  • 17
  • 18
  • 19
  • -
  • 20
  • -
  • 21
  • -
  • 22
  • -
  • 23
  • -
  • 24
  • -
  • 25
  • -
  • ...
  • -
  • 29
  • »
  • diff --git a/doc/pub/week9/html/._week9-bs016.html b/doc/pub/week9/html/._week9-bs016.html index 067f82ab..df515c77 100644 --- a/doc/pub/week9/html/._week9-bs016.html +++ b/doc/pub/week9/html/._week9-bs016.html @@ -62,43 +62,6 @@ 2, None, 'introducing-the-correlation-function'), - ('Statistics, wrapping up from last week', - 2, - None, - 'statistics-wrapping-up-from-last-week'), - ('Statistics, final expression', - 2, - None, - 'statistics-final-expression'), - ('Statistics, effective number of correlations', - 2, - None, - 'statistics-effective-number-of-correlations'), - ('Can we understand this? Time Auto-correlation Function', - 2, - None, - 'can-we-understand-this-time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Correlation Time', 2, None, 'correlation-time'), ('Resampling methods: Blocking', 2, None, @@ -163,23 +126,13 @@
  • The covariance term
  • Rewriting the covariance term
  • Introducing the correlation function
  • -
  • Statistics, wrapping up from last week
  • -
  • Statistics, final expression
  • -
  • Statistics, effective number of correlations
  • -
  • Can we understand this? Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Correlation Time
  • -
  • Resampling methods: Blocking
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations, getting there
  • -
  • Blocking Transformations, final expressions
  • -
  • Example code form last week
  • -
  • Resampling analysis
  • +
  • Resampling methods: Blocking
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations, getting there
  • +
  • Blocking Transformations, final expressions
  • +
  • Example code form last week
  • +
  • Resampling analysis
  • @@ -191,31 +144,34 @@

     

     

     

    -

    Time Auto-correlation Function

    -
    -
    - +

    Blocking Transformations, final expressions

    -

    One should be careful with times close to \( t_{\mathrm{max}} \), the upper limit of the sums -becomes small and we end up integrating over a rather small time interval. This means that the statistical -error in \( \phi(t) \) due to the random nature of the fluctuations in \( \mathbf{M}(t) \) can become large. -

    - -

    One should therefore choose \( t \ll t_{\mathrm{max}} \).

    +

    We can then wrap up

    +$$ +\begin{align} +n_{j+1} \overline{X}_{j+1} &= \sum_{i=1}^{n_{j+1}} (\hat{X}_{j+1})_i = \frac{1}{2}\sum_{i=1}^{n_{j}/2} (\hat{X}_{j})_{2i-1} + (\hat{X}_{j})_{2i} \nonumber \\ +&= \frac{1}{2}\left[ (\hat{X}_j)_1 + (\hat{X}_j)_2 + \cdots + (\hat{X}_j)_{n_j} \right] = \underbrace{\frac{n_j}{2}}_{=n_{j+1}} \overline{X}_j = n_{j+1}\overline{X}_j. +\tag{6} +\end{align} +$$ -

    Note that the variable \( \mathbf{M} \) can be any expectation values of interest.

    +

    By repeated use of this equation we get \( V(\overline{X}_i) = V(\overline{X}_0) = V(\overline{X}) \) for all \( 0 \leq i \leq d-1 \). This has the consequence that

    +$$ +\begin{align} +V(\overline{X}) = \frac{\sigma_k^2}{n_k} + e_k \qquad \text{for all} \qquad 0 \leq k \leq d-1. \tag{7} +\end{align} +$$ -

    The time-correlation function gives a measure of the correlation between the various values of the variable -at a time \( t' \) and a time \( t'+t \). If we multiply the values of \( \mathbf{M} \) at these two different times, -we will get a positive contribution if they are fluctuating in the same direction, or a negative value -if they fluctuate in the opposite direction. If we then integrate over time, or use the discretized version of, the time correlation function \( \phi(t) \) should take a non-zero value if the fluctuations are -correlated, else it should gradually go to zero. For times a long way apart -the different values of \( \mathbf{M} \) are most likely -uncorrelated and \( \phi(t) \) should be zero. +

    Flyvbjerg and Petersen demonstrated that the sequence +\( \{e_k\}_{k=0}^{d-1} \) is decreasing, and conjecture that the term +\( e_k \) can be made as small as we would like by making \( k \) (and hence +\( d \)) sufficiently large. The sequence is decreasing (Master of Science thesis by Marius Jonsson, UiO 2018). +It means we can apply blocking transformations until +\( e_k \) is sufficiently small, and then estimate \( V(\overline{X}) \) by +\( \widehat{\sigma}^2_k/n_k \).

    -
    -
    +

    For an elegant solution and proof of the blocking method, see the recent article of Marius Jonsson (former MSc student of the Computational Physics group).

    @@ -234,15 +190,6 @@

    Time Auto-correlation Fun
  • 17
  • 18
  • 19
  • -
  • 20
  • -
  • 21
  • -
  • 22
  • -
  • 23
  • -
  • 24
  • -
  • 25
  • -
  • 26
  • -
  • ...
  • -
  • 29
  • »
  • diff --git a/doc/pub/week9/html/._week9-bs017.html b/doc/pub/week9/html/._week9-bs017.html index 8994cff3..637e3df8 100644 --- a/doc/pub/week9/html/._week9-bs017.html +++ b/doc/pub/week9/html/._week9-bs017.html @@ -62,43 +62,6 @@ 2, None, 'introducing-the-correlation-function'), - ('Statistics, wrapping up from last week', - 2, - None, - 'statistics-wrapping-up-from-last-week'), - ('Statistics, final expression', - 2, - None, - 'statistics-final-expression'), - ('Statistics, effective number of correlations', - 2, - None, - 'statistics-effective-number-of-correlations'), - ('Can we understand this? Time Auto-correlation Function', - 2, - None, - 'can-we-understand-this-time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Correlation Time', 2, None, 'correlation-time'), ('Resampling methods: Blocking', 2, None, @@ -163,23 +126,13 @@
  • The covariance term
  • Rewriting the covariance term
  • Introducing the correlation function
  • -
  • Statistics, wrapping up from last week
  • -
  • Statistics, final expression
  • -
  • Statistics, effective number of correlations
  • -
  • Can we understand this? Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Correlation Time
  • -
  • Resampling methods: Blocking
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations, getting there
  • -
  • Blocking Transformations, final expressions
  • -
  • Example code form last week
  • -
  • Resampling analysis
  • +
  • Resampling methods: Blocking
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations, getting there
  • +
  • Blocking Transformations, final expressions
  • +
  • Example code form last week
  • +
  • Resampling analysis
  • @@ -191,38 +144,243 @@

     

     

     

    -

    Time Auto-correlation Function

    -
    -
    - -

    We can derive the correlation time by observing that our Metropolis algorithm is based on a random -walk in the space of all possible spin configurations. -Our probability -distribution function \( \mathbf{\hat{w}}(t) \) after a given number of time steps \( t \) could be written as -

    -$$ - \mathbf{\hat{w}}(t) = \mathbf{\hat{W}^t\hat{w}}(0), -$$ - -

    with \( \mathbf{\hat{w}}(0) \) the distribution at \( t=0 \) and \( \mathbf{\hat{W}} \) representing the -transition probability matrix. -We can always expand \( \mathbf{\hat{w}}(0) \) in terms of the right eigenvectors of -\( \mathbf{\hat{v}} \) of \( \mathbf{\hat{W}} \) as -

    -$$ - \mathbf{\hat{w}}(0) = \sum_i\alpha_i\mathbf{\hat{v}}_i, -$$ - -

    resulting in

    -$$ - \mathbf{\hat{w}}(t) = \mathbf{\hat{W}}^t\mathbf{\hat{w}}(0)=\mathbf{\hat{W}}^t\sum_i\alpha_i\mathbf{\hat{v}}_i= -\sum_i\lambda_i^t\alpha_i\mathbf{\hat{v}}_i, -$$ - -

    with \( \lambda_i \) the \( i^{\mathrm{th}} \) eigenvalue corresponding to -the eigenvector \( \mathbf{\hat{v}}_i \). -

    +

    Example code form last week

    + + +
    +
    +
    +
    +
    +
    # 2-electron VMC code for 2dim quantum dot with importance sampling
    +# Using gaussian rng for new positions and Metropolis- Hastings 
    +# Added energy minimization
    +from math import exp, sqrt
    +from random import random, seed, normalvariate
    +import numpy as np
    +import matplotlib.pyplot as plt
    +from mpl_toolkits.mplot3d import Axes3D
    +from matplotlib import cm
    +from matplotlib.ticker import LinearLocator, FormatStrFormatter
    +from scipy.optimize import minimize
    +import sys
    +import os
    +
    +# Where to save data files
    +PROJECT_ROOT_DIR = "Results"
    +DATA_ID = "Results/EnergyMin"
    +
    +if not os.path.exists(PROJECT_ROOT_DIR):
    +    os.mkdir(PROJECT_ROOT_DIR)
    +
    +if not os.path.exists(DATA_ID):
    +    os.makedirs(DATA_ID)
    +
    +def data_path(dat_id):
    +    return os.path.join(DATA_ID, dat_id)
    +
    +outfile = open(data_path("Energies.dat"),'w')
    +
    +
    +# Trial wave function for the 2-electron quantum dot in two dims
    +def WaveFunction(r,alpha,beta):
    +    r1 = r[0,0]**2 + r[0,1]**2
    +    r2 = r[1,0]**2 + r[1,1]**2
    +    r12 = sqrt((r[0,0]-r[1,0])**2 + (r[0,1]-r[1,1])**2)
    +    deno = r12/(1+beta*r12)
    +    return exp(-0.5*alpha*(r1+r2)+deno)
    +
    +# Local energy  for the 2-electron quantum dot in two dims, using analytical local energy
    +def LocalEnergy(r,alpha,beta):
    +    
    +    r1 = (r[0,0]**2 + r[0,1]**2)
    +    r2 = (r[1,0]**2 + r[1,1]**2)
    +    r12 = sqrt((r[0,0]-r[1,0])**2 + (r[0,1]-r[1,1])**2)
    +    deno = 1.0/(1+beta*r12)
    +    deno2 = deno*deno
    +    return 0.5*(1-alpha*alpha)*(r1 + r2) +2.0*alpha + 1.0/r12+deno2*(alpha*r12-deno2+2*beta*deno-1.0/r12)
    +
    +# Derivate of wave function ansatz as function of variational parameters
    +def DerivativeWFansatz(r,alpha,beta):
    +    
    +    WfDer  = np.zeros((2), np.double)
    +    r1 = (r[0,0]**2 + r[0,1]**2)
    +    r2 = (r[1,0]**2 + r[1,1]**2)
    +    r12 = sqrt((r[0,0]-r[1,0])**2 + (r[0,1]-r[1,1])**2)
    +    deno = 1.0/(1+beta*r12)
    +    deno2 = deno*deno
    +    WfDer[0] = -0.5*(r1+r2)
    +    WfDer[1] = -r12*r12*deno2
    +    return  WfDer
    +
    +# Setting up the quantum force for the two-electron quantum dot, recall that it is a vector
    +def QuantumForce(r,alpha,beta):
    +
    +    qforce = np.zeros((NumberParticles,Dimension), np.double)
    +    r12 = sqrt((r[0,0]-r[1,0])**2 + (r[0,1]-r[1,1])**2)
    +    deno = 1.0/(1+beta*r12)
    +    qforce[0,:] = -2*r[0,:]*alpha*(r[0,:]-r[1,:])*deno*deno/r12
    +    qforce[1,:] = -2*r[1,:]*alpha*(r[1,:]-r[0,:])*deno*deno/r12
    +    return qforce
    +    
    +
    +# Computing the derivative of the energy and the energy 
    +def EnergyDerivative(x0):
    +
    +    
    +    # Parameters in the Fokker-Planck simulation of the quantum force
    +    D = 0.5
    +    TimeStep = 0.05
    +    # positions
    +    PositionOld = np.zeros((NumberParticles,Dimension), np.double)
    +    PositionNew = np.zeros((NumberParticles,Dimension), np.double)
    +    # Quantum force
    +    QuantumForceOld = np.zeros((NumberParticles,Dimension), np.double)
    +    QuantumForceNew = np.zeros((NumberParticles,Dimension), np.double)
    +
    +    energy = 0.0
    +    DeltaE = 0.0
    +    alpha = x0[0]
    +    beta = x0[1]
    +    EnergyDer = 0.0
    +    DeltaPsi = 0.0
    +    DerivativePsiE = 0.0 
    +    #Initial position
    +    for i in range(NumberParticles):
    +        for j in range(Dimension):
    +            PositionOld[i,j] = normalvariate(0.0,1.0)*sqrt(TimeStep)
    +    wfold = WaveFunction(PositionOld,alpha,beta)
    +    QuantumForceOld = QuantumForce(PositionOld,alpha, beta)
    +
    +    #Loop over MC MCcycles
    +    for MCcycle in range(NumberMCcycles):
    +        #Trial position moving one particle at the time
    +        for i in range(NumberParticles):
    +            for j in range(Dimension):
    +                PositionNew[i,j] = PositionOld[i,j]+normalvariate(0.0,1.0)*sqrt(TimeStep)+\
    +                                       QuantumForceOld[i,j]*TimeStep*D
    +            wfnew = WaveFunction(PositionNew,alpha,beta)
    +            QuantumForceNew = QuantumForce(PositionNew,alpha, beta)
    +            GreensFunction = 0.0
    +            for j in range(Dimension):
    +                GreensFunction += 0.5*(QuantumForceOld[i,j]+QuantumForceNew[i,j])*\
    +	                              (D*TimeStep*0.5*(QuantumForceOld[i,j]-QuantumForceNew[i,j])-\
    +                                      PositionNew[i,j]+PositionOld[i,j])
    +      
    +            GreensFunction = exp(GreensFunction)
    +            ProbabilityRatio = GreensFunction*wfnew**2/wfold**2
    +            #Metropolis-Hastings test to see whether we accept the move
    +            if random() <= ProbabilityRatio:
    +                for j in range(Dimension):
    +                    PositionOld[i,j] = PositionNew[i,j]
    +                    QuantumForceOld[i,j] = QuantumForceNew[i,j]
    +                wfold = wfnew
    +        DeltaE = LocalEnergy(PositionOld,alpha,beta)
    +        DerPsi = DerivativeWFansatz(PositionOld,alpha,beta)
    +        DeltaPsi += DerPsi
    +        energy += DeltaE
    +        DerivativePsiE += DerPsi*DeltaE
    +            
    +    # We calculate mean values
    +    energy /= NumberMCcycles
    +    DerivativePsiE /= NumberMCcycles
    +    DeltaPsi /= NumberMCcycles
    +    EnergyDer  = 2*(DerivativePsiE-DeltaPsi*energy)
    +    return EnergyDer
    +
    +
    +# Computing the expectation value of the local energy 
    +def Energy(x0):
    +    # Parameters in the Fokker-Planck simulation of the quantum force
    +    D = 0.5
    +    TimeStep = 0.05
    +    # positions
    +    PositionOld = np.zeros((NumberParticles,Dimension), np.double)
    +    PositionNew = np.zeros((NumberParticles,Dimension), np.double)
    +    # Quantum force
    +    QuantumForceOld = np.zeros((NumberParticles,Dimension), np.double)
    +    QuantumForceNew = np.zeros((NumberParticles,Dimension), np.double)
    +
    +    energy = 0.0
    +    DeltaE = 0.0
    +    alpha = x0[0]
    +    beta = x0[1]
    +    #Initial position
    +    for i in range(NumberParticles):
    +        for j in range(Dimension):
    +            PositionOld[i,j] = normalvariate(0.0,1.0)*sqrt(TimeStep)
    +    wfold = WaveFunction(PositionOld,alpha,beta)
    +    QuantumForceOld = QuantumForce(PositionOld,alpha, beta)
    +
    +    #Loop over MC MCcycles
    +    for MCcycle in range(NumberMCcycles):
    +        #Trial position moving one particle at the time
    +        for i in range(NumberParticles):
    +            for j in range(Dimension):
    +                PositionNew[i,j] = PositionOld[i,j]+normalvariate(0.0,1.0)*sqrt(TimeStep)+\
    +                                       QuantumForceOld[i,j]*TimeStep*D
    +            wfnew = WaveFunction(PositionNew,alpha,beta)
    +            QuantumForceNew = QuantumForce(PositionNew,alpha, beta)
    +            GreensFunction = 0.0
    +            for j in range(Dimension):
    +                GreensFunction += 0.5*(QuantumForceOld[i,j]+QuantumForceNew[i,j])*\
    +	                              (D*TimeStep*0.5*(QuantumForceOld[i,j]-QuantumForceNew[i,j])-\
    +                                      PositionNew[i,j]+PositionOld[i,j])
    +      
    +            GreensFunction = exp(GreensFunction)
    +            ProbabilityRatio = GreensFunction*wfnew**2/wfold**2
    +            #Metropolis-Hastings test to see whether we accept the move
    +            if random() <= ProbabilityRatio:
    +                for j in range(Dimension):
    +                    PositionOld[i,j] = PositionNew[i,j]
    +                    QuantumForceOld[i,j] = QuantumForceNew[i,j]
    +                wfold = wfnew
    +        DeltaE = LocalEnergy(PositionOld,alpha,beta)
    +        energy += DeltaE
    +        if Printout: 
    +           outfile.write('%f\n' %(energy/(MCcycle+1.0)))            
    +    # We calculate mean values
    +    energy /= NumberMCcycles
    +    return energy
    +
    +#Here starts the main program with variable declarations
    +NumberParticles = 2
    +Dimension = 2
    +# seed for rng generator 
    +seed()
    +# Monte Carlo cycles for parameter optimization
    +Printout = False
    +NumberMCcycles= 10000
    +# guess for variational parameters
    +x0 = np.array([0.9,0.2])
    +# Using Broydens method to find optimal parameters
    +res = minimize(Energy, x0, method='BFGS', jac=EnergyDerivative, options={'gtol': 1e-4,'disp': True})
    +x0 = res.x
    +# Compute the energy again with the optimal parameters and increased number of Monte Cycles
    +NumberMCcycles= 2**19
    +Printout = True
    +FinalEnergy = Energy(x0)
    +EResult = np.array([FinalEnergy,FinalEnergy])
    +outfile.close()
    +#nice printout with Pandas
    +import pandas as pd
    +from pandas import DataFrame
    +data ={'Optimal Parameters':x0, 'Final Energy':EResult}
    +frame = pd.DataFrame(data)
    +print(frame)
    +
    +
    +
    +
    +
    +
    +
    +
    +
    +
    +
    +
    @@ -242,16 +400,6 @@

    Time Auto-correlation Fun
  • 17
  • 18
  • 19
  • -
  • 20
  • -
  • 21
  • -
  • 22
  • -
  • 23
  • -
  • 24
  • -
  • 25
  • -
  • 26
  • -
  • 27
  • -
  • ...
  • -
  • 29
  • »
  • diff --git a/doc/pub/week9/html/._week9-bs018.html b/doc/pub/week9/html/._week9-bs018.html index 765ad862..925c9cbe 100644 --- a/doc/pub/week9/html/._week9-bs018.html +++ b/doc/pub/week9/html/._week9-bs018.html @@ -62,43 +62,6 @@ 2, None, 'introducing-the-correlation-function'), - ('Statistics, wrapping up from last week', - 2, - None, - 'statistics-wrapping-up-from-last-week'), - ('Statistics, final expression', - 2, - None, - 'statistics-final-expression'), - ('Statistics, effective number of correlations', - 2, - None, - 'statistics-effective-number-of-correlations'), - ('Can we understand this? Time Auto-correlation Function', - 2, - None, - 'can-we-understand-this-time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Correlation Time', 2, None, 'correlation-time'), ('Resampling methods: Blocking', 2, None, @@ -163,23 +126,13 @@
  • The covariance term
  • Rewriting the covariance term
  • Introducing the correlation function
  • -
  • Statistics, wrapping up from last week
  • -
  • Statistics, final expression
  • -
  • Statistics, effective number of correlations
  • -
  • Can we understand this? Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Correlation Time
  • -
  • Resampling methods: Blocking
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations, getting there
  • -
  • Blocking Transformations, final expressions
  • -
  • Example code form last week
  • -
  • Resampling analysis
  • +
  • Resampling methods: Blocking
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations, getting there
  • +
  • Blocking Transformations, final expressions
  • +
  • Example code form last week
  • +
  • Resampling analysis
  • @@ -191,32 +144,88 @@

     

     

     

    -

    Time Auto-correlation Function

    -
    -
    - -

    If we assume that \( \lambda_0 \) is the largest eigenvector we see that in the limit \( t\rightarrow \infty \), -\( \mathbf{\hat{w}}(t) \) becomes proportional to the corresponding eigenvector -\( \mathbf{\hat{v}}_0 \). This is our steady state or final distribution. -

    +

    Resampling analysis

    -

    We can relate this property to an observable like the mean energy. -With the probabilty \( \mathbf{\hat{w}}(t) \) (which in our case is the squared trial wave function) we -can write the expectation values as +

    The next step is then to use the above data sets and perform a +resampling analysis using the blocking method +The blocking code, based on the article of Marius Jonsson is given here

    -$$ - \langle \mathbf{M}(t) \rangle = \sum_{\mu} \mathbf{\hat{w}}(t)_{\mu}\mathbf{M}_{\mu}, -$$ -

    or as the scalar of a vector product

    -$$ - \langle \mathbf{M}(t) \rangle = \mathbf{\hat{w}}(t)\mathbf{m}, -$$ -

    with \( \mathbf{m} \) being the vector whose elements are the values of \( \mathbf{M}_{\mu} \) in its -various microstates \( \mu \). -

    + +
    +
    +
    +
    +
    +
    # Common imports
    +import os
    +
    +# Where to save the figures and data files
    +DATA_ID = "Results/EnergyMin"
    +
    +def data_path(dat_id):
    +    return os.path.join(DATA_ID, dat_id)
    +
    +infile = open(data_path("Energies.dat"),'r')
    +
    +from numpy import log2, zeros, mean, var, sum, loadtxt, arange, array, cumsum, dot, transpose, diagonal, sqrt
    +from numpy.linalg import inv
    +
    +def block(x):
    +    # preliminaries
    +    n = len(x)
    +    d = int(log2(n))
    +    s, gamma = zeros(d), zeros(d)
    +    mu = mean(x)
    +
    +    # estimate the auto-covariance and variances 
    +    # for each blocking transformation
    +    for i in arange(0,d):
    +        n = len(x)
    +        # estimate autocovariance of x
    +        gamma[i] = (n)**(-1)*sum( (x[0:(n-1)]-mu)*(x[1:n]-mu) )
    +        # estimate variance of x
    +        s[i] = var(x)
    +        # perform blocking transformation
    +        x = 0.5*(x[0::2] + x[1::2])
    +   
    +    # generate the test observator M_k from the theorem
    +    M = (cumsum( ((gamma/s)**2*2**arange(1,d+1)[::-1])[::-1] )  )[::-1]
    +
    +    # we need a list of magic numbers
    +    q =array([6.634897,9.210340, 11.344867, 13.276704, 15.086272, 16.811894, 18.475307, 20.090235, 21.665994, 23.209251, 24.724970, 26.216967, 27.688250, 29.141238, 30.577914, 31.999927, 33.408664, 34.805306, 36.190869, 37.566235, 38.932173, 40.289360, 41.638398, 42.979820, 44.314105, 45.641683, 46.962942, 48.278236, 49.587884, 50.892181])
    +
    +    # use magic to determine when we should have stopped blocking
    +    for k in arange(0,d):
    +        if(M[k] < q[k]):
    +            break
    +    if (k >= d-1):
    +        print("Warning: Use more data")
    +    return mu, s[k]/2**(d-k)
    +
    +
    +x = loadtxt(infile)
    +(mean, var) = block(x) 
    +std = sqrt(var)
    +import pandas as pd
    +from pandas import DataFrame
    +data ={'Mean':[mean], 'STDev':[std]}
    +frame = pd.DataFrame(data,index=['Values'])
    +print(frame)
    +
    +
    +
    +
    +
    +
    +
    +
    +
    +
    +
    +
    @@ -235,18 +244,6 @@

    Time Auto-correlation Fun
  • 17
  • 18
  • 19
  • -
  • 20
  • -
  • 21
  • -
  • 22
  • -
  • 23
  • -
  • 24
  • -
  • 25
  • -
  • 26
  • -
  • 27
  • -
  • 28
  • -
  • ...
  • -
  • 29
  • -
  • »
  • diff --git a/doc/pub/week9/html/week9-bs.html b/doc/pub/week9/html/week9-bs.html index 1b8da37c..7c08e6d5 100644 --- a/doc/pub/week9/html/week9-bs.html +++ b/doc/pub/week9/html/week9-bs.html @@ -62,43 +62,6 @@ 2, None, 'introducing-the-correlation-function'), - ('Statistics, wrapping up from last week', - 2, - None, - 'statistics-wrapping-up-from-last-week'), - ('Statistics, final expression', - 2, - None, - 'statistics-final-expression'), - ('Statistics, effective number of correlations', - 2, - None, - 'statistics-effective-number-of-correlations'), - ('Can we understand this? Time Auto-correlation Function', - 2, - None, - 'can-we-understand-this-time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Correlation Time', 2, None, 'correlation-time'), ('Resampling methods: Blocking', 2, None, @@ -163,23 +126,13 @@
  • The covariance term
  • Rewriting the covariance term
  • Introducing the correlation function
  • -
  • Statistics, wrapping up from last week
  • -
  • Statistics, final expression
  • -
  • Statistics, effective number of correlations
  • -
  • Can we understand this? Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Time Auto-correlation Function
  • -
  • Correlation Time
  • -
  • Resampling methods: Blocking
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations
  • -
  • Blocking Transformations, getting there
  • -
  • Blocking Transformations, final expressions
  • -
  • Example code form last week
  • -
  • Resampling analysis
  • +
  • Resampling methods: Blocking
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations
  • +
  • Blocking Transformations, getting there
  • +
  • Blocking Transformations, final expressions
  • +
  • Example code form last week
  • +
  • Resampling analysis
  • @@ -234,7 +187,7 @@

    March 11-15

  • 9
  • 10
  • ...
  • -
  • 29
  • +
  • 19
  • »
  • diff --git a/doc/pub/week9/html/week9-reveal.html b/doc/pub/week9/html/week9-reveal.html index 4035850c..bdd02a2a 100644 --- a/doc/pub/week9/html/week9-reveal.html +++ b/doc/pub/week9/html/week9-reveal.html @@ -201,9 +201,9 @@

    Overview of week 11, March 11-15

    1. Reminder from last week about statistical observables, the central limit theorem and bootstrapping, see notes from last week
    2. -

    3. Resampling TechniquesL Blocking
    4. -

    5. Discussion of onebody densities
    6. -

    7. Start discussion on optimization and parallelization +

    8. Resampling Techniques, emphasis on Blocking
    9. +

    10. Discussion of onebody densities (whiteboard notes)
    11. +

    12. Start discussion on optimization and parallelization for Python and C++
    @@ -406,7 +406,7 @@

    Rewriting the covariance term

    $$

     
    -

    We note that for \( d= \) we have

    +

    We note that for \( d=0 \) we have

     
    $$ f_0=\frac{2}{mn}\sum_{i=1}^{m} \sum_{k=1}^{n}\tilde{x}_{ik}\tilde{x}_{i(k)}=\sigma^2! @@ -429,362 +429,6 @@

    Introducing the correlation functi

    The code here shows the evolution of \( \kappa_d \) as a function of \( d \) for a series of random numbers. We see that the function \( \kappa_d \) approaches \( 0 \) as \( d\rightarrow \infty \).

    -
    -

    Statistics, wrapping up from last week

    -
    - -

    -

    Let us analyze the problem by splitting up the correlation term into -partial sums of the form: -

    -

     
    -$$ -f_d = \frac{1}{n-d}\sum_{k=1}^{n-d}(x_k - \bar x_n)(x_{k+d} - \bar x_n) -$$ -

     
    - -

    The correlation term of the error can now be rewritten in terms of -\( f_d \) -

    -

     
    -$$ -\frac{2}{n}\sum_{k < l} (x_k - \bar x_n)(x_l - \bar x_n) = -2\sum_{d=1}^{n-1} f_d -$$ -

     
    - -

    The value of \( f_d \) reflects the correlation between measurements -separated by the distance \( d \) in the sample samples. Notice that for -\( d=0 \), \( f \) is just the sample variance, \( \mathrm{var}(x) \). If we divide \( f_d \) -by \( \mathrm{var}(x) \), we arrive at the so called autocorrelation function -

    -

     
    -$$ -\kappa_d = \frac{f_d}{\mathrm{var}(x)} -$$ -

     
    - -

    which gives us a useful measure of pairwise correlations -starting always at \( 1 \) for \( d=0 \). -

    -
    -
    - -
    -

    Statistics, final expression

    -
    - -

    -

    The sample error can now be -written in terms of the autocorrelation function: -

    - -

     
    -$$ -\begin{align} -\mathrm{err}_X^2 &= -\frac{1}{n}\mathrm{var}(x)+\frac{2}{n}\cdot\mathrm{var}(x)\sum_{d=1}^{n-1} -\frac{f_d}{\mathrm{var}(x)}\nonumber\\ &=& -\left(1+2\sum_{d=1}^{n-1}\kappa_d\right)\frac{1}{n}\mathrm{var}(x)\nonumber\\ -&=\frac{\tau}{n}\cdot\mathrm{var}(x) -\tag{1} -\end{align} - -$$ -

     
    - -

    and we see that \( \mathrm{err}_X \) can be expressed in terms the -uncorrelated sample variance times a correction factor \( \tau \) which -accounts for the correlation between measurements. We call this -correction factor the autocorrelation time: -

    -

     
    -$$ -\begin{equation} -\tau = 1+2\sum_{d=1}^{n-1}\kappa_d -\tag{2} -\end{equation} -$$ -

     
    -

    -
    - -
    -

    Statistics, effective number of correlations

    -
    - -

    -

    For a correlation free experiment, \( \tau \) -equals 1. -

    - -

    We can interpret a sequential -correlation as an effective reduction of the number of measurements by -a factor \( \tau \). The effective number of measurements becomes: -

    -

     
    -$$ -n_\mathrm{eff} = \frac{n}{\tau} -$$ -

     
    - -

    To neglect the autocorrelation time \( \tau \) will always cause our -simple uncorrelated estimate of \( \mathrm{err}_X^2\approx \mathrm{var}(x)/n \) to -be less than the true sample error. The estimate of the error will be -too good. On the other hand, the calculation of the full -autocorrelation time poses an efficiency problem if the set of -measurements is very large. -

    -
    -
    - -
    -

    Can we understand this? Time Auto-correlation Function

    -
    - -

    - -

    The so-called time-displacement autocorrelation \( \phi(t) \) for a quantity \( \mathbf{M} \) is given by

    -

     
    -$$ -\phi(t) = \int dt' \left[\mathbf{M}(t')-\langle \mathbf{M} \rangle\right]\left[\mathbf{M}(t'+t)-\langle \mathbf{M} \rangle\right], -$$ -

     
    - -

    which can be rewritten as

    -

     
    -$$ -\phi(t) = \int dt' \left[\mathbf{M}(t')\mathbf{M}(t'+t)-\langle \mathbf{M} \rangle^2\right], -$$ -

     
    - -

    where \( \langle \mathbf{M} \rangle \) is the average value and -\( \mathbf{M}(t) \) its instantaneous value. We can discretize this function as follows, where we used our -set of computed values \( \mathbf{M}(t) \) for a set of discretized times (our Monte Carlo cycles corresponding to moving all electrons?) -

    -

     
    -$$ -\phi(t) = \frac{1}{t_{\mathrm{max}}-t}\sum_{t'=0}^{t_{\mathrm{max}}-t}\mathbf{M}(t')\mathbf{M}(t'+t) --\frac{1}{t_{\mathrm{max}}-t}\sum_{t'=0}^{t_{\mathrm{max}}-t}\mathbf{M}(t')\times -\frac{1}{t_{\mathrm{max}}-t}\sum_{t'=0}^{t_{\mathrm{max}}-t}\mathbf{M}(t'+t). -\tag{3} -$$ -

     
    -

    -
    - -
    -

    Time Auto-correlation Function

    -
    - -

    - -

    One should be careful with times close to \( t_{\mathrm{max}} \), the upper limit of the sums -becomes small and we end up integrating over a rather small time interval. This means that the statistical -error in \( \phi(t) \) due to the random nature of the fluctuations in \( \mathbf{M}(t) \) can become large. -

    - -

    One should therefore choose \( t \ll t_{\mathrm{max}} \).

    - -

    Note that the variable \( \mathbf{M} \) can be any expectation values of interest.

    - -

    The time-correlation function gives a measure of the correlation between the various values of the variable -at a time \( t' \) and a time \( t'+t \). If we multiply the values of \( \mathbf{M} \) at these two different times, -we will get a positive contribution if they are fluctuating in the same direction, or a negative value -if they fluctuate in the opposite direction. If we then integrate over time, or use the discretized version of, the time correlation function \( \phi(t) \) should take a non-zero value if the fluctuations are -correlated, else it should gradually go to zero. For times a long way apart -the different values of \( \mathbf{M} \) are most likely -uncorrelated and \( \phi(t) \) should be zero. -

    -
    -
    - -
    -

    Time Auto-correlation Function

    -
    - -

    -

    We can derive the correlation time by observing that our Metropolis algorithm is based on a random -walk in the space of all possible spin configurations. -Our probability -distribution function \( \mathbf{\hat{w}}(t) \) after a given number of time steps \( t \) could be written as -

    -

     
    -$$ - \mathbf{\hat{w}}(t) = \mathbf{\hat{W}^t\hat{w}}(0), -$$ -

     
    - -

    with \( \mathbf{\hat{w}}(0) \) the distribution at \( t=0 \) and \( \mathbf{\hat{W}} \) representing the -transition probability matrix. -We can always expand \( \mathbf{\hat{w}}(0) \) in terms of the right eigenvectors of -\( \mathbf{\hat{v}} \) of \( \mathbf{\hat{W}} \) as -

    -

     
    -$$ - \mathbf{\hat{w}}(0) = \sum_i\alpha_i\mathbf{\hat{v}}_i, -$$ -

     
    - -

    resulting in

    -

     
    -$$ - \mathbf{\hat{w}}(t) = \mathbf{\hat{W}}^t\mathbf{\hat{w}}(0)=\mathbf{\hat{W}}^t\sum_i\alpha_i\mathbf{\hat{v}}_i= -\sum_i\lambda_i^t\alpha_i\mathbf{\hat{v}}_i, -$$ -

     
    - -

    with \( \lambda_i \) the \( i^{\mathrm{th}} \) eigenvalue corresponding to -the eigenvector \( \mathbf{\hat{v}}_i \). -

    -
    -
    - -
    -

    Time Auto-correlation Function

    -
    - -

    -

    If we assume that \( \lambda_0 \) is the largest eigenvector we see that in the limit \( t\rightarrow \infty \), -\( \mathbf{\hat{w}}(t) \) becomes proportional to the corresponding eigenvector -\( \mathbf{\hat{v}}_0 \). This is our steady state or final distribution. -

    - -

    We can relate this property to an observable like the mean energy. -With the probabilty \( \mathbf{\hat{w}}(t) \) (which in our case is the squared trial wave function) we -can write the expectation values as -

    -

     
    -$$ - \langle \mathbf{M}(t) \rangle = \sum_{\mu} \mathbf{\hat{w}}(t)_{\mu}\mathbf{M}_{\mu}, -$$ -

     
    - -

    or as the scalar of a vector product

    -

     
    -$$ - \langle \mathbf{M}(t) \rangle = \mathbf{\hat{w}}(t)\mathbf{m}, -$$ -

     
    - -

    with \( \mathbf{m} \) being the vector whose elements are the values of \( \mathbf{M}_{\mu} \) in its -various microstates \( \mu \). -

    -
    -
    - -
    -

    Time Auto-correlation Function

    - -
    - -

    - -

    We rewrite this relation as

    -

     
    -$$ - \langle \mathbf{M}(t) \rangle = \mathbf{\hat{w}}(t)\mathbf{m}=\sum_i\lambda_i^t\alpha_i\mathbf{\hat{v}}_i\mathbf{m}_i. -$$ -

     
    - -

    If we define \( m_i=\mathbf{\hat{v}}_i\mathbf{m}_i \) as the expectation value of -\( \mathbf{M} \) in the \( i^{\mathrm{th}} \) eigenstate we can rewrite the last equation as -

    -

     
    -$$ - \langle \mathbf{M}(t) \rangle = \sum_i\lambda_i^t\alpha_im_i. -$$ -

     
    - -

    Since we have that in the limit \( t\rightarrow \infty \) the mean value is dominated by the -the largest eigenvalue \( \lambda_0 \), we can rewrite the last equation as -

    -

     
    -$$ - \langle \mathbf{M}(t) \rangle = \langle \mathbf{M}(\infty) \rangle+\sum_{i\ne 0}\lambda_i^t\alpha_im_i. -$$ -

     
    - -

    We define the quantity

    -

     
    -$$ - \tau_i=-\frac{1}{log\lambda_i}, -$$ -

     
    - -

    and rewrite the last expectation value as

    -

     
    -$$ - \langle \mathbf{M}(t) \rangle = \langle \mathbf{M}(\infty) \rangle+\sum_{i\ne 0}\alpha_im_ie^{-t/\tau_i}. -\tag{4} -$$ -

     
    -

    -
    - -
    -

    Time Auto-correlation Function

    -
    - -

    - -

    The quantities \( \tau_i \) are the correlation times for the system. They control also the auto-correlation function -discussed above. The longest correlation time is obviously given by the second largest -eigenvalue \( \tau_1 \), which normally defines the correlation time discussed above. For large times, this is the -only correlation time that survives. If higher eigenvalues of the transition matrix are well separated from -\( \lambda_1 \) and we simulate long enough, \( \tau_1 \) may well define the correlation time. -In other cases we may not be able to extract a reliable result for \( \tau_1 \). -Coming back to the time correlation function \( \phi(t) \) we can present a more general definition in terms -of the mean magnetizations $ \langle \mathbf{M}(t) \rangle$. Recalling that the mean value is equal -to $ \langle \mathbf{M}(\infty) \rangle$ we arrive at the expectation values -

    -

     
    -$$ -\phi(t) =\langle \mathbf{M}(0)-\mathbf{M}(\infty)\rangle \langle \mathbf{M}(t)-\mathbf{M}(\infty)\rangle, -$$ -

     
    - -

    resulting in

    -

     
    -$$ -\phi(t) =\sum_{i,j\ne 0}m_i\alpha_im_j\alpha_je^{-t/\tau_i}, -$$ -

     
    - -

    which is appropriate for all times.

    -
    -
    - -
    -

    Correlation Time

    -
    - -

    - -

    If the correlation function decays exponentially

    -

     
    -$$ \phi (t) \sim \exp{(-t/\tau)}$$ -

     
    - -

    then the exponential correlation time can be computed as the average

    -

     
    -$$ \tau_{\mathrm{exp}} = -\langle \frac{t}{log|\frac{\phi(t)}{\phi(0)}|} \rangle. $$ -

     
    - -

    If the decay is exponential, then

    -

     
    -$$ \int_0^{\infty} dt \phi(t) = \int_0^{\infty} dt \phi(0)\exp{(-t/\tau)} = \tau \phi(0),$$ -

     
    - -

    which suggests another measure of correlation

    -

     
    -$$ \tau_{\mathrm{int}} = \sum_k \frac{\phi(k)}{\phi(0)}, $$ -

     
    - -

    called the integrated correlation time.

    -
    -
    -

    Resampling methods: Blocking

    @@ -829,7 +473,7 @@

    Blocking Transformations

    (\vec{X}_0)_k &\equiv (\vec{X})_k \nonumber \\ (\vec{X}_{i+1})_k &\equiv \frac{1}{2}\Big( (\vec{X}_i)_{2k-1} + (\vec{X}_i)_{2k} \Big) \qquad \text{for all} \qquad 1 \leq i \leq d-1 -\tag{5} +\tag{1} \end{align} $$

     
    @@ -865,9 +509,9 @@

    Blocking Transformations

    \gamma_{k+1}(h) &= cov\left( ({X}_{k+1})_{i}, ({X}_{k+1})_{j} \right) \nonumber \\ &= \frac{1}{4}cov\left( ({X}_{k})_{2i-1} + ({X}_{k})_{2i}, ({X}_{k})_{2j-1} + ({X}_{k})_{2j} \right) \nonumber \\ &= \frac{1}{2}\gamma_{k}(2h) + \frac{1}{2}\gamma_k(2h+1) \hspace{0.1cm} \mathrm{h = 0} -\tag{6}\\ +\tag{2}\\ &=\frac{1}{4}\gamma_k(2h-1) + \frac{1}{2}\gamma_k(2h) + \frac{1}{4}\gamma_k(2h+1) \quad \mathrm{else} -\tag{7} +\tag{3} \end{align} $$

     
    @@ -882,7 +526,7 @@

    Blocking Transformations, gettin $$ \begin{align} V(\overline{X}_k) = \frac{\sigma_k^2}{n_k} + \underbrace{\frac{2}{n_k} \sum_{h=1}^{n_k-1}\left( 1 - \frac{h}{n_k} \right)\gamma_k(h)}_{\equiv e_k} = \frac{\sigma^2_k}{n_k} + e_k \quad \text{if} \quad \gamma_k(0) = \sigma_k^2. -\tag{8} +\tag{4} \end{align} $$

     
    @@ -892,7 +536,7 @@

    Blocking Transformations, gettin $$ \begin{equation} e_k = \frac{2}{n_k} \sum_{h=1}^{n_k-1}\left( 1 - \frac{h}{n_k} \right)\gamma_k(h). -\tag{9} +\tag{5} \end{equation} $$

     
    @@ -909,7 +553,7 @@

    Blocking Transformations, fi \begin{align} n_{j+1} \overline{X}_{j+1} &= \sum_{i=1}^{n_{j+1}} (\hat{X}_{j+1})_i = \frac{1}{2}\sum_{i=1}^{n_{j}/2} (\hat{X}_{j})_{2i-1} + (\hat{X}_{j})_{2i} \nonumber \\ &= \frac{1}{2}\left[ (\hat{X}_j)_1 + (\hat{X}_j)_2 + \cdots + (\hat{X}_j)_{n_j} \right] = \underbrace{\frac{n_j}{2}}_{=n_{j+1}} \overline{X}_j = n_{j+1}\overline{X}_j. -\tag{10} +\tag{6} \end{align} $$

     
    @@ -918,7 +562,7 @@

    Blocking Transformations, fi

     
    $$ \begin{align} -V(\overline{X}) = \frac{\sigma_k^2}{n_k} + e_k \qquad \text{for all} \qquad 0 \leq k \leq d-1. \tag{11} +V(\overline{X}) = \frac{\sigma_k^2}{n_k} + e_k \qquad \text{for all} \qquad 0 \leq k \leq d-1. \tag{7} \end{align} $$

     
    diff --git a/doc/pub/week9/html/week9-solarized.html b/doc/pub/week9/html/week9-solarized.html index 01806e3f..405b3368 100644 --- a/doc/pub/week9/html/week9-solarized.html +++ b/doc/pub/week9/html/week9-solarized.html @@ -89,43 +89,6 @@ 2, None, 'introducing-the-correlation-function'), - ('Statistics, wrapping up from last week', - 2, - None, - 'statistics-wrapping-up-from-last-week'), - ('Statistics, final expression', - 2, - None, - 'statistics-final-expression'), - ('Statistics, effective number of correlations', - 2, - None, - 'statistics-effective-number-of-correlations'), - ('Can we understand this? Time Auto-correlation Function', - 2, - None, - 'can-we-understand-this-time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Correlation Time', 2, None, 'correlation-time'), ('Resampling methods: Blocking', 2, None, @@ -193,9 +156,9 @@

    Overview of week 11, March 11-15

    1. Reminder from last week about statistical observables, the central limit theorem and bootstrapping, see notes from last week
    2. -
    3. Resampling TechniquesL Blocking
    4. -
    5. Discussion of onebody densities
    6. -
    7. Start discussion on optimization and parallelization +
    8. Resampling Techniques, emphasis on Blocking
    9. +
    10. Discussion of onebody densities (whiteboard notes)
    11. +
    12. Start discussion on optimization and parallelization for Python and C++
    @@ -370,7 +333,7 @@

    Rewriting the covariance term

    f_d=\frac{2}{mn}\sum_{i=1}^{m} \sum_{k=1}^{n-d}\tilde{x}_{ik}\tilde{x}_{i(k+d)}. $$ -

    We note that for \( d= \) we have

    +

    We note that for \( d=0 \) we have

    $$ f_0=\frac{2}{mn}\sum_{i=1}^{m} \sum_{k=1}^{n}\tilde{x}_{ik}\tilde{x}_{i(k)}=\sigma^2! $$ @@ -388,312 +351,6 @@

    Introducing the correlation functi

    The code here shows the evolution of \( \kappa_d \) as a function of \( d \) for a series of random numbers. We see that the function \( \kappa_d \) approaches \( 0 \) as \( d\rightarrow \infty \).

    -









    -

    Statistics, wrapping up from last week

    -
    - -

    -

    Let us analyze the problem by splitting up the correlation term into -partial sums of the form: -

    -$$ -f_d = \frac{1}{n-d}\sum_{k=1}^{n-d}(x_k - \bar x_n)(x_{k+d} - \bar x_n) -$$ - -

    The correlation term of the error can now be rewritten in terms of -\( f_d \) -

    -$$ -\frac{2}{n}\sum_{k < l} (x_k - \bar x_n)(x_l - \bar x_n) = -2\sum_{d=1}^{n-1} f_d -$$ - -

    The value of \( f_d \) reflects the correlation between measurements -separated by the distance \( d \) in the sample samples. Notice that for -\( d=0 \), \( f \) is just the sample variance, \( \mathrm{var}(x) \). If we divide \( f_d \) -by \( \mathrm{var}(x) \), we arrive at the so called autocorrelation function -

    -$$ -\kappa_d = \frac{f_d}{\mathrm{var}(x)} -$$ - -

    which gives us a useful measure of pairwise correlations -starting always at \( 1 \) for \( d=0 \). -

    -
    - - -









    -

    Statistics, final expression

    -
    - -

    -

    The sample error can now be -written in terms of the autocorrelation function: -

    - -$$ -\begin{align} -\mathrm{err}_X^2 &= -\frac{1}{n}\mathrm{var}(x)+\frac{2}{n}\cdot\mathrm{var}(x)\sum_{d=1}^{n-1} -\frac{f_d}{\mathrm{var}(x)}\nonumber\\ &=& -\left(1+2\sum_{d=1}^{n-1}\kappa_d\right)\frac{1}{n}\mathrm{var}(x)\nonumber\\ -&=\frac{\tau}{n}\cdot\mathrm{var}(x) -\label{_auto1} -\end{align} - -$$ - -

    and we see that \( \mathrm{err}_X \) can be expressed in terms the -uncorrelated sample variance times a correction factor \( \tau \) which -accounts for the correlation between measurements. We call this -correction factor the autocorrelation time: -

    -$$ -\begin{equation} -\tau = 1+2\sum_{d=1}^{n-1}\kappa_d -\label{eq:autocorrelation_time} -\end{equation} -$$ -
    - - -









    -

    Statistics, effective number of correlations

    -
    - -

    -

    For a correlation free experiment, \( \tau \) -equals 1. -

    - -

    We can interpret a sequential -correlation as an effective reduction of the number of measurements by -a factor \( \tau \). The effective number of measurements becomes: -

    -$$ -n_\mathrm{eff} = \frac{n}{\tau} -$$ - -

    To neglect the autocorrelation time \( \tau \) will always cause our -simple uncorrelated estimate of \( \mathrm{err}_X^2\approx \mathrm{var}(x)/n \) to -be less than the true sample error. The estimate of the error will be -too good. On the other hand, the calculation of the full -autocorrelation time poses an efficiency problem if the set of -measurements is very large. -

    -
    - - -









    -

    Can we understand this? Time Auto-correlation Function

    -
    - -

    - -

    The so-called time-displacement autocorrelation \( \phi(t) \) for a quantity \( \mathbf{M} \) is given by

    -$$ -\phi(t) = \int dt' \left[\mathbf{M}(t')-\langle \mathbf{M} \rangle\right]\left[\mathbf{M}(t'+t)-\langle \mathbf{M} \rangle\right], -$$ - -

    which can be rewritten as

    -$$ -\phi(t) = \int dt' \left[\mathbf{M}(t')\mathbf{M}(t'+t)-\langle \mathbf{M} \rangle^2\right], -$$ - -

    where \( \langle \mathbf{M} \rangle \) is the average value and -\( \mathbf{M}(t) \) its instantaneous value. We can discretize this function as follows, where we used our -set of computed values \( \mathbf{M}(t) \) for a set of discretized times (our Monte Carlo cycles corresponding to moving all electrons?) -

    -$$ -\phi(t) = \frac{1}{t_{\mathrm{max}}-t}\sum_{t'=0}^{t_{\mathrm{max}}-t}\mathbf{M}(t')\mathbf{M}(t'+t) --\frac{1}{t_{\mathrm{max}}-t}\sum_{t'=0}^{t_{\mathrm{max}}-t}\mathbf{M}(t')\times -\frac{1}{t_{\mathrm{max}}-t}\sum_{t'=0}^{t_{\mathrm{max}}-t}\mathbf{M}(t'+t). -\label{eq:phitf} -$$ -
    - - -









    -

    Time Auto-correlation Function

    -
    - -

    - -

    One should be careful with times close to \( t_{\mathrm{max}} \), the upper limit of the sums -becomes small and we end up integrating over a rather small time interval. This means that the statistical -error in \( \phi(t) \) due to the random nature of the fluctuations in \( \mathbf{M}(t) \) can become large. -

    - -

    One should therefore choose \( t \ll t_{\mathrm{max}} \).

    - -

    Note that the variable \( \mathbf{M} \) can be any expectation values of interest.

    - -

    The time-correlation function gives a measure of the correlation between the various values of the variable -at a time \( t' \) and a time \( t'+t \). If we multiply the values of \( \mathbf{M} \) at these two different times, -we will get a positive contribution if they are fluctuating in the same direction, or a negative value -if they fluctuate in the opposite direction. If we then integrate over time, or use the discretized version of, the time correlation function \( \phi(t) \) should take a non-zero value if the fluctuations are -correlated, else it should gradually go to zero. For times a long way apart -the different values of \( \mathbf{M} \) are most likely -uncorrelated and \( \phi(t) \) should be zero. -

    -
    - - -









    -

    Time Auto-correlation Function

    -
    - -

    -

    We can derive the correlation time by observing that our Metropolis algorithm is based on a random -walk in the space of all possible spin configurations. -Our probability -distribution function \( \mathbf{\hat{w}}(t) \) after a given number of time steps \( t \) could be written as -

    -$$ - \mathbf{\hat{w}}(t) = \mathbf{\hat{W}^t\hat{w}}(0), -$$ - -

    with \( \mathbf{\hat{w}}(0) \) the distribution at \( t=0 \) and \( \mathbf{\hat{W}} \) representing the -transition probability matrix. -We can always expand \( \mathbf{\hat{w}}(0) \) in terms of the right eigenvectors of -\( \mathbf{\hat{v}} \) of \( \mathbf{\hat{W}} \) as -

    -$$ - \mathbf{\hat{w}}(0) = \sum_i\alpha_i\mathbf{\hat{v}}_i, -$$ - -

    resulting in

    -$$ - \mathbf{\hat{w}}(t) = \mathbf{\hat{W}}^t\mathbf{\hat{w}}(0)=\mathbf{\hat{W}}^t\sum_i\alpha_i\mathbf{\hat{v}}_i= -\sum_i\lambda_i^t\alpha_i\mathbf{\hat{v}}_i, -$$ - -

    with \( \lambda_i \) the \( i^{\mathrm{th}} \) eigenvalue corresponding to -the eigenvector \( \mathbf{\hat{v}}_i \). -

    -
    - - -









    -

    Time Auto-correlation Function

    -
    - -

    -

    If we assume that \( \lambda_0 \) is the largest eigenvector we see that in the limit \( t\rightarrow \infty \), -\( \mathbf{\hat{w}}(t) \) becomes proportional to the corresponding eigenvector -\( \mathbf{\hat{v}}_0 \). This is our steady state or final distribution. -

    - -

    We can relate this property to an observable like the mean energy. -With the probabilty \( \mathbf{\hat{w}}(t) \) (which in our case is the squared trial wave function) we -can write the expectation values as -

    -$$ - \langle \mathbf{M}(t) \rangle = \sum_{\mu} \mathbf{\hat{w}}(t)_{\mu}\mathbf{M}_{\mu}, -$$ - -

    or as the scalar of a vector product

    -$$ - \langle \mathbf{M}(t) \rangle = \mathbf{\hat{w}}(t)\mathbf{m}, -$$ - -

    with \( \mathbf{m} \) being the vector whose elements are the values of \( \mathbf{M}_{\mu} \) in its -various microstates \( \mu \). -

    -
    - - -









    -

    Time Auto-correlation Function

    - -
    - -

    - -

    We rewrite this relation as

    -$$ - \langle \mathbf{M}(t) \rangle = \mathbf{\hat{w}}(t)\mathbf{m}=\sum_i\lambda_i^t\alpha_i\mathbf{\hat{v}}_i\mathbf{m}_i. -$$ - -

    If we define \( m_i=\mathbf{\hat{v}}_i\mathbf{m}_i \) as the expectation value of -\( \mathbf{M} \) in the \( i^{\mathrm{th}} \) eigenstate we can rewrite the last equation as -

    -$$ - \langle \mathbf{M}(t) \rangle = \sum_i\lambda_i^t\alpha_im_i. -$$ - -

    Since we have that in the limit \( t\rightarrow \infty \) the mean value is dominated by the -the largest eigenvalue \( \lambda_0 \), we can rewrite the last equation as -

    -$$ - \langle \mathbf{M}(t) \rangle = \langle \mathbf{M}(\infty) \rangle+\sum_{i\ne 0}\lambda_i^t\alpha_im_i. -$$ - -

    We define the quantity

    -$$ - \tau_i=-\frac{1}{log\lambda_i}, -$$ - -

    and rewrite the last expectation value as

    -$$ - \langle \mathbf{M}(t) \rangle = \langle \mathbf{M}(\infty) \rangle+\sum_{i\ne 0}\alpha_im_ie^{-t/\tau_i}. -\label{eq:finalmeanm} -$$ -
    - - -









    -

    Time Auto-correlation Function

    -
    - -

    - -

    The quantities \( \tau_i \) are the correlation times for the system. They control also the auto-correlation function -discussed above. The longest correlation time is obviously given by the second largest -eigenvalue \( \tau_1 \), which normally defines the correlation time discussed above. For large times, this is the -only correlation time that survives. If higher eigenvalues of the transition matrix are well separated from -\( \lambda_1 \) and we simulate long enough, \( \tau_1 \) may well define the correlation time. -In other cases we may not be able to extract a reliable result for \( \tau_1 \). -Coming back to the time correlation function \( \phi(t) \) we can present a more general definition in terms -of the mean magnetizations $ \langle \mathbf{M}(t) \rangle$. Recalling that the mean value is equal -to $ \langle \mathbf{M}(\infty) \rangle$ we arrive at the expectation values -

    -$$ -\phi(t) =\langle \mathbf{M}(0)-\mathbf{M}(\infty)\rangle \langle \mathbf{M}(t)-\mathbf{M}(\infty)\rangle, -$$ - -

    resulting in

    -$$ -\phi(t) =\sum_{i,j\ne 0}m_i\alpha_im_j\alpha_je^{-t/\tau_i}, -$$ - -

    which is appropriate for all times.

    -
    - - -









    -

    Correlation Time

    -
    - -

    - -

    If the correlation function decays exponentially

    -$$ \phi (t) \sim \exp{(-t/\tau)}$$ - -

    then the exponential correlation time can be computed as the average

    -$$ \tau_{\mathrm{exp}} = -\langle \frac{t}{log|\frac{\phi(t)}{\phi(0)}|} \rangle. $$ - -

    If the decay is exponential, then

    -$$ \int_0^{\infty} dt \phi(t) = \int_0^{\infty} dt \phi(0)\exp{(-t/\tau)} = \tau \phi(0),$$ - -

    which suggests another measure of correlation

    -$$ \tau_{\mathrm{int}} = \sum_k \frac{\phi(k)}{\phi(0)}, $$ - -

    called the integrated correlation time.

    -
    - -









    Resampling methods: Blocking

    @@ -734,7 +391,7 @@

    Blocking Transformations

    (\vec{X}_0)_k &\equiv (\vec{X})_k \nonumber \\ (\vec{X}_{i+1})_k &\equiv \frac{1}{2}\Big( (\vec{X}_i)_{2k-1} + (\vec{X}_i)_{2k} \Big) \qquad \text{for all} \qquad 1 \leq i \leq d-1 -\label{_auto2} +\label{_auto1} \end{align} $$ @@ -767,9 +424,9 @@

    Blocking Transformations

    \gamma_{k+1}(h) &= cov\left( ({X}_{k+1})_{i}, ({X}_{k+1})_{j} \right) \nonumber \\ &= \frac{1}{4}cov\left( ({X}_{k})_{2i-1} + ({X}_{k})_{2i}, ({X}_{k})_{2j-1} + ({X}_{k})_{2j} \right) \nonumber \\ &= \frac{1}{2}\gamma_{k}(2h) + \frac{1}{2}\gamma_k(2h+1) \hspace{0.1cm} \mathrm{h = 0} -\label{_auto3}\\ +\label{_auto2}\\ &=\frac{1}{4}\gamma_k(2h-1) + \frac{1}{2}\gamma_k(2h) + \frac{1}{4}\gamma_k(2h+1) \quad \mathrm{else} -\label{_auto4} +\label{_auto3} \end{align} $$ @@ -781,7 +438,7 @@

    Blocking Transformations, gettin $$ \begin{align} V(\overline{X}_k) = \frac{\sigma_k^2}{n_k} + \underbrace{\frac{2}{n_k} \sum_{h=1}^{n_k-1}\left( 1 - \frac{h}{n_k} \right)\gamma_k(h)}_{\equiv e_k} = \frac{\sigma^2_k}{n_k} + e_k \quad \text{if} \quad \gamma_k(0) = \sigma_k^2. -\label{_auto5} +\label{_auto4} \end{align} $$ @@ -789,7 +446,7 @@

    Blocking Transformations, gettin $$ \begin{equation} e_k = \frac{2}{n_k} \sum_{h=1}^{n_k-1}\left( 1 - \frac{h}{n_k} \right)\gamma_k(h). -\label{_auto6} +\label{_auto5} \end{equation} $$ @@ -803,7 +460,7 @@

    Blocking Transformations, fi \begin{align} n_{j+1} \overline{X}_{j+1} &= \sum_{i=1}^{n_{j+1}} (\hat{X}_{j+1})_i = \frac{1}{2}\sum_{i=1}^{n_{j}/2} (\hat{X}_{j})_{2i-1} + (\hat{X}_{j})_{2i} \nonumber \\ &= \frac{1}{2}\left[ (\hat{X}_j)_1 + (\hat{X}_j)_2 + \cdots + (\hat{X}_j)_{n_j} \right] = \underbrace{\frac{n_j}{2}}_{=n_{j+1}} \overline{X}_j = n_{j+1}\overline{X}_j. -\label{_auto7} +\label{_auto6} \end{align} $$ diff --git a/doc/pub/week9/html/week9.html b/doc/pub/week9/html/week9.html index 17b57fa2..d0446a80 100644 --- a/doc/pub/week9/html/week9.html +++ b/doc/pub/week9/html/week9.html @@ -166,43 +166,6 @@ 2, None, 'introducing-the-correlation-function'), - ('Statistics, wrapping up from last week', - 2, - None, - 'statistics-wrapping-up-from-last-week'), - ('Statistics, final expression', - 2, - None, - 'statistics-final-expression'), - ('Statistics, effective number of correlations', - 2, - None, - 'statistics-effective-number-of-correlations'), - ('Can we understand this? Time Auto-correlation Function', - 2, - None, - 'can-we-understand-this-time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Time Auto-correlation Function', - 2, - None, - 'time-auto-correlation-function'), - ('Correlation Time', 2, None, 'correlation-time'), ('Resampling methods: Blocking', 2, None, @@ -270,9 +233,9 @@

    Overview of week 11, March 11-15

    1. Reminder from last week about statistical observables, the central limit theorem and bootstrapping, see notes from last week
    2. -
    3. Resampling TechniquesL Blocking
    4. -
    5. Discussion of onebody densities
    6. -
    7. Start discussion on optimization and parallelization +
    8. Resampling Techniques, emphasis on Blocking
    9. +
    10. Discussion of onebody densities (whiteboard notes)
    11. +
    12. Start discussion on optimization and parallelization for Python and C++
    @@ -447,7 +410,7 @@

    Rewriting the covariance term

    f_d=\frac{2}{mn}\sum_{i=1}^{m} \sum_{k=1}^{n-d}\tilde{x}_{ik}\tilde{x}_{i(k+d)}. $$ -

    We note that for \( d= \) we have

    +

    We note that for \( d=0 \) we have

    $$ f_0=\frac{2}{mn}\sum_{i=1}^{m} \sum_{k=1}^{n}\tilde{x}_{ik}\tilde{x}_{i(k)}=\sigma^2! $$ @@ -465,312 +428,6 @@

    Introducing the correlation functi

    The code here shows the evolution of \( \kappa_d \) as a function of \( d \) for a series of random numbers. We see that the function \( \kappa_d \) approaches \( 0 \) as \( d\rightarrow \infty \).

    -









    -

    Statistics, wrapping up from last week

    -
    - -

    -

    Let us analyze the problem by splitting up the correlation term into -partial sums of the form: -

    -$$ -f_d = \frac{1}{n-d}\sum_{k=1}^{n-d}(x_k - \bar x_n)(x_{k+d} - \bar x_n) -$$ - -

    The correlation term of the error can now be rewritten in terms of -\( f_d \) -

    -$$ -\frac{2}{n}\sum_{k < l} (x_k - \bar x_n)(x_l - \bar x_n) = -2\sum_{d=1}^{n-1} f_d -$$ - -

    The value of \( f_d \) reflects the correlation between measurements -separated by the distance \( d \) in the sample samples. Notice that for -\( d=0 \), \( f \) is just the sample variance, \( \mathrm{var}(x) \). If we divide \( f_d \) -by \( \mathrm{var}(x) \), we arrive at the so called autocorrelation function -

    -$$ -\kappa_d = \frac{f_d}{\mathrm{var}(x)} -$$ - -

    which gives us a useful measure of pairwise correlations -starting always at \( 1 \) for \( d=0 \). -

    -
    - - -









    -

    Statistics, final expression

    -
    - -

    -

    The sample error can now be -written in terms of the autocorrelation function: -

    - -$$ -\begin{align} -\mathrm{err}_X^2 &= -\frac{1}{n}\mathrm{var}(x)+\frac{2}{n}\cdot\mathrm{var}(x)\sum_{d=1}^{n-1} -\frac{f_d}{\mathrm{var}(x)}\nonumber\\ &=& -\left(1+2\sum_{d=1}^{n-1}\kappa_d\right)\frac{1}{n}\mathrm{var}(x)\nonumber\\ -&=\frac{\tau}{n}\cdot\mathrm{var}(x) -\label{_auto1} -\end{align} - -$$ - -

    and we see that \( \mathrm{err}_X \) can be expressed in terms the -uncorrelated sample variance times a correction factor \( \tau \) which -accounts for the correlation between measurements. We call this -correction factor the autocorrelation time: -

    -$$ -\begin{equation} -\tau = 1+2\sum_{d=1}^{n-1}\kappa_d -\label{eq:autocorrelation_time} -\end{equation} -$$ -
    - - -









    -

    Statistics, effective number of correlations

    -
    - -

    -

    For a correlation free experiment, \( \tau \) -equals 1. -

    - -

    We can interpret a sequential -correlation as an effective reduction of the number of measurements by -a factor \( \tau \). The effective number of measurements becomes: -

    -$$ -n_\mathrm{eff} = \frac{n}{\tau} -$$ - -

    To neglect the autocorrelation time \( \tau \) will always cause our -simple uncorrelated estimate of \( \mathrm{err}_X^2\approx \mathrm{var}(x)/n \) to -be less than the true sample error. The estimate of the error will be -too good. On the other hand, the calculation of the full -autocorrelation time poses an efficiency problem if the set of -measurements is very large. -

    -
    - - -









    -

    Can we understand this? Time Auto-correlation Function

    -
    - -

    - -

    The so-called time-displacement autocorrelation \( \phi(t) \) for a quantity \( \mathbf{M} \) is given by

    -$$ -\phi(t) = \int dt' \left[\mathbf{M}(t')-\langle \mathbf{M} \rangle\right]\left[\mathbf{M}(t'+t)-\langle \mathbf{M} \rangle\right], -$$ - -

    which can be rewritten as

    -$$ -\phi(t) = \int dt' \left[\mathbf{M}(t')\mathbf{M}(t'+t)-\langle \mathbf{M} \rangle^2\right], -$$ - -

    where \( \langle \mathbf{M} \rangle \) is the average value and -\( \mathbf{M}(t) \) its instantaneous value. We can discretize this function as follows, where we used our -set of computed values \( \mathbf{M}(t) \) for a set of discretized times (our Monte Carlo cycles corresponding to moving all electrons?) -

    -$$ -\phi(t) = \frac{1}{t_{\mathrm{max}}-t}\sum_{t'=0}^{t_{\mathrm{max}}-t}\mathbf{M}(t')\mathbf{M}(t'+t) --\frac{1}{t_{\mathrm{max}}-t}\sum_{t'=0}^{t_{\mathrm{max}}-t}\mathbf{M}(t')\times -\frac{1}{t_{\mathrm{max}}-t}\sum_{t'=0}^{t_{\mathrm{max}}-t}\mathbf{M}(t'+t). -\label{eq:phitf} -$$ -
    - - -









    -

    Time Auto-correlation Function

    -
    - -

    - -

    One should be careful with times close to \( t_{\mathrm{max}} \), the upper limit of the sums -becomes small and we end up integrating over a rather small time interval. This means that the statistical -error in \( \phi(t) \) due to the random nature of the fluctuations in \( \mathbf{M}(t) \) can become large. -

    - -

    One should therefore choose \( t \ll t_{\mathrm{max}} \).

    - -

    Note that the variable \( \mathbf{M} \) can be any expectation values of interest.

    - -

    The time-correlation function gives a measure of the correlation between the various values of the variable -at a time \( t' \) and a time \( t'+t \). If we multiply the values of \( \mathbf{M} \) at these two different times, -we will get a positive contribution if they are fluctuating in the same direction, or a negative value -if they fluctuate in the opposite direction. If we then integrate over time, or use the discretized version of, the time correlation function \( \phi(t) \) should take a non-zero value if the fluctuations are -correlated, else it should gradually go to zero. For times a long way apart -the different values of \( \mathbf{M} \) are most likely -uncorrelated and \( \phi(t) \) should be zero. -

    -
    - - -









    -

    Time Auto-correlation Function

    -
    - -

    -

    We can derive the correlation time by observing that our Metropolis algorithm is based on a random -walk in the space of all possible spin configurations. -Our probability -distribution function \( \mathbf{\hat{w}}(t) \) after a given number of time steps \( t \) could be written as -

    -$$ - \mathbf{\hat{w}}(t) = \mathbf{\hat{W}^t\hat{w}}(0), -$$ - -

    with \( \mathbf{\hat{w}}(0) \) the distribution at \( t=0 \) and \( \mathbf{\hat{W}} \) representing the -transition probability matrix. -We can always expand \( \mathbf{\hat{w}}(0) \) in terms of the right eigenvectors of -\( \mathbf{\hat{v}} \) of \( \mathbf{\hat{W}} \) as -

    -$$ - \mathbf{\hat{w}}(0) = \sum_i\alpha_i\mathbf{\hat{v}}_i, -$$ - -

    resulting in

    -$$ - \mathbf{\hat{w}}(t) = \mathbf{\hat{W}}^t\mathbf{\hat{w}}(0)=\mathbf{\hat{W}}^t\sum_i\alpha_i\mathbf{\hat{v}}_i= -\sum_i\lambda_i^t\alpha_i\mathbf{\hat{v}}_i, -$$ - -

    with \( \lambda_i \) the \( i^{\mathrm{th}} \) eigenvalue corresponding to -the eigenvector \( \mathbf{\hat{v}}_i \). -

    -
    - - -









    -

    Time Auto-correlation Function

    -
    - -

    -

    If we assume that \( \lambda_0 \) is the largest eigenvector we see that in the limit \( t\rightarrow \infty \), -\( \mathbf{\hat{w}}(t) \) becomes proportional to the corresponding eigenvector -\( \mathbf{\hat{v}}_0 \). This is our steady state or final distribution. -

    - -

    We can relate this property to an observable like the mean energy. -With the probabilty \( \mathbf{\hat{w}}(t) \) (which in our case is the squared trial wave function) we -can write the expectation values as -

    -$$ - \langle \mathbf{M}(t) \rangle = \sum_{\mu} \mathbf{\hat{w}}(t)_{\mu}\mathbf{M}_{\mu}, -$$ - -

    or as the scalar of a vector product

    -$$ - \langle \mathbf{M}(t) \rangle = \mathbf{\hat{w}}(t)\mathbf{m}, -$$ - -

    with \( \mathbf{m} \) being the vector whose elements are the values of \( \mathbf{M}_{\mu} \) in its -various microstates \( \mu \). -

    -
    - - -









    -

    Time Auto-correlation Function

    - -
    - -

    - -

    We rewrite this relation as

    -$$ - \langle \mathbf{M}(t) \rangle = \mathbf{\hat{w}}(t)\mathbf{m}=\sum_i\lambda_i^t\alpha_i\mathbf{\hat{v}}_i\mathbf{m}_i. -$$ - -

    If we define \( m_i=\mathbf{\hat{v}}_i\mathbf{m}_i \) as the expectation value of -\( \mathbf{M} \) in the \( i^{\mathrm{th}} \) eigenstate we can rewrite the last equation as -

    -$$ - \langle \mathbf{M}(t) \rangle = \sum_i\lambda_i^t\alpha_im_i. -$$ - -

    Since we have that in the limit \( t\rightarrow \infty \) the mean value is dominated by the -the largest eigenvalue \( \lambda_0 \), we can rewrite the last equation as -

    -$$ - \langle \mathbf{M}(t) \rangle = \langle \mathbf{M}(\infty) \rangle+\sum_{i\ne 0}\lambda_i^t\alpha_im_i. -$$ - -

    We define the quantity

    -$$ - \tau_i=-\frac{1}{log\lambda_i}, -$$ - -

    and rewrite the last expectation value as

    -$$ - \langle \mathbf{M}(t) \rangle = \langle \mathbf{M}(\infty) \rangle+\sum_{i\ne 0}\alpha_im_ie^{-t/\tau_i}. -\label{eq:finalmeanm} -$$ -
    - - -









    -

    Time Auto-correlation Function

    -
    - -

    - -

    The quantities \( \tau_i \) are the correlation times for the system. They control also the auto-correlation function -discussed above. The longest correlation time is obviously given by the second largest -eigenvalue \( \tau_1 \), which normally defines the correlation time discussed above. For large times, this is the -only correlation time that survives. If higher eigenvalues of the transition matrix are well separated from -\( \lambda_1 \) and we simulate long enough, \( \tau_1 \) may well define the correlation time. -In other cases we may not be able to extract a reliable result for \( \tau_1 \). -Coming back to the time correlation function \( \phi(t) \) we can present a more general definition in terms -of the mean magnetizations $ \langle \mathbf{M}(t) \rangle$. Recalling that the mean value is equal -to $ \langle \mathbf{M}(\infty) \rangle$ we arrive at the expectation values -

    -$$ -\phi(t) =\langle \mathbf{M}(0)-\mathbf{M}(\infty)\rangle \langle \mathbf{M}(t)-\mathbf{M}(\infty)\rangle, -$$ - -

    resulting in

    -$$ -\phi(t) =\sum_{i,j\ne 0}m_i\alpha_im_j\alpha_je^{-t/\tau_i}, -$$ - -

    which is appropriate for all times.

    -
    - - -









    -

    Correlation Time

    -
    - -

    - -

    If the correlation function decays exponentially

    -$$ \phi (t) \sim \exp{(-t/\tau)}$$ - -

    then the exponential correlation time can be computed as the average

    -$$ \tau_{\mathrm{exp}} = -\langle \frac{t}{log|\frac{\phi(t)}{\phi(0)}|} \rangle. $$ - -

    If the decay is exponential, then

    -$$ \int_0^{\infty} dt \phi(t) = \int_0^{\infty} dt \phi(0)\exp{(-t/\tau)} = \tau \phi(0),$$ - -

    which suggests another measure of correlation

    -$$ \tau_{\mathrm{int}} = \sum_k \frac{\phi(k)}{\phi(0)}, $$ - -

    called the integrated correlation time.

    -
    - -









    Resampling methods: Blocking

    @@ -811,7 +468,7 @@

    Blocking Transformations

    (\vec{X}_0)_k &\equiv (\vec{X})_k \nonumber \\ (\vec{X}_{i+1})_k &\equiv \frac{1}{2}\Big( (\vec{X}_i)_{2k-1} + (\vec{X}_i)_{2k} \Big) \qquad \text{for all} \qquad 1 \leq i \leq d-1 -\label{_auto2} +\label{_auto1} \end{align} $$ @@ -844,9 +501,9 @@

    Blocking Transformations

    \gamma_{k+1}(h) &= cov\left( ({X}_{k+1})_{i}, ({X}_{k+1})_{j} \right) \nonumber \\ &= \frac{1}{4}cov\left( ({X}_{k})_{2i-1} + ({X}_{k})_{2i}, ({X}_{k})_{2j-1} + ({X}_{k})_{2j} \right) \nonumber \\ &= \frac{1}{2}\gamma_{k}(2h) + \frac{1}{2}\gamma_k(2h+1) \hspace{0.1cm} \mathrm{h = 0} -\label{_auto3}\\ +\label{_auto2}\\ &=\frac{1}{4}\gamma_k(2h-1) + \frac{1}{2}\gamma_k(2h) + \frac{1}{4}\gamma_k(2h+1) \quad \mathrm{else} -\label{_auto4} +\label{_auto3} \end{align} $$ @@ -858,7 +515,7 @@

    Blocking Transformations, gettin $$ \begin{align} V(\overline{X}_k) = \frac{\sigma_k^2}{n_k} + \underbrace{\frac{2}{n_k} \sum_{h=1}^{n_k-1}\left( 1 - \frac{h}{n_k} \right)\gamma_k(h)}_{\equiv e_k} = \frac{\sigma^2_k}{n_k} + e_k \quad \text{if} \quad \gamma_k(0) = \sigma_k^2. -\label{_auto5} +\label{_auto4} \end{align} $$ @@ -866,7 +523,7 @@

    Blocking Transformations, gettin $$ \begin{equation} e_k = \frac{2}{n_k} \sum_{h=1}^{n_k-1}\left( 1 - \frac{h}{n_k} \right)\gamma_k(h). -\label{_auto6} +\label{_auto5} \end{equation} $$ @@ -880,7 +537,7 @@

    Blocking Transformations, fi \begin{align} n_{j+1} \overline{X}_{j+1} &= \sum_{i=1}^{n_{j+1}} (\hat{X}_{j+1})_i = \frac{1}{2}\sum_{i=1}^{n_{j}/2} (\hat{X}_{j})_{2i-1} + (\hat{X}_{j})_{2i} \nonumber \\ &= \frac{1}{2}\left[ (\hat{X}_j)_1 + (\hat{X}_j)_2 + \cdots + (\hat{X}_j)_{n_j} \right] = \underbrace{\frac{n_j}{2}}_{=n_{j+1}} \overline{X}_j = n_{j+1}\overline{X}_j. -\label{_auto7} +\label{_auto6} \end{align} $$ diff --git a/doc/pub/week9/ipynb/ipynb-week9-src.tar.gz b/doc/pub/week9/ipynb/ipynb-week9-src.tar.gz index fae386c8..27d96613 100644 Binary files a/doc/pub/week9/ipynb/ipynb-week9-src.tar.gz and b/doc/pub/week9/ipynb/ipynb-week9-src.tar.gz differ diff --git a/doc/pub/week9/ipynb/week9.ipynb b/doc/pub/week9/ipynb/week9.ipynb index 25303b9e..c9d937ec 100644 --- a/doc/pub/week9/ipynb/week9.ipynb +++ b/doc/pub/week9/ipynb/week9.ipynb @@ -2,7 +2,7 @@ "cells": [ { "cell_type": "markdown", - "id": "12750b68", + "id": "112079bb", "metadata": { "editable": true }, @@ -14,7 +14,7 @@ }, { "cell_type": "markdown", - "id": "b1da46d5", + "id": "01330897", "metadata": { "editable": true }, @@ -27,7 +27,7 @@ }, { "cell_type": "markdown", - "id": "53456ea2", + "id": "f02c162a", "metadata": { "editable": true }, @@ -37,11 +37,11 @@ "\n", "1. Reminder from last week about statistical observables, the central limit theorem and bootstrapping, see notes from last week\n", "\n", - "2. Resampling TechniquesL Blocking \n", + "2. Resampling Techniques, emphasis on Blocking \n", "\n", - "3. Discussion of onebody densities\n", + "3. Discussion of onebody densities (whiteboard notes)\n", "\n", - "4. Start discussion on optimization and parallelization\n", + "4. Start discussion on optimization and parallelization for Python and C++\n", "\n", "\n", "\n", @@ -54,7 +54,7 @@ }, { "cell_type": "markdown", - "id": "2ee9e574", + "id": "fdd2e018", "metadata": { "editable": true }, @@ -71,7 +71,7 @@ }, { "cell_type": "markdown", - "id": "43a452a7", + "id": "b0b7d10d", "metadata": { "editable": true }, @@ -90,7 +90,7 @@ }, { "cell_type": "markdown", - "id": "44388dbc", + "id": "90baf4e8", "metadata": { "editable": true }, @@ -108,7 +108,7 @@ }, { "cell_type": "markdown", - "id": "ff862041", + "id": "17222598", "metadata": { "editable": true }, @@ -125,7 +125,7 @@ }, { "cell_type": "markdown", - "id": "7018aa59", + "id": "906c2dfa", "metadata": { "editable": true }, @@ -137,7 +137,7 @@ }, { "cell_type": "markdown", - "id": "b3e53daa", + "id": "12e3c52f", "metadata": { "editable": true }, @@ -147,7 +147,7 @@ }, { "cell_type": "markdown", - "id": "0e5f0485", + "id": "6c6118fa", "metadata": { "editable": true }, @@ -159,7 +159,7 @@ }, { "cell_type": "markdown", - "id": "a9836a57", + "id": "887c2e0a", "metadata": { "editable": true }, @@ -172,7 +172,7 @@ }, { "cell_type": "markdown", - "id": "723c0ccc", + "id": "acafea2b", "metadata": { "editable": true }, @@ -187,7 +187,7 @@ }, { "cell_type": "markdown", - "id": "8f034b21", + "id": "e35d683b", "metadata": { "editable": true }, @@ -199,7 +199,7 @@ }, { "cell_type": "markdown", - "id": "5ab3ae28", + "id": "2284213e", "metadata": { "editable": true }, @@ -209,7 +209,7 @@ }, { "cell_type": "markdown", - "id": "4dd47830", + "id": "38fc350c", "metadata": { "editable": true }, @@ -221,7 +221,7 @@ }, { "cell_type": "markdown", - "id": "6519d734", + "id": "e6a10e33", "metadata": { "editable": true }, @@ -231,7 +231,7 @@ }, { "cell_type": "markdown", - "id": "8aa49cf6", + "id": "77708cbe", "metadata": { "editable": true }, @@ -243,7 +243,7 @@ }, { "cell_type": "markdown", - "id": "9868626f", + "id": "a7d8f255", "metadata": { "editable": true }, @@ -255,7 +255,7 @@ }, { "cell_type": "markdown", - "id": "64f8e4fb", + "id": "0efe976e", "metadata": { "editable": true }, @@ -265,7 +265,7 @@ }, { "cell_type": "markdown", - "id": "1cae23a5", + "id": "b1a1fbc4", "metadata": { "editable": true }, @@ -277,7 +277,7 @@ }, { "cell_type": "markdown", - "id": "6ee57ff3", + "id": "c1520556", "metadata": { "editable": true }, @@ -287,7 +287,7 @@ }, { "cell_type": "markdown", - "id": "e11a07c7", + "id": "d558f855", "metadata": { "editable": true }, @@ -299,7 +299,7 @@ }, { "cell_type": "markdown", - "id": "3aba5d10", + "id": "0ddd692b", "metadata": { "editable": true }, @@ -311,7 +311,7 @@ }, { "cell_type": "markdown", - "id": "5d0798e9", + "id": "10116eff", "metadata": { "editable": true }, @@ -326,7 +326,7 @@ }, { "cell_type": "markdown", - "id": "d2f74fcf", + "id": "e6854707", "metadata": { "editable": true }, @@ -336,7 +336,7 @@ }, { "cell_type": "markdown", - "id": "038736f0", + "id": "428e0cc5", "metadata": { "editable": true }, @@ -348,7 +348,7 @@ }, { "cell_type": "markdown", - "id": "280977f2", + "id": "95748e29", "metadata": { "editable": true }, @@ -362,7 +362,7 @@ }, { "cell_type": "markdown", - "id": "c5e164f5", + "id": "c16de7af", "metadata": { "editable": true }, @@ -381,7 +381,7 @@ }, { "cell_type": "markdown", - "id": "45c10537", + "id": "4c34dcab", "metadata": { "editable": true }, @@ -393,7 +393,7 @@ }, { "cell_type": "markdown", - "id": "24888c29", + "id": "2fa3ac90", "metadata": { "editable": true }, @@ -405,7 +405,7 @@ }, { "cell_type": "markdown", - "id": "73204442", + "id": "ba09ecf3", "metadata": { "editable": true }, @@ -415,7 +415,7 @@ }, { "cell_type": "markdown", - "id": "772baecb", + "id": "121cd516", "metadata": { "editable": true }, @@ -427,17 +427,17 @@ }, { "cell_type": "markdown", - "id": "9e7bf14e", + "id": "758c4303", "metadata": { "editable": true }, "source": [ - "We note that for $d=$ we have" + "We note that for $d=0$ we have" ] }, { "cell_type": "markdown", - "id": "85596d96", + "id": "5f24791c", "metadata": { "editable": true }, @@ -449,7 +449,7 @@ }, { "cell_type": "markdown", - "id": "69b75edd", + "id": "31ff5de6", "metadata": { "editable": true }, @@ -461,7 +461,7 @@ }, { "cell_type": "markdown", - "id": "e7e93793", + "id": "40587bca", "metadata": { "editable": true }, @@ -475,7 +475,7 @@ }, { "cell_type": "markdown", - "id": "48c0c753", + "id": "c6af6233", "metadata": { "editable": true }, @@ -485,753 +485,7 @@ }, { "cell_type": "markdown", - "id": "d3ba363d", - "metadata": { - "editable": true - }, - "source": [ - "## Statistics, wrapping up from last week\n", - "Let us analyze the problem by splitting up the correlation term into\n", - "partial sums of the form:" - ] - }, - { - "cell_type": "markdown", - "id": "53d50ad9", - "metadata": { - "editable": true - }, - "source": [ - "$$\n", - "f_d = \\frac{1}{n-d}\\sum_{k=1}^{n-d}(x_k - \\bar x_n)(x_{k+d} - \\bar x_n)\n", - "$$" - ] - }, - { - "cell_type": "markdown", - "id": "79b06934", - "metadata": { - "editable": true - }, - "source": [ - "The correlation term of the error can now be rewritten in terms of\n", - "$f_d$" - ] - }, - { - "cell_type": "markdown", - "id": "f45277c1", - "metadata": { - "editable": true - }, - "source": [ - "$$\n", - "\\frac{2}{n}\\sum_{k\n", - "
    \n", - "\n", - "$$\n", - "\\begin{equation} \n", - "=\\frac{\\tau}{n}\\cdot\\mathrm{var}(x)\n", - "\\label{_auto1} \\tag{1}\n", - "\\end{equation}\n", - "$$" - ] - }, - { - "cell_type": "markdown", - "id": "27082b05", - "metadata": { - "editable": true - }, - "source": [ - "and we see that $\\mathrm{err}_X$ can be expressed in terms the\n", - "uncorrelated sample variance times a correction factor $\\tau$ which\n", - "accounts for the correlation between measurements. We call this\n", - "correction factor the *autocorrelation time*:" - ] - }, - { - "cell_type": "markdown", - "id": "ebba6d5b", - "metadata": { - "editable": true - }, - "source": [ - "\n", - "
    \n", - "\n", - "$$\n", - "\\begin{equation}\n", - "\\tau = 1+2\\sum_{d=1}^{n-1}\\kappa_d\n", - "\\label{eq:autocorrelation_time} \\tag{2}\n", - "\\end{equation}\n", - "$$" - ] - }, - { - "cell_type": "markdown", - "id": "cd7bc744", - "metadata": { - "editable": true - }, - "source": [ - "## Statistics, effective number of correlations\n", - "For a correlation free experiment, $\\tau$\n", - "equals 1.\n", - "\n", - "We can interpret a sequential\n", - "correlation as an effective reduction of the number of measurements by\n", - "a factor $\\tau$. The effective number of measurements becomes:" - ] - }, - { - "cell_type": "markdown", - "id": "c0c2d545", - "metadata": { - "editable": true - }, - "source": [ - "$$\n", - "n_\\mathrm{eff} = \\frac{n}{\\tau}\n", - "$$" - ] - }, - { - "cell_type": "markdown", - "id": "7feb6f1b", - "metadata": { - "editable": true - }, - "source": [ - "To neglect the autocorrelation time $\\tau$ will always cause our\n", - "simple uncorrelated estimate of $\\mathrm{err}_X^2\\approx \\mathrm{var}(x)/n$ to\n", - "be less than the true sample error. The estimate of the error will be\n", - "too *good*. On the other hand, the calculation of the full\n", - "autocorrelation time poses an efficiency problem if the set of\n", - "measurements is very large." - ] - }, - { - "cell_type": "markdown", - "id": "6a5bebd4", - "metadata": { - "editable": true - }, - "source": [ - "## Can we understand this? Time Auto-correlation Function\n", - "\n", - "The so-called time-displacement autocorrelation $\\phi(t)$ for a quantity $\\mathbf{M}$ is given by" - ] - }, - { - "cell_type": "markdown", - "id": "7bb653ea", - "metadata": { - "editable": true - }, - "source": [ - "$$\n", - "\\phi(t) = \\int dt' \\left[\\mathbf{M}(t')-\\langle \\mathbf{M} \\rangle\\right]\\left[\\mathbf{M}(t'+t)-\\langle \\mathbf{M} \\rangle\\right],\n", - "$$" - ] - }, - { - "cell_type": "markdown", - "id": "796d4279", - "metadata": { - "editable": true - }, - "source": [ - "which can be rewritten as" - ] - }, - { - "cell_type": "markdown", - "id": "d6b3c534", - "metadata": { - "editable": true - }, - "source": [ - "$$\n", - "\\phi(t) = \\int dt' \\left[\\mathbf{M}(t')\\mathbf{M}(t'+t)-\\langle \\mathbf{M} \\rangle^2\\right],\n", - "$$" - ] - }, - { - "cell_type": "markdown", - "id": "d0daae4a", - "metadata": { - "editable": true - }, - "source": [ - "where $\\langle \\mathbf{M} \\rangle$ is the average value and\n", - "$\\mathbf{M}(t)$ its instantaneous value. We can discretize this function as follows, where we used our\n", - "set of computed values $\\mathbf{M}(t)$ for a set of discretized times (our Monte Carlo cycles corresponding to moving all electrons?)" - ] - }, - { - "cell_type": "markdown", - "id": "fe972039", - "metadata": { - "editable": true - }, - "source": [ - "\n", - "
    \n", - "\n", - "$$\n", - "\\phi(t) = \\frac{1}{t_{\\mathrm{max}}-t}\\sum_{t'=0}^{t_{\\mathrm{max}}-t}\\mathbf{M}(t')\\mathbf{M}(t'+t)\n", - "-\\frac{1}{t_{\\mathrm{max}}-t}\\sum_{t'=0}^{t_{\\mathrm{max}}-t}\\mathbf{M}(t')\\times\n", - "\\frac{1}{t_{\\mathrm{max}}-t}\\sum_{t'=0}^{t_{\\mathrm{max}}-t}\\mathbf{M}(t'+t).\n", - "\\label{eq:phitf} \\tag{3}\n", - "$$" - ] - }, - { - "cell_type": "markdown", - "id": "577486f0", - "metadata": { - "editable": true - }, - "source": [ - "## Time Auto-correlation Function\n", - "\n", - "One should be careful with times close to $t_{\\mathrm{max}}$, the upper limit of the sums \n", - "becomes small and we end up integrating over a rather small time interval. This means that the statistical\n", - "error in $\\phi(t)$ due to the random nature of the fluctuations in $\\mathbf{M}(t)$ can become large.\n", - "\n", - "One should therefore choose $t \\ll t_{\\mathrm{max}}$.\n", - "\n", - "Note that the variable $\\mathbf{M}$ can be any expectation values of interest.\n", - "\n", - "The time-correlation function gives a measure of the correlation between the various values of the variable \n", - "at a time $t'$ and a time $t'+t$. If we multiply the values of $\\mathbf{M}$ at these two different times,\n", - "we will get a positive contribution if they are fluctuating in the same direction, or a negative value\n", - "if they fluctuate in the opposite direction. If we then integrate over time, or use the discretized version of, the time correlation function $\\phi(t)$ should take a non-zero value if the fluctuations are \n", - "correlated, else it should gradually go to zero. For times a long way apart \n", - "the different values of $\\mathbf{M}$ are most likely \n", - "uncorrelated and $\\phi(t)$ should be zero." - ] - }, - { - "cell_type": "markdown", - "id": "edfc3447", - "metadata": { - "editable": true - }, - "source": [ - "## Time Auto-correlation Function\n", - "We can derive the correlation time by observing that our Metropolis algorithm is based on a random\n", - "walk in the space of all possible spin configurations. \n", - "Our probability \n", - "distribution function $\\mathbf{\\hat{w}}(t)$ after a given number of time steps $t$ could be written as" - ] - }, - { - "cell_type": "markdown", - "id": "8c9cc31a", - "metadata": { - "editable": true - }, - "source": [ - "$$\n", - "\\mathbf{\\hat{w}}(t) = \\mathbf{\\hat{W}^t\\hat{w}}(0),\n", - "$$" - ] - }, - { - "cell_type": "markdown", - "id": "e7743323", - "metadata": { - "editable": true - }, - "source": [ - "with $\\mathbf{\\hat{w}}(0)$ the distribution at $t=0$ and $\\mathbf{\\hat{W}}$ representing the \n", - "transition probability matrix. \n", - "We can always expand $\\mathbf{\\hat{w}}(0)$ in terms of the right eigenvectors of \n", - "$\\mathbf{\\hat{v}}$ of $\\mathbf{\\hat{W}}$ as" - ] - }, - { - "cell_type": "markdown", - "id": "fa73b616", - "metadata": { - "editable": true - }, - "source": [ - "$$\n", - "\\mathbf{\\hat{w}}(0) = \\sum_i\\alpha_i\\mathbf{\\hat{v}}_i,\n", - "$$" - ] - }, - { - "cell_type": "markdown", - "id": "4eadcd46", - "metadata": { - "editable": true - }, - "source": [ - "resulting in" - ] - }, - { - "cell_type": "markdown", - "id": "ca749461", - "metadata": { - "editable": true - }, - "source": [ - "$$\n", - "\\mathbf{\\hat{w}}(t) = \\mathbf{\\hat{W}}^t\\mathbf{\\hat{w}}(0)=\\mathbf{\\hat{W}}^t\\sum_i\\alpha_i\\mathbf{\\hat{v}}_i=\n", - "\\sum_i\\lambda_i^t\\alpha_i\\mathbf{\\hat{v}}_i,\n", - "$$" - ] - }, - { - "cell_type": "markdown", - "id": "5b5685d2", - "metadata": { - "editable": true - }, - "source": [ - "with $\\lambda_i$ the $i^{\\mathrm{th}}$ eigenvalue corresponding to \n", - "the eigenvector $\\mathbf{\\hat{v}}_i$." - ] - }, - { - "cell_type": "markdown", - "id": "3a5e6c97", - "metadata": { - "editable": true - }, - "source": [ - "## Time Auto-correlation Function\n", - "If we assume that $\\lambda_0$ is the largest eigenvector we see that in the limit $t\\rightarrow \\infty$,\n", - "$\\mathbf{\\hat{w}}(t)$ becomes proportional to the corresponding eigenvector \n", - "$\\mathbf{\\hat{v}}_0$. This is our steady state or final distribution. \n", - "\n", - "We can relate this property to an observable like the mean energy.\n", - "With the probabilty $\\mathbf{\\hat{w}}(t)$ (which in our case is the squared trial wave function) we\n", - "can write the expectation values as" - ] - }, - { - "cell_type": "markdown", - "id": "59db208d", - "metadata": { - "editable": true - }, - "source": [ - "$$\n", - "\\langle \\mathbf{M}(t) \\rangle = \\sum_{\\mu} \\mathbf{\\hat{w}}(t)_{\\mu}\\mathbf{M}_{\\mu},\n", - "$$" - ] - }, - { - "cell_type": "markdown", - "id": "0a820f73", - "metadata": { - "editable": true - }, - "source": [ - "or as the scalar of a vector product" - ] - }, - { - "cell_type": "markdown", - "id": "b0c94774", - "metadata": { - "editable": true - }, - "source": [ - "$$\n", - "\\langle \\mathbf{M}(t) \\rangle = \\mathbf{\\hat{w}}(t)\\mathbf{m},\n", - "$$" - ] - }, - { - "cell_type": "markdown", - "id": "f79b2798", - "metadata": { - "editable": true - }, - "source": [ - "with $\\mathbf{m}$ being the vector whose elements are the values of $\\mathbf{M}_{\\mu}$ in its \n", - "various microstates $\\mu$." - ] - }, - { - "cell_type": "markdown", - "id": "4c53b179", - "metadata": { - "editable": true - }, - "source": [ - "## Time Auto-correlation Function\n", - "\n", - "We rewrite this relation as" - ] - }, - { - "cell_type": "markdown", - "id": "013fe1b9", - "metadata": { - "editable": true - }, - "source": [ - "$$\n", - "\\langle \\mathbf{M}(t) \\rangle = \\mathbf{\\hat{w}}(t)\\mathbf{m}=\\sum_i\\lambda_i^t\\alpha_i\\mathbf{\\hat{v}}_i\\mathbf{m}_i.\n", - "$$" - ] - }, - { - "cell_type": "markdown", - "id": "03ea9dcb", - "metadata": { - "editable": true - }, - "source": [ - "If we define $m_i=\\mathbf{\\hat{v}}_i\\mathbf{m}_i$ as the expectation value of\n", - "$\\mathbf{M}$ in the $i^{\\mathrm{th}}$ eigenstate we can rewrite the last equation as" - ] - }, - { - "cell_type": "markdown", - "id": "f562d158", - "metadata": { - "editable": true - }, - "source": [ - "$$\n", - "\\langle \\mathbf{M}(t) \\rangle = \\sum_i\\lambda_i^t\\alpha_im_i.\n", - "$$" - ] - }, - { - "cell_type": "markdown", - "id": "a379dbdb", - "metadata": { - "editable": true - }, - "source": [ - "Since we have that in the limit $t\\rightarrow \\infty$ the mean value is dominated by the \n", - "the largest eigenvalue $\\lambda_0$, we can rewrite the last equation as" - ] - }, - { - "cell_type": "markdown", - "id": "4827c0d3", - "metadata": { - "editable": true - }, - "source": [ - "$$\n", - "\\langle \\mathbf{M}(t) \\rangle = \\langle \\mathbf{M}(\\infty) \\rangle+\\sum_{i\\ne 0}\\lambda_i^t\\alpha_im_i.\n", - "$$" - ] - }, - { - "cell_type": "markdown", - "id": "e729d2fb", - "metadata": { - "editable": true - }, - "source": [ - "We define the quantity" - ] - }, - { - "cell_type": "markdown", - "id": "b8830395", - "metadata": { - "editable": true - }, - "source": [ - "$$\n", - "\\tau_i=-\\frac{1}{log\\lambda_i},\n", - "$$" - ] - }, - { - "cell_type": "markdown", - "id": "cbe751ca", - "metadata": { - "editable": true - }, - "source": [ - "and rewrite the last expectation value as" - ] - }, - { - "cell_type": "markdown", - "id": "cf7f0699", - "metadata": { - "editable": true - }, - "source": [ - "\n", - "
    \n", - "\n", - "$$\n", - "\\langle \\mathbf{M}(t) \\rangle = \\langle \\mathbf{M}(\\infty) \\rangle+\\sum_{i\\ne 0}\\alpha_im_ie^{-t/\\tau_i}.\n", - "\\label{eq:finalmeanm} \\tag{4}\n", - "$$" - ] - }, - { - "cell_type": "markdown", - "id": "60345a06", - "metadata": { - "editable": true - }, - "source": [ - "## Time Auto-correlation Function\n", - "\n", - "The quantities $\\tau_i$ are the correlation times for the system. They control also the auto-correlation function \n", - "discussed above. The longest correlation time is obviously given by the second largest\n", - "eigenvalue $\\tau_1$, which normally defines the correlation time discussed above. For large times, this is the \n", - "only correlation time that survives. If higher eigenvalues of the transition matrix are well separated from \n", - "$\\lambda_1$ and we simulate long enough, $\\tau_1$ may well define the correlation time. \n", - "In other cases we may not be able to extract a reliable result for $\\tau_1$. \n", - "Coming back to the time correlation function $\\phi(t)$ we can present a more general definition in terms\n", - "of the mean magnetizations $ \\langle \\mathbf{M}(t) \\rangle$. Recalling that the mean value is equal \n", - "to $ \\langle \\mathbf{M}(\\infty) \\rangle$ we arrive at the expectation values" - ] - }, - { - "cell_type": "markdown", - "id": "88881ebf", - "metadata": { - "editable": true - }, - "source": [ - "$$\n", - "\\phi(t) =\\langle \\mathbf{M}(0)-\\mathbf{M}(\\infty)\\rangle \\langle \\mathbf{M}(t)-\\mathbf{M}(\\infty)\\rangle,\n", - "$$" - ] - }, - { - "cell_type": "markdown", - "id": "383fbdc9", - "metadata": { - "editable": true - }, - "source": [ - "resulting in" - ] - }, - { - "cell_type": "markdown", - "id": "8df6f5ff", - "metadata": { - "editable": true - }, - "source": [ - "$$\n", - "\\phi(t) =\\sum_{i,j\\ne 0}m_i\\alpha_im_j\\alpha_je^{-t/\\tau_i},\n", - "$$" - ] - }, - { - "cell_type": "markdown", - "id": "ba2b0887", - "metadata": { - "editable": true - }, - "source": [ - "which is appropriate for all times." - ] - }, - { - "cell_type": "markdown", - "id": "c6abff3f", - "metadata": { - "editable": true - }, - "source": [ - "## Correlation Time\n", - "\n", - "If the correlation function decays exponentially" - ] - }, - { - "cell_type": "markdown", - "id": "a8a8d079", - "metadata": { - "editable": true - }, - "source": [ - "$$\n", - "\\phi (t) \\sim \\exp{(-t/\\tau)}\n", - "$$" - ] - }, - { - "cell_type": "markdown", - "id": "caf13e28", - "metadata": { - "editable": true - }, - "source": [ - "then the exponential correlation time can be computed as the average" - ] - }, - { - "cell_type": "markdown", - "id": "a2f481d1", - "metadata": { - "editable": true - }, - "source": [ - "$$\n", - "\\tau_{\\mathrm{exp}} = -\\langle \\frac{t}{log|\\frac{\\phi(t)}{\\phi(0)}|} \\rangle.\n", - "$$" - ] - }, - { - "cell_type": "markdown", - "id": "b324d7c1", - "metadata": { - "editable": true - }, - "source": [ - "If the decay is exponential, then" - ] - }, - { - "cell_type": "markdown", - "id": "da8faf42", - "metadata": { - "editable": true - }, - "source": [ - "$$\n", - "\\int_0^{\\infty} dt \\phi(t) = \\int_0^{\\infty} dt \\phi(0)\\exp{(-t/\\tau)} = \\tau \\phi(0),\n", - "$$" - ] - }, - { - "cell_type": "markdown", - "id": "93d041f0", - "metadata": { - "editable": true - }, - "source": [ - "which suggests another measure of correlation" - ] - }, - { - "cell_type": "markdown", - "id": "43a0dfc5", - "metadata": { - "editable": true - }, - "source": [ - "$$\n", - "\\tau_{\\mathrm{int}} = \\sum_k \\frac{\\phi(k)}{\\phi(0)},\n", - "$$" - ] - }, - { - "cell_type": "markdown", - "id": "64917ea7", - "metadata": { - "editable": true - }, - "source": [ - "called the integrated correlation time." - ] - }, - { - "cell_type": "markdown", - "id": "b94310aa", + "id": "0b88af71", "metadata": { "editable": true }, @@ -1249,7 +503,7 @@ }, { "cell_type": "markdown", - "id": "4e4e15a5", + "id": "94b1edde", "metadata": { "editable": true }, @@ -1263,7 +517,7 @@ }, { "cell_type": "markdown", - "id": "f3f20865", + "id": "af97e0d7", "metadata": { "editable": true }, @@ -1276,7 +530,7 @@ }, { "cell_type": "markdown", - "id": "da75ce5b", + "id": "7feaa251", "metadata": { "editable": true }, @@ -1293,7 +547,7 @@ }, { "cell_type": "markdown", - "id": "cb8a392e", + "id": "9d5024aa", "metadata": { "editable": true }, @@ -1305,26 +559,26 @@ }, { "cell_type": "markdown", - "id": "42b4359c", + "id": "08342d94", "metadata": { "editable": true }, "source": [ "\n", - "
    \n", + "
    \n", "\n", "$$\n", "\\begin{equation} \n", "(\\vec{X}_{i+1})_k \\equiv \\frac{1}{2}\\Big( (\\vec{X}_i)_{2k-1} +\n", "(\\vec{X}_i)_{2k} \\Big) \\qquad \\text{for all} \\qquad 1 \\leq i \\leq d-1\n", - "\\label{_auto2} \\tag{5}\n", + "\\label{_auto1} \\tag{1}\n", "\\end{equation}\n", "$$" ] }, { "cell_type": "markdown", - "id": "cfa30335", + "id": "35d66f14", "metadata": { "editable": true }, @@ -1346,7 +600,7 @@ }, { "cell_type": "markdown", - "id": "294a10f5", + "id": "bd0d6c0d", "metadata": { "editable": true }, @@ -1361,7 +615,7 @@ }, { "cell_type": "markdown", - "id": "eac0df17", + "id": "c7771171", "metadata": { "editable": true }, @@ -1373,7 +627,7 @@ }, { "cell_type": "markdown", - "id": "862ec0bf", + "id": "b013c68e", "metadata": { "editable": true }, @@ -1385,43 +639,43 @@ }, { "cell_type": "markdown", - "id": "0aa4d113", + "id": "5aba9b30", "metadata": { "editable": true }, "source": [ "\n", - "
    \n", + "
    \n", "\n", "$$\n", "\\begin{equation} \n", "= \\frac{1}{2}\\gamma_{k}(2h) + \\frac{1}{2}\\gamma_k(2h+1) \\hspace{0.1cm} \\mathrm{h = 0} \n", - "\\label{_auto3} \\tag{6}\n", + "\\label{_auto2} \\tag{2}\n", "\\end{equation}\n", "$$" ] }, { "cell_type": "markdown", - "id": "d640122e", + "id": "80359834", "metadata": { "editable": true }, "source": [ "\n", - "
    \n", + "
    \n", "\n", "$$\n", "\\begin{equation} \n", "=\\frac{1}{4}\\gamma_k(2h-1) + \\frac{1}{2}\\gamma_k(2h) + \\frac{1}{4}\\gamma_k(2h+1) \\quad \\mathrm{else}\n", - "\\label{_auto4} \\tag{7}\n", + "\\label{_auto3} \\tag{3}\n", "\\end{equation}\n", "$$" ] }, { "cell_type": "markdown", - "id": "9d7801b1", + "id": "b4609263", "metadata": { "editable": true }, @@ -1431,7 +685,7 @@ }, { "cell_type": "markdown", - "id": "d05e9e9d", + "id": "4d7c4cf9", "metadata": { "editable": true }, @@ -1442,25 +696,25 @@ }, { "cell_type": "markdown", - "id": "2f2f325b", + "id": "bcb7a18f", "metadata": { "editable": true }, "source": [ "\n", - "
    \n", + "
    \n", "\n", "$$\n", "\\begin{equation}\n", "V(\\overline{X}_k) = \\frac{\\sigma_k^2}{n_k} + \\underbrace{\\frac{2}{n_k} \\sum_{h=1}^{n_k-1}\\left( 1 - \\frac{h}{n_k} \\right)\\gamma_k(h)}_{\\equiv e_k} = \\frac{\\sigma^2_k}{n_k} + e_k \\quad \\text{if} \\quad \\gamma_k(0) = \\sigma_k^2. \n", - "\\label{_auto5} \\tag{8}\n", + "\\label{_auto4} \\tag{4}\n", "\\end{equation}\n", "$$" ] }, { "cell_type": "markdown", - "id": "a601982d", + "id": "3c0415c5", "metadata": { "editable": true }, @@ -1470,25 +724,25 @@ }, { "cell_type": "markdown", - "id": "6a040ab9", + "id": "b917d6e7", "metadata": { "editable": true }, "source": [ "\n", - "
    \n", + "
    \n", "\n", "$$\n", "\\begin{equation}\n", "e_k = \\frac{2}{n_k} \\sum_{h=1}^{n_k-1}\\left( 1 - \\frac{h}{n_k} \\right)\\gamma_k(h). \n", - "\\label{_auto6} \\tag{9}\n", + "\\label{_auto5} \\tag{5}\n", "\\end{equation}\n", "$$" ] }, { "cell_type": "markdown", - "id": "193ba02d", + "id": "8a3a843d", "metadata": { "editable": true }, @@ -1498,7 +752,7 @@ }, { "cell_type": "markdown", - "id": "586b9bb1", + "id": "ff26532f", "metadata": { "editable": true }, @@ -1510,7 +764,7 @@ }, { "cell_type": "markdown", - "id": "2e3905dc", + "id": "53d488ff", "metadata": { "editable": true }, @@ -1522,25 +776,25 @@ }, { "cell_type": "markdown", - "id": "9a78cc1b", + "id": "12616630", "metadata": { "editable": true }, "source": [ "\n", - "
    \n", + "
    \n", "\n", "$$\n", "\\begin{equation} \n", "= \\frac{1}{2}\\left[ (\\hat{X}_j)_1 + (\\hat{X}_j)_2 + \\cdots + (\\hat{X}_j)_{n_j} \\right] = \\underbrace{\\frac{n_j}{2}}_{=n_{j+1}} \\overline{X}_j = n_{j+1}\\overline{X}_j. \n", - "\\label{_auto7} \\tag{10}\n", + "\\label{_auto6} \\tag{6}\n", "\\end{equation}\n", "$$" ] }, { "cell_type": "markdown", - "id": "6fd0fad1", + "id": "97fd701f", "metadata": { "editable": true }, @@ -1550,7 +804,7 @@ }, { "cell_type": "markdown", - "id": "b1eb4a9d", + "id": "4bb4df84", "metadata": { "editable": true }, @@ -1560,14 +814,14 @@ "\n", "$$\n", "\\begin{equation}\n", - "V(\\overline{X}) = \\frac{\\sigma_k^2}{n_k} + e_k \\qquad \\text{for all} \\qquad 0 \\leq k \\leq d-1. \\label{eq:convergence} \\tag{11}\n", + "V(\\overline{X}) = \\frac{\\sigma_k^2}{n_k} + e_k \\qquad \\text{for all} \\qquad 0 \\leq k \\leq d-1. \\label{eq:convergence} \\tag{7}\n", "\\end{equation}\n", "$$" ] }, { "cell_type": "markdown", - "id": "25ce5032", + "id": "763e56cf", "metadata": { "editable": true }, @@ -1585,7 +839,7 @@ }, { "cell_type": "markdown", - "id": "dca9731a", + "id": "7df111e3", "metadata": { "editable": true }, @@ -1596,7 +850,7 @@ { "cell_type": "code", "execution_count": 1, - "id": "70f1c821", + "id": "917cbd60", "metadata": { "collapsed": false, "editable": true @@ -1825,7 +1079,7 @@ }, { "cell_type": "markdown", - "id": "0f6b721c", + "id": "d6173775", "metadata": { "editable": true }, @@ -1840,7 +1094,7 @@ { "cell_type": "code", "execution_count": 2, - "id": "6f4a3393", + "id": "ba6dda27", "metadata": { "collapsed": false, "editable": true diff --git a/doc/pub/week9/pdf/week9-beamer.pdf b/doc/pub/week9/pdf/week9-beamer.pdf index f6504635..b5b72c17 100644 Binary files a/doc/pub/week9/pdf/week9-beamer.pdf and b/doc/pub/week9/pdf/week9-beamer.pdf differ diff --git a/doc/pub/week9/pdf/week9.pdf b/doc/pub/week9/pdf/week9.pdf index f9098890..fe2780cd 100644 Binary files a/doc/pub/week9/pdf/week9.pdf and b/doc/pub/week9/pdf/week9.pdf differ