-
Notifications
You must be signed in to change notification settings - Fork 0
/
index.html
150 lines (150 loc) · 8.38 KB
/
index.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
<!DOCTYPE html>
<html>
<head>
<title>Predicting Outcomes With Markov Chains</title>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<link rel="stylesheet" href="css/bootstrap.min.css">
<link rel="stylesheet" href="css/style.css">
</head>
<body>
<div class="container">
<div class="page-header">
<h1 id="title">Predicting Outcomes With Markov Chains <small class="pull-right" id="names">Alex, Bryce, Daniel, Katrina, Tae</small></h1>
</div>
<div class="col-md-4">
<div class="colored-heading rounded-corners">
<h2 class="text-center">INTRODUCTION</h2>
</div>
<ul>
<li>Most widely used applications of Linear Algebra</li>
<li>Describes a sequence of possible events in which the probability of each event depends only on the state attained in the previous event</li>
<li>A basic Markov Chain consists of:</li>
<ul>
<li>a current state c0 , or initial state. Used for the basis of future states</li>
<li>a stochastic matrix P, an nxn collection of probability vectors based on the number of states (n)</li>
</ul>
<li>Probability vector is one that is made up of nonnegative entries that add up to 1</li>
<li>Resulting Markov Chain is thus formed where c<sub>1</sub> = Pc<sub>0</sub>, c<sub>2</sub> = Pc<sub>1</sub>,...,c<sub>(k+1)</sub> = Pc<sub>k</sub>, where k is an integer</li>
<li>Can be used to determine results for a future event</li>
<ul>
<li>Reliable, but not always the best method</li>
</ul>
</ul>
<p>
<strong>Objectives:</strong>
<ol>
<li>Understand the nature of Markov’s chain and it’s stability with different situations</li>
<li>Understand how the probability evolves with the chain</li>
</ol>
</p>
<div class="colored-heading rounded-corners">
<h2 class="text-center">METHODOLOGY</h2>
</div>
<h3 class="methodology-subheading text-center">EXAMPLE #1 - THIS WEEK'S FORECAST</h3>
<p>
Suppose that tomorrow’s weather only relies on today’s weather, and does not rely on any previous
days. Given the tendencies of the December weather in Long Beach, the following stochastic matrix
was created using data from December 2012 in the Farmer’s Almanac.
</p>
<p><img class="img-rounded img-responsive" src="images/weather_1.png"></p>
<p>Today’s weather (Monday, December 2, 2013) determined by observation:</p>
<p><img class="img-rounded img-responsive" src="images/weather_2.png"></p>
<p>
To predict the weather of any future day, simply raise P to the desired state vector, e.g. one would
calculate the following to find Friday (x<sub>4</sub>):</p>
<p><img class="img-rounded img-responsive" src="images/weather_4.png"></p>
</div><!-- .col -->
<div class="col-md-4">
<h3 class="methodology-subheading text-center">EXAMPLE #2 - GAMBLIN' YOUR LUNCH MONEY</h3>
<p>
There exists a generous gambling game with probability .48 winning $1 on any turn, but .52 chance
you will lose $1. It is free to play, but you must have at least $1 up to $4. You start with $1,
and the game stops when you are either broke (lose) or have $5 (win).
</p>
<p>
<strong>Important questions to answer:</strong>
<ul>
<li>Should you play?</li>
<li>What is the probability you win $5 or go broke?</li>
<li>How do your chances of winning change if you start with more money?</li>
</ul>
</p>
<p>Stochastic Matrix for the generous gambling machine:</p>
<p><img class="img-rounded img-responsive" src="images/stochastic_matrix.png"></p>
<p><strong>What happens when you start with $1?</strong></p>
<p>A $1 bet is represented as x<sub>0</sub> = [0 1 0 0 0 0]<sup>T</sup>.</p>
<p><img class="img-rounded img-responsive" src="images/one_dollar.png"></p>
<p><strong>What happens when you start with $4?</strong></p>
<p>A $4 bet is represented as x<sub>0</sub> = [0 0 0 0 1 0]<sup>T</sup>.</p>
<p><img class="img-rounded img-responsive" src="images/four_dollars.png"></p>
</div><!-- .col -->
<div class="col-md-4">
<div class="colored-heading rounded-corners">
<h2 class="text-center">RESULTS & OBSERVATIONS</h2>
</div>
<ul>
<li>We found out that using December 2012 weather data to predict the 5-day forecast of a week in December 2013 was not accurate</li>
<ul>
<li>We discovered that trying to predict weather based on past events or states was, at best, flawed conjecture</li>
<li>The weather in December 2013 might not necessarily be the same type of weather as it was in December 2012</li>
</ul>
<li>One should only play the "generous" gambling game if the bids are high</li>
</ul>
<p><img class="img-rounded img-responsive" src="images/gambling_state_diagram.png"></p>
<p><strong>Altough the odds of winning seem appealing, it is not worth it unless "you go big, or go home".</strong></p>
<div class="colored-heading rounded-corners">
<h2 class="text-center">CONCLUSION</h2>
</div>
<p>
<ul>
<li>Markov Chain good for predicting in events like gambling, but not weather</li>
<ul>
<li>Weather has too many variables</li>
<li>Gambling machines and games usually have set probabilities</li>
</ul>
<li>Using Markov for weather is only reliable for short term prediction</li>
<li>Using Markov for gambling shows the following:</li>
<ul>
<li>If the bid is higher, the probability to win is higher</li>
<li>If the bid is lower, the probability to win is lower</li>
</ul>
<li>Ultimately, the accuracy of the Markov Chain depends on the consistency of the system</li>
</ul>
</p>
<div class="colored-heading rounded-corners">
<h2 class="text-center">SUMMARY</h2>
</div>
<ul>
<li>Stochastic matrix is a square matrix whose columns are probability vectors based on previous data</li>
<li>The Markov chain is a probability vector and used to predict possible outcomes, such that X<sub>(k+1)</sub> = PX<sub>k</sub> for k = 0,1,2,...</li>
<li>Overall, the Markov chain has the potential of estimating the approaching result in the field of business, chemistry, engineering, physics, and elsewhere</li>
</ul>
<div class="colored-heading rounded-corners">
<h2 class="text-center">SOURCES</h2>
</div>
<ul class="list-unstyled">
<li>Lay, David C., Linear Algebra and its applications, Pearson, 4th e., 2012 </li>
<li>
<a href="http://www.farmersalmanac.com/weather-history/search-results/">Farmer's Almanac</a>
</li>
<li>
<a href="http://www.bandgap.cs.rice.edu/classes/comp140/f08/Module%206/Markovchainsandprediction.pdf">Markov Chains and Prediction - Devika Subramanian</a>
</li>
<li>
<a href="http://www.americanscientist.org/include/popup_fullImage.aspx?key=NBukCW/BNunUamTp2cJe3afjG72GEDXaDHer5fzRR0lgqj2hg9Qtsw==">Markov Chain Illustration - American Scientist</a>
</li>
<li>
<a href="http://www.iasri.res.in/ebook/FET/Chap%2013_Markov%20chain_lecture_Ranjit.pdf">Forecasting Using Markov Chain - Ranjit Kumar Paul</a>
</li>
<li>
<a href="http://www.dartmouth.edu/~chance/teaching_aids/books_articles/probability_book/Chapter11.pdf">Markov Chains</a>
</li>
<li>
<a href="http://courses.washington.edu/inde411/MarkovChains.pdf">Markov Chains Lecture - Zelda Zabinsky</a>
</li>
</ul>
</div><!-- .col -->
</div><!-- .container -->
</body>
</html>