Skip to content

Commit bb0468e

Browse files
author
buruzaemon
committed
Starting work on Lecture 19
1 parent 7492077 commit bb0468e

7 files changed

+275
-16
lines changed

.gitignore

100644100755
File mode changed.

LICENSE

100644100755
File mode changed.

Lecture_18.ipynb

+155-16
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@
3131
"* $M'''(0) = \\mathbb{E}(X^3)$\n",
3232
"* ... and so on ...\n",
3333
"\n",
34-
"But even though finding derivatives of $\\frac{1}{1-t}$ is not all that bad, it is nevertheless annoying busywork. But since we know that the $n^{th}$ moment is the coefficient of the $n^{th}$ term of the Taylor Expansion of $X$, we can leverage that fact instead.\n",
34+
"Even though finding derivatives of $\\frac{1}{1-t}$ is not all that bad, it is nevertheless annoying busywork. But since we know that the $n^{th}$ moment is the coefficient of the $n^{th}$ term of the Taylor Expansion of $X$, we can leverage that fact instead.\n",
3535
"\n",
3636
"\\begin{align}\n",
3737
" \\frac{1}{1-t} &= \\sum_{n=0}^{\\infty} t^n &\\quad \\text{ for } |t| < 1 \\\\ \n",
@@ -123,9 +123,9 @@
123123
"collapsed": true
124124
},
125125
"source": [
126-
"## MGF for $\\mathcal{Pois}(\\lambda)$\n",
126+
"## MGF for $Pois(\\lambda)$\n",
127127
"\n",
128-
"Let $X \\sim \\mathcal{Pois}(\\lambda)$; now let's consider MGFs and how to use them to find sums of random variables (convolutions).\n",
128+
"Let $X \\sim Pois(\\lambda)$; now let's consider MGFs and how to use them to find sums of random variables (convolutions).\n",
129129
"\n",
130130
"\\begin{align}\n",
131131
" M(t) &= \\mathbb{E}(e^{tX}) \\\\\n",
@@ -136,7 +136,7 @@
136136
"\\end{align}\n",
137137
"\n",
138138
"\n",
139-
"Now let's let $Y \\sim \\mathcal{Pois}(\\mu)$, and it is independent of $X$. Find the distribution of $(X + Y)$.\n",
139+
"Now let's let $Y \\sim Pois(\\mu)$, and it is independent of $X$. Find the distribution of $(X + Y)$.\n",
140140
"\n",
141141
"You may recall that with MGFs, all we need to do is _multiply_ the MGFs.\n",
142142
"\n",
@@ -159,28 +159,167 @@
159159
"cell_type": "markdown",
160160
"metadata": {},
161161
"source": [
162-
"## Joint Distribution\n",
162+
"### Some definitions\n",
163163
"\n",
164-
"Let $X, Y$ be both Bernoulli.\n",
164+
"In the most basic case of two r.v. in a joint distribution, consider both r.v.'s _together_:\n",
165+
"\n",
166+
"> **Joint CDF**\n",
167+
">\n",
168+
"> In the general case, the joint CDF fof two r.v.'s is $ F(x,y) = P(X \\le x, Y \\le y)$\n",
169+
"\n",
170+
"<p/> \n",
171+
"\n",
172+
"> **Joint PDF**\n",
173+
"> $f(x, y)$ such that, in the *continuous* case $P((X,Y) \\in B) = \\iint_B f(x,y)\\,dx\\,dy$\n",
174+
"\n",
175+
"#### Joint PMF\n",
176+
"\n",
177+
"$f(x, y)$ such that, in the *discrete* case\n",
178+
"\n",
179+
"\\begin{align}\n",
180+
" P(X=x, Y=y)\n",
181+
"\\end{align}\n",
182+
"\n",
183+
"We also can consider a single r.v. of a joint distribution:\n",
184+
"\n",
185+
"#### Independence and Joint Distributions\n",
186+
"\n",
187+
"$X, Y$ are independent iff $F(x,y) = F_X(x) \\, F_Y(y)$. \n",
188+
"\n",
189+
"\\begin{align}\n",
190+
" P(X=x, Y=y) &= P(X=x) \\, P(Y=y) &\\quad \\text{discrete case} \\\\\n",
191+
" \\\\\\\\\n",
192+
" f(x, y) &= f_X(x) \\, f_Y(y) &\\quad \\text{continuous case}\n",
193+
"\\end{align}\n",
194+
"\n",
195+
"... with the caveat that this must be so for *all* $\\text{x, y} \\in \\mathbb{R}$\n",
196+
"\n",
197+
"#### Marginals (and how to get them)\n",
198+
"\n",
199+
"$P(X \\le x)$ is the *marginal distribution* of $X$, where we consider one r.v. at a time.\n",
200+
"\n",
201+
"In the case of a two-r.v. joint distribution, we can get the marginals by using the joint distribution itself:\n",
202+
"\n",
203+
"\\begin{align}\n",
204+
" P(X=x) &= \\sum_y P(X=x, Y=y) &\\quad \\text{marginal PMF, discrete case, for } x \\\\\n",
205+
" \\\\\\\\\n",
206+
" f_Y(y) &= \\int_{-\\infty}^{\\infty} f_{(X,Y)}(x,y) \\, dx &\\quad \\text{marginal PDF, continuous case, for } y\n",
207+
"\\end{align}"
208+
]
209+
},
210+
{
211+
"cell_type": "markdown",
212+
"metadata": {},
213+
"source": [
214+
"## Example: Discrete Joint Distribution\n",
215+
"\n",
216+
"Let $X, Y$ be both Bernoulli. $X$ and $Y$ may be independent; or they might be dependent. They may or may not have the same $p$. But they are both related in the form of a *joint distribution*.\n",
217+
"\n",
218+
"We can lay out this joint distribution in a $2 \\times 2$ contigency table like below:\n",
165219
"\n",
166220
"\n",
167221
"| | $Y=0$ | $Y=1$ |\n",
168-
"|-------|-------|-------|\n",
169-
"| $X=0$ | | |\n",
170-
"| $X=1$ | | |\n",
171-
"-------------------------\n",
222+
"|-------|:-----:|:-----:|\n",
223+
"| $X=0$ | 2/6 | 1/6 |\n",
224+
"| $X=1$ | 2/6 | 1/6 |\n",
172225
"\n",
173-
"#### Joint CDF\n",
226+
"In order to be a joint distribution, all of the values in our contigency table must be positive; and they must all sum up to 1. The example above shows such a PMF.\n",
174227
"\n",
175-
"$F(x,y) = P(X \\le x, Y \\le y)$\n",
228+
"Let's add the marginals for $X$ and $Y$ to our $2 \\times 2$ contigency table:\n",
176229
"\n",
177-
"#### Joint PMF\n",
178230
"\n",
179-
"$P(X=x, Y=y)$ in the discrete case\n",
231+
"| | $Y=0$ | $Y=1$ | ... |\n",
232+
"|:-----:|:-----:|:-----:|:-----:|\n",
233+
"| $X=0$ | 2/6 | 1/6 | 3/6 |\n",
234+
"| $X=1$ | 2/6 | 1/6 | 3/6 |\n",
235+
"| ... | 4/6 | 2/6 | |\n",
236+
"\n",
237+
"\n",
238+
"Observe how in our example, we have:\n",
239+
"\n",
240+
"\\begin{align}\n",
241+
" P(X=0,Y=0) &= P(X=0) \\, P(Y=0) \\\\\n",
242+
" &= 3/6 \\times 4/6 = 12/36 &= \\boxed{2/6} \\\\\n",
243+
" \\\\\n",
244+
" P(X=0,Y=1) &= P(X=0) \\, P(Y=1) \\\\\n",
245+
" &= 3/6 \\times 2/6 = 6/36 &= \\boxed{1/6} \\\\\n",
246+
" P(X=1,Y=0) &= P(X=1) \\, P(Y=0) \\\\\n",
247+
" &= 3/6 \\times 4/6 = 12/36 &= \\boxed{2/6} \\\\\n",
248+
" \\\\\n",
249+
" P(X=1,Y=1) &= P(X=1) \\, P(Y=1) \\\\\n",
250+
" &= 3/6 \\times 2/6 = 6/36 &= \\boxed{1/6} \\\\\n",
251+
"\\end{align}\n",
252+
"\n",
253+
"and so you can see that $X$ and $Y$ are independent.\n",
180254
"\n",
181-
"#### Marginal CDF\n",
255+
"Now here's an example of a two r.v. joint distribution where $X$ and $Y$ are _dependent_; check it out for yourself.\n",
182256
"\n",
183-
"$P(X \\le x)$ is the *marginal distribution* of $X$ "
257+
"| | $Y=0$ | $Y=1$ |\n",
258+
"|:-----:|:-----:|:-----:|\n",
259+
"| $X=0$ | 1/3 | 0 |\n",
260+
"| $X=1$ | 1/3 | 1/3 |"
261+
]
262+
},
263+
{
264+
"cell_type": "markdown",
265+
"metadata": {},
266+
"source": [
267+
"## Example: Continuous Joint Distribution\n",
268+
"\n",
269+
"Now say we had Uniform distributions on a square such that $x,y \\in [0,1]$. \n",
270+
"\n",
271+
"The joint PDF would be constant on/within the square; and 0 outside.\n",
272+
"\n",
273+
"\\begin{align}\n",
274+
" \\text{joint PDF} &=\n",
275+
" \\begin{cases}\n",
276+
" c &\\quad \\text{if } 0 \\le x \\le 1 \\text{, } 0 \\le y \\le 1 \\\\\n",
277+
" \\\\\n",
278+
" 0 &\\quad \\text{otherwise}\n",
279+
" \\end{cases}\n",
280+
"\\end{align}\n",
281+
"\n",
282+
"![title](images/L1801.png)\n",
283+
"\n",
284+
"In 1-dimension space, if you integrate $1$ over some interval you get the _length_ of that interval.\n",
285+
"\n",
286+
"In 2-dimension space, if you integrate $1$ over some region, you get the _area_ of that region.\n",
287+
"\n",
288+
"Normalizing $c$, we know that $c = \\frac{1}{area} = 1$.\n",
289+
"\n",
290+
"Marginally, $X$ and $Y$ are independent $\\mathcal{Unif}(1)$."
291+
]
292+
},
293+
{
294+
"cell_type": "markdown",
295+
"metadata": {},
296+
"source": [
297+
"## Example: Dependent, Continuous Joint Distribution\n",
298+
"\n",
299+
"Now say we had Uniform distributions on a _disc_ such that $x^2 + y^2 \\le 1$. \n",
300+
"\n",
301+
"\n",
302+
"#### Joint PDF\n",
303+
"\n",
304+
"In this case, the joint PDF is $Unif$ over the area of a disc centered at the origin with radius 1.\n",
305+
"\n",
306+
"\\begin{align}\n",
307+
" \\text{joint PDF} &=\n",
308+
" \\begin{cases}\n",
309+
" \\frac{1}{\\pi} &\\quad \\text{if } x^2 + y^2 \\le 1 \\\\\n",
310+
" \\\\\n",
311+
" 0 &\\quad \\text{otherwise}\n",
312+
" \\end{cases}\n",
313+
"\\end{align}\n",
314+
"\n",
315+
"#### Marginal PDF"
316+
]
317+
},
318+
{
319+
"cell_type": "markdown",
320+
"metadata": {},
321+
"source": [
322+
"![title](images/L1802.png)"
184323
]
185324
},
186325
{

Lecture_19.ipynb

+120
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,120 @@
1+
{
2+
"cells": [
3+
{
4+
"cell_type": "markdown",
5+
"metadata": {},
6+
"source": [
7+
"# Joint, Conditional and Marginal Distributions; 2D LOTUS; Expected Distance between Uniforms; Chicken-Egg Problem"
8+
]
9+
},
10+
{
11+
"cell_type": "markdown",
12+
"metadata": {},
13+
"source": [
14+
"## Joint, Conditional and Marginal Distributions\n",
15+
"\n",
16+
"### Joint CDF\n",
17+
"A joint CDF is simply where we are dealing with multiple random variables. As an example, a case where we have two random variables $X, Y$, the joint CDF of two random variables $X, Y$ can be expressed as:\n",
18+
"\n",
19+
"\\begin{align}\n",
20+
" F(x,y) &= P(X \\le x, Y \\le y)\n",
21+
"\\end{align}\n",
22+
"\n",
23+
"Note that the random variables may be discrete, continuous, or a mixture of both.\n",
24+
"\n",
25+
"### Joint PDF\n",
26+
"The joint PDF, in the case of _continuous_ random variables, is what you would _integrate_ to get the joint CDF. Continuing with our example of two (continous) random variables, we have:\n",
27+
"\n",
28+
"\\begin{align}\n",
29+
" f(x,y) &= \\frac{\\partial^2}{\\partial{x}\\partial{y}} F(x,y)\n",
30+
"\\end{align}\n",
31+
"\n",
32+
"Conversely, if we want to know the probability of $X,Y$ in some set $A$, we _integrate_ the density to get that probability.\n",
33+
"\n",
34+
"\\begin{align}\n",
35+
" P\\left((X,Y) \\in A\\right) &= \\iint\\limits_{A} f(x,y) \\, dxdy\n",
36+
"\\end{align}\n",
37+
"\n",
38+
"Integrate holding on variable constant, and then do the other. The key thing is to be sure to get the _limits of integration_ correct.\n",
39+
"\n",
40+
"### Marginal PDF\n",
41+
"The _marginal PDF of $X$_ is obtained by _integrating out the $Y$_. Recall the $X,Y$ contigency table and the definition of marginal.\n",
42+
"\n",
43+
"\\begin{align}\n",
44+
" \\int_{-\\infty}^{\\infty} f(x,y) \\, dy\n",
45+
"\\end{align}\n",
46+
"\n",
47+
"Notice that by keeping $X$ constant and _integrating over all $Y$_, the marginal PDF of $X$ no longer depends on $Y$.\n",
48+
"\n",
49+
"And we can do vice-versa for the marginal PDF of $Y$, but keeping $Y$ constant and _integrating over all $X$_.\n",
50+
"\n",
51+
"Do not forget that taking the marginal PDF of $X$ and then integrating over all $X$, we should get 1.0.\n",
52+
"\n",
53+
"\\begin{align}\n",
54+
" \\int\\limits_{-\\infty}^{\\infty} \\int\\limits_{-\\infty}^{\\infty} f(x,y) \\, dx dy &= 1.0\n",
55+
"\\end{align}\n",
56+
"\n",
57+
"### Conditional PDF\n",
58+
"\n",
59+
"Given that we know $X$, what is the appropriate PDF for $Y$?\n",
60+
"\n",
61+
"Well, we can apply what we know about _conditional probability_ to get a conditional PDF.\n",
62+
"\n",
63+
"\\begin{align}\n",
64+
" f_{Y|X} (y|x) &= \\frac{f_{XY}(x,y)}{f_{X}(x)} \\\\\n",
65+
" &= \\frac{f_{X|Y}(x,y) \\, f_{Y}(y)}{f_{X}(x)}\n",
66+
"\\end{align}\n",
67+
"\n",
68+
"This is completely analogous to conditional probability. \n",
69+
"\n",
70+
"### Indepence\n",
71+
"\n",
72+
"$X,Y$ are independent if\n",
73+
"\n",
74+
"\\begin{align}\n",
75+
" f_{X,Y}(x,y) &= f_{X}(x) \\, f_{Y}(y)\n",
76+
"\\end{align}\n",
77+
"\n",
78+
"for _all_ $x,y$, in the case of continous PDFs. \n",
79+
"\n",
80+
"See how we can say the same thing using CDFs:\n",
81+
"\n",
82+
"\\begin{align}\n",
83+
" F(x,y) &= F(x) \\, F(y)\n",
84+
"\\end{align}\n",
85+
"\n",
86+
"for _all_ $x,y$. "
87+
]
88+
},
89+
{
90+
"cell_type": "code",
91+
"execution_count": null,
92+
"metadata": {
93+
"collapsed": true
94+
},
95+
"outputs": [],
96+
"source": []
97+
}
98+
],
99+
"metadata": {
100+
"kernelspec": {
101+
"display_name": "Python 3",
102+
"language": "python",
103+
"name": "python3"
104+
},
105+
"language_info": {
106+
"codemirror_mode": {
107+
"name": "ipython",
108+
"version": 3
109+
},
110+
"file_extension": ".py",
111+
"mimetype": "text/x-python",
112+
"name": "python",
113+
"nbconvert_exporter": "python",
114+
"pygments_lexer": "ipython3",
115+
"version": "3.5.1"
116+
}
117+
},
118+
"nbformat": 4,
119+
"nbformat_minor": 0
120+
}

README.md

100644100755
File mode changed.

images/L1801.png

16 KB
Loading

images/L1802.png

17.8 KB
Loading

0 commit comments

Comments
 (0)