|
31 | 31 | "* $M'''(0) = \\mathbb{E}(X^3)$\n",
|
32 | 32 | "* ... and so on ...\n",
|
33 | 33 | "\n",
|
34 |
| - "But even though finding derivatives of $\\frac{1}{1-t}$ is not all that bad, it is nevertheless annoying busywork. But since we know that the $n^{th}$ moment is the coefficient of the $n^{th}$ term of the Taylor Expansion of $X$, we can leverage that fact instead.\n", |
| 34 | + "Even though finding derivatives of $\\frac{1}{1-t}$ is not all that bad, it is nevertheless annoying busywork. But since we know that the $n^{th}$ moment is the coefficient of the $n^{th}$ term of the Taylor Expansion of $X$, we can leverage that fact instead.\n", |
35 | 35 | "\n",
|
36 | 36 | "\\begin{align}\n",
|
37 | 37 | " \\frac{1}{1-t} &= \\sum_{n=0}^{\\infty} t^n &\\quad \\text{ for } |t| < 1 \\\\ \n",
|
|
123 | 123 | "collapsed": true
|
124 | 124 | },
|
125 | 125 | "source": [
|
126 |
| - "## MGF for $\\mathcal{Pois}(\\lambda)$\n", |
| 126 | + "## MGF for $Pois(\\lambda)$\n", |
127 | 127 | "\n",
|
128 |
| - "Let $X \\sim \\mathcal{Pois}(\\lambda)$; now let's consider MGFs and how to use them to find sums of random variables (convolutions).\n", |
| 128 | + "Let $X \\sim Pois(\\lambda)$; now let's consider MGFs and how to use them to find sums of random variables (convolutions).\n", |
129 | 129 | "\n",
|
130 | 130 | "\\begin{align}\n",
|
131 | 131 | " M(t) &= \\mathbb{E}(e^{tX}) \\\\\n",
|
|
136 | 136 | "\\end{align}\n",
|
137 | 137 | "\n",
|
138 | 138 | "\n",
|
139 |
| - "Now let's let $Y \\sim \\mathcal{Pois}(\\mu)$, and it is independent of $X$. Find the distribution of $(X + Y)$.\n", |
| 139 | + "Now let's let $Y \\sim Pois(\\mu)$, and it is independent of $X$. Find the distribution of $(X + Y)$.\n", |
140 | 140 | "\n",
|
141 | 141 | "You may recall that with MGFs, all we need to do is _multiply_ the MGFs.\n",
|
142 | 142 | "\n",
|
|
159 | 159 | "cell_type": "markdown",
|
160 | 160 | "metadata": {},
|
161 | 161 | "source": [
|
162 |
| - "## Joint Distribution\n", |
| 162 | + "### Some definitions\n", |
163 | 163 | "\n",
|
164 |
| - "Let $X, Y$ be both Bernoulli.\n", |
| 164 | + "In the most basic case of two r.v. in a joint distribution, consider both r.v.'s _together_:\n", |
| 165 | + "\n", |
| 166 | + "> **Joint CDF**\n", |
| 167 | + ">\n", |
| 168 | + "> In the general case, the joint CDF fof two r.v.'s is $ F(x,y) = P(X \\le x, Y \\le y)$\n", |
| 169 | + "\n", |
| 170 | + "<p/> \n", |
| 171 | + "\n", |
| 172 | + "> **Joint PDF**\n", |
| 173 | + "> $f(x, y)$ such that, in the *continuous* case $P((X,Y) \\in B) = \\iint_B f(x,y)\\,dx\\,dy$\n", |
| 174 | + "\n", |
| 175 | + "#### Joint PMF\n", |
| 176 | + "\n", |
| 177 | + "$f(x, y)$ such that, in the *discrete* case\n", |
| 178 | + "\n", |
| 179 | + "\\begin{align}\n", |
| 180 | + " P(X=x, Y=y)\n", |
| 181 | + "\\end{align}\n", |
| 182 | + "\n", |
| 183 | + "We also can consider a single r.v. of a joint distribution:\n", |
| 184 | + "\n", |
| 185 | + "#### Independence and Joint Distributions\n", |
| 186 | + "\n", |
| 187 | + "$X, Y$ are independent iff $F(x,y) = F_X(x) \\, F_Y(y)$. \n", |
| 188 | + "\n", |
| 189 | + "\\begin{align}\n", |
| 190 | + " P(X=x, Y=y) &= P(X=x) \\, P(Y=y) &\\quad \\text{discrete case} \\\\\n", |
| 191 | + " \\\\\\\\\n", |
| 192 | + " f(x, y) &= f_X(x) \\, f_Y(y) &\\quad \\text{continuous case}\n", |
| 193 | + "\\end{align}\n", |
| 194 | + "\n", |
| 195 | + "... with the caveat that this must be so for *all* $\\text{x, y} \\in \\mathbb{R}$\n", |
| 196 | + "\n", |
| 197 | + "#### Marginals (and how to get them)\n", |
| 198 | + "\n", |
| 199 | + "$P(X \\le x)$ is the *marginal distribution* of $X$, where we consider one r.v. at a time.\n", |
| 200 | + "\n", |
| 201 | + "In the case of a two-r.v. joint distribution, we can get the marginals by using the joint distribution itself:\n", |
| 202 | + "\n", |
| 203 | + "\\begin{align}\n", |
| 204 | + " P(X=x) &= \\sum_y P(X=x, Y=y) &\\quad \\text{marginal PMF, discrete case, for } x \\\\\n", |
| 205 | + " \\\\\\\\\n", |
| 206 | + " f_Y(y) &= \\int_{-\\infty}^{\\infty} f_{(X,Y)}(x,y) \\, dx &\\quad \\text{marginal PDF, continuous case, for } y\n", |
| 207 | + "\\end{align}" |
| 208 | + ] |
| 209 | + }, |
| 210 | + { |
| 211 | + "cell_type": "markdown", |
| 212 | + "metadata": {}, |
| 213 | + "source": [ |
| 214 | + "## Example: Discrete Joint Distribution\n", |
| 215 | + "\n", |
| 216 | + "Let $X, Y$ be both Bernoulli. $X$ and $Y$ may be independent; or they might be dependent. They may or may not have the same $p$. But they are both related in the form of a *joint distribution*.\n", |
| 217 | + "\n", |
| 218 | + "We can lay out this joint distribution in a $2 \\times 2$ contigency table like below:\n", |
165 | 219 | "\n",
|
166 | 220 | "\n",
|
167 | 221 | "| | $Y=0$ | $Y=1$ |\n",
|
168 |
| - "|-------|-------|-------|\n", |
169 |
| - "| $X=0$ | | |\n", |
170 |
| - "| $X=1$ | | |\n", |
171 |
| - "-------------------------\n", |
| 222 | + "|-------|:-----:|:-----:|\n", |
| 223 | + "| $X=0$ | 2/6 | 1/6 |\n", |
| 224 | + "| $X=1$ | 2/6 | 1/6 |\n", |
172 | 225 | "\n",
|
173 |
| - "#### Joint CDF\n", |
| 226 | + "In order to be a joint distribution, all of the values in our contigency table must be positive; and they must all sum up to 1. The example above shows such a PMF.\n", |
174 | 227 | "\n",
|
175 |
| - "$F(x,y) = P(X \\le x, Y \\le y)$\n", |
| 228 | + "Let's add the marginals for $X$ and $Y$ to our $2 \\times 2$ contigency table:\n", |
176 | 229 | "\n",
|
177 |
| - "#### Joint PMF\n", |
178 | 230 | "\n",
|
179 |
| - "$P(X=x, Y=y)$ in the discrete case\n", |
| 231 | + "| | $Y=0$ | $Y=1$ | ... |\n", |
| 232 | + "|:-----:|:-----:|:-----:|:-----:|\n", |
| 233 | + "| $X=0$ | 2/6 | 1/6 | 3/6 |\n", |
| 234 | + "| $X=1$ | 2/6 | 1/6 | 3/6 |\n", |
| 235 | + "| ... | 4/6 | 2/6 | |\n", |
| 236 | + "\n", |
| 237 | + "\n", |
| 238 | + "Observe how in our example, we have:\n", |
| 239 | + "\n", |
| 240 | + "\\begin{align}\n", |
| 241 | + " P(X=0,Y=0) &= P(X=0) \\, P(Y=0) \\\\\n", |
| 242 | + " &= 3/6 \\times 4/6 = 12/36 &= \\boxed{2/6} \\\\\n", |
| 243 | + " \\\\\n", |
| 244 | + " P(X=0,Y=1) &= P(X=0) \\, P(Y=1) \\\\\n", |
| 245 | + " &= 3/6 \\times 2/6 = 6/36 &= \\boxed{1/6} \\\\\n", |
| 246 | + " P(X=1,Y=0) &= P(X=1) \\, P(Y=0) \\\\\n", |
| 247 | + " &= 3/6 \\times 4/6 = 12/36 &= \\boxed{2/6} \\\\\n", |
| 248 | + " \\\\\n", |
| 249 | + " P(X=1,Y=1) &= P(X=1) \\, P(Y=1) \\\\\n", |
| 250 | + " &= 3/6 \\times 2/6 = 6/36 &= \\boxed{1/6} \\\\\n", |
| 251 | + "\\end{align}\n", |
| 252 | + "\n", |
| 253 | + "and so you can see that $X$ and $Y$ are independent.\n", |
180 | 254 | "\n",
|
181 |
| - "#### Marginal CDF\n", |
| 255 | + "Now here's an example of a two r.v. joint distribution where $X$ and $Y$ are _dependent_; check it out for yourself.\n", |
182 | 256 | "\n",
|
183 |
| - "$P(X \\le x)$ is the *marginal distribution* of $X$ " |
| 257 | + "| | $Y=0$ | $Y=1$ |\n", |
| 258 | + "|:-----:|:-----:|:-----:|\n", |
| 259 | + "| $X=0$ | 1/3 | 0 |\n", |
| 260 | + "| $X=1$ | 1/3 | 1/3 |" |
| 261 | + ] |
| 262 | + }, |
| 263 | + { |
| 264 | + "cell_type": "markdown", |
| 265 | + "metadata": {}, |
| 266 | + "source": [ |
| 267 | + "## Example: Continuous Joint Distribution\n", |
| 268 | + "\n", |
| 269 | + "Now say we had Uniform distributions on a square such that $x,y \\in [0,1]$. \n", |
| 270 | + "\n", |
| 271 | + "The joint PDF would be constant on/within the square; and 0 outside.\n", |
| 272 | + "\n", |
| 273 | + "\\begin{align}\n", |
| 274 | + " \\text{joint PDF} &=\n", |
| 275 | + " \\begin{cases}\n", |
| 276 | + " c &\\quad \\text{if } 0 \\le x \\le 1 \\text{, } 0 \\le y \\le 1 \\\\\n", |
| 277 | + " \\\\\n", |
| 278 | + " 0 &\\quad \\text{otherwise}\n", |
| 279 | + " \\end{cases}\n", |
| 280 | + "\\end{align}\n", |
| 281 | + "\n", |
| 282 | + "\n", |
| 283 | + "\n", |
| 284 | + "In 1-dimension space, if you integrate $1$ over some interval you get the _length_ of that interval.\n", |
| 285 | + "\n", |
| 286 | + "In 2-dimension space, if you integrate $1$ over some region, you get the _area_ of that region.\n", |
| 287 | + "\n", |
| 288 | + "Normalizing $c$, we know that $c = \\frac{1}{area} = 1$.\n", |
| 289 | + "\n", |
| 290 | + "Marginally, $X$ and $Y$ are independent $\\mathcal{Unif}(1)$." |
| 291 | + ] |
| 292 | + }, |
| 293 | + { |
| 294 | + "cell_type": "markdown", |
| 295 | + "metadata": {}, |
| 296 | + "source": [ |
| 297 | + "## Example: Dependent, Continuous Joint Distribution\n", |
| 298 | + "\n", |
| 299 | + "Now say we had Uniform distributions on a _disc_ such that $x^2 + y^2 \\le 1$. \n", |
| 300 | + "\n", |
| 301 | + "\n", |
| 302 | + "#### Joint PDF\n", |
| 303 | + "\n", |
| 304 | + "In this case, the joint PDF is $Unif$ over the area of a disc centered at the origin with radius 1.\n", |
| 305 | + "\n", |
| 306 | + "\\begin{align}\n", |
| 307 | + " \\text{joint PDF} &=\n", |
| 308 | + " \\begin{cases}\n", |
| 309 | + " \\frac{1}{\\pi} &\\quad \\text{if } x^2 + y^2 \\le 1 \\\\\n", |
| 310 | + " \\\\\n", |
| 311 | + " 0 &\\quad \\text{otherwise}\n", |
| 312 | + " \\end{cases}\n", |
| 313 | + "\\end{align}\n", |
| 314 | + "\n", |
| 315 | + "#### Marginal PDF" |
| 316 | + ] |
| 317 | + }, |
| 318 | + { |
| 319 | + "cell_type": "markdown", |
| 320 | + "metadata": {}, |
| 321 | + "source": [ |
| 322 | + "" |
184 | 323 | ]
|
185 | 324 | },
|
186 | 325 | {
|
|
0 commit comments