class: center, middle, inverse, title-slide # Discrete Random Variables ### Dr. Dogucu --- layout: true <div class="my-header"></div> <div class="my-footer"> Copyright © <a href="https://mdogucu.ics.uci.edu">Dr. Mine Dogucu</a>. <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">CC BY-NC-SA 4.0</a></div> --- ## Discrete Random Variables A numeric outcome of a random process is called a random variable. -- A **discrete random** variable has a countable number of possible outcomes. -- We will use capital letters towards the end of the alphabet such as `\(X, Y, Z\)` to represent random variables. We will use the corresponding lower case letters `\(x, y, z\)` to represent observed outcomes. --- ## Discrete Random Variable Random process: 3 flips of a single coin -- Sample Space: `\(\{HHH, HHT, HTH, THH, HTT, THT, TTH, TTT\}\)` -- Let X be a discrete random variable that represents the total number of heads in 3 coin flips. --- ## Discrete Random Variable <table align = "center"> <tr> <td>x </td> <td>0</td> <td>1</td> <td>2</td> <td>3</td> </tr> <tr> <td>P(X=x) = f(x)</td> <td>0.125</td> <td>0.375</td> <td>0.375</td> <td>0.125</td> </tr> </table> -- `\(P(X = 2) =f(2)=\frac{3}{8} = 0.375\)` --- ## Probability Mass Function For discrete random variable, `\(X\)` the distribution of all possible values of `\(x\)` can be shown with a probability mass function (pmf). <img src="slide-2-discrete-rv_files/figure-html/unnamed-chunk-1-1.png" style="display: block; margin: auto;" /> --- ## Probability Mass Function (pmf) <table align = "center"> <tr> <td>x</td> <td>0</td> <td>1</td> <td>2</td> <td>3</td> </tr> <tr> <td>P(X=x) = f(x)</td> <td>0.125</td> <td>0.375</td> <td>0.375</td> <td>0.125</td> </tr> </table> `\(P(X = 2) =f(2)=\frac{3}{8} = 0.375\)` -- ## Cumulative Distribution Function (cdf) <table align = "center"> <tr> <td>x</td> <td>0</td> <td>1</td> <td>2</td> <td>3</td> </tr> <tr> <td>P(X ≤x) = F(x)</td> <td>0.125</td> <td>0.5</td> <td>0.875</td> <td>1</td> </tr> </table> `\(P(X \leq 2) =F(2)=\frac{7}{8} = 0.875\)` --- ## Cumulative Distribution Function <img src="slide-2-discrete-rv_files/figure-html/unnamed-chunk-2-1.png" style="display: block; margin: auto;" /> --- ## Expected Value If we repeat this random process of 3 coin flips many times, how many number of heads would we expect to see on average? <table align = "center"> <tr> <td>x </td> <td>0</td> <td>1</td> <td>2</td> <td>3</td> </tr> <tr> <td>P(X=x) = f(x)</td> <td>0.125</td> <td>0.375</td> <td>0.375</td> <td>0.125</td> </tr> </table> `\(E(X) = \sum_{S} x f(x)\)` -- `\(E(X) = (0\cdot 0.125) + (1\cdot 0.375) + (2\cdot 0.375) + (3\cdot 0.125)\)` `\(E(X) = \mu = 1.5\)` --- ## Variance Variance is the average squared deviation from the average. --- ## Variance ~~Variance is the average squared deviation from the average.~~ Variance is the expected squared deviation from the expected value. `$$Var(X)= E[(X-E(X))^2]$$` -- `$$Var(X)= E[(X-\mu)^2]$$` -- `$$Var(X)= E(X^2) - [E(X)]^2$$` --- ## Variance <table align = "center"> <tr> <td>x </td> <td>0</td> <td>1</td> <td>2</td> <td>3</td> </tr> <tr> <td>P(X=x) = f(x)</td> <td>0.125</td> <td>0.375</td> <td>0.375</td> <td>0.125</td> </tr> </table> `\(Var(X)= E(X^2) - [E(X)]^2\)` -- `\(E(X^2) = \sum_{S} x^2 f(x)\)` -- `\(E(X^2) = (0^2\cdot 0.125) + (1^2\cdot 0.375) + (2^2\cdot 0.375) + (3^2\cdot 0.125) = 3\)` -- `\(Var(X)= 3 - 1.5^2 = 0.75\)` --- ## Linear Combination of Random Variables Let `\(X\)` and `\(Y\)` represent random variables and `\(a\)` and `\(b\)` represent constants. Then `\(E(aX +bY) = a\cdot E(X) + b \cdot E(Y),\)` and, if `\(X\)` and `\(Y\)` are independent, `\(Var(aX +bY) = a^2\cdot Var(X) + b^2 \cdot Var(Y)\)`