Binomial Distribution Derivation
Goldman Sachs
Can you compute the epected value and variance of a binomial $Bin(n,p)$ distribution? Recall that it is a sum of n independent Bernoulli trials, all of which take value 1 (win) with probability $p$ and 0 (loss) with probability $1-p$.
Answer
Let's assume you don't remember any distribution-specific formulas and derive these ground up. We are asked for expected value of a binomial distribution, which is itself a sum of independent variables. We know that expectation is additive, i.e. $E(X+Y) = E(X) + E(Y)$, and so is variance in the case of independent variables (since covariance term goes to 0), i.e. Var(X+Y) = Var(X) + Var(Y). Then, if we compute the expected value and variance of the Bernoulli trial, we can sum them up to get to Binomial.
Bernoulli trial $X$ can be 1 with probability $p$ and 0 with probability $1-p$ so its expected value is $E(X) = p\cdot 1 + (1-p)\cdot 0 = p$. Its variance is $Var(X) = E(X^2) - E(X)^2$. We already know the second term, since we know the expectation. As for the first term, notice that in the case of a Bernoulli trial, $X=X^2$, i.e. if $X=1$ then $X^2=1$; if $X=0$ then $X^2=0$. Then, we have $Var(X) = E(X) - E(X)^2 = p(1-p)$.
Now that we know the Bernoulli trial parameters, we simply sum up to binomial. Assuming $Y\sim Bin(n,p)$, then $Y = \sum_{i=1}^n X_i$ with $X_i$ Bernoulli trials.$$E(Y) = \sum_{i=1}^n E(X_i) = np$$ $$Var(Y) = \sum_{i=1}^n Var(X_i) = np(1-p)$$