Normal distributions

The distribution of a sum of independent normally distributed random variables also follows a normal distribution. This is a rather particular result for normally distributed variables; see here for a detailed proof of this result. In particular, consider two independent random variables \(X\) and \(Y\), where \(X\sim N(\mu_X, \sigma^2_X)\) and \(Y\sim N(\mu_Y, \sigma^2_Y)\) (i.e., means \(\mu_X\) and \(\mu_Y\) and standard deviations \(\sigma_X\) and \(\sigma_Y\)). Then, their sum \(Z = X+Y\) is also normally distributed, where \(Z \sim N(\mu_X + \mu_Y, \sigma^2_X + \sigma^2_Y)\). The most straightforward proof of this is the geometric one (see link above).

Moving from the probability density to the cumulative distribution function. Briefly (see here for more detailed information), the probability density of the normal distribution is \[f(x|\mu,\sigma^2) = \frac{1}{\sqrt{2\pi\sigma}} \exp\left[-\frac{(x-\mu)^2}{\sigma^2}\right]\] This probability density tells you that if you want to know the probability that a random sample \(X\sim N(\mu,\sigma^2)\) is between the values \(x_0 < x_1\), it is given by \[P(x_0\le X \le x_1) = \int_{x_0}^{x_1}f(t|\mu, \sigma^2)\;dt\] If we take the special case where \(x_0\to-\infty\), and then we have the cumulative distribution function, \[P(X \le x) = \int_{-\infty}^x f(t|\mu, \sigma^2)\;dt = \frac{1}{2} \left[1 + \mathrm{erf}\left(\frac{x-\mu}{\sigma\sqrt{2}}\right) \right]\] where \(\mathrm{erf}\) is the Error function. The cumulative distribution function goes to 0 as \(x\to-\infty\) and to 1 as \(x\to\infty\).