Bernoulli Distribution is a type of discrete probability distribution where every experiment conducted asks a question that can be answered only in yes or no. In other words, the random variable can be 1 with a probability p or it can be 0 with a probability (1 - p). Such an experiment is called a Bernoulli trial. A pass or fail exam can be modeled by a Bernoulli Distribution.
If we have a Binomial Distribution where n = 1 then it becomes a Bernoulli Distribution. As this distribution is very easy to understand, it is used as a basis for deriving more complex distributions. Bernoulli Distribution can be used to describe events that can only have two outcomes, that is, success or failure.
Bernoulli Distribution is a special kind of distribution that is used to model real-life examples and can be used in many different types of applications. A random experiment that can only have an outcome of either 1 or 0 is known as a Bernoulli trial. Such an experiment is used in a Bernoulli distribution.
A discrete probability distribution wherein the random variable can only have 2 possible outcomes is known as a Bernoulli Distribution. If in a Bernoulli trial the random variable takes on the value of 1, it means that this is a success. The probability of success is given by \(p\). Similarly, if the value of the random variable is 0, it indicates failure. The probability of failure is \(q = 1 - p\). Bernoulli distribution can be used to derive a binomial distribution, geometric distribution, and negative binomial distribution.
Suppose there is an experiment where you flip a fair coin. If the outcome of the flip is heads, then you win. This means that the probability of getting heads is \(p = \frac{1}{2}\). If \(X\) is the random variable following a Bernoulli Distribution, we get \(P(X = 1) = p = \frac{1}{2}\).
A binomial random variable, \(X\), is also known as an indicator variable. This is because if an event results in success, then \(X = 1\), and if the outcome is a failure, then \(X = 0\). \(X\) can be written as \(X \sim \text{Bernoulli}(p)\), where \(p\) is the parameter. The formulas for Bernoulli distribution are given by the probability mass function (PMF) and the cumulative distribution function (CDF).
The probability mass function (PMF) for a Bernoulli distribution is:
\[ f(x, p) = \begin{cases} p & \text{if } x = 1 \\ 1 - p & \text{if } x = 0 \end{cases} \]We can also express this formula as:
\[ f(x, p) = p^x (1 - p)^{1 - x}, \quad x \in \{0, 1\} \]The cumulative distribution function (CDF) of a Bernoulli random variable \(X\) is defined as:
\[ F(x, p) = \begin{cases} 0 & \text{if } x < 0 \\ 1 - p & \text{if } 0 \leq x < 1 \\ 1 & \text{if } x \geq 1 \end{cases} \]The mean (expected value) and variance of a Bernoulli distribution are derived as follows:
We know that:
\[ P(X = 1) = p \quad \text{and} \quad P(X = 0) = q = 1 - p \]The expected value (mean) is given by:
\[ E[X] = P(X = 1) \cdot 1 + P(X = 0) \cdot 0 = p \cdot 1 + (1 - p) \cdot 0 = p \]The variance is calculated as:
\[ \text{Var}(X) = E[X^2] - (E[X])^2 \]Since \(E[X^2] = E[X]\), we get:
\[ \text{Var}(X) = p - p^2 = p(1 - p) = p \cdot q \]Thus, the variance of a Bernoulli distribution is \( \text{Var}(X) = p(1 - p) \).
Bernoulli distribution is a simple distribution and hence, is widely used in many industries. Given below are some applications of Bernoulli distribution: