The Bernoulli distribution is one of the simplest probability distributions. It is based on the idea of a Bernoulli trial. This is an experiment with only two possible outcomes: "success" which occurs with probability p, and "failure", which occurs with probability 1 - p. The random variable X which equals 1 for a success and 0 for a failure has a Bernoulli(p) distribution.
Parameter | Range | Description |
---|---|---|
p | 0 ≤ p ≤ 1 | Probability of success |
Probability Mass Function
Support
Mean
Variance
Example | p |
---|---|
A fair coin is tossed. Let X = 1 for heads and X = 0 for tails. | 0.5000 |
A fair six-sided die is thrown. Let X = 1 if a 6 is thrown, and 0 otherwise. | 0.1667 |
The probability that an LED light bulb will fail this year is 0.2. Let X = 1 if the bulb fails this year and X = 0 if the bulb continues working. | 0.2000 |
X ~ Bernoulli(p)
E(X) = , Var(X) =
The expected value can be changed directly by dragging left or right on the chart. This can also be done using the box below the chart which shows the mean ± one standard deviation. As the expected value gets closer to 0 or 1, the variance also approaches zero.
The illustration above shows an experiment with only two possible outcomes:
- ❌
- X = 0 indicates "failure" which occurs with probability 1 - p.
- ✅
- X = 1 indicates "success" which occurs with probability p.
The simulation above shows a number U chose randomly from the interval [0, 1]. The light blue line shows the Bernoulli(p) random variable X which equals 1 if U falls in the green interval of length p, and 0 if U falls in the grey interval of length 1 - p. The histogram accumulates the results of each simulation.