What is a Bernoulli Distribution?

Janvi Kumari Last Updated : 22 Nov, 2024
8 min read

A key idea in data science and statistics is the Bernoulli distribution, named for the Swiss mathematician Jacob Bernoulli. It is crucial to probability theory and a foundational element for more intricate statistical models, ranging from machine learning algorithms to customer behaviour prediction. In this article, we will discuss the Bernoulli distribution in detail.

Read on!

Bernoulli distribution

What is a Bernoulli distribution?

A Bernoulli distribution is a discrete probability distribution representing a random variable with only two possible outcomes. Usually, these results are denoted by the terms “success” and “failure,” or alternatively, by the numbers 1 and 0.

 Let X be a random variable. Then, X is said to follow a Bernoulli distribution with success probability p

Definition of Bernoulli distribution
Sources: Link

The Probability mass function of the Bernoulli distribution

Let X be a random variable following a Bernoulli distribution:

formula

Then, the probability mass function of X is

probability mass function

This follows directly from the definition given above.

Cumulative Distribution Function for Bernoulli Distribution

The Cumulative Distribution Function (CDF) of a Bernoulli distribution gives the probability that the random variable X is less than or equal to a given value x. For x∈{0,1}, the CDF is defined as:

Cumulative Distribution Function for Bernoulli Distribution

Steps to Compute Cumulative Distribution Function

For a Bernoulli random variable X:

  • If x=0, F(x)=1−p.
  • If x=1F(x)=1.

Python Implementation of Cumulative Distribution Function

You can compute the CDF of a Bernoulli distribution using the scipy.stats library:

from scipy.stats import bernoulli
# Given probability of success
p = 0.7
# CDF calculation
cdf_0 = bernoulli.cdf(0, p)  # CDF at X = 0
cdf_1 = bernoulli.cdf(1, p)  # CDF at X = 1
# Display results
print("CDF at X = 0 (F(0)): ", cdf_0)
print("CDF at X = 1 (F(1)): ", cdf_1)
output

Mean of the Bernoulli Distribution

Let X be a random variable following a Bernoulli distribution:

random variable following a Bernoulli distribution

Then, the mean or expected value of X is

Proof: The expected value is the probability-weighted average of all possible values:

probability-weighted average

Since there are only two possible outcomes for a Bernoulli random variable, we have:

two possible outcomes for a Bernoulli random variable

Sources: https://en.wikipedia.org/wiki/Bernoulli_distribution#Mean.

Also read: End to End Statistics for Data Science

Variance of the Bernoulli distribution

Let X be a random variable following a Bernoulli distribution:

random variable following a Bernoulli distribution

Then, the variance of X is

variance

Proof: The variance is the probability-weighted average of the squared deviation from the expected value across all possible values

variance is the probability-weighted average

and can also be written in terms of the expected values:

 Equation (1)

equation

The mean of a Bernoulli random variable is

Equation(2)

equation 2

and the mean of a squared Bernoulli random variable is

Equation(3)

equation 3

Combining Equations (1), (2) and (3), we have:

combination
Sources: Link

Bernoulli Distribution vs Binomial Distribution

The Bernoulli distribution is a special case of the Binomial distribution where the number of trials n=1. Here’s a detailed comparison between the two:

Aspect Bernoulli Distribution Binomial Distribution
Purpose Models the outcome of a single trial of an event. Models the outcome of multiple trials of the same event.
Representation X∼Bernoulli(p), where p is the probability of success. X∼Binomial(n,p), where n is the number of trials and p is the probability of success in each trial.
Mean E[X]=p E[X]=n⋅p
Variance Var(X)=p(1−p) Var(X)=n⋅p⋅(1−p)
Support Outcomes are X∈{0,1}, representing failure (0) and success (1). Outcomes are X∈{0,1,2,…,n}, representing the number of successes in n trials.
Special Case Relationship A Bernoulli distribution is a special case of the Binomial distribution when n=1. A Binomial distribution generalizes the Bernoulli distribution for n>1.
Example If the probability of winning a game is 60%, the Bernoulli distribution can model whether you win (1) or lose (0) in a single game. If the probability of winning a game is 60%, the Binomial distribution can model the probability of winning exactly 3 out of 5 games.
graph

The Bernoulli distribution (left) models the outcome of a single trial with two possible outcomes: 0(failure) or 1 (success). In this example, with p=0.6 there is a 40% chance of failure (P(X=0)=0.4) and a 60% chance of success (P(X=1)=0.6). The graph clearly shows two bars, one for each outcome, where the height corresponds to their respective probabilities.

The Binomial distribution (right) represents the number of successes across multiple trials (in this case, n=5 trials). It shows the probability of observing each possible number of successes, ranging from 0 to 5. The number of trials n and the success probability p=0.6 influence the distribution’s shape. Here, the highest probability occurs at X=3, indicating that achieving exactly 3 successes out of 5 trials is most likely. The probabilities for fewer (X=0,1,2) or more (X=4,5) successes decrease symmetrically around the mean E[X]=n⋅p=3.

Also read: A Guide To Complete Statistics For Data Science Beginners!

Use of Bernoulli Distributions in Real-world Applications

The Bernoulli distribution is widely used in real-world applications involving binary outcomes. Bernoulli distributions are essential to machine learning when it comes to binary classification issues. In these situations, we must classify the data into one of two groups. Among the examples are:

  • Email spam detection (spam or not spam)
  • Financial transaction fraud detection (legal or fraudulent)
  • Diagnosis of disease based on symptoms (missing or present)
  • Medical Testing: Determining if a treatment is effective (positive/negative result).
  • Gaming: Modeling outcomes of a single event, such as win or lose.
  • Churn Analysis: Predicting if a customer will leave a service or stay.
  • Sentiment Analysis: Classifying text as positive or negative.

Why Use the Bernoulli Distribution?

  • Simplicity: It’s ideal for scenarios where only two possible outcomes exist.
  • Building Block: The Bernoulli distribution serves as the foundation for the Binomial and other advanced distributions.
  • Interpretable: Real-world outcomes like success/failure, pass/fail, or yes/no fit naturally into its framework.

Numerical Example on Bernoulli Distribution:

A factory produces light bulbs. Each light bulb has a 90% chance of passing the quality test (p=0.9) and a 10% chance of failing (1−p=0.1). Let X be the random variable that represents the outcome of the quality test:

  • X=1: The bulb passes.
  • X=0: The bulb fails.

Problem:

  1. What is the probability that the bulb passes the test?
  2. What is the expected value E[X]?
  3. What is the variance Var(X)?

Solution:

  1. Probability of Passing the Test: Using the Bernoulli PMF:
Bernoulli PMF

So, the probability of passing is 0.9 (90%).

  1. Expected Value E[X]

E[X]=p.

Here, p=0.9.

E[X]=0.9..

This means the average success rate is 0.9 (90%).

  1. Variance Var(X)

Var(X)=p(1−p)

Here, p=0.9:

Var(X)=0.9(1−0.9)=0.9⋅0.1=0.09.

The variance is 0.09.

Final Answer:

  1. Probability of passing: 0.9 (90%).
  2. Expected value: 0.9.
  3. Variance: 0.09.

This example shows how the Bernoulli distribution models single binary events like a quality test outcome.

Now let’s see how this question can be solved in python

Implementation 

Step 1: Install the necessary library

You need to install matplotlib if you haven’t already:

pip install matplotlib

Step 2: Import the packages

Now, import the necessary packages for the plot and Bernoulli distribution.

import matplotlib.pyplot as plt
from scipy.stats import bernoulli

Step 3: Define the probability of success

Set the given probability of success for the Bernoulli distribution.

p = 0.9

Step 4: Calculate the PMF for success and failure

Calculate the probability mass function (PMF) for both the “Fail” (X=0) and “Pass” (X=1) outcomes.

probabilities = [bernoulli.pmf(0, p), bernoulli.pmf(1, p)]

Step 5: Set labels for the outcomes

Define the labels for the outcomes (“Fail” and “Pass”).

outcomes = ['Fail (X=0)', 'Pass (X=1)']

Step 6: Calculate the expected value

The expected value (mean) for the Bernoulli distribution is simply the probability of success.

expected_value = p  # Mean of Bernoulli distribution

Step 7: Calculate the variance

The variance of a Bernoulli distribution is calculated using the formula Var[X]=p(1−p)

variance = p * (1 - p)  # Variance formula

Step 8: Display the results

Print the calculated probabilities, expected value, and variance.

print("Probability of Passing (X = 1):", probabilities[1])
print("Probability of Failing (X = 0):", probabilities[0])
print("Expected Value (E[X]):", expected_value)
print("Variance (Var[X]):", variance)

Output:

Output

Step 9: Plotting the probabilities

Create a bar plot for the probabilities of failure and success using matplotlib.

bars = plt.bar(outcomes, probabilities, color=['red', 'green'])

Step 10: Add title and labels to the plot

Set the title and labels for the x-axis and y-axis of the plot.

plt.title(f'Bernoulli Distribution (p = {p})')
plt.xlabel('Outcome')
plt.ylabel('Probability')

Step 10: Add labels to the legend

Add labels for each bar to the legend, showing the probabilities for “Fail” and “Pass”.

bars[0].set_label(f'Fail (X=0): {probabilities[0]:.2f}')
bars[1].set_label(f'Pass (X=1): {probabilities[1]:.2f}')

Step 11: Display the legend

Show the legend on the plot.

plt.legend()

Step 12: Show the plot

Finally, display the plot.

plt.show()
Output

This step-by-step breakdown allows you to create the plot and calculate the necessary values for the Bernoulli distribution.

Conclusion

A key idea in statistics is the Bernoulli distribution model scenarios with two possible outcomes: success or failure. It is employed in many different applications, such as quality testing, consumer behaviour prediction, and machine learning for binary categorisation. Key characteristics of the distribution, such as variance, expected value, and probability mass function (PMF), aid in the comprehension and analysis of such binary events. You may create more intricate models, like the Binomial distribution, by becoming proficient with the Bernoulli distribution.

Frequently Asked Questions

Q1. Can the Bernoulli distribution handle multiple outcomes? 

Ans. No, it only handles two outcomes (success or failure). For more than two outcomes, other distributions, like the multinomial distribution, are used.

Q2. What are some examples of Bernoulli trials? 

Ans. Some examples of Bernoulli trails are:
1. Tossing a coin (heads or tails)
2. Passing a quality test (pass or fail)

Q3. What is the Bernoulli distribution? 

Ans. The Bernoulli distribution is a discrete probability distribution representing a random variable with two possible outcomes: success (1) and failure (0). It is defined by the probability of success, denoted by p.

Q4. What distinguishes the Binomial distribution from the Bernoulli distribution? 

Ans. When the number of trials (n) equals 1, the Bernoulli distribution is a particular instance of the Binomial distribution. The Binomial distribution models several trials, whereas the Bernoulli distribution models just one.Ans.

Q5. Is Bernoulli Distribution discrete or continuous?

Ans. Bernoulli Distribution is a discrete probability distribution because it deals with binary outcomes (0 or 1).

Q6. What is the practical use of Bernoulli Distribution in machine learning?

Ans. Bernoulli Distribution is often used in classification problems, especially in binary classifiers like logistic regression. It can also model binary outcomes in data, such as predicting whether a customer will make a purchase.

Q7. Why is the Bernoulli Distribution important?

Ans. The Bernoulli distribution is fundamental in probability and statistics because it models binary outcomes, such as success/failure or yes/no scenarios. It forms the basis for more complex distributions like the Binomial and is widely used in machine learning, hypothesis testing, and decision-making processes.

Hi, I am Janvi, a passionate data science enthusiast currently working at Analytics Vidhya. My journey into the world of data began with a deep curiosity about how we can extract meaningful insights from complex datasets.

Responses From Readers

Clear

We use cookies essential for this site to function well. Please click to help us improve its usefulness with additional cookies. Learn about our use of cookies in our Privacy Policy & Cookies Policy.

Show details