- Algebra
- Arithmetic
- Whole Numbers
- Numbers
- Types of Numbers
- Odd and Even Numbers
- Prime & Composite Numbers
- Sieve of Eratosthenes
- Number Properties
- Commutative Property
- Associative Property
- Identity Property
- Distributive Property
- Order of Operations
- Rounding Numbers
- Absolute Value
- Number Sequences
- Factors & Multiples
- Prime Factorization
- Greatest Common Factor
- Least Common Multiple
- Squares & Perfect Squares
- Square Roots
- Squares & Square Roots
- Simplifying Square Roots
- Simplifying Radicals
- Radicals that have Fractions
- Multiplying Radicals

- Integers
- Fractions
- Introducing Fractions
- Converting Fractions
- Comparing Fractions
- Ordering Fractions
- Equivalent Fractions
- Reducing Fractions
- Adding Fractions
- Subtracting Fractions
- Multiplying Fractions
- Reciprocals
- Dividing Fractions
- Adding Mixed Numbers
- Subtracting Mixed Numbers
- Multiplying Mixed Numbers
- Dividing Mixed Numbers
- Complex Fractions
- Fractions to Decimals

- Decimals
- Exponents
- Percent
- Scientific Notation
- Proportions
- Equality
- Properties of equality
- Addition property of equality
- Transitive property of equality
- Subtraction property of equality
- Multiplication property of equality
- Division property of equality
- Symmetric property of equality
- Reflexive property of equality
- Substitution property of equality
- Distributive property of equality

- Commercial Math

- Calculus
- Differential Calculus
- Limits calculus
- Mean value theorem
- L’Hôpital’s rule
- Newton’s method
- Derivative calculus
- Power rule
- Sum rule
- Difference rule
- Product rule
- Quotient rule
- Chain rule
- Derivative rules
- Trigonometric derivatives
- Inverse trig derivatives
- Trigonometric substitution
- Derivative of arctan
- Derivative of secx
- Derivative of csc
- Derivative of cotx
- Exponential derivative
- Derivative of ln
- Implicit differentiation
- Critical numbers
- Derivative test
- Concavity calculus
- Related rates
- Curve sketching
- Asymptote
- Hyperbolic functions
- Absolute maximum
- Absolute minimum

- Integral Calculus
- Fundamental theorem of calculus
- Approximating integrals
- Riemann sum
- Integral properties
- Antiderivative
- Integral calculus
- Improper integrals
- Integration by parts
- Partial fractions
- Area under the curve
- Area between two curves
- Center of mass
- Work calculus
- Integrating exponential functions
- Integration of hyperbolic functions
- Integrals of inverse trig functions
- Disk method
- Washer method
- Shell method

- Sequences, Series & Tests
- Parametric Curves & Polar Coordinates
- Multivariable Calculus
- 3d coordinate system
- Vector calculus
- Vectors equation of a line
- Equation of a plane
- Intersection of line and plane
- Quadric surfaces
- Spherical coordinates
- Cylindrical coordinates
- Vector function
- Derivatives of vectors
- Length of a vector
- Partial derivatives
- Tangent plane
- Directional derivative
- Lagrange multipliers
- Double integrals
- Iterated integral
- Double integrals in polar coordinates
- Triple integral
- Change of variables in multiple integrals
- Vector fields
- Line integral
- Fundamental theorem for line integrals
- Green’s theorem
- Curl vector field
- Surface integral
- Divergence of a vector field
- Differential equations
- Exact equations
- Integrating factor
- First order linear differential equation
- Second order homogeneous differential equation
- Non homogeneous differential equation
- Homogeneous differential equation
- Characteristic equations
- Laplace transform
- Inverse laplace transform
- Dirac delta function

- Differential Calculus
- Matrices
- Pre-Calculus
- Lines & Planes
- Functions
- Domain of a function
- Transformation Of Graph
- Polynomials
- Graphs of rational functions
- Limits of a function
- Complex Numbers
- Exponential Function
- Logarithmic Function
- Sequences
- Conic Sections
- Series
- Mathematical induction
- Probability
- Advanced Trigonometry
- Vectors
- Polar coordinates

- Probability
- Geometry
- Angles
- Triangles
- Types of Triangles
- Special Right Triangles
- 3 4 5 Triangle
- 45 45 90 Triangle
- 30 60 90 Triangle
- Area of Triangle
- Pythagorean Theorem
- Pythagorean Triples
- Congruent Triangles
- Hypotenuse Leg (HL)
- Similar Triangles
- Triangle Inequality
- Triangle Sum Theorem
- Exterior Angle Theorem
- Angles of a Triangle
- Law of Sines or Sine Rule
- Law of Cosines or Cosine Rule

- Polygons
- Circles
- Circle Theorems
- Solid Geometry
- Volume of Cubes
- Volume of Rectangular Prisms
- Volume of Prisms
- Volume of Cylinders
- Volume of Spheres
- Volume of Cones
- Volume of Pyramids
- Volume of Solids
- Surface Area of a Cube
- Surface Area of a Cuboid
- Surface Area of a Prism
- Surface Area of a Cylinder
- Surface Area of a Cone
- Surface Area of a Sphere
- Surface Area of a Pyramid
- Geometric Nets
- Surface Area of Solids

- Coordinate Geometry and Graphs
- Coordinate Geometry
- Coordinate Plane
- Slope of a Line
- Equation of a Line
- Forms of Linear Equations
- Slopes of Parallel and Perpendicular Lines
- Graphing Linear Equations
- Midpoint Formula
- Distance Formula
- Graphing Inequalities
- Linear Programming
- Graphing Quadratic Functions
- Graphing Cubic Functions
- Graphing Exponential Functions
- Graphing Reciprocal Functions

- Geometric Constructions
- Geometric Construction
- Construct a Line Segment
- Construct Perpendicular Bisector
- Construct a Perpendicular Line
- Construct Parallel Lines
- Construct a 60° Angle
- Construct an Angle Bisector
- Construct a 30° Angle
- Construct a 45° Angle
- Construct a Triangle
- Construct a Parallelogram
- Construct a Square
- Construct a Rectangle
- Locus of a Moving Point

- Geometric Transformations

- Sets & Set Theory
- Statistics
- Collecting and Summarizing Data
- Common Ways to Describe Data
- Different Ways to Represent Data
- Frequency Tables
- Cumulative Frequency
- Advance Statistics
- Sample mean
- Population mean
- Sample variance
- Standard deviation
- Random variable
- Probability density function
- Binomial distribution
- Expected value
- Poisson distribution
- Normal distribution
- Bernoulli distribution
- Z-score
- Bayes theorem
- Normal probability plot
- Chi square
- Anova test
- Central limit theorem
- Sampling distribution
- Logistic equation
- Chebyshev’s theorem

- Difference
- Correlation Coefficient
- Tautology
- Relative Frequency
- Frequency Distribution
- Dot Plot
- Сonditional Statement
- Converse Statement
- Law of Syllogism
- Counterexample
- Least Squares
- Law of Detachment
- Scatter Plot
- Linear Graph
- Arithmetic Mean
- Measures of Central Tendency
- Discrete Data
- Weighted Average
- Summary Statistics
- Interquartile Range
- Categorical Data

- Trigonometry
- Vectors
- Multiplication Charts
- Time Table
- 2 times table
- 3 times table
- 4 times table
- 5 times table
- 6 times table
- 7 times table
- 8 times table
- 9 times table
- 10 times table
- 11 times table
- 12 times table
- 13 times table
- 14 times table
- 15 times table
- 16 times table
- 17 times table
- 18 times table
- 19 times table
- 20 times table
- 21 times table
- 22 times table
- 23 times table
- 24 times table

- Time Table

# Bernoulli Distribution – Explanation & Examples

*The definition of the Bernoulli distribution is:*

**“The Bernoulli distribution is a discrete probability distribution that describes the probability of a random variable with only two outcomes.”**

*In this topic, we will discuss the Bernoulli distribution from the following aspects:*

- What is a Bernoulli distribution?
- When to use Bernoulli distribution?
- Bernoulli distribution formula.
- Practice questions.
- Answer key.

## 1. What is a Bernoulli distribution?

**The Bernoulli distribution** is a discrete probability distribution that describes the probability of a random variable with only two outcomes.

In the random process called a Bernoulli trial, the random variable can take one outcome, called a success, with a probability p, or take another outcome, called failure, with a probability q = 1-p.

The success outcome is denoted as 1 and the failure outcome is denoted as 0.

The Bernoulli distribution is a special case of the binomial distribution where a single trial is conducted and the binomial distribution is the sum of repeated Bernoulli trials.

**The Bernoulli distribution was named after the Swiss mathematician Jacob Bernoulli**.

### – Example 1

Tossing a coin can result in only two possible outcomes (head or tail). We call one of these outcomes (head) a success and the other (tail), a failure.

The probability of success (p) or head is 0.5 for a fair coin. The probability of failure (q) or tail = 1-p = 1-0.5 = 0.5.

*If we denote head as 1 and tail as 0, we can plot this Bernoulli distribution as follows:*

*We have two outcomes:*

- Tail or 0 with a probability of 0.5.
- Head or 1 with a probability of 0.5 also.

This is an **example of a probability mass function** where we have the probability for each outcome.

### – Example 2

We have an unfair coin where the probability of success (p) or head is 0.8 and the probability of failure (q) or tail = 1-p = 1-0.8 = 0.2.

*If we denote head as 1 and tail as 0, we can plot this Bernoulli distribution as follows:*

*We have two outcomes:*

- Tail or 0 with a probability of 0.2.
- Head or 1 with a probability of 0.8.

### – Example 3

The prevalence of a certain disease in the general population is 10%.

If we randomly select a person from this population, we can have only two possible outcomes (diseased or healthy person). We call one of these outcomes (diseased person) success and the other (healthy person), a failure.

The probability of success (p) or diseased person is 10% or 0.1. So, the probability of failure (q) or healthy person = 1-p = 1-0.1 = 0.9.

*If we denote diseased person as 1 and healthy person as 0, we can plot this Bernoulli distribution as follows:*

*We have two outcomes:*

- A healthy person or 0 with a probability of 0.9.
- A diseased person or 1 with a probability of 0.1.

### – Example 4

In the above example of disease prevalence of 10%, if We are interested in healthy persons and call the healthy person a success and the diseased person, a failure.

The probability of success (p) or healthy person is 90% or 0.9. So, the probability of failure (q) or diseased person = 1-p = 1-0.9 = 0.1.

*If we denote a healthy person as 1 and diseased person as 0, we can plot this Bernoulli distribution as follows:*

*We have two outcomes:*

- A healthy person or 1 with a probability of 0.9.
- A diseased person or 0 with a probability of 0.1.

## 2. When to use Bernoulli distribution?

*For a random variable to be described by the Bernoulli distribution:*

- The random variable can take only one of two possible outcomes. We call one of these outcomes a success and the other, a failure.
- The probability of success, denoted by p, is the same in every Bernoulli trial.
- The trials are independent, meaning that the outcome in one trial does not affect the outcome in other trials.

We can **determine the Bernoulli distribution** from the results of different Bernoulli trials.

### – Example 1

You are tossing a coin. The random variable equals to 1 if you get a head and 0 if you get a tail.

*You tossed the coin 100 times and get the following results:*

0 1 0 1 1 0 1 1 1 0 1 0 1 1 0 1 0 0 0 1 1 1 1 1 1 1 1 1 0 0 1 1 1 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 1 0 0 1 1 0 1 0 0 0 1 0 1 1 1 0 1 1 1 0 0 0 0 1 0 0 0 1 0 1 0 0 1 1 1 0 0 1 0 1 0 0 1 0 0 1.

What is the Bernoulli distribution for this coin?

You can use that data to estimate the probability mass function (or the probability distribution) for tossing this coin.

1. We construct a frequency table for each outcome.

Outcome | frequency |

0 | 53 |

1 | 47 |

2. Add another column for the probability of each outcome.

Probability = frequency/total number of data = frequency/100.

Outcome | frequency | probability |

0 | 53 | 0.53 |

1 | 47 | 0.47 |

The probabilities are >= 0 and sum to 1.

This is a likely fair coin where the probability of heads nearly equals the probability of tails = 0.5.

We do not get exactly 50 heads and 50 tails due to randomness in the process but we get a good approximation to the probability of the fair coin = 0.5.

3. Use the table to plot the Bernoulli distribution for that coin:

*We have two outcomes:*

- Head or 1 with a probability of 0.47.
- Tail or 0 with a probability of 0.53.

### – Example 2

You screened 50 individuals from a certain population for the presence of hypertension and get the following results:

ID | condition |

1 | normotensive |

2 | normotensive |

3 | normotensive |

4 | normotensive |

5 | normotensive |

6 | normotensive |

7 | normotensive |

8 | normotensive |

9 | normotensive |

10 | normotensive |

11 | hypertensive |

12 | normotensive |

13 | normotensive |

14 | normotensive |

15 | normotensive |

16 | normotensive |

17 | normotensive |

18 | normotensive |

19 | normotensive |

20 | hypertensive |

21 | normotensive |

22 | normotensive |

23 | normotensive |

24 | hypertensive |

25 | normotensive |

26 | normotensive |

27 | normotensive |

28 | normotensive |

29 | normotensive |

30 | normotensive |

31 | hypertensive |

32 | normotensive |

33 | normotensive |

34 | normotensive |

35 | normotensive |

36 | normotensive |

37 | normotensive |

38 | normotensive |

39 | normotensive |

40 | normotensive |

41 | normotensive |

42 | normotensive |

43 | normotensive |

44 | normotensive |

45 | normotensive |

46 | normotensive |

47 | normotensive |

48 | normotensive |

49 | normotensive |

50 | normotensive |

What is the estimated Bernoulli distribution for hypertension in this population?

1. We construct a frequency table for each outcome.

Outcome | frequency |

hypertensive | 4 |

normotensive | 46 |

2. Add another column for the probability of each outcome. As we are interested in hypertension, so we denote hypertensive persons as 1 and normotensive persons as 0.

Probability = frequency/total number of data = frequency/50.

outcome | frequency | probability |

1 | 4 | 0.08 |

0 | 46 | 0.92 |

The probabilities are >= 0 and sum to 1.

3. Use the table to plot the Bernoulli distribution for hypertension: