Bernoulli and the Law of Large Numbers
Analogies Between the Bernoulli Process Simulations and the Law of Large Numbers
Both homework projects explore how randomness reveals order through repetition. In both cases, we simulate a Bernoulli process — a sequence of independent trials where each trial has only two possible outcomes, often labeled as success (1) or failure (0). Over many repetitions, the average of these outcomes approaches a predictable value, illustrating the Law of Large Numbers (LLN).
Even though the two simulations may differ in setup (for example, one might focus on the frequency of successes, and another on cumulative probabilities or distributions), they share the same mathematical backbone: the Bernoulli distribution and its relationship with combinatorics.
Mathematical Similarities
1. Bernoulli Process and Binomial Coefficients
Every sequence of independent Bernoulli trials can be described using the binomial distribution.
The probability of getting exactly k successes in n trials is:
$$
P(X = k) = \binom{n}{k} p^k (1-p)^{n-k}
$$
The term $$\binom{n}{k}$$ (the binomial coefficient) counts the number of ways k successes can occur in n trials.
2. Pascal’s Triangle Connection
The coefficients $$\binom{n}{k}$$ form Pascal’s triangle.
Each row of Pascal’s triangle corresponds to all possible outcomes of a Bernoulli process with n trials.
The symmetry and recursive structure of Pascal’s triangle directly mirror the recursive nature of probabilities in successive trials.
3. Binomial Expansion
The binomial theorem states that:
$$
(p + q)^n = \sum_{k=0}^n \binom{n}{k} p^k q^{n-k}
$$
If we interpret \(p\) as the probability of success and \(q = 1 - p\) as failure, this expansion represents the total probability of all possible combinations of outcomes — a perfect algebraic analogue of the Bernoulli process.
Differences Between the Two Simulations
- One simulation may focus on empirical frequencies — repeating the experiment many times and showing that the sample mean converges to the expected probability \(E[X] = p\).
- The other might emphasize distributional convergence — how the shape of the histogram of outcomes (the binomial distribution) stabilizes and resembles the theoretical curve as \(n\) grows.
Both reflect the LLN, but from different perspectives:
- The first emphasizes convergence of averages (a numerical law).
- The second emphasizes convergence of distributions (a probabilistic law).
Combinatorial and Fibonacci Relationships
Combinatorial coefficients show how possible outcomes grow exponentially with the number of trials.
Interestingly, certain paths through Pascal’s triangle relate to the Fibonacci sequence: summing along its diagonals yields Fibonacci numbers.
This reveals how both sequences (binomial coefficients and Fibonacci numbers) are connected through recursive combinatorial relationships.
For example:
$$
F_{n} = \sum_{k=0}^{\lfloor (n-1)/2 \rfloor} \binom{n - k - 1}{k}
$$
This expresses Fibonacci numbers as sums of binomial coefficients — another bridge between probabilistic and combinatorial worlds.
Law of Large Numbers (LLN) Revisited
The LLN states that as the number of trials \(n \to \infty\), the sample mean \(\bar{X}_n\) converges to the expected value \(p\):
$$ \bar{X}_n = \frac{1}{n} \sum_{i=1}^n X_i \xrightarrow{n \to \infty} p$$
In both homework projects, this principle appears either as a visual convergence (in simulations) or as an algebraic property (in probability formulas).
The LLN is thus the bridge connecting empirical randomness with theoretical stability — just as the structure of Pascal’s triangle bridges discrete combinatorics with continuous probability behavior.