Law of Large Numbers
What is the Law of Large Numbers (LLN)?
The Law of Large Numbers (LLN) is a fundamental concept in probability and statistics.
It states that as the number of independent trials of a random experiment increases, the sample average of the outcomes will get closer and closer to the expected (theoretical) value.
How It Works
Suppose you have a random variable X with an expected value (mean) of μ.
If you perform n independent trials and calculate the average of these trials, the LLN guarantees that as n approaches infinity, the average will converge to μ.
In simpler terms: more trials = results closer to the true average.
Simple Example
Imagine flipping a fair coin. The theoretical probability of getting heads is 0.5.
- If you flip the coin only 10 times, you might get 7 heads and 3 tails, which gives an average of 0.7 — far from the expected 0.5.
- If you flip the coin 10,000 times, the proportion of heads will approach 0.5.
This demonstrates that variability in small samples is high, but it stabilizes as the sample size grows.
Applications of LLN
- Predicting long-term outcomes in games of chance, financial markets, or natural phenomena.
- Forming the basis for many statistical and machine learning techniques.
- Understanding the difference between short-term variability and long-term stability.
- Ensuring reliability in simulations and experiments by using sufficiently large sample sizes.
Mathematical Note
Formally, if X₁, X₂, ..., Xₙ are independent and identically distributed random variables with expected value μ, then the sample mean:
X̄n = (X1 + X2 + ... + Xn) / n
converges to μ as n → ∞.
This can be stated in two forms: weak (convergence in probability) and strong (almost sure convergence).