Monte Carlo simulation is a computational technique that uses random sampling to estimate numerical results for problems that are analytically intractable or too complex for closed-form solutions. Named after the Monte Carlo casino in Monaco, the method transforms deterministic mathematical problems into probabilistic experiments by generating thousands to millions of random scenarios and aggregating their outcomes. At its core, Monte Carlo relies on the Law of Large Numbers—as sample size increases, the sample mean converges to the expected value—making it invaluable across finance, physics, engineering, and data science. The key insight: rather than solving an equation directly, you simulate the randomness inherent in the system and let statistics reveal the answer.
What This Cheat Sheet Covers
This topic spans 10 focused tables and 60 indexed concepts. Below is a complete table-by-table outline of this topic, spanning foundational concepts through advanced details.
Table 1: Random Sampling Fundamentals
| Technique | Example | Description |
|---|---|---|
u = rng.random()x = F_inv(u) | Generates samples from any distribution by applying the inverse CDF F^{-1} to uniform random numbers; works because F(X) \sim \text{Unif}(0,1) for any random variable X. | |
x = q.rvs()u = rng.random()if u <= f(x)/(M*q(x)): accept | Samples from target density f(x) using proposal q(x) with envelope M \cdot q(x) \geq f(x); accepts with probability \frac{f(x)}{M \cdot q(x)}. | |
samples = [f(x) for x in X]estimate = mean(samples) | Estimates \mathbb{E}[f(X)] by averaging f(X_i) over N independent samples; simplest Monte Carlo estimator with convergence rate O(N^{-1/2}). |