Learn the mathematics of chance and use it to draw precise conclusions about possible outcomes of uncertain events. Analyze real-world data using mathematically rigorous techniques.
1.1.1. | Bayes' Theorem | |
1.1.2. | Extending Bayes' Theorem | |
1.1.3. | The Law of Total Probability (Extended) |
1.2.1. | Probability Density Functions of Continuous Random Variables | |
1.2.2. | Calculating Probabilities With Continuous Random Variables | |
1.2.3. | Continuous Random Variables Over Infinite Domains | |
1.2.4. | Cumulative Distribution Functions for Continuous Random Variables | |
1.2.5. | Median, Quartiles and Percentiles of Continuous Random Variables | |
1.2.6. | Finding the Mode of a Continuous Random Variable | |
1.2.7. | Approximating Discrete Random Variables as Continuous | |
1.2.8. | Simulating Random Observations |
1.3.1. | One-to-One Transformations of Discrete Random Variables | |
1.3.2. | Many-to-One Transformations of Discrete Random Variables | |
1.3.3. | The Distribution Function Method | |
1.3.4. | The Change-of-Variables Method for Continuous Random Variables | |
1.3.5. | The Distribution Function Method With Many-to-One Transformations |
1.4.1. | Convergence in Distribution | |
1.4.2. | Convergence in Probability | |
1.4.3. | Almost Sure Convergence |
2.5.1. | Expected Values of Discrete Random Variables | |
2.5.2. | Variance of Discrete Random Variables | |
2.5.3. | Properties of Expectation for Discrete Random Variables | |
2.5.4. | Properties of Variance for Discrete Random Variables | |
2.5.5. | Moments of Continuous Random Variables | |
2.5.6. | Expected Values of Continuous Random Variables | |
2.5.7. | Variance of Continuous Random Variables | |
2.5.8. | The Rule of the Lazy Statistician | |
2.5.9. | Skewness of Continuous Random Variables | |
2.5.10. | Skewness of Discrete Random Variables |
2.6.1. | Moment-Generating Functions | |
2.6.2. | Calculating Moments Using Moment-Generating Functions | |
2.6.3. | Calculating Variance and Standard Deviation Using Moment-Generating Functions | |
2.6.4. | Identifying Discrete Distributions From Moment Generating Functions | |
2.6.5. | Identifying Continuous Distributions From Moment-Generating Functions | |
2.6.6. | Properties of Moment-Generating Functions | |
2.6.7. | Further Properties of Moment-Generating Functions |
3.7.1. | The Discrete Uniform Distribution | |
3.7.2. | Mean and Variance of Discrete Uniform Distributions | |
3.7.3. | Modeling With Discrete Uniform Distributions |
3.8.1. | The Bernoulli Distribution | |
3.8.2. | Modeling With the Bernoulli Distribution | |
3.8.3. | Mean and Variance of the Bernoulli Distribution |
3.9.1. | The Binomial Distribution | |
3.9.2. | Modeling With the Binomial Distribution | |
3.9.3. | Mean and Variance of the Binomial Distribution | |
3.9.4. | The Cumulative Distribution Function of the Binomial Distribution |
3.10.1. | The Poisson Distribution | |
3.10.2. | Modeling With the Poisson Distribution | |
3.10.3. | Mean and Variance of the Poisson Distribution | |
3.10.4. | The Cumulative Distribution Function of the Poisson Distribution | |
3.10.5. | The Poisson Approximation of the Binomial Distribution |
3.11.1. | The Geometric Distribution | |
3.11.2. | Modeling With the Geometric Distribution | |
3.11.3. | Mean and Variance of the Geometric Distribution |
3.12.1. | The Negative Binomial Distribution | |
3.12.2. | Modeling With the Negative Binomial Distribution | |
3.12.3. | Mean and Variance of the Negative Binomial Distribution |
3.13.1. | The Hypergeometric Distribution | |
3.13.2. | Modeling With the Hypergeometric Distribution |
4.14.1. | The Continuous Uniform Distribution | |
4.14.2. | Mean and Variance of Continuous Uniform Distributions | |
4.14.3. | Modeling With Continuous Uniform Distributions |
4.15.1. | The Standard Normal Distribution | |
4.15.2. | Symmetry Properties of the Standard Normal Distribution | |
4.15.3. | The Z-Score | |
4.15.4. | The Normal Distribution | |
4.15.5. | Mean and Variance of the Normal Distribution | |
4.15.6. | Percentage Points of the Standard Normal Distribution | |
4.15.7. | Modeling With the Normal Distribution |
4.16.1. | Normal Approximations of Binomial Distributions | |
4.16.2. | The Normal Approximation of the Poisson Distribution |
4.17.1. | The Exponential Distribution | |
4.17.2. | Modeling With the Exponential Distribution | |
4.17.3. | Mean and Variance of the Exponential Distribution |
4.18.1. | The Chi-Square Distribution | |
4.18.2. | Computing Chi-Square Probabilities From the Normal Distribution | |
4.18.3. | The Student's T-Distribution |
4.19.1. | The Gamma Function | |
4.19.2. | The Gamma Distribution | |
4.19.3. | Modeling With the Gamma Distribution |
5.20.1. | Joint Distributions for Discrete Random Variables | |
5.20.2. | The Joint CDF of Two Discrete Random Variables | |
5.20.3. | Marginal Distributions for Discrete Random Variables | |
5.20.4. | Independence of Discrete Random Variables | |
5.20.5. | Conditional Distributions for Discrete Random Variables | |
5.20.6. | The Trinomial Distribution | |
5.20.7. | The Multinomial Distribution |
5.21.1. | Joint Distributions for Continuous Random Variables | |
5.21.2. | Marginal Distributions for Continuous Random Variables | |
5.21.3. | Independence of Continuous Random Variables | |
5.21.4. | Conditional Distributions for Continuous Random Variables | |
5.21.5. | The Joint CDF of Two Continuous Random Variables | |
5.21.6. | Properties of the Joint CDF of Two Continuous Random Variables | |
5.21.7. | The Bivariate Normal Distribution | |
5.21.8. | The Multivariate Normal Distribution |
5.22.1. | Linear Combinations of Binomial Random Variables | |
5.22.2. | Linear Combinations of Poisson Random Variables | |
5.22.3. | Linear Combinations of Chi-Square Random Variables | |
5.22.4. | Combining Two Normally Distributed Random Variables | |
5.22.5. | Combining Multiple Normally Distributed Random Variables | |
5.22.6. | I.I.D Normal Random Variables |
5.23.1. | Expected Values of Sums and Products of Random Variables | |
5.23.2. | Variance of Sums of Independent Random Variables | |
5.23.3. | Computing Expected Values From Joint Distributions | |
5.23.4. | Conditional Expectation for Discrete Random Variables | |
5.23.5. | The Law of Iterated Expectations | |
5.23.6. | Conditional Variance for Discrete Random Variables | |
5.23.7. | The Law of Total Variance |
5.24.1. | The Covariance of Two Random Variables | |
5.24.2. | Variance of Sums of Random Variables | |
5.24.3. | The Covariance Matrix | |
5.24.4. | The Correlation Coefficient for Two Random Variables | |
5.24.5. | Interpreting the Correlation Coefficient | |
5.24.6. | The Sample Covariance Matrix |
5.25.1. | The Change-of-Variables Method for Two Random Variables | |
5.25.2. | The Beta Distribution | |
5.25.3. | The F-Distribution |
5.26.1. | The Uniqueness Property of MGFs | |
5.26.2. | MGFs of Linear Combinations of Random Variables |
6.27.1. | The Sample Mean | |
6.27.2. | Statistics and Sampling Distributions | |
6.27.3. | The Sample Variance | |
6.27.4. | Variance of Sample Means | |
6.27.5. | Sample Means From Normal Populations | |
6.27.6. | The Central Limit Theorem | |
6.27.7. | Sampling Proportions From Finite Populations | |
6.27.8. | Point Estimates of Population Proportions | |
6.27.9. | Finite Population Correction Factors | |
6.27.10. | Applications of the Central Limit Theorem | |
6.27.11. | Distributions of Sample Variances |
6.28.1. | Biased vs. Unbiased Estimators | |
6.28.2. | Consistent Estimators |
6.29.1. | Spearman's Rank Correlation Coefficient | |
6.29.2. | Multiple Regression |
6.30.1. | Estimating a Mean | |
6.30.2. | Estimating a Proportion for a Large Population | |
6.30.3. | Estimating a Proportion for a Small Population |
6.31.1. | The Method of Moments |
6.32.1. | Likelihood Functions for Discrete Probability Distributions | |
6.32.2. | Log-Likelihood Functions for Discrete Probability Distributions | |
6.32.3. | Likelihood Functions for Continuous Probability Distributions | |
6.32.4. | Log-Likelihood Functions for Continuous Probability Distributions | |
6.32.5. | Maximum Likelihood Estimation | |
6.32.6. | Properties of Maximum Likelihood Estimators | |
6.32.7. | Consistency of Maximum Likelihood Estimators |
7.33.1. | One-Tailed Hypothesis Tests | |
7.33.2. | Two-Tailed Hypothesis Tests | |
7.33.3. | Type I and Type II Errors in Hypothesis Testing | |
7.33.4. | Hypothesis Tests For the Rate of a Poisson Distribution | |
7.33.5. | Hypothesis Tests For the Rate of a Poisson Distribution Using Critical Regions | |
7.33.6. | Hypothesis Tests For the Proportion of a Binomial Distribution | |
7.33.7. | Hypothesis Tests For the Proportion of a Binomial Distribution Using Critical Regions | |
7.33.8. | Hypothesis Tests for One Mean: Known Population Variance | |
7.33.9. | Hypothesis Tests for One Mean: Unknown Population Variance | |
7.33.10. | Hypothesis Tests Using the Poisson Approximation of the Binomial Distribution | |
7.33.11. | Hypothesis Tests Using the Normal Approximation of the Binomial Distribution | |
7.33.12. | Hypothesis Tests Using the Normal Approximation of the Poisson Distribution |
7.34.1. | The Size and Power of a Test | |
7.34.2. | The Power Function | |
7.34.3. | The Quality of Estimators |
7.35.1. | Hypothesis Tests for Two Means: Known Population Variances | |
7.35.2. | Hypothesis Tests for Two Means: Equal But Unknown Population Variances | |
7.35.3. | Hypothesis Tests for Two Means: Unequal and Unknown Population Variances | |
7.35.4. | Hypothesis Tests for Differences in Proportions | |
7.35.5. | Hypothesis Tests for Two Means: Paired-Sample Z-Test | |
7.35.6. | Hypothesis Tests for Two Means: Paired-Sample T-Test | |
7.35.7. | Hypothesis Testing With Correlation Coefficients |
8.36.1. | Confidence Intervals for One Mean: Known Population Variance | |
8.36.2. | Confidence Intervals for One Mean: Unknown Population Variance | |
8.36.3. | Confidence Intervals for Proportions | |
8.36.4. | Confidence Intervals for Proportions With Finite Populations | |
8.36.5. | Confidence Intervals for Variances | |
8.36.6. | Confidence Intervals for Slope Parameters in Linear Regression | |
8.36.7. | Confidence Intervals for Intercept Parameters in Linear Regression |
8.37.1. | Confidence Intervals for Two Means: Known Population Variances | |
8.37.2. | Confidence Intervals for Two Means: Equal But Unknown Population Variance | |
8.37.3. | Confidence Intervals for Two Means: Unequal and Unknown Population Variance | |
8.37.4. | Confidence Intervals for Differences in Proportions | |
8.37.5. | Confidence Intervals for Two Means: Paired-Sample Z-Test | |
8.37.6. | Confidence Intervals for Two Means: Paired-Sample T-Test |
9.38.1. | Introduction to Goodness of Fit | |
9.38.2. | Testing Discrete Uniform Distribution Models Using Goodness of Fit | |
9.38.3. | Testing Binomial Distribution Models Using Goodness-of-Fit: Part One | |
9.38.4. | Testing Binomial Distribution Models Using Goodness-of-Fit: Part Two | |
9.38.5. | Testing Poisson Distribution Models Using Goodness-of-Fit | |
9.38.6. | Testing Uniform Distribution Models Using Goodness-of-Fit | |
9.38.7. | Testing Normal Distribution Models Using Goodness-of-Fit |
9.39.1. | Tests for Homogeneity Using Contingency Tables | |
9.39.2. | Tests for Independence Using Contingency Tables |
9.40.1. | Order Statistics | |
9.40.2. | Confidence Intervals for Quantiles and Percentiles | |
9.40.3. | The Wilcoxon Tests | |
9.40.4. | Run Test and Test for Randomness | |
9.40.5. | Kolmogorov-Smirnov Goodness-of-Fit Test |
10.41.1. | Hypothesis Tests for One Variance | |
10.41.2. | Hypothesis Tests for Two Variances | |
10.41.3. | One-Factor Analysis of Variance | |
10.41.4. | Two-Factor Analysis of Variance |
11.42.1. | Posterior Distributions Under the Non-Informative Prior | |
11.42.2. | Posterior Distributions Under an Informative Prior | |
11.42.3. | Maximum a Posteriori Estimation |