Probability – The Science of Uncertainty and Data

Overview

The world is full of uncertainty: accidents, storms, unruly financial markets, noisy communications. The world is also full of data. Probabilistic modeling and the related field of statistical inference are the keys to analyzing data and making scientifically sound predictions.

Probabilistic models use the language of mathematics. But instead of relying on the traditional “theorem-proof” format, we develop the material in an intuitive — but still rigorous and mathematically-precise — manner. Furthermore, while the applications are multiple and evident, we emphasize the basic concepts and methodologies that are universally applicable.

The course covers all of the basic probability concepts, including:

  • multiple discrete or continuous random variables, expectations, and conditional distributions
  • laws of large numbers
  • the main tools of Bayesian inference methods
  • an introduction to random processes (Poisson processes and Markov chains)

The contents of this courseare heavily based upon the corresponding MIT class — Introduction to Probability — a course that has been offered and continuously refined over more than 50 years. It is a challenging class but will enable you to apply the tools of probability theory to real-world applications or to your research.

This course is part of theMITx MicroMasters Program in Statistics and Data Science. Master the skills needed to be an informed and effective practitioner of data science. You will complete this course and three others from MITx, at a similar pace and level of rigor as an on-campus course at MIT, and then take a virtually-proctored exam to earn your MicroMasters, an academic credential that will demonstrate your proficiency in data science or accelerate your path towards an MIT PhD or a Master’s at other universities. To learn more about this program, please visit https://micromasters.mit.edu/ds/.

Syllabus

Unit 1: Probability models and axioms

  • Probability models and axioms
  • Mathematical background: Sets; sequences, limits, and series; (un)countable sets.

Unit 2: Conditioning and independence

  • Conditioning and Bayes’ rule
  • Independence

Unit 3: Counting

  • Counting

Unit 4: Discrete random variables

  • Probability mass functions and expectations
  • Variance; Conditioning on an event; Multiple random variables
  • Conditioning on a random variable; Independence of random variables

Unit 5: Continuous random variables

  • Probability density functions
  • Conditioning on an event; Multiple random variables
  • Conditioning on a random variable; Independence; Bayes’ rule

Unit 6: Further topics on random variables

  • Derived distributions
  • Sums of independent random variables; Covariance and correlation
  • Conditional expectation and variance revisited; Sum of a random number of independent random variables

Unit 7: Bayesian inference

  • Introduction to Bayesian inference
  • Linear models with normal noise
  • Least mean squares (LMS) estimation
  • Linear least mean squares (LLMS) estimation

Unit 8: Limit theorems and classical statistics

  • Inequalities, convergence, and the Weak Law of Large Numbers
  • The Central Limit Theorem (CLT)
  • An introduction to classical statistics

Unit 9: Bernoulli and Poisson processes

  • The Bernoulli process
  • The Poisson process
  • More on the Poisson process

Unit 10 (Optional): Markov chains

  • Finite-state Markov chains
  • Steady-state behavior of Markov chains
  • Absorption probabilities and expected time to absorption

Leave a Comment

Your email address will not be published. Required fields are marked *

Shopping Cart
  • Your cart is empty.