Skip to content

Cheatsheet

Probability Basics

  • Probability of an event occuring,

  • For any event E,

    • \[ P(E) \geq \emptyset \]
    • \[ P(\Omega) = 1 \]
    • \[ P(E_1 \cup E_2 \cup ...) = \sum_{i = 1}^{\infty}P(E_i) \]
    • \[P(E) = 1 - P(E')\]
  • Total number of outcomes = \(x^y\) where \(x\) is number of possible outcomes and \(y\) is the number of tries

  • Probability of an event not occuring,

  • If F is a subset of E, then

Partition of State Space \(\Omega\)

  • If \(B\subset \Omega\) and \(A_1,A_2,\cdots, A_N\) is a partition of \(\Omega\) then,

  • If \(A_i\cap A_j= \emptyset\) then

  • Combining all these gives

Chain Rule

\[ P(X_1, X_2, ..., X_n) = P(X_1 | X_2,...,X_n)P(X_2,...,X_n) \]
\[ P(A,B,C) = P(A|B,C)P(B,C) \]
\[ = P(A|B,C)P(B|C)P(C) = P(B|A,C)P(C|A)P(A) \]

Odds

If probability \(P(E) = p\), then odds

Union and Intersection

  • A and B (\(P(A \cap B)\)) is the set of outcomes in both A and B, which implies and
  • A or B (\(P(A \cup B)\)) is the set of outcomes in A or B, which implies and
  • If Events A and B are mutually exclusive,
  • If A and B are complimentary events, and
  • Addition Rule

    \[ P(A \ or \ B) = P(A) + P(B) − P(A \ and \ B) \]
    \[ P(A \cup B) = P(A) + P(B) − P(A \cap B) \]
    • For mutually exclusive events
    • For any three events

Independent and Dependent Events

  • Multiplication Rule

    • For independent events,
    \[ P(A \ and \ B) = P(A \cap B) = P(A) P(B) \]
    \[ P(A \cup B) = P(A) + P(B) − P(A) P(B) \]
    \[ P(X, Y|Z) = P(X|Z)P(Y|Z) \]
    • For more than two independent events
    \[ P(A_1 \cap A_2 \cap...\cap A_n) = P(A_1) \cdot p(A_2) \cdot ... \cdot P(A_n) \]
    • For dependent events,
    \[ P(A \ and \ B) = P(A \cap B) = P(A) P(B|A) \]
    \[ P(A \cup B) = P(A) + P(B) − P(A) P(B|A) \]
    • For any three dependent events

Law of Total Probability

\[ P(Y) = P(Y \cap X) + P(Y \cap X') \]
\[ P(Y) = P(Y|X)P(X) + P(Y|!X)P(!X) \]

If \(X_1, X_2,...,X_n\) are mutually exclusive and exhaustive, then

For a continuous parameter \(\theta\) in the range \([a,b]\) and discrete random data x,

Bayes' Theorem

\[ P(A|B) = \frac{p(A \cap B)}{P(B)} = \frac{p(B|A) P(A)}{P(B)} = \frac{P(B|A)P(A)}{P(B|A)P(A) + P(B|!A)P(!A)}\]
  • For any three events

  • Bayesian Updating

    Posterior Probability = \(\frac{\text{Likelihood} \times \text{Prior}}{\text{Sum of products of Likelihood and Prior (also known as Normalizer)}}\)
    where Likelihood is P(Data|Hypothesis), i.e.P(B|A) and Posterior = P(Hypothesis|Data), i.e. P(A|B)

Bayes Factor

\[ \frac{P(D|H)}{P(D|H')} \]

where D is the Data and H is the hypothesis.

Beta Distribution

Probability Density Function (PDF) of the Beta Distribution is given by where \(x\) is the probability of success and \(\alpha\) and \(\beta\) are the prior number of success and failures respectively

Probability Distributions for Random Variables

Binomial Probability

where \(x\) is the exact number of times we want success, \(n\) is the number of independent trials and \(\binom nx\) is the combination \({}^{n}C_{x}\)

If \(X\) and \(Y\) are independent random variables and \(X \sim Bin(n,p)\) and \(Y \sim Bin(m,p)\), then

Geometric Probability

where \(n\) is the number of attempts required to get a success and \(p\) is the probability of success.

The probability of success in less than \(n\) attempts is given by

The probability of success in at most \(n\) attempts is given by

The probability of success in more than \(n\) attempts is given by

The probability of success in at least \(n\) attempts is given by

Poisson Probability

\[ P(x) = \frac{\lambda^x \cdot e^{-\lambda}}{x!} \]

where \(x\) is the number of events observed during any particular time interval of length \(t\), \(\lambda = \alpha t\) and \(\alpha\) is the rate of the event process, the expected number of events occuring in unit time

If \(n \to \infty\) and \(p \to 0\) in such a way that \(np\) approaches a value \(\lambda > 0\), i.e. in any binomial experiment in which n is large and p is small, \(b(x;n,p) \approx p(x; \lambda)\). This can be safely applied when \(n \gt 50, np \lt 5, \lambda = np\).

So we can express the Poisson Probability of a Binomial Random Variable as