image
image
image
image
image
image
image
image
image
image

Probability Formula Sheet

This page will help you to revise formulas and concepts of Probability instantly for various exams.
Shape 2
Shape 3
Shape 4
Shape 5
Shape 7
Shape 8
Shape 9
Shape 10

Probability is a mathematical concept that quantifies the likelihood or chance of an event occurring, expressed as a number between 0 and 1, where 0 indicates impossibility and 1 indicates certainty.

Neetesh Kumar | June 01, 2024                                       \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space \space Share this Page on: Reddit icon Discord icon Email icon WhatsApp icon Telegram icon

1. Some Basic Terms and Concepts:

(a) An Experiment: An action or operation resulting in two or more outcomes is called an experiment.

(b) Sample Space: The set of all possible outcomes of an experiment is called the sample space, denoted by SS. An element of SS is called a sample point.

(c) Event: Any subset of the sample space is an event.

(d) Simple Event: An event is simple if it is a singleton subset of the sample space SS.

(e) Compound Events: The joint occurrence of two or more simple events.

(f) Equally Likely Events: Several simple events are equally likely if there is no reason for one event to occur in preference to any other event.

(g) Exhaustive Events: All the possible outcomes taken together in which an experiment can result are said to be exhaustive.

(h) Mutually Exclusive or Disjoint Events: If two events cannot occur simultaneously, they are mutually exclusive. If AA and BB are mutually exclusive, then AB=A \cap B = \emptyset.

(i) Complement of an Event: The complement of an event AA, denoted by Aˉ,A or AC\bar{A}, A' \space or \space A^C, is the set of all sample points of the space other than the sample points in AA.

2. Mathematical Definition of Probability:

Let the outcomes of an experiment consist of nn exhaustive, mutually exclusive, and equally likely cases. Then, the sample space SS has nn sample points. If an event AA consists of mm sample points, (0mn0 \leq m \leq n), then the probability of event AA, denoted by P(A)P(A), is defined to be mn\frac{m}{n} i.e. P(A)=mnP(A) = \frac{m}{n}.

Let S={a1,a2,,an}S = \{ a_1, a_2, \ldots, a_n \} be the sample space

(a) P(S)=nn=1P(S) = \frac{n}{n} = 1 corresponding to the certain event.

(b) P()=0n=0P(\emptyset) = \frac{0}{n} = 0 corresponding to the null event \emptyset or impossible event.

(c) If Ai={ai}A_i = \{ a_i \}, i=1,,ni = 1, \ldots, n then AiA_i is the event corresponding to a single sample point aia_i. Then P(Ai)=1nP(A_i) = \frac{1}{n}.

(d) 0P(A)10 \leq P(A) \leq 1

3. Odds Against and Odds in Favour of an Event:

Let there be m+nm + n equally likely, mutually exclusive, and exhaustive cases out of which an event AA can occur in mm cases and does not occur in nn cases.
Then by definition, the probability of the occurrence of event A=P(A)=mm+nA = P(A) = \frac{m}{m+n}

The probability of non-occurrence of event A=P(A)=nm+nA = P(A') = \frac{n}{m+n}

P(A):P(A)=m:nP(A) : P(A') = m : n

Thus, the odds in favor of the occurrence of event AA are defined by m:nm: n i.e. P(A):P(A)P(A) : P(A), and the odds against the occurrence of event AA are defined by n:mn:m i.e. P(A):P(A)P(A) : P(A).

4. Addition Theorem:

(a) If AA and BB are any events in SS, then P(AB)=P(A)+P(B)P(AB)P(A \cup B) = P(A) + P(B) - P(A \cap B)
Since the probability of an event is a non-negative number, it follows that
P(AB)P(A)+P(B)P(A \cup B) \leq P(A) + P(B)

For three events AA, BB, and CC in SS, we have
P(ABC)=P(A)+P(B)+P(C)P(AB)P(BC)P(CA)+P(ABC)P(A \cup B \cup C) = P(A) + P(B) + P(C) - P(A \cap B) - P(B \cap C) - P(C \cap A) + P(A \cap B \cap C)

The general form of addition theorem (Principle of Inclusion-Exclusion):

For nn events A1,A2,A3,,AnA_1, A_2, A_3, \ldots, A_n in SS, we have

P(A1A2A3An)=P(A_1 \cup A_2 \cup A_3 \cup \ldots \cup A_n) =
iP(Ai)i<jP(AiAj)+i<j<kP(AiAjAk)+(1)n1P(A1A2An)\sum_{i} P(A_i) - \sum_{i < j} P(A_i \cap A_j) + \sum_{i < j < k} P(A_i \cap A_j \cap A_k) - \ldots + (-1)^{n-1} P(A_1 \cap A_2 \cap \ldots \cap A_n)

(b) If AA and BB are mutually exclusive, then P(AB)=0P(A \cap B) = 0 so that
P(AB)=P(A)+P(B)P(A \cup B) = P(A) + P(B).

5. Conditional Probability:

If AA and BB are any events in SS, then the conditional probability of BB relative to AA, i.e. the probability of occurrence of BB when AA has occurred, is given by

P(BA)=P(BA)P(A), if P(A)0P(\frac{B}{A}) = \frac{P(B \cap A)}{P(A)} \text{, if } P(A) \neq 0

6. Multiplication Theorem:

  • Independent Event:
    If AA and BB are two independent events, then the happening of BB will have no effect on AA.

  • When events are independent:
    P(AB)=P(A) and P(BA)=P(B)P(\frac{A}{B}) = P(A) \text{ and } P(\frac{B}{A}) = P(B)
    Then P(AB)=P(A)P(B)P(A \cap B) = P(A) \cdot P(B) OR P(AB)=P(A)P(B)P(A B) = P(A) \cdot P(B)

  • When events are not independent:
    The probability of the simultaneous happening of two events AA and BB is equal to the
    probability of AA multiplied by the conditional probability of BB with respect to AA
    (or the probability of BB multiplied by the conditional probability of AA with respect to BB) i.e.
    P(AB)=P(A)P(BA) or P(B)P(AB)P(A \cap B) = P(A) \cdot P(\frac{B}{A}) \text{ or } P(B) \cdot P(\frac{A}{B})

7. Total Probability Theorem:

Let an event AA of an experiment occur with its nn mutually exclusive and exhaustive events B1,B2,B3,,BnB_1, B_2, B_3, \ldots, B_n.
The total probability of the occurrence of event AA is:
P(A)=P(AB1)+P(AB2)++P(ABn)=i=1nP(ABi)P(A) = P(AB_1) + P(AB_2) + \ldots + P(AB_n) = \sum_{i=1}^{n} P(AB_i)
Thus,
P(A)=P(B1)P(AB1)+P(B2)P(AB2)++P(Bn)P(ABn)=i=1nP(Bi)P(ABi)P(A) = P(B_1) P(\frac{A}{B_1}) + P(B_2) P(\frac{A}{B_2}) + \ldots + P(B_n) P(\frac{A}{B_n}) = \sum_{i=1}^{n} P(B_i) P(\frac{A}{B_i})

8. Baye's Theorem or Inverse Probability:

Let A1,A2,,AnA_1, A_2, \ldots, A_n be nn mutually exclusive and exhaustive events of the sample space SS
and AA be an event that can occur with any of the events AiA_i.

P(AiA)=P(Ai)P(AAi)i=1nP(Ai)P(AAi)P(\frac{A_i}{A}) = \frac{P(A_i)P(\frac{A}{A_i})}{\sum_{i=1}^{n} P(A_i)P(\frac{A}{A_i})}

9. Binomial Distribution for Repeated Trials:

Binomial Experiment:
Any experiment with only two outcomes is known as a binomial experiment.
The outcomes of such an experiment are known as success and failure.
The probability of success is denoted by pp, and the probability of failure is denoted by qq. then,
p+q=1p + q = 1

If the binomial experiment is repeated nn times, then

(a) Probability of exactly rr successes in nn trials: P(X=r)=(nr)prqnrP(X = r) = \binom{n}{r} p^r q^{n-r}

(b) Probability of at most rr successes in nn trials: P(Xr)=l=0r(nl)plqnlP(X \leq r) = \displaystyle\sum_{l=0}^r \binom{n}{l} p^l q^{n-l}

(c) Probability of at least rr successes in nn trials: P(Xr)=l=rn(nl)plqnlP(X \geq r) = \displaystyle\sum_{l=r}^n \binom{n}{l} p^l q^{n-l}

(d) Probability of having 1st success at the rr-th trial: P(X=r)=pqr1P(X = r) = p q^{r-1}

The binomial distribution's mean, variance, and standard deviation are npnp, npqnpq, and npq\sqrt{npq} respectively.

Note:
(p+q)n=(n0)qn+(n1)pqn1++(nr)prqnr++(nn)pn(p + q)^n = \binom{n}{0} q^n + \binom{n}{1} pq^{n-1} + \ldots + \binom{n}{r} p^r q^{n-r} + \ldots + \binom{n}{n} p^n

10. Poisson Distribution:

Poisson Distribution:
The limiting form of binomial distribution under the following conditions is known as Poisson distribution:

(a) The number of trials is indefinitely large i.e. nn \rightarrow \infty.

(b) Probability of success for each trial is indefinitely small i.e. p0p \rightarrow 0.

(c) np = λ\lambda, a finite positive number.

Then, the Poisson distribution is given by: P(X=r)=eλλrr!P(X = r) = \frac{e^{-\lambda} \lambda^r}{r!}

The Poisson distribution's mean, variance, and standard deviation are λ\lambda, λ\lambda, and λ\sqrt{\lambda} respectively.

11. Continuous Probability Distributions:

(a) Uniform Distribution: A continuous random variable XX is said to follow a uniform distribution over (a,b)(a, b) if its probability density function is given by

f(x)=1ba for axb and f(x)=0 elsewhere.f(x) = \frac{1}{b-a} \text{ for } a \leq x \leq b \text{ and } f(x) = 0 \text{ elsewhere.}

The mean and variance of the uniform distribution are a+b2\frac{a+b}{2} and (ba)212\frac{(b-a)^2}{12} respectively.

(b) Normal Distribution: A continuous random variable XX is said to follow a normal distribution with parameters μ\mu and σ2\sigma^2 if its probability density function is given by

f(x)=1σ2πe(xμ)22σ2 for <x<.f(x) = \frac{1}{\sigma \sqrt{2\pi}} e^{-\frac{(x-\mu)^2}{2\sigma^2}} \text{ for } -\infty < x < \infty.

The normal distribution's mean, variance, and standard deviation are μ\mu, σ2\sigma^2, and σ\sigma respectively.

  • A Probability Distribution spells out how a total probability of 1 is distributed over several values of a random variable.
  • If p represents a person’s chance of success in any venture and ‘M’ the sum of money which he will receive in case of success, then his expectations or probable value = pM
  • Mean of Binomial Probability Distribution (BPD) = np ; variance of BPD = npq.

12. Standard Normal Distribution:

A normal distribution with a mean of 0 and a standard deviation of 1 is called a standard normal distribution. Its probability density function is given by
ϕ(x)=12πex22 for <x<.\phi(x) = \frac{1}{\sqrt{2\pi}} e^{-\frac{x^2}{2}} \text{ for } -\infty < x < \infty.

If XX follows a normal distribution with mean μ\mu and standard deviation σ\sigma,
then the random variable Z=XμσZ = \frac{X - \mu}{\sigma} follows a standard normal distribution.

13. Exponential Distribution:

A continuous random variable XX is said to follow an exponential distribution with parameter λ\lambda if its probability density function is given by
f(x)=λeλx for x0 and f(x)=0 for x<0.f(x) = \lambda e^{-\lambda x} \text{ for } x \geq 0 \text{ and } f(x) = 0 \text{ for } x < 0.

The mean and variance of the exponential distribution are 1λ\frac{1}{\lambda} and 1λ2\frac{1}{\lambda^2} respectively.

14. Some Important Points to Note:

(a) Let A and B be two events, then

  • P(A)+P(Aˉ)=1P(A) + P(\bar{A})=1
  • P(A+B)=1P(AˉBˉ)P(A+B)= 1 - P(\bar{A}\bar{B})
  • P(AB)=P(AB)P(B)P(\frac{A}{B}) = \frac{P(AB)}{P(B)}
  • P(A+B)=P(AB)+P(AˉB)+P(ABˉ)P(A+B) = P(AB) + P(\bar{A}B) + P(A\bar{B})
  • AB    P(A)P(B)A \subset B \iff P(A) \le P(B)
  • P(AˉB)=P(B)P(AB)P(\bar{A}B) = P(B) - P(AB)
  • P(AB)P(A)P(B)P(A+B)P(A)+P(B)P(AB) \le P(A)P(B) \le P(A+B) \le P(A) + P(B)
  • P(AB)=P(A)+P(B)P(A+B)P(AB) = P(A) + P(B) - P(A + B)
  • P(Exactly one event) = P(AˉB)+P(ABˉ)=P(A)+P(B)2P(AB)=P(A+B)P(AB)P(\bar{A}B) + P(A\bar{B}) =P(A) + P(B)-2P(AB) = P(A+B)-P(AB)
  • P(Neither A nor B) = P(AˉBˉ)=1P(A+B)P(\bar{A}\bar{B}) = 1 - P(A+B)
  • P(Aˉ+Bˉ)=1P(AB)P(\bar{A} + \bar{B}) = 1 - P(AB)

(b) Number of exhaustive cases of tossing n coins simultaneously (or of tossing a coin n times) = 2n^n

(c) Number of exhaustive cases of throwing n dice simultaneously (or throwing one dice n times) = 6n^n

(d) Playing Cards

  • Total Cards : 52(26 red, 26 black)
  • Four suits: Heart, Diamond, Spade, Club - 13 cards each
  • Court Cards: 12 (4 Kings, 4 queens, 4 jacks)
  • Honour Cards: 16 (4 aces, 4 kings, 4 queens, 4 jacks)

(e) Probability regarding n letters and their envelopes:
If n letters are placed into n directed envelopes at random, then

  • Probability that all letters are in the right envelopes = 1n!\frac{1}{n!}
  • Probability that all letters are not in right envelopes = 11n!1-\frac{1}{n!}
  • Probability that no letters is in right envelopes = 12!13!+14!....+(1)n1n!\frac{1}{2!} - \frac{1}{3!} + \frac{1}{4!}- .... + (-1)^n\frac{1}{n!}
  • Probability that exactly r letters are in right envelopes = 1r![12!13!+14!....+(1)nr1(nr)!]\frac{1}{r!}[\frac{1}{2!} - \frac{1}{3!} + \frac{1}{4!}- .... + (-1)^{n-r}\frac{1}{(n-r)!}]

Related Pages:\color{red} \bold{Related \space Pages:}
Permutation Formula Sheet
Relation Formula Sheet
Vector operation Calculators
Vector Formula sheet