Probability Formulas

Want to compute the Probability problems at a faster pace? Then the smartest way is to take help from the Probability Formulas. If you don’t remember the Basic Probability Formulas then check out the provided cheatsheet and tables over here. Also, you can refer to the below-furnished Probability Formulas list with examples for easy understanding of the concept.

List of Basic Probability Formulas with Examples

This is the section where you can find the Probability Formulas list. Have a glance at the list before you start computing probability calculations. If you memorize these Probability Formulas daily then you can simplify all probability problems easily & effortlessly.

1. Definitions

(i) Trial and Event:
An experiment is called a trial if it results in anyone of the possible outcomes and all the possible outcomes are called events.
i.e. Tossing of a fair coin is a trial and turning up head or tail are events.

(ii) Exhaustive Events:
Total possible outcomes of an experiment are called its exhaustive events.
i.e. Throwing of a die has 6 exhaustive cases because any one of six digits 1, 2, 3, 4, 5, 6 may come upward.

(iii) Favourable Events:
Those outcomes of a trial in which a given event may happen, are called favourable cases for that event, i.e. If a dice is thrown then favourable case for getting 1 or 2 or 3 or 4 or 5 or 6, is 1.

(iv) Equally likely events:
Two or more events are said to be equally likely events if they have same number of favourable cases.
i.e. In throwing of a dice, getting 1 or 2 or 3 or 4 or 5 or 6 are six equally likely events.

(v) Mutually exclusive or disjoint events:
Two or more events are said to be mutually exclusive, if the occurrence of one prevents or precludes the occurrence of the others. In other words they cannot occur together.
i.e. In throwing of a dice, getting 1 or 2 or 3 or 4 or 5 or 6 are six mutually exclusive events.

(vi) Simple and compound events:
If in any experiment only one event can happen at a time then it is called a simple event. If two or more events happen together then they constitute a compound event.

i.e. If we draw a card from a well shuffled pack of cards, then getting a queen of spade is a simple event and if two coins A and B are tossed together then getting ‘H’ from A and ‘T’ from B is a compound event.

(vii) Independent and Dependent Events:
Two or more events are said to be independent if happening of one does not affect other events. On the other hand if happening of one event affects (partially or totally) other event, then they are said to be [ dependent events.
Note:
Generally students find themselves in problem to distinguish between independent and mutually exclusive events and get confused. These events have the following differences-

  • Independent events are always taken from different experiments, while mutually exclusive events are from only one experiment.
  • Independent events can happen together but in mutually exclusive events one event may happen at one time.
  • Independent events are represented by the word “and” but mutually exclusive events are represented by the word “or”.

(viii) Sample Space:
The set of all possible outcomes of a trial is j called its sample space. It is generally denoted by S and each outcome of the trial is said to be a point of sample of S. i.e.

  • If a dice is thrown once, then its sample space S = {1, 2, 3, 4, 5, 6}
  • If two coins are tossed together then its sample space S= {HT, TH, HH, TT}.

2. Mathematical definition of Probability

Let there are n exhaustive, mutually exclusive and equally likely cases for an event A and m of those are favourable to it, then probability of happening of the event A is defined by the ratio m/n which is denoted by P(A). Thus
P(A) = \(\frac{m}{n}\) = \(\frac{\text { No. of favourable cases to } \mathrm{A}}{\text { No. of exhaustive cases to } \mathrm{A}}\)
Further, if \(\overline { A } \) denotes negative of A i.e. event that A doesn’t happen,
then for above cases m, n; we shall have
\(P(\bar{A})=\frac{n-m}{n}=1-\frac{m}{n}\) = 1 – P(A)
∴ P(A) + P(\(\overline { A } \)) = 1 & always 0 ≤ P(A) ≤ 1

3. Odds for an event

If an event A happens in m number of cases and if total number of exhaustive cases are n then we can say that
The probability of event A, P(A) = \(\frac{m}{n}\) and P(\(\overline { A } \)) = 1 – \(\frac{m}{n}=\frac{n-m}{n}\)
∴ odds in favour of A = \(\frac{P(A)}{P(\bar{A})}=\frac{m / n}{(n-m) / n}=\frac{m}{n-m}\)
and odds m against of A = \(\frac{P(\bar{A})}{P(A)}=\frac{(n-m) / n}{m / n}=\frac{n-m}{m}\)

4. Addition theorem of Probability

(i) When events are mutually exclusive:
If A and B are mutually exclusive events then
n (A ∩ B) = 0 ⇒ P (A ∩ B) = 0
∴ P (A ∪ B) = P (A) + P (B)

(ii) When events are not mutually exclusive:
If A & B are two events which are not mutually exclusive then
P(A ∪ B) = P (A) + P (B) – P (A ∩ B)
or
P(A + B) = P (A) + P (B) – P (AB)

5. Multiplication theorem of Probability

(i) When events are independent:
P(A/B) = P(A) and P(B/A) = P(B), then
P(A ∩ B) =P(A).P(B) or P(AB) = P(A). P(B)

(ii) When events are not independent:
P(A ∩ B)= P(A) . P(B/A) or P(B) . P(A/B)
or
P(AB) = P(A) . P(B/A) or P(B). P(A/B)

6. Probability of at least one of the n Independent Events

If p1, p2, P3, ….pn are the probabilities of n independent events A1, A2, A3……..An then the probability of happening of at least one of these event is
1 – [(1 – P1) (1 – P1) ……. (1 – Pn) ]
P(A1 + A2 + A3 + ….+ An) = 1 \(-P\left(\bar{A}_{1}\right) P\left(\bar{A}_{2}\right) P\left(\bar{A}_{3}\right) \ldots P\left(\bar{A}_{n}\right)\)

7. Conditional Probability

If A and B are dependent events, then the probability of B when A has happened is called conditional probability of B with respect to A and it is denoted by P (B/A). It may be seen that pfB). \(P\left(\frac{B}{A}\right)=\frac{P(A B)}{P(A)}\)

8. Binomial distribution for repeated trials

Let an experiment is repeated n times and probability of happening of any event called success is p and not happening the event called failure is q = 1- p then by binomial theorem,
(q + p)n = qn + nC1 qn-1 p +………+ nCr qn-r pr+ ….+ pn
Now probability of
(i) Occurrence of the event exactly r times = nCr qn-rpr
(ii) Occurrence of the event at least r times = nCr qn-rpr +……..+ pn
(iii) Occurrence of the event at the most r times = qn + nC1 qn-1 p + …+ nCrqn-r pr
(iv) Mean of the Binomial distribution = np
(v) Variance of the Binomial is “npq1 and standard deviation \(\sqrt{\mathrm{npq}}\)

9. Some Important results

(a) Let A and B be two events, then
(i) P (A) + P (\(\overline { A } \)) = 1

(ii) P (A + B) = 1 – P(\(\overline { A } \)\(\overline { B } \))

(iii) P (A/B) = \(\frac{P(A B)}{P(B)}\)

(iv) P(A + B) = P(AB) + P(\(\overline { A } \) B) + P(A \(\overline { B } \))

(v) A ⊂ B ⇒ P (A) ≤ P (B)

(vi) P (\(\overline { A } \) B) = P (B) – P (AB)

(vii) P (AB) ≤ P (A) P (B) ≤ P (A + B) ≤ P (A) + P (B)

(viii) P (AB) = P (A) + P (B) – P (A + B)

(ix) P (Exactly one event) = P(A \(\overline { B } \)) + P(\(\overline { A } \) B)

(x) P (\(\overline { A } \) + \(\overline { B } \)) = 1 – P (AB) = P (A) + P (B) – 2P (AB)
= P (A + B) – P (AB)

(xi) P (neither A nor B) = P (\(\overline { A } \) \(\overline { B } \)) = 1 – P (A + B)

(xii) If E1, E2, E3 are three events then
P(E1 ∪ E2 ∪ E3) = P(E1) + P(E2) + P(E3) – P(E1 ∩ E2) – P(E2 ∩ E3) – P(E3 ∩ E1) + P(E1 ∩ E2 ∩ E3)

(xiii) P(At least two of E1, E2, E3 occur)
= P (E1 ∩ E2) + P(E2 ∩ E3) + P(E3 ∩ E1) – 2P(E1 ∩ E2 ∩ E3)

(xiv) P(Exactly two of Ei, E2, E3 occur)
= P(E1 ∩ E2) + P(E2 ∩ E3) + P(E3 ∩ E1) – 3P(E1 ∩ E2 ∩ E3)

(xv) P(Exactly one of Ei, E2, E3 occur)
= P(E1) + P(E2) + P(E3) – 2P(E1 ∩ E2) – 2P(E2 ∩ E3) – 2P(E3 ∩ E1) + 3P (E1 ∩ E2 ∩ E3)

(xvi) Multinomial theorem:
If a die has “m” feces marked 1, 2, 3…….m and if such n dices are thrown, then probability that the sum of the number on the upper feces is equal to r is given by the coefficient of xr in \(\left(\frac{x+x^{2}+\ldots x^{m}}{m}\right)^{n}\)

(xvii) Total probability theorem:
The probability that one of several
mutually exclusive events A1, A2, A3,……..An will happen is the sum of the sum of the probabilities of the separate events, in symbol.
P(A1 + A2 + A3 +………+ An) = P(A1) + P(A2) +………..+ P(An)

(xviii) Let A1, A2……….An, An+1 be (n + 1) events such that
P(A1 ∩ A2 ∩……..∩ An) > 0 then
\(P\left(\bigcap_{j=1}^{n+1} A_{j}\right)=P\left(A_{1}\right) P\left(\frac{A_{2}}{A_{1}}\right) P\left(\frac{A_{3}}{A_{1} \cap A_{2}}\right) \ldots P\left(\frac{A_{n+1}}{A_{1} \cap A_{2} \ldots \cap A_{n}}\right)\)

(b) Number of exhaustive cases of tossing n coins simultaneously (or of tossing a coin n times) = 2n

(c) Number of exhaustive cases of throwing n dice simultaneously (or throwing one dice n times) = 6n

(d) Playing Cards

  • Total: 52 (26 red, 26 black)
  • Four suits: Heart, Diamond, Spade, Club – 13 cards each
  • Court Cards: 12 (4 Kings, 4 queens, 4 jacks)
  • Honour Cards: 16 (4 aces, 4 kings, 4 queens , 4 jacks)

(e) Probability regarding n letters and their envelopes:
If n letters corresponding to n envelopes are placed in the envelopes at random, then

  • Probability that all letters are in right envelopes = \(\frac{1}{n !}\)
  • Probability that all letters are not in right envelopes = 1 – \(\frac{1}{n !}\)
  • Probability that no letter is in right envelope
    = \(\frac{1}{2 !}-\frac{1}{3 !}+\frac{1}{4 !}-\ldots \ldots+(-1)^{n} \frac{1}{n}\)
  • Probability that exactly r letters are in right envelopes
    = \(\frac{1}{r !}\left[\frac{1}{2 !}-\frac{1}{3 !}+\frac{1}{4 !}-\ldots \ldots+(-1)^{n} \frac{1}{(n-r) !}\right]\)

* Rule of elimination
P(A) = P(B1) P(A/B1) + P(B2) P(A/B2) + …. + P(Bn) P(A/Bn)

* Total probability for compound events
Let A, B, C are mutually independent if in addition to the above P(A ∩ B ∩ C) = P(A) P(B) P(C)

* Baye’s theorem:
If E1, E2, E2……..En are n mutually exclusive and exhaustive events such that P(Ei) > 0, (0 ≤ i ≤ n) and E is any event then for 1 ≤ k ≤ n
\(P\left(\frac{E_{k}}{E}\right)=\frac{P\left(E_{k}\right) P\left(E / E_{k}\right)}{\sum_{k=1}^{n} P\left(E_{k}\right) P\left(E / E_{k}\right)}\)

Seek help from Onlinecalculator.guru to get a good grip on all math concepts formulas along with the Probability formulas.