Skip to content

Ch3: Conditional probability and independence

Conditional probability

Conditional probability is the probability of one event occurring with some relationship to one or more other events.

Definition conditional probablity

When A and B are events and P(B) > 0, then we define:

P(AB)=P(AB)P(B)

as the (conditional) probability of A under condition of B (or: the (conditional) probability of A given B).

The following Multiplication rule follows:

Muliplication Rule

Independent Events P(XY)=P(X)P(Y)

Dependent Events P(XY)=P(Y)P(XY)

Law of total probability and Bayes' rule

The law of total probability states that If {Si} is a partition of S such that P(Si)>0 for all i, then for each event A we have:

P(A)=P(AS1)P(S1)+P(AS2)P(S2)+=iP(ASi)P(Si)

Bayes' info

P(XY)=P(XY)P(Y)

Note that the Bayes' info can be used in combination with the multiplication rules for example with dependent events:

P(XY)=P(Y)P(XY)P(XY)P(Y)=P(Y)P(XY)P(Y)

Independence of events and random variables

Independence

The events 𝐴 and 𝐵 are independent when: P(AB)=P(A)P(B)

Bernoulli trials

A series of experiments is called Bernoulli experiments or trials if

  1. each experiment has two possible outcomes, often denoted with 'Success' and 'Failure',
  2. the experiments are independent and
  3. the probability of success is the same for each experiment.

From this follows the Binomial formula If X is the number of successes in n Bernoulli experiments with success probability p, then: P(X=k)=(nk)pk(1p)nk, where k=0,1,2,,n

If we conduct Bernoulli trials with success probability p until a success occurs and X is the number of required trials, then:

P(X=k)=(1p)k1p, where k=1,2,3,

Remember that $$p^{k}(1-p)^{n-k}$$ is the probability that the first k trials are successful and the last nk are failures and (nk) the number of possible orders of k successes and nk failures.