New: All the best for your Final Examination!!!!
•The word probability is used to indicate the likelihood that some event will happen. For example, ‘there is high probability that it will rain today’, solar and lunar eclipses by astronomical calculations, case of tossing a coin, population growth and fluctuations of prices of a share or a stock etc.
If an event can happen in ‘a’ ways, and fail to happen in ‘b’ ways, then the probability of its happening ‘p’ is given by :
Similarly, the probability of the failure of the event to happen denoted by ‘q’ will be :
Trial: The experiments whose results are not unique, even if repeated under identical conditions, are called trial. For example, tossing of a coin or throwing of dice are trials.
Event : Any possible outcome of a random experiment is called an event. For example, occurrence of head or tail in tossing a coin is an event. Similarly, a new born baby being a boy or a girl is also an event. In simple terms - performing an experiment is a trial and its outcome is an event.
Exhaustive Events (Sample Space): The total number of possible events in any trial is known as the exhaustive events. For example exhaustive events in :
Tossing a coin = 2 (H & T),
Tossing two coins = 4 (HH, HT, TH, TT),
Tossing n coins = 2n, and
In throwing n dice it is 6n
Favourable Events: It is the number of events which entail to the happening of the event in question. For example, the number of favourable events corresponding to the occurrence of at least one head in tossing of two coins is 3 (H1H2, H1T2, T1H2).
Equally Likely Events: Two or more events are said to be equally likely if one can not be expected to occur in preference to other. For example, in tossing a balanced coin, the events H or T are equally likely events, but for a coin waxed on one side, the events of occurrence of H and T are not equally likely.
Simple and Compound Events: The occurrence of a single event is known as simple event, but when two or more simple events occur in connection with each other, their joint occurrence is called the compound event.
For example, if form a pack of cards, a king is drawn and then a queen is drawn, then each of these events (drawing of a king or a queen) are simple event, and the event of joint occurrence of both, i.e., drawing of a king and a queen both, are a compound event.
Mutually Exclusive (Disjoint) Events: Events are said to be mutually exclusive when they can not occur simultaneously in the same experiment. In other words the occurrence of one event excludes the possibility of the occurrence of the others. If any one outcome happens, all the others fail to happen. All simple events are mutually exclusive.
For example, if a coin is tossed, either H or T can occur but not both. Similarly, in throwing a die, all the six faces are mutually exclusive.
Independent Events: A set of events is said to be independent if the occurrence of any event does not affect the chance of the occurrence of any other event of the set.
For example, when two coins are tossed together or separately, the occurrence of H or T in the first coin does not affect the probability of the occurrence of H or T in the second coin.
Dependent Events: Two or more events are said to be dependent if occurrence of one affects the occurrence of others
For example, if we draw a card from the pack of cards and then another card is drawn without replacing the first card drawn, then the result of second draw is dependent on the first draw. However, drawing of second after replacing the first card makes the second draw independent of first one.
Mathematical probability is based on the assumption that the sample space (exhaustive events) is finite and the elementary events are equi-probable (equally likely). It is obtained on the principle of indifference or insufficient reason.
In the cases of tossing a coin, throwing a die or drawing a card from the pack of cards, the probabilities are calculated on deductive reasoning even before any trial or experiment is conducted.
Since we have prior knowledge of the probabilities in such cases, it is also known as a priori probability. Here we obtain the probabilities of the events under the assumption that the outcomes of these are finite and events are equally likely.
We can define the classical probability as,:
If an experiment results in ‘n’ exhaustive, equally likely and mutually exclusive events and ‘m’ of them are favourable to an event (E), then the probability (p) of happening of the event (E) is the ratio of the number of favourable events to the exhaustive number of events.
Mathematically,
Further, since the number of events favourable to non-happening of the event (E) are n - m, the probability (q) of non-happening of event is given by
It is applied in most of the research fields. In many biological problems it is not possible to determine all the equally likely cases before actual trials are made.
Here the probability is determined from a set of observations, e.g., in a random sample of 100 individuals, 5 individuals are found to be of AB blood group giving us the p (AB group) = 5 /100 = 0.05. We can not get the probability of AB blood group individuals unless a random sample is studied. This estimate of probability is known as statistical or empirical probability.
sometimes also referred to as "OR" law
(a) When events are mutually exclusive
The probability of occurrence of one or the other event of a set of equally likely and mutually exclusive events is the sum of the separate probabilities of the occurrence of the separate events of the set. Thus,
For example, in case of tossing a coin the event H and T are equally likely and mutually exclusive events. The probability of getting head on tossing a coin is given by:
Similarly, the probability of getting tail is given as:
p (T) = ½p (T) = ½
Then the probability of getting H or T on tossing a coin is given by,
p(H or T) = p(H) + p(T) = ½ + ½ = 1
(b) When events are not mutually exclusive
If E1 and E2 are mutually not exclusive events, then the probability of occurrence of either of them is obtained as the sum of their individual probabilities minus the probability of their simultaneous occurrence. Therefore:
p( E1 or E2) = p(E1) + p(E2) - p(E1.E2)
Similarly for three events, E1, E2 and E3, we have:
p(E1 or E2 or E3) = p(E1) + p(E2) + p(E3) - p(E1E2) - p(E1E3) - p(E2E3) + p(E1E2E3)
sometimes also known as "AND" law
(a) When events are independent
The probability of simultaneous (or joint) occurrence of ‘n’ independent event is given by the product of their separate probabilities. Symbolically:
p(E1.E2….En) = p(E1) X p(E2) X….X p(En)
(b) When events are dependent
The probability of simultaneous occurrence of the two dependent events is given by the product of the occurrence of first event and the probability of occurrence of the second event, provided that the first event has already occurred.
Or, in other words,
The probability of simultaneous (or joint) occurrence of two dependent events E1 and E2 is given by the product of unconditional probability of event E1 and conditional probability of event E2 supposing that E1 has already occurred.
p(E1E2) = p(E1) X p(E2/E1)
p(E1E2…En)= p(E1) X p(E2/E1) X p(E3/E1E2) X ….. X p(En/E1, E2…. En - 1)
Here p(E2/E1) is called the conditional probability of E2 given E1 has already occurred.