PROBABILITY

Chapter 1: Combinatorics

1.1 Factorials:

N! (called N factorial) is the product of the first N integers. It can be expressed as: \[N! = N \times (N - 1)\times (N - 2)\times (N - 3) \times ... \times 3 \times 2 \times 1\] example: \[4! = 4 \times 3 \times 2 \times 1 = 24\]

1.2 Permutations:

A permutation is the number of ways a set of objects can be ordered. The following example shows the permutations of the 4 objects 1,2,3 and 4: \[1\,2\,3\,4\;\;\;\;2\,1\,3\,4\;\;\;\;3\,1\,2\,4\;\;\;\;4\,1\,2\,3 \] \[1\,2\,4\,3\;\;\;\;2\,1\,4\,3\;\;\;\;3\,1\,4\,2\;\;\;\;4\,1\,3\,2 \] \[1\,3\,2\,4\;\;\;\;2\,3\,1\,4\;\;\;\;3\,2\,1\,4\;\;\;\;4\,2\,1\,3 \] \[1\,3\,4\,2\;\;\;\;2\,3\,4\,1\;\;\;\;3\,2\,4\,1\;\;\;\;4\,2\,3\,1 \] \[1\,4\,2\,3\;\;\;\;2\,4\,1\,3\;\;\;\;3\,4\,1\,2\;\;\;\;4\,3\,1\,2 \] \[1\,4\,3\,2\;\;\;\;2\,4\,3\,1\;\;\;\;3\,4\,2\,1\;\;\;\;4\,3\,2\,1 \] An interesting obserbation about these permutations is that if the first number is removed from each of the 4 present columns, we end up with the the six permutations of the other 3 numbers not including the number removed. The columns in this example can be thought of as a groups of permutations.

The general case of a permutation can be represented by: \[ P_N = N . P_{N - 1}\] or \[P_N = N!\]

1.3 Ordered Sets, Repetitions Allowed:

Imagine a box containing 5 balls A, B, C, D, E. You pull a ball out of the box, record its name and put it back in the box (repetition). You pull a second ball (it might be the same box as the first one), record its name next to the name of the first one (order) and put back. This case has two characteristics:

  1. The same element can appear more than once.
  2. The order in which the same element appears matters, meaning that \(A\,B\) is not the same \(B\,A\).

Rule: There are \(N^n\) different ordered sets of balls would result from \(n\) picks (trials)from a box containing \(N\) balls.

Examples: Flipping coins a number of times, rolling a dice.

1.4 Ordered Sets, Repetitions Not Allowed:

How many different sets of ordered \(n\) objects can be drawn from a set \(N\) of objects without repetition. This is partially similar to the previous case, except that you can't repeat the trials (You can't put a ball back in the box).

Rule: The number of ordered sets of \(n\) objects from \(N\) objects without repetition can be denoted by \(_NP_n\) and and be solved by the following pattern, Given that \(n \leq N\): \[ _NP_n = N(N - 1)(N - 2) ... (N - (n-1)) \] This is equivalent to: \[ _NP_n = {N! \over (N - n)!} \]

Examples: Drawing a committee of 4 people from 13 candidates where each member assumes a specific position. Once a candidate is picked, s/he can't be re-picked.

1.5 Unordered Sets, Repetitions Not Allowed:

How many different sets of unordered \(n\) objects can be drawn from a set of \(N\) objects without repetitions? This scenario builds on the previous one. Because the order doesn't matter \(ABC = ACB = BAC = CBA\). To eliminate the repeated sets containing the same objects in different orderings, we simply divide the result from 1.4 by \(n!\) as in: \[ {{_N P_n} \over {n!}} = {{N!}\over{n!(N-n)!}}\]

Rule: The number of unordered sets of \(n \) objects where repetition is not allowed, from a given set \(N \) is equivalent to the binomial coefficient which is denoteed by: \[ _NC_n = {\binom N n} = {{N!}\over{n!(N -n)!}} \] \(\binom N n\) stands for "\(N\) choose \(n\)"" while \( C\) in \( _NC_n\)stands for "Combinations".

Interesting observation: Any two binomial coefficients With \( N\) nominator whose \( n\)'s values add up to \( N\) have the same value, hence \[ {\binom 6 4} = {\binom 6 2} = 15\]

Easy way to count binomial coefficients: limit the number of permutations of the nominator by the value of the denominator. For example: \[ {\binom 6 4} \] the nominator here is 6 and the denominator is 4. This results in: \[ {{6\,.\,5\,.\,4\,.\,3}\over{4!}}\]

Examples:

1.7 Unordered Sets, Repetitions Allowed:

How many different sets of unordered \(n\) objects can be drawn from a set of \(N\) objects where repetitions are allowed? This not as common as the previous two scenarios. It can be solved by the stars and bars analogy, where stars represent the lines separating groups of similar object, and each star represents an objects: for example let's draw \( n = 5\) from set \(\{A, B, C\}\) whose \( N = 3\). One of the possibilities is: is: \[\,A, A, A, B, B\] This one can be represented by: \[\star\,\,\star\,\,\star\,\star\,|\star\,\,\star\,\,|\] and: \[ B,B,B,B,C\] can be represented by: \[ | \star\,\,\star\,\,\star\,\star\,|\star\,\,\star\,\,\] Notice that there is always one less bar than N, because obviously one bar is enough to separate two stars and 2 bars are enough to separate 3 stars. This stars and bars trick results in the follwing rule. Rule: The number of unordered sets of \(n \) objects with replacement from a given set \(N \) results from: \[ _NU_n = {\binom {n + (N - 1)} {N - 1} } \] Where U stands for "unordered".

Examples:

1.8 Binomial Coefficients:

1.8.1 Coins and Pascal Triangles:

Constructing a Pascal triangle can be achieved through successive incremental coin flips. The tip of the triangle is made by a zero flip, next to it a 1 flip.. etc. as we get farther from the two sides of the triangle (It's essentially an isosceles triangle), the number of ways of getting heads or tails increases and vice versa. If you flip a coin 6 times, there are 20 ways that 3 heads can be scored, while there is only 1 way that 1 or 6 heads can be gotten.

Basically Pascal triangle is pyramid of binomial coefficients. The nominators of these binomial coefficients increase as you move downwards. The denominators increase as you move from left to right. Each row of BC's has the same nominator. The denominators within that row increase from left to right.

There are \( \binom n k \) ways that \( k\) heads can come up in \( k\) coin flips! This also applies the committee, where a committee of \( k\) members can be drawn from a group of \( n\) people.

1.8.2 \( (a + b)^n\) and Pascal Triangles:

The sum of a row in PT (Pascal Triangle) is as follows: \[ {\binom n 0} + {\binom n 1} + {\binom n 2} + ... + {\binom n {n - 1}} + {\binom n n}\] This subsection is about binomial expansion, Algebra 2!! What the lambda!!?
Basically, BC is used to count the number of ways \( k\) can be chosen from \( n\) things, unordered and without repetitions.


\[ -------------------------\]

Chapter 2: Probability:

2.1 Preliminary:

Probability is represented by a fraction of the number of desired outcomes over the total number if possibilities outcomes as in: \[ P = { {number \,of \,desired \,outcomes} \over {total\, number\, of \,possibilities \,outcomes}}\] A lot of rambling! Averages and as the number of trials get larger and larger, the actual desired outcomes will get closer and closer to their probabilities.

2.2 Rules of Probability:

The rules of probability apply to multiple events rather than a single even like tossing a dice. These rules deal with questions like what's the likelihood of both events occurring or what's the probability of either event happening?

2.2.1 AND: The Intersection Probability or \( P(A \cap B)\):

Let A and B represent two events, something like tossing a coin twice. What is the probability that both events occur (joint probability)? The two events can be either independent or dependent

Independent events:

This includes rolling 2 dice. The two dice have no effect on each other. An other example is picking a card that's both a king and a heart.

Rule: If \( A\) and \( B\) are independent, then: \[ P(A \, and \, B) = P(A)\,.\,P(B)\] This is reminiscent of the ordered sets with repetitions allowed from chapter 1. It follows the exact same logic.

Dependent events:

This includes pulling a ball out of a box and NOT returning back, then pulling a second ball. If the first ball is blue and the second one is blue as well, the probability of the second ball is changed. Even \( B\) is affected by Event \( A\).

Rule: If \( A\) and \( B\) are dependent, then the probability of both of them occurring is: \[ P(A \, and \, B) = P(A)\,.\,P(B\,|\,A)\] where \( P(B\,|\,A)\) stands for the probability that \( B\) occurs. This is a conditional probability because a given condition is assumed, namely \( A\). It's read as "The probability of \( B\), given \( A\)".
Example: We have a box with 2 red balls and 3 blue balls. Picking a red ball as even A and then picking a blue ball as event B can be represented as: \[ P(Red_1\, and \, Blue_2) = P(Red_1) \,.\, P(Blue_2\,|\,Red_1)= {2 \over 5} . {3\over 4} \] Notice that the denominator increased after pulling the second ball.

This is reminiscent of the ordered sets without repetitions from chapter 1. It follows the exact same logic.

2.2.2 OR: The Union Probability, \( P(A \,or\, B)\) or \( P(A \cup B)\):

What is the probability of getting 1 or 3 or both,when rolling 2 dice? What is the probability of getting 1 or 3 or both,when rolling the same die? There two situation in this type, exclusive and nonexclusive events:

Exclusive Events:

Two events are exclusive if they preclude each other, meaning they can't both happen at the same time.
Rule: If two events \( A\) and \( B\) are exclusive, the probability that either of them occurs is the sum of their individual probabilities as in : \[P(A \,or\, B) = P(A) \,+\, P(B) \]

Nonexclusive Events:

Two events are nonexclusive if they can both happen, for example, when rolling two dice. Either A or B or both A and B happen. An example is rolling a die and getting all even numbers and multiples of 3. Because the two are nonexclusive, 6 is a result of both A and B.
Rule: If two events \( A\) and \( B\) are nonexclusive, the probability of either one or both of them occurring is equal to: \[P(A \,or\, B) = P(A) \,+\, P(B)\,-\, P(A \, and \, B)\] We subtract \( P(A \, and \, B)\) because, we don't want to double count it. It's enough that either \( A\) or \( B\) occurs.

(In)dependence and (Non)exclusiveness:

Independent and Exclusive: NO
Independent and Nonexclusive: YES
Dependent and Exclusive: YES
Dependent and Nonexclusive: YES

2.3 Examples:

It's claimed that only through examples probability can be understood

2.3.1 The Art of "Not":

It might be advisable in some situations to calculate the complement of a probability, i.e. \( P(not \, A)\) instead of \( P(A)\) to make life easier.
What is, for example, the probability of getting at least one 6 after rolling 3 dice? Instead of going through some byzantine voodoo, you can instead calculate its zero probability and subtract that from 1. We know that: \[ P(A) = 1 \, - \, P(not A) \] In our example we need to get the zero probability of get 6 after 3 rolls. This is done as in: \[ 1 - ({5 \over 6})^3 = {1- {125 \over 216}} = {91 \over 216}\]

2.3.2 Picking Seats:

Example 1 (Middle in the middle): We have 3 people and 3 chairs. What is the probability that the person with the middle height is seated in the middle seat? Suggested solutions include:

Ignoring the rest of examples for now and jumping directly to Bayes theorem.

Bayes' Theorem:

There are three forms of the famed Bayes' theorem:

The explicit form seems to b the more useful one, at least for one. The following example is probably a great explanation of how Bayes Theorem works.

False positive: A hospital conducts a test to see if a person has a certain disease. We know the following:

The question is: If the test is positive, what is the chance that the person has the disease?