Renormalize files
This commit is contained in:
@ -1,66 +1,66 @@
|
||||
Lecture Topic: Probability
|
||||
|
||||
Experiment: An act where the outcome is subject to uncertainty
|
||||
|
||||
Sample space: The set of all possible outcomes of an experiment
|
||||
Example:
|
||||
- Flip a coin: $S = \{H, T\}$
|
||||
- Throw a dice: $S = \{1, 2, 3, 4, 5, 6\}$
|
||||
|
||||
Events: An event is a collection (subset) of outcomes contained in the sample space S
|
||||
$$
|
||||
S =
|
||||
\begin{Bmatrix}
|
||||
1,1 & 1,2 & 1,3 & 1,4 & 1,5 & 1,6 \\
|
||||
2,1 & 2,2 & 2,3 & 2,4 & 2,5 & 2,6 \\
|
||||
3,1 & 3,2 & 3,3 & 3,4 & 3,5 & 3,6 \\
|
||||
4,1 & 4,2 & 4,3 & 4,4 & 4,5 & 4,6 \\
|
||||
5,1 & 5,2 & 5,3 & 5,4 & 5,5 & 5,6 \\
|
||||
6,1 & 6,2 & 6,3 & 6,4 & 6,5 & 6,6
|
||||
\end{Bmatrix}
|
||||
$$
|
||||
A = First and second elements are the same
|
||||
$A = \{(1,1) \ (2,2) \ (3,3) \ (4,4) \ (5,5) \ (6,6)\}$
|
||||
|
||||
Union: Union of two events, A and B, ($A \cup B$)
|
||||
|
||||
Compliment: The compliment of an event A, is the set of all outcomes in S that are not in A
|
||||
|
||||
Mutually Exclusive Events: When and B have no common outcomes, they are said to be mutually exclusive or disjoint events, i.e. $P=(A \cap B) = 0$
|
||||
|
||||
Axiom:
|
||||
1. For event A, P(A) >= 0
|
||||
2. P(S) = 1
|
||||
3. (Need to look at slides to correct)
|
||||
(a) If $A_1, A_2 ... A_k$ is a finite collection of mutually exclusive events then: $$P(A_1 \cup A_2 \cup ... \cup A_k) = \sum^{\infty}_{i=1} P(A_i) $$
|
||||
(b) If $A_1, A_2 ... A_k$ is a finite collection of mutually exclusive events then: $$P(A_1 \cup A_2 \cup ... \cup A_k) = \sum^{\infty}_{i=1} P(A_i) $$
|
||||
Proposition:
|
||||
1. For event A, P(A) = 1 - P(A')
|
||||
2. If and and B are mutually exclusive then $P(A \cap B) = 0$
|
||||
|
||||
Permutation: Any ordered sequence of k objects taken from a set of n distinct objects is called a permutation of size k of the objects. The number of permutations of size k that can be constructed from n objects is denoted by $P_{k,n}$ and calculated by:
|
||||
$$
|
||||
P_{k,n} = \frac{n!}{(n-k)!}
|
||||
$$
|
||||
|
||||
Combination: Given a set of n, any unordered sub-set of k of the objects is called a combination. The set of combinations of size k that can be formed from n distinct objected will be denoted by $C_{k,n}$ and calculated by:
|
||||
$$
|
||||
C_{k,n} = \frac{n!}{k!(n-k)!} = \begin{pmatrix}n \\ k \end{pmatrix}
|
||||
$$
|
||||
|
||||
(NEED TO REVIEW THIS)
|
||||
|
||||
Example: S = {1, 2, 3, 4, 5}
|
||||
$A_1$ = {1,2,3}, $A_2$ = {2,3,5}, $A_3$ = {2,3,1}
|
||||
|
||||
Permutation:
|
||||
5! = 120
|
||||
(5-3)! = 2! = 2
|
||||
120 / 2 = 60
|
||||
|
||||
Combination:
|
||||
5!/2!3! = 60/6 = 10
|
||||
0! = 1
|
||||
5 out of 5
|
||||
5!/(5-5)! = 120/1 = 120
|
||||
|
||||
Lecture Topic: Probability
|
||||
|
||||
Experiment: An act where the outcome is subject to uncertainty
|
||||
|
||||
Sample space: The set of all possible outcomes of an experiment
|
||||
Example:
|
||||
- Flip a coin: $S = \{H, T\}$
|
||||
- Throw a dice: $S = \{1, 2, 3, 4, 5, 6\}$
|
||||
|
||||
Events: An event is a collection (subset) of outcomes contained in the sample space S
|
||||
$$
|
||||
S =
|
||||
\begin{Bmatrix}
|
||||
1,1 & 1,2 & 1,3 & 1,4 & 1,5 & 1,6 \\
|
||||
2,1 & 2,2 & 2,3 & 2,4 & 2,5 & 2,6 \\
|
||||
3,1 & 3,2 & 3,3 & 3,4 & 3,5 & 3,6 \\
|
||||
4,1 & 4,2 & 4,3 & 4,4 & 4,5 & 4,6 \\
|
||||
5,1 & 5,2 & 5,3 & 5,4 & 5,5 & 5,6 \\
|
||||
6,1 & 6,2 & 6,3 & 6,4 & 6,5 & 6,6
|
||||
\end{Bmatrix}
|
||||
$$
|
||||
A = First and second elements are the same
|
||||
$A = \{(1,1) \ (2,2) \ (3,3) \ (4,4) \ (5,5) \ (6,6)\}$
|
||||
|
||||
Union: Union of two events, A and B, ($A \cup B$)
|
||||
|
||||
Compliment: The compliment of an event A, is the set of all outcomes in S that are not in A
|
||||
|
||||
Mutually Exclusive Events: When and B have no common outcomes, they are said to be mutually exclusive or disjoint events, i.e. $P=(A \cap B) = 0$
|
||||
|
||||
Axiom:
|
||||
1. For event A, P(A) >= 0
|
||||
2. P(S) = 1
|
||||
3. (Need to look at slides to correct)
|
||||
(a) If $A_1, A_2 ... A_k$ is a finite collection of mutually exclusive events then: $$P(A_1 \cup A_2 \cup ... \cup A_k) = \sum^{\infty}_{i=1} P(A_i) $$
|
||||
(b) If $A_1, A_2 ... A_k$ is a finite collection of mutually exclusive events then: $$P(A_1 \cup A_2 \cup ... \cup A_k) = \sum^{\infty}_{i=1} P(A_i) $$
|
||||
Proposition:
|
||||
1. For event A, P(A) = 1 - P(A')
|
||||
2. If and and B are mutually exclusive then $P(A \cap B) = 0$
|
||||
|
||||
Permutation: Any ordered sequence of k objects taken from a set of n distinct objects is called a permutation of size k of the objects. The number of permutations of size k that can be constructed from n objects is denoted by $P_{k,n}$ and calculated by:
|
||||
$$
|
||||
P_{k,n} = \frac{n!}{(n-k)!}
|
||||
$$
|
||||
|
||||
Combination: Given a set of n, any unordered sub-set of k of the objects is called a combination. The set of combinations of size k that can be formed from n distinct objected will be denoted by $C_{k,n}$ and calculated by:
|
||||
$$
|
||||
C_{k,n} = \frac{n!}{k!(n-k)!} = \begin{pmatrix}n \\ k \end{pmatrix}
|
||||
$$
|
||||
|
||||
(NEED TO REVIEW THIS)
|
||||
|
||||
Example: S = {1, 2, 3, 4, 5}
|
||||
$A_1$ = {1,2,3}, $A_2$ = {2,3,5}, $A_3$ = {2,3,1}
|
||||
|
||||
Permutation:
|
||||
5! = 120
|
||||
(5-3)! = 2! = 2
|
||||
120 / 2 = 60
|
||||
|
||||
Combination:
|
||||
5!/2!3! = 60/6 = 10
|
||||
0! = 1
|
||||
5 out of 5
|
||||
5!/(5-5)! = 120/1 = 120
|
||||
|
||||
|
@ -1,2 +1,2 @@
|
||||
Lecture Topic:
|
||||
|
||||
Lecture Topic:
|
||||
|
||||
|
@ -1,5 +1,5 @@
|
||||
Lecture Topic: Conditional Probability
|
||||
|
||||
\| (vertical bar) = "given that"
|
||||
e.g. P(A|B) = probability of A given that B
|
||||
# Question 3
|
||||
Lecture Topic: Conditional Probability
|
||||
|
||||
\| (vertical bar) = "given that"
|
||||
e.g. P(A|B) = probability of A given that B
|
||||
# Question 3
|
||||
|
@ -1,17 +1,17 @@
|
||||
Lecture Topic: Random Variable
|
||||
|
||||
Random Variable: Variable with a probability attributed to it
|
||||
Regular Variable: Variable that is more intrinsic, like height
|
||||
Discrete Variable: Variable where each possible value is a finite set or is an infinite set where each element is it's own distinct element, eg first element, second element etc.
|
||||
|
||||
Probability Mass Function (pmf): The pmf of a discrete rv is defined for every number x by p(x) = P(X=x)
|
||||
|
||||
Mass function criteria:
|
||||
- All probabilities have to been between 0 and 1
|
||||
- The total probabilities have to equal 1
|
||||
|
||||
Expected Value: The summation of the value and the probability of x over the entire set
|
||||
Var?: Variance calculate the expected value but square the variable, then subtract the squared expected value
|
||||
|
||||
|
||||
|
||||
Lecture Topic: Random Variable
|
||||
|
||||
Random Variable: Variable with a probability attributed to it
|
||||
Regular Variable: Variable that is more intrinsic, like height
|
||||
Discrete Variable: Variable where each possible value is a finite set or is an infinite set where each element is it's own distinct element, eg first element, second element etc.
|
||||
|
||||
Probability Mass Function (pmf): The pmf of a discrete rv is defined for every number x by p(x) = P(X=x)
|
||||
|
||||
Mass function criteria:
|
||||
- All probabilities have to been between 0 and 1
|
||||
- The total probabilities have to equal 1
|
||||
|
||||
Expected Value: The summation of the value and the probability of x over the entire set
|
||||
Var?: Variance calculate the expected value but square the variable, then subtract the squared expected value
|
||||
|
||||
|
||||
|
||||
|
@ -1,24 +1,24 @@
|
||||
Lecture Topic: Binomial Distribution
|
||||
|
||||
# Requirements of Binomial Experiments
|
||||
- (n) independent trials
|
||||
- Possible outcomes: success (S) and failure (F)
|
||||
- Success probability (p)
|
||||
|
||||
## Formula
|
||||
The pmf of binomial rv $X$ depends on two parameters $n$ and $p$. We denote the pmf by $b(x; n,p)$
|
||||
$$b(x;n,p) = \{
|
||||
\begin{pmatrix}
|
||||
n \\
|
||||
p \\
|
||||
\end{pmatrix}
|
||||
p^x(1-p)^{n-x}
|
||||
\}$$
|
||||
$x = 0, 1, 2, ..., n$
|
||||
|
||||
If X ~ b(x; n,p), then
|
||||
1. E(X) = np
|
||||
2. V(X) = np(1-p)
|
||||
|
||||
# Examples
|
||||
Lecture Topic: Binomial Distribution
|
||||
|
||||
# Requirements of Binomial Experiments
|
||||
- (n) independent trials
|
||||
- Possible outcomes: success (S) and failure (F)
|
||||
- Success probability (p)
|
||||
|
||||
## Formula
|
||||
The pmf of binomial rv $X$ depends on two parameters $n$ and $p$. We denote the pmf by $b(x; n,p)$
|
||||
$$b(x;n,p) = \{
|
||||
\begin{pmatrix}
|
||||
n \\
|
||||
p \\
|
||||
\end{pmatrix}
|
||||
p^x(1-p)^{n-x}
|
||||
\}$$
|
||||
$x = 0, 1, 2, ..., n$
|
||||
|
||||
If X ~ b(x; n,p), then
|
||||
1. E(X) = np
|
||||
2. V(X) = np(1-p)
|
||||
|
||||
# Examples
|
||||
Examples in posted pdf
|
@ -1,5 +1,5 @@
|
||||
Lecture Topic: Poisson
|
||||
|
||||
Was too interested in configuring neovim, lmao
|
||||
|
||||
Lecture Topic: Poisson
|
||||
|
||||
Was too interested in configuring neovim, lmao
|
||||
|
||||
Look at slides for info, this one seemed important
|
@ -1,3 +1,3 @@
|
||||
Lecture Topic:
|
||||
|
||||
Lecture Topic:
|
||||
|
||||
Listened to lecturer and didn't take notes
|
@ -1,9 +1,9 @@
|
||||
Lecture Topic: Continious r.v
|
||||
|
||||
Midterm Review session by math learning center
|
||||
Midterm review will be on feb 16, friday
|
||||
|
||||
For evaluating the probability of a distribution of a continious random variable X, you need to integrate the probability
|
||||
|
||||
The probability of a function from - infinity to + infinity must equal 1
|
||||
|
||||
Lecture Topic: Continious r.v
|
||||
|
||||
Midterm Review session by math learning center
|
||||
Midterm review will be on feb 16, friday
|
||||
|
||||
For evaluating the probability of a distribution of a continious random variable X, you need to integrate the probability
|
||||
|
||||
The probability of a function from - infinity to + infinity must equal 1
|
||||
|
||||
|
@ -1,8 +1,8 @@
|
||||
Lecture Topic: Some Questions
|
||||
Lecture will be short as lecturer needs to leave at 10:00
|
||||
|
||||
First thing to do is to convert x to Z, for solving a standard deviation problem
|
||||
|
||||
$$Z = \frac{x-\mu}{\delta}$$
|
||||
|
||||
Lecture Topic: Some Questions
|
||||
Lecture will be short as lecturer needs to leave at 10:00
|
||||
|
||||
First thing to do is to convert x to Z, for solving a standard deviation problem
|
||||
|
||||
$$Z = \frac{x-\mu}{\delta}$$
|
||||
|
||||
Mean is 0 and variance is 1 is the conditions for normal distribution (?) Something standard deviation must be 1 (?) Look into
|
@ -1 +1 @@
|
||||
Lecture Topic: Binomial Distribution
|
||||
Lecture Topic: Binomial Distribution
|
||||
|
@ -1,36 +1,36 @@
|
||||
Lecture Topic: Midterm Review 1
|
||||
|
||||
Will be Wednesday, Feb 21, 2024
|
||||
Covers Lectures 1-13, Chapters 1-3
|
||||
All multiple choice, 18-20 Questions for 50 mins, 16-17 as it is 45 minutes
|
||||
45 Minutes, Show up early
|
||||
Choose closest answer if answer does not line up
|
||||
YOU NEED A CALCULATOR FOR THIS
|
||||
|
||||
For your formula sheet, you can put anything on it, 1 regular size sheet (11x8.5), both sides are allowed to be written on.
|
||||
|
||||
Questions from base theorem will likely take longer than others, so look for questions that are easier to solve first
|
||||
|
||||
## Basic Statistics
|
||||
Mean - $\frac{\sum x}{n}$
|
||||
Median - Middle Most value, even is the average of two middle values
|
||||
Mode - Most frequent value
|
||||
Quartile
|
||||
- Q1 - 25%, In between 0 and median
|
||||
- Q2 = Median
|
||||
- Q3 = 75% In Between median and 100
|
||||
- Q4 = 100%
|
||||
|
||||
## Variance
|
||||
IQR - Inter quartile range - Q3 - Q1
|
||||
Variance
|
||||
- $\frac{1}{n-1} \sum (x_i - x)^2$ or $\frac{\sum(x_i - x)^2}{n-1}$
|
||||
- $\frac{1}{n-1} \sum (x_i^2) - nx^2$ Ask about this, didn't quite catch on board
|
||||
- must be positive
|
||||
std deviation - Square root of variance, $\sqrt{V(x)}$
|
||||
Upper and lower fence - Outlier limits
|
||||
- UF = Q3 + 15 IQR Ask about this, could have been 1 times 5, board was messy
|
||||
- LF = Q1- 15 IQR Ask about this, could have been 1 times 5, board was messy
|
||||
Z score - $\frac{x - \mu}{\sigma}$
|
||||
Coefficient of variation / CV = $\frac{\mu}{\sigma} * 100$
|
||||
Lecture Topic: Midterm Review 1
|
||||
|
||||
Will be Wednesday, Feb 21, 2024
|
||||
Covers Lectures 1-13, Chapters 1-3
|
||||
All multiple choice, 18-20 Questions for 50 mins, 16-17 as it is 45 minutes
|
||||
45 Minutes, Show up early
|
||||
Choose closest answer if answer does not line up
|
||||
YOU NEED A CALCULATOR FOR THIS
|
||||
|
||||
For your formula sheet, you can put anything on it, 1 regular size sheet (11x8.5), both sides are allowed to be written on.
|
||||
|
||||
Questions from base theorem will likely take longer than others, so look for questions that are easier to solve first
|
||||
|
||||
## Basic Statistics
|
||||
Mean - $\frac{\sum x}{n}$
|
||||
Median - Middle Most value, even is the average of two middle values
|
||||
Mode - Most frequent value
|
||||
Quartile
|
||||
- Q1 - 25%, In between 0 and median
|
||||
- Q2 = Median
|
||||
- Q3 = 75% In Between median and 100
|
||||
- Q4 = 100%
|
||||
|
||||
## Variance
|
||||
IQR - Inter quartile range - Q3 - Q1
|
||||
Variance
|
||||
- $\frac{1}{n-1} \sum (x_i - x)^2$ or $\frac{\sum(x_i - x)^2}{n-1}$
|
||||
- $\frac{1}{n-1} \sum (x_i^2) - nx^2$ Ask about this, didn't quite catch on board
|
||||
- must be positive
|
||||
std deviation - Square root of variance, $\sqrt{V(x)}$
|
||||
Upper and lower fence - Outlier limits
|
||||
- UF = Q3 + 15 IQR Ask about this, could have been 1 times 5, board was messy
|
||||
- LF = Q1- 15 IQR Ask about this, could have been 1 times 5, board was messy
|
||||
Z score - $\frac{x - \mu}{\sigma}$
|
||||
Coefficient of variation / CV = $\frac{\mu}{\sigma} * 100$
|
||||
If sigma is not know, replace with S, sample standard deviation
|
@ -1,3 +1,3 @@
|
||||
Lecture Topic: Exam Review 2
|
||||
|
||||
Lecture Topic: Exam Review 2
|
||||
|
||||
Things to study, distribution, total probability theory
|
@ -1,14 +1,14 @@
|
||||
Exam Review:
|
||||
Calculator Required
|
||||
Formula Sheet, 1 page, 2 sided
|
||||
Statistical Tables provided
|
||||
Everything is multiple choice
|
||||
Put scantron sheet in booklet when returning
|
||||
Everything can be on the exam, including regression analysis
|
||||
Class Wednesday is optional question session
|
||||
36-40 Questions expected, stuff after confidence interval may have many multiple choice per question
|
||||
|
||||
When solving a normal distribution problem we convert X to Z
|
||||
We do this by subtracting X by mu over sigma
|
||||
|
||||
Exam Review:
|
||||
Calculator Required
|
||||
Formula Sheet, 1 page, 2 sided
|
||||
Statistical Tables provided
|
||||
Everything is multiple choice
|
||||
Put scantron sheet in booklet when returning
|
||||
Everything can be on the exam, including regression analysis
|
||||
Class Wednesday is optional question session
|
||||
36-40 Questions expected, stuff after confidence interval may have many multiple choice per question
|
||||
|
||||
When solving a normal distribution problem we convert X to Z
|
||||
We do this by subtracting X by mu over sigma
|
||||
|
||||
He basically just went over lecture slides again
|
Reference in New Issue
Block a user