\(lne^{x} =\)
Front
Active users
4
All-time users
5
Favorites
0
Last updated
4 years ago
Date created
Jan 5, 2021
Algebra & Calculus Review
(42 cards)
\(lne^{x} =\)
\(=x\)
\(e^{x\ln b} =\)
\( =b^x\)
Double Integral
Given a continuous function \(f(x,y)\) on the rectangular region bounded by \(x=a\), \(x=b\), \(y=c\), and \(y=d\), what is the definite integral of \(f\)?
Can be expressed in two ways:
\(\int_{a}^{b} \int_{c}^{d} f(x,y) \,dy\,dx \)
\(\int_{c}^{d} \int_{a}^{b} f(x,y) \,dx\,dy \)
\(\int a^x \,dx\)
\(\frac{a^x}{lna}\)
Derivative of \(a^x\)
\(a^x \cdot lna\)
Derivative of \(e^{g(x)}\)
\(g'(x)\cdot e^{g(x)}\)
Increasing geometric series sum
\(1 + 2r + 3r^2 + \cdots = \)
\(\frac{1}{(1-r)^2}\)
Derivative of \(cosx\)
\(-sinx\)
\(\sum_{x=0}^{\infty} \frac{a^x}{x!} =\)
\(= e^a\)
Derivative of \(e^x\)
\(e^x\)
\(b^{\log _{b} y} =\)
\(=y\)
Arithmetic Progression
Sum of the first \(n\) terms of the series \(a + (a + d) + (a + 2d) +\cdots\)
\(na + d \cdot \frac{n(n-1)}{2}\)
Integration by Parts
A technique of integration based on the product rule to find \(\int f(x) \cdot g'(x) \,dx\)
\(\int f(x) \cdot g'(x) \,dx = f(x) \cdot g(x) - \int f'(x) \cdot g(x) \,dx\)
Useful if \(f'(x) \cdot g(x)\) has an easier antiderivative to find than \(f(x) \cdot g'(x)\). May be necessary to apply integration by parts more than once to simplify an integral.
One-to-one function
A function where each element of the range is paired with exactly one element of the domain (only one y value for each x value)
Partial Differentiation
Given function \(f(x,y)\), what is the partial derivative of \(f\) with respect to \(x\) at the point \((x_{0},y_{0})\)?
\(\frac{\partial f}{\partial x}\) is found by differentiating \(f\) with respect to \(x\) and regarding \(y\) as constant, then substituting the values \(x = x_{0}\) and \(y = y_{0}\)
if \(G(x) = \int_{a}^{h(x)} f(u) \,du\)
\(G'(x) =\)
\(= f[h(x)]\cdot h'(x)\)
Product rule
(derivative of \(g(x)\cdot h(x)\)
\(g'(x)\cdot h(x) + g(x)\cdot h'(x)\)
\(\int sinx \,dx\)
\(-cosx\)
Inverse of exponential function \(f(x) = b^x\)
\(x = log _{b} y\)
Integration of \(f\) on \(a,b\) when \(f\) is not defined at \(a\) or \(b\) or when \(a\) or \(b\) is \(\pm \infty\)
Integration over an infinite interval is defined by taking limits \( \int_{a}^{\infty} f(x) \,dx = \lim\limits_{b \to \infty} \int_{a}^{b} f(x) \,dx\)
If \(f\) is not defined or is discontinuous at \(x = a\) then \( \int_{a}^{b} f(x) \,dx = \lim\limits_{c \to a+} \int_{c}^{b} f(x) \,dx\)
If \(f\) is undefined or discontinuous at point \(x = c\) then \( \int_{a}^{b} f(x) \,dx = \int_{a}^{c} f(x) \,dx + \int_{c}^{b} f(x) \,dx\)
Quotient Rule
(derivative of \(\frac{g(x)}{h(x)}\)
\(\frac{h(x)g'(x)-g(x)h'(x)}{[h(x)]^2}\)
\(\int cosx \,dx\)
\(sinx\)
\( \int_{0}^{\infty} x^{n}e^{-cx} \,dx =\)
\(= \frac{n!}{c^{n+1}}\)
L'Hospital's First Rule
For limits of the form \(\lim\limits_{x \to c} \frac{f(x)}{g(x)}\)
IF
(i) \(\lim\limits_{x \to c} f(x) = \lim\limits_{x \to c} g(x) = 0\) and
(ii) \(f'(c)\) exists and
(iii) \(g'(c)\) exists and \(\neq\) 0
(note that 0 can be replaced with \(\pm \infty\))
THEN
\(\lim\limits_{x \to c} \frac{f(x)}{g(x)} = \frac{f'(c)}{g'(c)}\)
Derivative of \(lnx\)
\(\frac{1}{x}\)
\(e^{lny}=\)
\(=y\)
Derivative of \(ln(g(x))\)
\(\frac{g'(x)}{g(x)}\)
if \(G(x) = \int_{g(x)}^{h(x)} f(u) \,du\)
\(G'(x) =\)
\(G'(x) = f[h(x)]\cdot h'(x) - f[g(x)]\cdot g'(x)\)
Integration by Substitution
To find if \(\int f(x) \,dx\) we make the substitution \(u = g(x)\) for an appropriate function \(g(x)\) which has an antiderivative easier to find than the original.
The differential \(du\) is defined as \(du = g'(x)dx\) and we try to write \(\int f(x) \dx\) as an integral with respect to the variable \(u\).
For example, to find \(\int(x^3-1)^{4/3}x^2 \,dx\), we let \(u = x^3 - 1\) so that \(du = 3x^2dx\) or equivalently, \(\frac{1}{3}\cdot du = x^2dx\) and the integral can be rewritten as \(\int u^{\frac{4}{3}}\cdot \frac{1}{3} \,du\)
if \(G(x) = \int_{g(x)}^{b} f(u) \,du\)
\(G'(x) =\)
\(G'(x) = -f[g(x)\cdot g'(x)\)
Quadratic Formula
\(x = \frac{-b\pm \sqrt{b^2-4ac}}{2a}\)
Chain Rule
(derivative of \(g(h(x))\)
\(g'(h(x)) \cdot h'(x)\)
\(\int xe^{ax} \,dx\)
\(\frac{xe^{ax}}{a} - \frac{e^{ax}}{a^2}\)
\(\log _{b} \frac{y}{z} =\)
\(= \log _{b} y - \log _{b} z\)
Derivative of \(\log _{b} x\)
\(\frac{1}{xlnb}\)
Geometric Progression
Sum of the first \(n\) terms \(a+ar+ar^2+\cdots + ar^{n-1}\)
Infinite series sum \(a+ar+ar^2+\cdots\)
Sum of the first \(n\) terms \(= a\cdot \frac{r^n-1}{r-1}\)
Infinite series sum \(= \frac{a}{1-r}\)
L'Hospital's Second Rule
For limits of the form \(\lim\limits_{x \to c} \frac{f(x)}{g(x)}\)
IF
(i) \(\lim\limits_{x \to c} f(x) = \lim\limits_{x \to c} g(x) = 0\) and
(ii) \(f\) and \(g\) are differentiable near \(c\) and
(iii) \(\lim\limits_{x \to c} \frac{f'(x)}{g'(x)}\) exists
(note that 0 can be replaced with \(\pm \infty\))
THEN
\(\lim\limits_{x \to c} \frac{f(x)}{g(x)} = \lim\limits_{x \to c} \frac{f'(x)}{g'(x)}\)
\(log _{b} y =\)
\(= \frac{lny}{lnb}\)
Power rule
(derivative of \(cx^n\))
\(cnx^{n-1}\)
\(\int \frac{1}{x} \,dx\)
\(lnx\)
Derivative of \(sinx\)
\(cosx\)
\(\log _{b} yz =\)
\(\log _{b} y + \log _{b} z\)
Basic Probability
(9 cards)
\((A\cup B)' =\)
\((A\cup B)' = A'\cap B'\)
Mutually exclusive events
Cannot occur simultaneously. No sample points in common. Also referred to as disjoint or as having empty intersection.
Event
A collection of sample points, a subset of the probability space. We say "event A has occurred" if the experimental outcome was one of the sample points in A.
Sample Point and Sample Space
A sample point is the simple outcome of a random experiment. The sample space or probability space is the collection of all possible sample points (outcomes) related to a specified experiment.
\(A\cup (B_{1}\cap B_{2}\cap \cdots B_{n}) =\)
\(A\cup (B_{1}\cap B_{2}\cap \cdots B_{n}) = (A\cup B_{1})\cap (A\cup B_{2})\cap \cdots \cap (A\cup B_{n})\)
Complement of event A
All sample points in the probability space that are not in A.
A'
\(A\cap (B_{1}\cup B_{2}\cup \cdots B_{n}) =\)
\(A\cap (B_{1}\cup B_{2}\cup \cdots B_{n}) = (A\cap B_{1})\cup (A\cap B_{2})\cup \cdots (A\cap B_{n})\)
Partition of event A
Events C1, C2,...,Cn form a partition of event A if the Cs are mutually exclusive and exhaustive of event A.
\((A\cap B)' =\)
\((A\cap B)' = A'\cup B'\)
Conditional Probability and Independence
(9 cards)
\(P[B|A] =\)
\(\frac{P[B\cap A]}{P[A]}\)
Bayes' rule and Theorem
For events A and B, P[A|B] =
(usually used to turn around conditioning of events A and B)
\(P[A|B] = \frac{P[A\cap B]}{P[B\cap A] + P[B\cap A']} = \frac{P[B|A]\cdot P[A]}{P[B|A]\cdot P[A] + P[B|A']\cdot P[A']}\)
Relationship between independent events A and B
\(P[A\cap B] = P[A]\cdot P[B]\)
Law of Total Probability
For events A and B, P[B] =
\(P[B] = P[B|A]\cdot P(A) + P[B|A']\cdot P(A')\)
\(P[A'|B] =\)
\(1 - P[A|B]\)
\(P[A\cup B|C] =\)
\(P[A\cup B|C] = P[A|C] + P[B|C] - P[A\cap B|C]\)
\(P[A\cup B|C] =\)
\(P[A|C] + P[B|C] - P[A\cap B|C]\)
\(P[B\cap A] =\)
\(P[B|A]\cdot P[A]\)
\(P[A_{1} \cap A_{2}\cap \cdots \cap A_{n}] = \)
\(P[A_{1}]\cdot P[A_{2}|A_{1}]\cdot P[A_{3}|A_{1}\cap A_{2}]\cdots P[A_{n}|A_{1}\cap A_{2}\cap \cdots \cap A_{n-1}]\)
Combinatorial Principles, Permutations and Combinations
(6 cards)
Multinomial Theorem
In the power series expansion of \((t_{1} + t_{2} +\cdots + t_{s} )^N\) the coefficient of \(t_{1}^{k_{1}}\cdot t_{2}^{k_{2}}\cdots t_{s}^{k_{s}}\) is
\(\binom{N}{k_{1}, k_{2}, \cdots k_{s}} = \frac{N!}{k_{1}!\cdot k_{2}! \cdots k_{s}!}\)
For example, in the expansion of \((1+x+y)^4\), the coefficient of \(xy^2\) is the coefficient of \(1^{1}x^{1}y^{2}\) which is \(\frac{4!}{1!\cdot 1!\cdot 2!} = 12\)
Number of ordering \(n\) objects with \(n_{1}\) of Type 1, \(n_{2}\) of Type 2, ..., \(n_{t}\) of Type \(t\)
\(\frac{n_{1}}{n_{1}!\cdot n_{2}!\cdots n_{t}!}\)
Binomial Theorem
Power series expansion of \((1+t)^N\)
The coefficient of \(t^k\) is \(\binom{N}{k}\) so that \((1+t)^N = \sum_{k=0}^{\infty}\binom{N}{k}\cdot t^k = 1 + Nt + \frac{N(N-1)}{2}t^2 + \frac{N(N-1)(N-2)}{6}t^2 \cdots\)
Way of choosing a subset size \(k\) of \(n\) objects with \(k_{1}\) objects of Type 1, \(k_{2}\) objects of Type 2, ..., and \(k_{t}\) objects of Type \(t\)
\(\binom{n_{1}}{k_{1}}\cdot \binom{n_{2}}{k_{2}}\cdots \binom{n_{t}}{k_{t}}\)
Number of permutations of size \(k\) out of \(n\) distinct objects
Denoted \(_{n}P_{k}\)
\(\frac{n!}{(n-k)!}\)
Number of combinations of size \(k\) out of \(n\) distinct objects
Denoted \(\binom{n}{k}\) or \(_{n}C_{k}\)
\(\frac{n!}{k!\cdot (n-k)!}\)
Random Variables and Probability Distributions
(5 cards)
Probability function of a discrete random variable
Usually denoted \(p(x), f_{X},\) or \(p_{x}\)
The probability that the value \(x\) occurs
(i) \(0 ≤ p(x) ≤1\) for all \(x\), and
(ii) \(\sum_{x} p(x) = 1
Survival function \(S(x)\)
The complement of the cumulative distribution function
\(S(x) = 1 - F(X) = P[X > x]\)
Probability density function (pdf)
Usually denoted \(f(x)\) or \(f_{X} (x)\)
Probabilities related to X are found by integrating the density function over an interval.
\(P[X\in (a,b)] = P[a < X < b] = \int_{a}^{b} f(x) dx\)
Must satisfy:
(i) \(f(x) ≥ 0\) for all \(x\), and
(ii) \(\int_{-\infty}^{\infty} f(x) dx = 1\)
Cumulative distribution function \(F(x)\) or \(F_{X} (x)\)
\(F(x) = P[X ≤ x]\)
For a discrete random variable, \(F(x) = \sum_{w<x} p(w)\)
For a continuous random variable, \(F(x) = \int_{-\infty}^{\infty} f(t)dt\) and \(F(x)\) is a continuous, differentiable, non-decreasing function.
Expectation and Other Distribution Parameters
(23 cards)
Finding \(Var[X]\) using \(M_{X}(t)\) and logarithms
\(Var[X] = \frac{d^2}{dt^2} ln[M_{X}(t)]|_{t=0}\)
\(E[aX+b]\)
\(= aE[X]+b\)
Skewness of a distribution
\(\frac{E[(X-μ)^3]}{σ^3}\)
If the skewness is positive, the distribution is said to be skewed to the right. If it is negative, it is skewed to the left.
Standard Deviation of \(X\)
\(σ_{X}\)
\(= \sqrt{Var[X]}\)
Finding \(E[X]\) using \(M_{X}(t)\) and logarithms
\(E[X] = \frac{d}{dt} ln[M_{X}(t)]|_{t=0}\)
A mixture of distributions - expected values and moment generating functions for density function \(f(x) = a_{1}f_{1}(x) + a_{2}f_{2}(x) + \cdots a_{k}f_{k}(x)\)
\(E[X^{n}] = a_{1}E[X_{1}^{n}] + a_{2}E[X_{2}^{n}] + \cdots a_{k}E[X_{k}^{n}]\)
\(M_{x}(t) = a_{1}M_{X_1}(t) + a_{2}M_{X_2}(t) + \cdots + a_{k}M_{X_k}(t)\)
If X is a random variable defined on the interval \([a,b]\), \(E[X] =\)
\(E[X] = a + \int_{a}^{b}[1 - F(x)]dx\)
\(n\)-th moment of \(X\)
\(E[X^n]\)
Coefficient of variation of X
\(\frac{σ_{x}}{μ_{x}}\)
\(Var[aX+b]\)
\(Var[aX+b] = a^2 Var[X]\)
Median of distribution \(X\)
The point \(M\) for which \(P[X ≤ M] = .5\)
Taylor expansion \(e^y =\)
\(e^y = 1 + y + \frac{y^2}{2!} + \frac{y^3}{3!} + \cdots\)
Taylor series expansion of \(M_{x}(t)\) about point \(t=0\)
\(M_{x}(t) = \sum_{k=0}^{\infty} \frac{t^k}{k!} E[X^k] = 1 + t\cdot E[X] + \frac{t^2}{2!} \cdot E[X^2] + \frac{t^3}{3!} \cdot E[X^3] \cdots\)
For a discrete distribution with probability function \(p_{k}\),
\(M_{x}(t) = e^{tx_1}\cdot p_{1} + e^{tx_2}\cdot p_{2} + e^{tx_3}\cdot p_{3} \cdots\)
Chebyshev's inequality
For random variable \(X\) with mean \(μ_{x}\) and standard deviation \(σ_{x}\)
For any real number \(r > 0\)...
\(P[|X - μ_{x}| > rσ_{x}] ≤ \frac{1}{r^2}\)
Mode of a distribution
Any point \(m\) at which the probability or density function \(f(x)\) is maximized.
\(n\)-th central moment of \(X\) about the mean \(μ\)
\(E[(X-μ)^n]\)
Percentiles of distribution
For \(0 < p < 1\), the 100\(p\)-th percentile of the distribution of \(X\) is the number \(c_{p}\) which satisfies:
\(P[X ≤ c_{p}] ≥ p\)
\(P[X ≥ c_{p}] ≤ 1 - p\)
For constants \(a_{1}, a_{2}\), and \(b\) and functions \(h_{1}\) and \(h_{2}\),
\(E[a_{1}h_{1}(X) + a_{2}h_{2}(X) + b]\)
\(E[a_{1}h_{1}(X) + a_{2}h_{2}(X) + b] = a_{1}E[h_{1}(X)] + a_{2}E[h_{2}(X)] + b\)
Expected value of a random variable X
\(E[X], μ_{x}\)
For a discrete random variable:
\(E[X] = \sum x\cdot p(x) = x_{1}\cdot p(x_{1}) + x_{2}\cdot p(x_{2}) + \cdots\)
For a continuous random variable:
\(E[X] = \int_{-\infty}^{\infty} x\cdot f(x)dx\)
(Interval of integration is the interval of non-zero density for \(X\))
Expected value of function \(h(x)\)
\(E[h(x)]\)
For a discrete random variable:
\(E[h(x)] = \sum_{x} h(x)\cdot p(x)\)
For a continuous random variable with density function \(f(x)\):
\(E[h(x)] = \int_{-\infty}^{\infty} h(x)\cdot f(x)dx\)
Jensen's Inequality
For function \(h\) and random variable \(X\)
If \(h''(x) ≥ 0\) at all points \(x\) with non-zero probability for \(X\), then \(E[h(X)] ≥ h(E[X])\) and if \(h''(x) > 0\) at all points \(x\) with non-zero probability for \(X\), then \(E[h(X)] > h(E[X])\)
The inequality reverses for \(h''(x) ≤ 0\)
Moment generating function of \(X\)
\(M_{X}(t)\)
\(M_{X}(t) = E[e^{tX}]\)
(i) It is always true that \(M_{X}(0) = 1\)
(ii) Moments of X can be found by successive derivations of \(M_{X}(t)\), e.g. \(M_{X}'(0) = E[X]\), \(M_{X}''(0) = E[X^2]\), etc.
\(Var[X]\)
\(σ^2\) or \(σ_{x}^2\)
\(= E[(X-μx)^2] = E[X^2] - (E[X])^2 = E[X^2] - μ_{x}^2\)
Frequently Used Discrete Distributions
(4 cards)
Uniform distribution on \(N\) points
\(p(x) = \frac{1}{N}\) for \(x = 1, 2, ..., N\)
\(E[X] = \frac{N+1}{2}\)
\(Var[X] = \frac{N^2-1}{12}\)
Poisson distribution with parameter λ
Often used as a model for counting the number of events of a certain type that occur in a certain period of time (suppose \(X\) represents the number of customers arriving for service at a bank in a one hour period, with a mean of λ - the number arriving for service in two hours would have a Poisson distribution with parameter 2λ)
\(p(x) = \frac{e^{-λ}λ^x}{x!}\) for \(x = 0, 1, 2, ...\)
\(E[X] = Var[X] = λ\)
Binomial distribution with parameters \(n\) and \(p\)
A single trial of an experiment results in success with probability \(p\) or failure with probability \(1-p=q\). If \(n\) independent trials of the experiment are performed, and \(X\) is the number of successes that occur, \(X\) is said to have binomial distribution denoted \(X~B(n,p)\).
\(p(x) = \binom{n}{x}p^x (1-p)^{n-x}\) for \(x = 0, 1, 2, ..., n\), the probability of exactly \(x\) successes in \(n\) trials.
\(E[X] = np\)
\(Var[X] = np(1-p)\)
Geometric distribution with parameter \(p\)
\(X\) represents the number of failures until the first success of an experiment with probability \(p\) of success.
\(p(x) = (1-p)^x p\) for \(x = 0, 1, 2, ..., N\)
\(E[X] = \frac{1-p}{p}\)
\(Var[X] = \frac{1-p}{p^2}\)
Frequently Used Continuous Distributions
(11 cards)
Link between exponential and Poisson distribution
Let \(X\) represent the time between successive occurrences of some type of event, where \(X\) has an exponential distribution with mean \(\frac{1}{λ}\) and time is measured in some appropriate units (seconds, minutes, hours, days, etc.)
Let \(N\) represent the number of events that have occurred when one unit of time has elapsed. Then \(N\) will be a random variable that has a Poisson distribution with mean \(λ\).
Integer correction for normal approximation of discrete random variables
If \(X\) is discrete and integer-valued, then integer correction may be applied in the following way:
\(P[n < X < m]\) is approximated using a normal variable \(Y\) with the same mean and variance as \(X\) and finding probability \(P[n -\frac{1}{2} ≤ Y ≤ m +\frac{1}{2}]\)
Minimum of a collection of exponential random variables
If independent random variables \(Y_{1}\), \(Y_{2}\), ..., \(Y_{n}\) have exponential distributions with means \(\frac{1}{λ_{1}}\), \(\frac{1}{λ_{2}}\), ..., \(\frac{1}{λ_{n}}\), \(Y = min\){\(Y_{1}, Y_{2}, ..., Y_{n}\)} has an exponential distribution with mean \(\frac{1}{λ_{1}+λ_{2}+...+λ_{n}}\)
Normal Distribution
Normal distribution \(X\)~\(N(μ,σ^2)\) has a mean of \(μ\) and variance of \(σ^2\)
\(f(x) = \frac{1}{σ\cdot \sqrt{2π}}\cdot e^{-\frac{(x-μ)^2}{2σ^2}}\)
\(E[X] =μ\)
\(Var[X] = σ^2\)
Standardizing normal random variables
For normal random variable \(X\)~ \(N(μ,σ^2)\), find \(P[r < X < s]\) by standardizing \(Z = \frac{X - μ}{σ}\). Then
\(P[r < X < x] = P[\frac{r-μ}{σ} < \frac{X-μ}{σ} < \frac{s-μ}{σ}] = ɸ(\frac{s-μ}{σ}) - ɸ(\frac{r-μ}{σ})\)
Combination of normal random variables \(W = X_{1} + X_{2}\)
\(W\) is also a normal random variable with mean \(μ_{1} + μ_{2}\) and variance \(σ_{1}^2 + σ_{2}^2\).
Exponential distribution with mean \(\frac{1}{λ} > 0\)
Typically used to model the amount of time until a specific event occurs.
\(f(x) = λe^{-λx}\) for \(x > 0\)
\(F(x) = 1 - e^{-λx}\)
\(S(x) = e^{-λx}\)
\(E[X] = \frac{1}{λ}\)
\(Var[X] = \frac{1}{λ^2}\)
Lack of memory property: \(P[X > x + y|X > x] = P[X > y]\)
Uniform distribution on interval (a,b)
\(f(x) = \frac{1}{b-a}\)
\(F(x) = \int_{a}^{x} f(x) dx = \frac{x-a}{b-a} a ≤ x ≤ b\)
\(E[X] = \frac{a+b}{2}\)
\(Var[X] = \frac{(b-a)^2}{12}\)
Finding 95th percentile of normal variable \(X\)~\(N(1,4)\)
\(P[X ≤ c] = .95\)
\(P[\frac{X - 1}{\sqrt{4}} ≤ \frac{c - 1}{\sqrt{4}}] = ɸ(\frac{c - 1}{\sqrt{4}}) = .95\)
\(\frac{c - 1}{\sqrt{4}} = 1.645\)
\(c = 4.29\)
Standard Normal Distribution
Standard normal distribution \(Z~N(0,1)\) has a mean of 0 and variance of 1. A table of probabilities is provided on the exam.
\(f(x) = \frac{1}{\sqrt{2π}}\cdot e^{-\frac{x^2}{2}}\)
\(E[X] = 0\)
\(Var[X] = 1\)
Gamma distribution with parameters \(n > 0\) and \(β > 0\)
\(f(x) = \frac{β^n \cdot x^{n-1} \cdot e^{-βx}}{(n-1)!}\)
\(E[X] = \frac{n}{β}\)
\(Var[X] = \frac{n}{β^2}\)
Joint, Marginal, and Conditional Distributions
(11 cards)
Independence of random variables \(X\) and \(Y\)
\(X\) and \(Y\) are independent if the probability space is rectangular (endpoints can be infinite) and
\(f(x,y) = f_{X}(x)\cdot f_{Y}(y)\)
Which is equivalent to
\(F(x,y) = F_{X}(x)\cdot F_{Y}(y)\) for all \(x,y\)
\(E[E[X|Y]] =\)
\(=E[X]\)
Marginal distribution of \(X\) found from a joint distribution of \(X\) and \(Y\)
If \(X\) and \(Y\) have a joint distribution with density function \(f(x,y)\), the marginal distribution of \(X\) has a density function \(f_{X}(x)\) which is equal to
\(f_{X}(x) = \sum_{y} f(x,y)\) in the discrete case and
\(f_{X}(x) = \int f(x,y)dy\) in the continuous case.
Note that \(F_{X}(x) = \lim_{y \to \infty} F(x,y)\)
Covariance between random variables \(X\) and \(Y\)
\(Cov[X,Y] = E[XY] - E[X]E[Y]\)
Also, note
\(Var[aX+bY+c] = a^2 Var[X] + b^2 Var[Y] + 2abCov[X,Y]\)
Expectation of a function of jointly distributed random variables
Discrete case:
\(E[h(X,Y)] = \sum_{x}\sum_{y} h(x,y)\cdot f(x,y)\)
Continuous case:
\(E[h(X,Y)] = \int \int h(x,y)\cdot f(x,y)dydx\)
Joint distribution of variables \(X\) and \(Y\)
The probability \(P[(X = x)\cup (Y = y)]\) for each pair \((x,y)\) of possible outcomes.
For discrete random variables:
(i) \(0 ≤ f(x,y) ≤1\) and
(ii) \(\sum_{x}\sum_{y} f(x,y) = 1\).
For continuous random variables:
(i) \(f(x,y) ≥ 0\) and
(ii) \(\int_{-\infty}^{\infty} f(x,y)dydx = 1\)
\(Var[X]\) using conditional distributions
\(Var[X] = E[Var[X|Y]] + Var[E[X|Y]]\)
Coefficient of correlation between random variables \(X\) and \(Y\)
\(\frac{Cov[X,Y]}{σ_{X}σ_{Y}}\)
Moment generating function of a joint distribution
\(M_{X,Y}(t_{1}, t_{2}) = E[e^{t_{1}x + t_{2}y}]\)
\(E[X^n Y^m] = \frac{∂^{n+m}}{∂^n t_{1}∂^m t_{2}} M_{X,Y}(t_{1}, t_{2})\) evaluated at \(t_{1} = t_{2} = 0\)
Pdf of a uniform joint distribution of \(X\) and \(Y\) on region \(R\)
\(f(x) = \frac{1}{Area of R}\)
Probability of event A \(P[A] = \frac{Area of A}{Area of R}\)
\(f_{Y|X}(Y|X=x)\) has a uniform distribution on the line segment defined by the intersection of the region \(R\) with the line \(X=x\)
Conditional distribution of \(Y\) given \(X = x\)
\(f_{Y|X}(y|X = x) = \frac{f(x,y)}{f_{X}(x)}\)
\(E[Y|X = x] = \int y\cdot f_{Y|X}(y|X = x)dy\)
If \(X\) and \(Y\) are independent, then
\(f_{Y|X}(y|X = x) = f_{Y}(y)\)
Also note that
\(f(x,y) = f_{Y|X}(y|X = x)\cdot f_{X}(x)\)
Important Formulas to Memorize
(5 cards)
Infinite sum of a geometric progression \(a + ar + ar^2 + ...\)
\(a + ar + ar^2 + ... = \frac{a}{1-r}\)
\(\int xe^{ax} dx =\)
\(\int xe^{ax} dx = \frac{xe^{ax}}{a} - \frac{e^{ax}}{a^2}\)
\(\int_{0}^{\infty} x^ne^{-cx} dx\)
\(\int_{0}^{\infty} x^ne^{-cx} dx = \frac{n!}{c^{n+1}}\)
Sum of the first \(n\) terms of an arithmetic progression \(a + a + d + a + 2d + a + 3d + ... + a + (n-1)d\)
\(a + a + d + a + 2d + a + 3d + ... + a + (n-1)d = na + d\cdot \frac{(n)(n-1)}{2}\)
Sum of the first \(n\) terms of a geometric progression \(a + ar + ar^2 + ... + ar^{n-1}\)
\(= a\cdot \frac{1-r^n}{1-r}\)
Functions and Transformations of Random Variables
(11 cards)
Distribution of the sum of \(k\) independent Normal variables
\(Y\)~\(N(\sum μ_i,\sum σ_i^2)\)
Pdf of \(Y = u(X)\)
Can be found in two ways:
(i) \(f_Y(y) = f_X(v(y))\cdot|v'(y)|\)
(ii) \(F_Y(y) = F_X(v(y)), f_Y(y) = F'_Y(y)\) for strictly increasing functions
Central Limit Theorem
If \(X\) is a random variable with mean μ and standard deviation σ and \(Y = X_1 + X_2 + \cdots + X_n\), then \(E[Y] = nμ\) and \(Var[Y] = nσ^2\)
As \(n\) increases, the distribution of \(Y\) approaches a normal distribution \(N(nμ,nσ^2)\)
If an exam question asks for probability involving a sum of a large number of independent random variables, it is usually asking for the normal approximation to be applied.
Distribution of the sum of \(k\) independent Poisson variables
\(Y\)~\(P(\sum λ_i)\)
Distribution of the sum of independent discrete random variables
\(P[X_1+X_2 = k] = \sum_{x_1=0}^k f_1(x_1)\cdot f_2(k-x_1)\)
Distribution of the sum of continuous random variables \(Y = X_1 + X_2\)
\(f_Y(y) = \int_{-\infty}^{\infty} f(x_1, y-x_1)dx_1\)
For independent continuous variables,
\(f_Y(y) = \int_{-\infty}^{\infty} f_1(x_1)\cdot f_2(y-x_1)dx_1\)
If \(X_1>0\) and \(X_2>0\), then
\(f_Y(y) = \int_{0}^{y} f(x_1, y-x_1)dx_1\)
Distribution of the sum of discrete random variables
\(P[X_1+X_2 = k] = \sum_{x_1=0}^k f(x_1, k-x_1)\)
Transformation of \(X\)
\(Y = u(X)\)
\(u(x)\) is a one-to-one function, strictly increasing or decreasing, with inverse function \(v\) so that \(v(u(x)) = x\)
\(Y = u(X)\) is referred to as a transformation of X.
Moment generating function of a sum of random variables
\(M_Y(t) = M_{x_1}(t)\cdot M_{x_2}(t)\cdots M_{x_n}(t)\)
Distribution of the maximum or minimum of a collection of random variables
\(X_1\) and \(X_2\) are independent random variables.
\(U = max{X_1,X_2}\) and \(V = min{X_1,X_2}\)
\(F_U(u) = P[(X_1≤u)\cap (X_2≤u)] = P[(X_1≤u)]\cdot (X_2≤u)] = F_1(u)\cdot F_2(u)\)
similarly
\(F_V(v) = 1 - [1 - F_1(v)]\cdot [1 - F_2(v)]\)
Distribution of the sum of \(k\) independent binomial variables
\(Y\)~\(B(\sum n_i ,p)\)
Risk Management Concepts
(2 cards)
Policy Limit
A policy limit of amount u indicates that the insurer will pay a maximum amount of \(u\) when a loss occurs. The amount paid by the insurer is \(X\) if \(X≤u\) and \(u\) if \(X>u\).
The expected payment is \(\int_0^u [1-F_X(x)]dx\)
Deductible Insurance
If a loss of amount \(X\) occurs, the insurer pays nothing if the loss is less than \(d\) and pays the amount of the loss in excess of \(d\) if the loss is greater than \(d\).
\(Y =\)
\(0\) if \(X≤d\)
\(X-d\) if \(X>d\)
\(= Max(X-d,0)\)
The expected payment is \(\int_d^{\infty} (x-d)f_X(x)dx = \int_d^{\infty} [1-F_X(x)]dx\)