Actuarial Exam P

Actuarial Exam P

For cramming

Pelham Delaney (lvl 10)
Algebra & Calculus Review

Preview this deck

\(lne^{x} =\)

Front

Star 0%
Star 0%
Star 0%
Star 0%
Star 0%

0.0

0 reviews

5
0
4
0
3
0
2
0
1
0

Active users

4

All-time users

5

Favorites

0

Last updated

3 years ago

Date created

Jan 5, 2021

Cards (138)

Algebra & Calculus Review

(42 cards)

\(lne^{x} =\)

Front

\(=x\)

Back

\(e^{x\ln b} =\)

Front

\( =b^x\)

Back

Double Integral

Given a continuous function \(f(x,y)\) on the rectangular region bounded by \(x=a\), \(x=b\), \(y=c\), and \(y=d\), what is the definite integral of  \(f\)?

Front

Can be expressed in two ways:

\(\int_{a}^{b} \int_{c}^{d} f(x,y) \,dy\,dx \)

\(\int_{c}^{d} \int_{a}^{b} f(x,y) \,dx\,dy \)

Back

\(\int a^x \,dx\)

Front

\(\frac{a^x}{lna}\)

Back

Derivative of \(a^x\)

Front

\(a^x \cdot lna\)

Back

Derivative of \(e^{g(x)}\)

Front

\(g'(x)\cdot e^{g(x)}\)

Back

Increasing geometric series sum

\(1 + 2r + 3r^2 + \cdots = \)

Front

\(\frac{1}{(1-r)^2}\)

Back

Derivative of \(cosx\)

Front

\(-sinx\)

Back

\(\sum_{x=0}^{\infty} \frac{a^x}{x!} =\)

Front

\(= e^a\)

Back

Derivative of \(e^x\)

Front

\(e^x\)

Back

\(b^{\log _{b} y} =\)

Front

\(=y\)

Back

Arithmetic Progression

Sum of the first \(n\) terms of the series \(a + (a + d) + (a + 2d) +\cdots\)

Front

\(na + d \cdot \frac{n(n-1)}{2}\)

Back

Integration by Parts

A technique of integration based on the product rule to find \(\int f(x) \cdot g'(x) \,dx\)

Front

\(\int f(x) \cdot g'(x) \,dx = f(x) \cdot g(x) - \int f'(x) \cdot g(x) \,dx\)

 

Useful if \(f'(x) \cdot g(x)\) has an easier antiderivative to find than \(f(x) \cdot g'(x)\). May be necessary to apply integration by parts more than once to simplify an integral.

Back

One-to-one function

Front

A function where each element of the range is paired with exactly one element of the domain (only one y value for each x value) 

Back

Partial Differentiation

Given function \(f(x,y)\), what is the partial derivative of \(f\) with respect to \(x\) at the point \((x_{0},y_{0})\)?

Front

\(\frac{\partial f}{\partial x}\) is found by differentiating \(f\) with respect to \(x\) and regarding \(y\) as constant, then substituting the values \(x = x_{0}\) and \(y = y_{0}\)

Back

if \(G(x) = \int_{a}^{h(x)} f(u) \,du\)

\(G'(x) =\)

Front

\(= f[h(x)]\cdot h'(x)\)

Back

Product rule

(derivative of \(g(x)\cdot h(x)\)

Front

\(g'(x)\cdot h(x) + g(x)\cdot h'(x)\)

Back

\(\int sinx \,dx\)

Front

\(-cosx\)

Back

Inverse of exponential function \(f(x) = b^x\)

Front

\(x = log _{b} y\)

Back

Integration of \(f\) on \(a,b\) when \(f\) is not defined at \(a\) or \(b\) or when \(a\) or \(b\) is \(\pm \infty\)

Front

Integration over an infinite interval is defined by taking limits \( \int_{a}^{\infty} f(x) \,dx = \lim\limits_{b \to \infty} \int_{a}^{b} f(x) \,dx\)

If \(f\) is not defined or is discontinuous at \(x = a\) then \( \int_{a}^{b} f(x) \,dx = \lim\limits_{c \to a+} \int_{c}^{b} f(x) \,dx\)

If \(f\) is undefined or discontinuous at point \(x = c\) then \( \int_{a}^{b} f(x) \,dx = \int_{a}^{c} f(x) \,dx + \int_{c}^{b} f(x) \,dx\)

 

Back

Quotient Rule

(derivative of \(\frac{g(x)}{h(x)}\)

Front

\(\frac{h(x)g'(x)-g(x)h'(x)}{[h(x)]^2}\)

Back

\(\int cosx \,dx\)

Front

\(sinx\)

Back

\( \int_{0}^{\infty} x^{n}e^{-cx} \,dx =\)

Front

 \(= \frac{n!}{c^{n+1}}\)

Back

L'Hospital's First Rule

For limits of the form \(\lim\limits_{x \to c} \frac{f(x)}{g(x)}\)

IF

(i) \(\lim\limits_{x \to c} f(x) = \lim\limits_{x \to c} g(x) = 0\) and

(ii) \(f'(c)\) exists and

(iii) \(g'(c)\) exists and \(\neq\) 0

(note that 0 can be replaced with \(\pm \infty\))

THEN

Front

\(\lim\limits_{x \to c} \frac{f(x)}{g(x)} = \frac{f'(c)}{g'(c)}\)

Back

Derivative of \(lnx\)

Front

\(\frac{1}{x}\)

Back

\(e^{lny}=\)

Front

\(=y\)

Back

Derivative of \(ln(g(x))\)

Front

\(\frac{g'(x)}{g(x)}\)

Back

if \(G(x) = \int_{g(x)}^{h(x)} f(u) \,du\)

\(G'(x) =\)

Front

\(G'(x) = f[h(x)]\cdot h'(x) - f[g(x)]\cdot g'(x)\)

Back

Integration by Substitution

To find if \(\int f(x) \,dx\) we make the substitution \(u = g(x)\) for an appropriate function \(g(x)\) which has an antiderivative easier to find than the original.

Front

The differential \(du\) is defined as \(du = g'(x)dx\) and we try to write \(\int f(x) \dx\) as an integral with respect to the variable \(u\).

For example, to find \(\int(x^3-1)^{4/3}x^2 \,dx\), we let \(u = x^3 - 1\) so that \(du = 3x^2dx\) or equivalently, \(\frac{1}{3}\cdot du = x^2dx\) and the integral can be rewritten as \(\int u^{\frac{4}{3}}\cdot \frac{1}{3} \,du\)

Back

if \(G(x) = \int_{g(x)}^{b} f(u) \,du\)

\(G'(x) =\)

Front

\(G'(x) = -f[g(x)\cdot g'(x)\)

Back

Quadratic Formula

Front

\(x = \frac{-b\pm \sqrt{b^2-4ac}}{2a}\)

Back

Chain Rule

(derivative of \(g(h(x))\)

Front

\(g'(h(x)) \cdot h'(x)\)

Back

\(\int xe^{ax} \,dx\)

Front

\(\frac{xe^{ax}}{a} - \frac{e^{ax}}{a^2}\)

Back

\(\log _{b} \frac{y}{z} =\)

Front

\(= \log _{b} y - \log _{b} z\)

Back

Derivative of \(\log _{b} x\)

Front

\(\frac{1}{xlnb}\)

Back

Geometric Progression

Sum of the first \(n\) terms \(a+ar+ar^2+\cdots + ar^{n-1}\)

Infinite series sum \(a+ar+ar^2+\cdots\)

Front

Sum of the first \(n\) terms \(= a\cdot \frac{r^n-1}{r-1}\)

Infinite series sum \(= \frac{a}{1-r}\)

Back

L'Hospital's Second Rule

For limits of the form \(\lim\limits_{x \to c} \frac{f(x)}{g(x)}\)

IF

(i) \(\lim\limits_{x \to c} f(x) = \lim\limits_{x \to c} g(x) = 0\) and

(ii) \(f\) and \(g\) are differentiable near \(c\) and

(iii) \(\lim\limits_{x \to c} \frac{f'(x)}{g'(x)}\) exists

(note that 0 can be replaced with \(\pm \infty\))

THEN

Front

\(\lim\limits_{x \to c} \frac{f(x)}{g(x)} = \lim\limits_{x \to c} \frac{f'(x)}{g'(x)}\)

Back

\(log _{b} y =\)

Front

\(= \frac{lny}{lnb}\)

Back

Power rule

(derivative of \(cx^n\))

Front

\(cnx^{n-1}\)

Back

\(\int \frac{1}{x} \,dx\)

Front

\(lnx\)

Back

Derivative of \(sinx\)

Front

\(cosx\)

Back

\(\log _{b} yz =\)

Front

\(\log _{b} y + \log _{b} z\)

Back

Basic Probability

(9 cards)

\((A\cup B)' =\)

Front

\((A\cup B)' = A'\cap B'\)

Back

Mutually exclusive events

Front

Cannot occur simultaneously. No sample points in common. Also referred to as disjoint or as having empty intersection.

Back

Event

Front

A collection of sample points, a subset of the probability space. We say "event A has occurred" if the experimental outcome was one of the sample points in A.

Back

Sample Point and Sample Space

Front

A sample point is the simple outcome of a random experiment. The sample space or probability space is the collection of all possible sample points (outcomes) related to a specified experiment.

Back

\(A\cup (B_{1}\cap B_{2}\cap \cdots B_{n}) =\)

Front

\(A\cup (B_{1}\cap B_{2}\cap \cdots B_{n}) = (A\cup B_{1})\cap (A\cup B_{2})\cap \cdots \cap (A\cup B_{n})\)

Back

Complement of event A

Front

All sample points in the probability space that are not in A.

A'

Back

\(A\cap (B_{1}\cup B_{2}\cup \cdots B_{n}) =\)

Front

\(A\cap (B_{1}\cup B_{2}\cup \cdots B_{n}) = (A\cap B_{1})\cup (A\cap B_{2})\cup \cdots (A\cap B_{n})\)

Back

Partition of event A

Front

Events C1, C2,...,Cn form a partition of event A if the Cs are mutually exclusive and exhaustive of event A.

Back

\((A\cap B)' =\)

Front

\((A\cap B)' = A'\cup B'\)

Back

Conditional Probability and Independence

(9 cards)

\(P[B|A] =\)

Front

\(\frac{P[B\cap A]}{P[A]}\)

Back

Bayes' rule and Theorem

For events A and B, P[A|B] =

(usually used to turn around conditioning of events A and B)

Front

\(P[A|B] = \frac{P[A\cap B]}{P[B\cap A] + P[B\cap A']} = \frac{P[B|A]\cdot P[A]}{P[B|A]\cdot P[A] + P[B|A']\cdot P[A']}\)

Back

Relationship between independent events A and B

Front

\(P[A\cap B] = P[A]\cdot P[B]\)

Back

Law of Total Probability

For events A and B, P[B] =

Front

\(P[B] = P[B|A]\cdot P(A) + P[B|A']\cdot P(A')\)

Back

\(P[A'|B] =\)

Front

\(1 - P[A|B]\)

Back

\(P[A\cup B|C] =\)

Front

\(P[A\cup B|C] = P[A|C] + P[B|C] - P[A\cap B|C]\)

Back

\(P[A\cup B|C] =\)

Front

\(P[A|C] + P[B|C] - P[A\cap B|C]\)

Back

\(P[B\cap A] =\)

Front

\(P[B|A]\cdot P[A]\)

Back

\(P[A_{1} \cap A_{2}\cap \cdots \cap A_{n}] = \)

Front

\(P[A_{1}]\cdot P[A_{2}|A_{1}]\cdot P[A_{3}|A_{1}\cap A_{2}]\cdots P[A_{n}|A_{1}\cap A_{2}\cap \cdots \cap A_{n-1}]\)

Back

Combinatorial Principles, Permutations and Combinations

(6 cards)

Multinomial Theorem

In the power series expansion of \((t_{1} + t_{2} +\cdots + t_{s} )^N\) the coefficient of \(t_{1}^{k_{1}}\cdot t_{2}^{k_{2}}\cdots t_{s}^{k_{s}}\) is

Front

\(\binom{N}{k_{1}, k_{2}, \cdots k_{s}} = \frac{N!}{k_{1}!\cdot k_{2}! \cdots k_{s}!}\)

For example, in the expansion of \((1+x+y)^4\), the coefficient of \(xy^2\) is the coefficient of \(1^{1}x^{1}y^{2}\) which is \(\frac{4!}{1!\cdot 1!\cdot 2!} = 12\)

Back

Number of ordering \(n\) objects with \(n_{1}\) of Type 1,  \(n_{2}\) of Type 2, ...,  \(n_{t}\) of Type \(t\)

Front

\(\frac{n_{1}}{n_{1}!\cdot n_{2}!\cdots n_{t}!}\)

Back

Binomial Theorem

Power series expansion of \((1+t)^N\)

Front

The coefficient of \(t^k\) is \(\binom{N}{k}\) so that \((1+t)^N = \sum_{k=0}^{\infty}\binom{N}{k}\cdot t^k = 1 + Nt + \frac{N(N-1)}{2}t^2 + \frac{N(N-1)(N-2)}{6}t^2 \cdots\)

Back

Way of choosing a subset size \(k\) of \(n\) objects with \(k_{1}\) objects of Type 1, \(k_{2}\) objects of Type 2, ..., and \(k_{t}\) objects of Type \(t\)

Front

\(\binom{n_{1}}{k_{1}}\cdot \binom{n_{2}}{k_{2}}\cdots \binom{n_{t}}{k_{t}}\)

Back

Number of permutations of size \(k\) out of \(n\) distinct objects

Front

Denoted \(_{n}P_{k}\)

\(\frac{n!}{(n-k)!}\)

Back

Number of combinations of size \(k\) out of \(n\) distinct objects

Front

Denoted \(\binom{n}{k}\) or \(_{n}C_{k}\)

\(\frac{n!}{k!\cdot (n-k)!}\)

Back

Random Variables and Probability Distributions

(5 cards)

Probability function of a discrete random variable

Front

Usually denoted \(p(x), f_{X},\) or \(p_{x}\)

The probability that the value \(x\) occurs

(i) \(0 ≤ p(x) ≤1\) for all \(x\), and

(ii) \(\sum_{x} p(x) = 1

Back

Survival function \(S(x)\)

Front

The complement of the cumulative distribution function

\(S(x) = 1 - F(X) = P[X > x]\)

Back

Probability density function (pdf)

Front

Usually denoted \(f(x)\) or \(f_{X} (x)\)

Probabilities related to X are found by integrating the density function over an interval.

\(P[X\in (a,b)] = P[a < X < b] = \int_{a}^{b} f(x) dx\)

Must satisfy:

(i) \(f(x) ≥ 0\) for all \(x\), and

(ii) \(\int_{-\infty}^{\infty} f(x) dx = 1\)

Back

Cumulative distribution function \(F(x)\) or \(F_{X} (x)\)

Front

\(F(x) = P[X ≤ x]\)

For a discrete random variable, \(F(x) = \sum_{w<x} p(w)\)

For a continuous random variable, \(F(x) = \int_{-\infty}^{\infty} f(t)dt\) and \(F(x)\) is a continuous, differentiable, non-decreasing function.

Back

Expectation and Other Distribution Parameters

(23 cards)

Finding \(Var[X]\) using \(M_{X}(t)\) and logarithms

Front

\(Var[X] = \frac{d^2}{dt^2} ln[M_{X}(t)]|_{t=0}\)

Back

\(E[aX+b]\)

Front

\(= aE[X]+b\)

Back

Skewness of a distribution

Front

\(\frac{E[(X-μ)^3]}{σ^3}\)

If the skewness is positive, the distribution is said to be skewed to the right. If it is negative, it is skewed to the left.

Back

Standard Deviation of \(X\)

\(σ_{X}\)

Front

\(= \sqrt{Var[X]}\)

Back

Finding \(E[X]\) using \(M_{X}(t)\) and logarithms

Front

\(E[X] = \frac{d}{dt} ln[M_{X}(t)]|_{t=0}\)

Back

A mixture of distributions - expected values and moment generating functions for density function \(f(x) = a_{1}f_{1}(x) + a_{2}f_{2}(x) + \cdots a_{k}f_{k}(x)\)

Front

\(E[X^{n}] = a_{1}E[X_{1}^{n}] + a_{2}E[X_{2}^{n}] + \cdots a_{k}E[X_{k}^{n}]\)

\(M_{x}(t) = a_{1}M_{X_1}(t) + a_{2}M_{X_2}(t) + \cdots + a_{k}M_{X_k}(t)\)

Back

If X is a random variable defined on the interval \([a,b]\), \(E[X] =\)

Front

\(E[X] = a + \int_{a}^{b}[1 - F(x)]dx\)

Back

\(n\)-th moment of \(X\)

Front

\(E[X^n]\)

Back

Coefficient of variation of X

Front

\(\frac{σ_{x}}{μ_{x}}\)

Back

\(Var[aX+b]\)

Front

\(Var[aX+b] = a^2 Var[X]\)

Back

Median of distribution \(X\)

Front

The point \(M\) for which \(P[X ≤ M] = .5\)

Back

Taylor expansion \(e^y =\)

Front

\(e^y = 1 + y + \frac{y^2}{2!} + \frac{y^3}{3!} + \cdots\)

Back

Taylor series expansion of \(M_{x}(t)\) about point \(t=0\)

Front

\(M_{x}(t) = \sum_{k=0}^{\infty} \frac{t^k}{k!} E[X^k] = 1 + t\cdot E[X] + \frac{t^2}{2!} \cdot E[X^2] + \frac{t^3}{3!} \cdot E[X^3] \cdots\)

For a discrete distribution with probability function \(p_{k}\),

\(M_{x}(t) = e^{tx_1}\cdot p_{1} + e^{tx_2}\cdot p_{2} + e^{tx_3}\cdot p_{3} \cdots\)

Back

Chebyshev's inequality

For random variable \(X\) with mean \(μ_{x}\) and standard deviation \(σ_{x}\)

For any real number \(r > 0\)...

Front

\(P[|X - μ_{x}| > rσ_{x}] ≤ \frac{1}{r^2}\)

Back

Mode of a distribution

Front

Any point \(m\) at which the probability or density function \(f(x)\) is maximized.

Back

\(n\)-th central moment of \(X\) about the mean \(μ\)

Front

\(E[(X-μ)^n]\)

Back

Percentiles of distribution

Front

For \(0 < p < 1\), the 100\(p\)-th percentile of the distribution of \(X\) is the number \(c_{p}\) which satisfies:

\(P[X ≤ c_{p}] ≥ p\)

\(P[X ≥ c_{p}] ≤ 1 - p\)

Back

For constants \(a_{1}, a_{2}\), and \(b\) and functions \(h_{1}\) and \(h_{2}\),

\(E[a_{1}h_{1}(X) + a_{2}h_{2}(X) + b]\)

Front

\(E[a_{1}h_{1}(X) + a_{2}h_{2}(X) + b] = a_{1}E[h_{1}(X)] + a_{2}E[h_{2}(X)] + b\)

Back

Expected value of a random variable X

\(E[X], μ_{x}\)

Front

For a discrete random variable:

\(E[X] = \sum x\cdot p(x) = x_{1}\cdot p(x_{1}) + x_{2}\cdot p(x_{2}) + \cdots\)

For a continuous random variable:

\(E[X] = \int_{-\infty}^{\infty} x\cdot f(x)dx\)

(Interval of integration is the interval of non-zero density for \(X\))

Back

Expected value of function \(h(x)\)

\(E[h(x)]\)

Front

For a discrete random variable:

\(E[h(x)] = \sum_{x} h(x)\cdot p(x)\)

For a continuous random variable with density function \(f(x)\):

\(E[h(x)] = \int_{-\infty}^{\infty} h(x)\cdot f(x)dx\)

Back

Jensen's Inequality

For function \(h\) and random variable \(X\)

Front

If \(h''(x) ≥ 0\) at all points \(x\) with non-zero probability for \(X\), then \(E[h(X)] ≥ h(E[X])\) and if \(h''(x) > 0\) at all points \(x\) with non-zero probability for \(X\), then \(E[h(X)] > h(E[X])\)

The inequality reverses for \(h''(x) ≤ 0\)

Back

Moment generating function of \(X\)

\(M_{X}(t)\)

Front

\(M_{X}(t) = E[e^{tX}]\)

(i) It is always true that \(M_{X}(0) = 1\)

(ii) Moments of X can be found by successive derivations of \(M_{X}(t)\), e.g. \(M_{X}'(0) = E[X]\), \(M_{X}''(0) = E[X^2]\), etc.

Back

\(Var[X]\)

\(σ^2\) or \(σ_{x}^2\)

Front

\(= E[(X-μx)^2] = E[X^2] - (E[X])^2 = E[X^2] - μ_{x}^2\)

Back

Frequently Used Discrete Distributions

(4 cards)

Uniform distribution on \(N\) points

Front

\(p(x) = \frac{1}{N}\) for \(x = 1, 2, ..., N\)

\(E[X] = \frac{N+1}{2}\)

\(Var[X] = \frac{N^2-1}{12}\) 

Back

Poisson distribution with parameter λ

Front

Often used as a model for counting the number of events of a certain type that occur in a certain period of time (suppose \(X\) represents the number of customers arriving for service at a bank in a one hour period, with a mean of λ - the number arriving for service in two hours would have a Poisson distribution with parameter 2λ)

\(p(x) = \frac{e^{-λ}λ^x}{x!}\) for \(x = 0, 1, 2, ...\)

\(E[X] = Var[X] = λ\)

Back

Binomial distribution with parameters \(n\) and \(p\)

Front

A single trial of an experiment results in success with probability \(p\) or failure with probability \(1-p=q\). If \(n\) independent trials of the experiment are performed, and \(X\) is the number of successes that occur, \(X\) is said to have binomial distribution denoted \(X~B(n,p)\).

\(p(x) = \binom{n}{x}p^x (1-p)^{n-x}\) for \(x = 0, 1, 2, ..., n\), the probability of exactly \(x\) successes in \(n\) trials.

\(E[X] = np\)

\(Var[X] = np(1-p)\) 

Back

Geometric distribution with parameter \(p\)

Front

\(X\) represents the number of failures until the first success of an experiment with probability \(p\) of success.

\(p(x) = (1-p)^x p\) for \(x = 0, 1, 2, ..., N\)

\(E[X] = \frac{1-p}{p}\)

\(Var[X] = \frac{1-p}{p^2}\) 

Back

Frequently Used Continuous Distributions

(11 cards)

Link between exponential and Poisson distribution

Front

Let \(X\) represent the time between successive occurrences of some type of event, where \(X\) has an exponential distribution with mean \(\frac{1}{λ}\) and time is measured in some appropriate units (seconds, minutes, hours, days, etc.)

Let \(N\) represent the number of events that have occurred when one unit of time has elapsed. Then \(N\) will be a random variable that has a Poisson distribution with mean \(λ\).

 

Back

Integer correction for normal approximation of discrete random variables

Front

If \(X\) is discrete and integer-valued, then integer correction may be applied in the following way:

\(P[n < X < m]\) is approximated using a normal variable \(Y\) with the same mean and variance as \(X\) and finding probability \(P[n -\frac{1}{2} ≤ Y ≤ m +\frac{1}{2}]\)

Back

Minimum of a collection of exponential random variables

Front

If independent random variables \(Y_{1}\), \(Y_{2}\), ..., \(Y_{n}\) have exponential distributions with means \(\frac{1}{λ_{1}}\), \(\frac{1}{λ_{2}}\), ..., \(\frac{1}{λ_{n}}\), \(Y = min\){\(Y_{1}, Y_{2}, ..., Y_{n}\)} has an exponential distribution with mean \(\frac{1}{λ_{1}+λ_{2}+...+λ_{n}}\)

Back

Normal Distribution

Front

Normal distribution \(X\)~\(N(μ,σ^2)\) has a mean of \(μ\) and variance of \(σ^2\)

\(f(x) = \frac{1}{σ\cdot \sqrt{2π}}\cdot e^{-\frac{(x-μ)^2}{2σ^2}}\)

\(E[X] =μ\)

\(Var[X] = σ^2\)

Back

Standardizing normal random variables

Front

For normal random variable \(X\)~ \(N(μ,σ^2)\), find \(P[r < X < s]\) by standardizing \(Z = \frac{X - μ}{σ}\). Then

\(P[r < X < x] = P[\frac{r-μ}{σ} < \frac{X-μ}{σ} < \frac{s-μ}{σ}] = ɸ(\frac{s-μ}{σ}) - ɸ(\frac{r-μ}{σ})\)

Back

Combination of normal random variables \(W = X_{1} + X_{2}\)

Front

\(W\) is also a normal random variable with mean \(μ_{1} + μ_{2}\) and variance \(σ_{1}^2 + σ_{2}^2\).

Back

Exponential distribution with mean \(\frac{1}{λ} > 0\)

Front

Typically used to model the amount of time until a specific event occurs.

\(f(x) = λe^{-λx}\) for \(x > 0\)

\(F(x) = 1 - e^{-λx}\)

\(S(x) = e^{-λx}\)

\(E[X] = \frac{1}{λ}\)

\(Var[X] = \frac{1}{λ^2}\)

Lack of memory property: \(P[X > x + y|X > x] = P[X > y]\)

Back

Uniform distribution on interval (a,b)

Front

\(f(x) = \frac{1}{b-a}\)

\(F(x) = \int_{a}^{x} f(x) dx = \frac{x-a}{b-a} a ≤ x ≤ b\)

\(E[X] = \frac{a+b}{2}\)

\(Var[X] = \frac{(b-a)^2}{12}\)

Back

Finding 95th percentile of normal variable \(X\)~\(N(1,4)\)

Front

\(P[X ≤ c] = .95\)

\(P[\frac{X - 1}{\sqrt{4}} ≤ \frac{c - 1}{\sqrt{4}}] = ɸ(\frac{c - 1}{\sqrt{4}}) = .95\)

\(\frac{c - 1}{\sqrt{4}} = 1.645\)

\(c = 4.29\)

Back

Standard Normal Distribution

Front

Standard normal distribution \(Z~N(0,1)\) has a mean of 0 and variance of 1. A table of probabilities is provided on the exam.

\(f(x) = \frac{1}{\sqrt{2π}}\cdot e^{-\frac{x^2}{2}}\)

\(E[X] = 0\)

\(Var[X] = 1\)

Back

Gamma distribution with parameters \(n > 0\) and \(β > 0\)

Front

\(f(x) = \frac{β^n \cdot x^{n-1} \cdot e^{-βx}}{(n-1)!}\)

\(E[X] = \frac{n}{β}\)

\(Var[X] = \frac{n}{β^2}\)

Back

Joint, Marginal, and Conditional Distributions

(11 cards)

Independence of random variables \(X\) and \(Y\)

Front

\(X\) and \(Y\) are independent if the probability space is rectangular (endpoints can be infinite) and

\(f(x,y) = f_{X}(x)\cdot f_{Y}(y)\)

Which is equivalent to

\(F(x,y) = F_{X}(x)\cdot F_{Y}(y)\) for all \(x,y\)

Back

\(E[E[X|Y]] =\)

Front

\(=E[X]\)

Back

Marginal distribution of \(X\) found from a joint distribution of \(X\) and \(Y\)

Front

If \(X\) and \(Y\) have a joint distribution with density function \(f(x,y)\), the marginal distribution of  \(X\) has a density function \(f_{X}(x)\) which is equal to

\(f_{X}(x) = \sum_{y} f(x,y)\) in the discrete case and

\(f_{X}(x) = \int f(x,y)dy\) in the continuous case.

Note that \(F_{X}(x) = \lim_{y \to \infty} F(x,y)\)

Back

Covariance between random variables \(X\) and \(Y\)

Front

\(Cov[X,Y] = E[XY] - E[X]E[Y]\)

Also, note

\(Var[aX+bY+c] = a^2 Var[X] + b^2 Var[Y] + 2abCov[X,Y]\)

Back

Expectation of a function of jointly distributed random variables

Front

Discrete case:

\(E[h(X,Y)] = \sum_{x}\sum_{y} h(x,y)\cdot f(x,y)\)

Continuous case:

\(E[h(X,Y)] = \int \int h(x,y)\cdot f(x,y)dydx\)

Back

Joint distribution of variables \(X\) and \(Y\)

Front

The probability \(P[(X = x)\cup (Y = y)]\) for each pair \((x,y)\) of possible outcomes.

For discrete random variables:

(i) \(0 ≤ f(x,y) ≤1\) and

(ii) \(\sum_{x}\sum_{y} f(x,y) = 1\).

For continuous random variables:

(i) \(f(x,y) ≥ 0\) and

(ii) \(\int_{-\infty}^{\infty} f(x,y)dydx = 1\)

Back

\(Var[X]\) using conditional distributions

Front

\(Var[X] = E[Var[X|Y]] + Var[E[X|Y]]\)

Back

Coefficient of correlation between random variables \(X\) and \(Y\)

Front

\(\frac{Cov[X,Y]}{σ_{X}σ_{Y}}\)

Back

Moment generating function of a joint distribution

Front

\(M_{X,Y}(t_{1}, t_{2}) = E[e^{t_{1}x + t_{2}y}]\)

\(E[X^n Y^m] = \frac{∂^{n+m}}{∂^n t_{1}∂^m t_{2}} M_{X,Y}(t_{1}, t_{2})\) evaluated at \(t_{1} = t_{2} = 0\)

Back

Pdf of a uniform joint distribution of \(X\) and \(Y\) on region \(R\)

Front

\(f(x) = \frac{1}{Area of R}\)

Probability of event A \(P[A] = \frac{Area of A}{Area of R}\)

\(f_{Y|X}(Y|X=x)\) has a uniform distribution on the line segment defined by the intersection of the region \(R\) with the line \(X=x\)

Back

Conditional distribution of \(Y\) given \(X = x\)

Front

\(f_{Y|X}(y|X = x) = \frac{f(x,y)}{f_{X}(x)}\)

\(E[Y|X = x] = \int y\cdot f_{Y|X}(y|X = x)dy\)

If \(X\) and \(Y\) are independent, then

\(f_{Y|X}(y|X = x) = f_{Y}(y)\)

Also note that

\(f(x,y) = f_{Y|X}(y|X = x)\cdot f_{X}(x)\)

Back

Important Formulas to Memorize

(5 cards)

Infinite sum of a geometric progression \(a + ar + ar^2 + ...\)

Front

\(a + ar + ar^2 + ... = \frac{a}{1-r}\)

Back

\(\int xe^{ax} dx =\)

Front

\(\int xe^{ax} dx = \frac{xe^{ax}}{a} - \frac{e^{ax}}{a^2}\)

Back

\(\int_{0}^{\infty} x^ne^{-cx} dx\)

Front

\(\int_{0}^{\infty} x^ne^{-cx} dx = \frac{n!}{c^{n+1}}\)

Back

Sum of the first \(n\) terms of an arithmetic progression \(a + a + d + a + 2d + a + 3d + ... + a + (n-1)d\)

Front

\(a + a + d + a + 2d + a + 3d + ... + a + (n-1)d = na + d\cdot \frac{(n)(n-1)}{2}\)

Back

Sum of the first \(n\) terms of a geometric progression \(a + ar + ar^2 + ... + ar^{n-1}\)

Front

\(= a\cdot \frac{1-r^n}{1-r}\)

Back

Functions and Transformations of Random Variables

(11 cards)

Distribution of the sum of \(k\) independent Normal variables

Front

\(Y\)~\(N(\sum μ_i,\sum σ_i^2)\)

Back

Pdf of \(Y = u(X)\)

Front

Can be found in two ways:

(i) \(f_Y(y) = f_X(v(y))\cdot|v'(y)|\)

(ii) \(F_Y(y) = F_X(v(y)), f_Y(y) = F'_Y(y)\) for strictly increasing functions

Back

Central Limit Theorem

Front

If \(X\) is a random variable with mean μ and standard deviation σ and \(Y = X_1 + X_2 + \cdots + X_n\), then \(E[Y] = nμ\) and \(Var[Y] = nσ^2\)

As \(n\) increases, the distribution of \(Y\) approaches a normal distribution \(N(nμ,nσ^2)\)

If an exam question asks for probability involving a sum of a large number of independent random variables, it is usually asking for the normal approximation to be applied.

Back

Distribution of the sum of \(k\) independent Poisson variables

Front

\(Y\)~\(P(\sum λ_i)\)

Back

Distribution of the sum of independent discrete random variables

Front

\(P[X_1+X_2 = k] = \sum_{x_1=0}^k f_1(x_1)\cdot f_2(k-x_1)\)

Back

Distribution of the sum of continuous random variables \(Y = X_1 + X_2\)

Front

\(f_Y(y) = \int_{-\infty}^{\infty} f(x_1, y-x_1)dx_1\)

For independent continuous variables,

\(f_Y(y) = \int_{-\infty}^{\infty} f_1(x_1)\cdot f_2(y-x_1)dx_1\)

If \(X_1>0\) and \(X_2>0\), then

\(f_Y(y) = \int_{0}^{y} f(x_1, y-x_1)dx_1\)

Back

Distribution of the sum of discrete random variables

Front

\(P[X_1+X_2 = k] = \sum_{x_1=0}^k f(x_1, k-x_1)\)

Back

Transformation of \(X\)

\(Y = u(X)\)

Front

\(u(x)\) is a one-to-one function, strictly increasing or decreasing, with inverse function \(v\) so that \(v(u(x)) = x\)

\(Y = u(X)\) is referred to as a transformation of X.

Back

Moment generating function of a sum of random variables

Front

\(M_Y(t) = M_{x_1}(t)\cdot M_{x_2}(t)\cdots M_{x_n}(t)\)

Back

Distribution of the maximum or minimum of a collection of random variables

Front

\(X_1\) and \(X_2\) are independent random variables.

\(U = max{X_1,X_2}\) and \(V = min{X_1,X_2}\)

\(F_U(u) = P[(X_1≤u)\cap (X_2≤u)] = P[(X_1≤u)]\cdot (X_2≤u)] = F_1(u)\cdot F_2(u)\)

similarly

\(F_V(v) = 1 - [1 - F_1(v)]\cdot [1 - F_2(v)]\)

Back

Distribution of the sum of \(k\) independent binomial variables

Front

\(Y\)~\(B(\sum n_i ,p)\)

Back

Risk Management Concepts

(2 cards)

Policy Limit

Front

A policy limit of amount u indicates that the insurer will pay a maximum amount of \(u\) when a loss occurs. The amount paid by the insurer is \(X\) if \(X≤u\) and \(u\) if \(X>u\).

The expected payment is \(\int_0^u [1-F_X(x)]dx\)

Back

Deductible Insurance

Front

If a loss of amount \(X\) occurs, the insurer pays nothing if the loss is less than \(d\) and pays the amount of the loss in excess of \(d\) if the loss is greater than \(d\).

\(Y =\)

\(0\) if \(X≤d\)

\(X-d\) if \(X>d\)

\(= Max(X-d,0)\)

The expected payment is \(\int_d^{\infty} (x-d)f_X(x)dx = \int_d^{\infty} [1-F_X(x)]dx\)

Back