### Distributions

There are two types of distributions:**Discrete** - When you have a discrete distribution we talk about probabilities**Continuous** - When we have a continuous distribution we talk about probability densities

**Describing a Probability Distribution**

If you wanted to describe a probability distribution, how would you do it ?

There are a number of measures that could be used:

**Measure of Central Tendency** - Mean, Mode, Median**Measure of Dispersion** - Range, Variance, Standard Deviation**Skewness** This is a measure of asymmetry. Third moment**Kurtosis** This is a measure of the outliers (or extreme values). Fourth moment

**What is Probability Theory ?**

Most introductions to probability theory treat discrete probability distributions and continuous probability distributions separately

This is the foundation of every aspect of quantitative finance.

The use of the term stochastic to mean based on the theory of probability

**Joint Probability Density Function**

**What are Random Variables ?**

The outcome of an experiment need not be a number, for example, the outcome when a coin is tossed can be 'heads' or 'tails'.

A random variable is a function that associates a unique numerical value with every outcome of an experiment.

The value of the random variable will vary from trial to trial as the experiment is repeated.

A random variable is a variable for which a probability can be assigned to the outcome variable.

There are two types of random variable:**Discrete Random Variable** - Every outcome has a specific probability**Continuous Random Variable** - There are an infinite number of outcomes

**Probability Histogram**

The probability distribution of a random variable can be easily represented by a histogram.

The potential values of the random variable are plotted on the x-axis, while their associated probabilities are plotted on the y-axis

SS

Rectangles of equal width are centered on each discrete value, and their heights are equal to the probabilities that the random variable can assume those values.

A histogram is called a probability mass function when the total area under a histogram is 1.

Probability distributions provide a way of calculating the probability of a value in a distribution.

Histograms can have bars which have different widths.

You can use a different scale (other than frequency density) on your histogram but it must be propertional to the frequency.

The area under the frequency polygon will always be the same as that of the histogram.

A variate is a generalization of the concept of a random variable that is defined without reference to a particular type of probabilistic experiment. It is defined as the set of all random variables that obey a given probabilistic law.

Examples

A coin is tossed ten times. The random variable X is the number of tails that are noted. X can only take the values 0, 1, ..., 10, so X is a discrete random variable.

A light bulb is burned until it burns out. The random variable Y is its lifetime in hours. Y can take any positive real value, so Y is a continuous random variable.

**Random Walk**

A random variable is a type of measurement taken on the outcome of a random experiment.

A random variable is a function that maps each outcome (w) in the sample space (?) into a set of real numbers.

Examples are:

throwing a nice

coin toss

card example

N points on a circle

**What is a Probability Distribution ?**

This is a tool for telling us about the expected values of some random variable.

There are 3 main uses for probability distributions

1) They are used in statistical analysis

2) They can be used to estimate the probability

3) They can be used in Monte Carlo simulations

**Transition Probability Density Function**

General stochastic differential equation

dy = A(y,t) dt + B(y,t) dX

The probability that the random variable y lies between a and b at time t' in the future given that it started out with value y at time t.

Think of y and t as current values

Think of y' and t' as future values

This can be used to answer the question "what is the probability of the variable y being in a certain time 't given that it started out with value y at time t.

This function satisfies two equations:

1) The Forward Equation

2) The Backwards Equation

(both of these are parabolic partial differential equations)

**Important**

When referring to a probability distribution function you should always be explicit about the type of function you are referring to.

It is very common for people to talk about probability density functions to mean both discrete and continuous probability functions

Cumulative distribution functions are also used to specify the distribution of multivariate random variables.

Every random variable has either an associated probability distribution (discrete random variable) or probability density function (continuous random variable).

In a symmetrical distribution the mean, median and mode all have the same value (or extremely close)

© 2018 Better Solutions Limited. All Rights Reserved. © 2018 Better Solutions Limited TopNext