Distributions

There are two types of distributions:
Discrete - When you have a discrete distribution we talk about probabilities
Continuous - When we have a continuous distribution we talk about probability densities



Describing a Distribution

If you wanted to describe a probability distribution, how would you do it ?
There are a number of measures that could be used:


Measure of Central Tendency - Mean, Mode, Median
Measure of Dispersion - Range, Variance, Standard Deviation
Skewness This is a measure of asymmetry. Third moment
Kurtosis This is a measure of the outliers (or extreme values). Fourth moment



Probability Histogram

The probability distribution of a random variable can be easily represented by a histogram.
The potential values of the random variable are plotted on the x-axis, while their associated probabilities are plotted on the y-axis
SS
Rectangles of equal width are centered on each discrete value, and their heights are equal to the probabilities that the random variable can assume those values.
A histogram is called a probability mass function when the total area under a histogram is 1.


Probability distributions provide a way of calculating the probability of a value in a distribution.


Histograms can have bars which have different widths.
You can use a different scale (other than frequency density) on your histogram but it must be propertional to the frequency.
The area under the frequency polygon will always be the same as that of the histogram.


A variate is a generalization of the concept of a random variable that is defined without reference to a particular type of probabilistic experiment. It is defined as the set of all random variables that obey a given probabilistic law.



What is a Probability Distribution ?

This is a tool for telling us about the expected values of some random variable.


There are 3 main uses for probability distributions
1) They are used in statistical analysis
2) They can be used to estimate the probability
3) They can be used in Monte Carlo simulations




Transition Probability Density Function

General stochastic differential equation
dy = A(y,t) dt + B(y,t) dX


The probability that the random variable y lies between a and b at time t' in the future given that it started out with value y at time t.
Think of y and t as current values
Think of y' and t' as future values


This can be used to answer the question "what is the probability of the variable y being in a certain time 't given that it started out with value y at time t.


This function satisfies two equations:
1) The Forward Equation
2) The Backwards Equation
(both of these are parabolic partial differential equations)



Important

When referring to a probability distribution function you should always be explicit about the type of function you are referring to.
It is very common for people to talk about probability density functions to mean both discrete and continuous probability functions
Cumulative distribution functions are also used to specify the distribution of multivariate random variables.
Every random variable has either an associated probability distribution (discrete random variable) or probability density function (continuous random variable).
In a symmetrical distribution the mean, median and mode all have the same value (or extremely close)


© 2024 Better Solutions Limited. All Rights Reserved. © 2024 Better Solutions Limited TopNext