Probability Distribution

The probability distribution allows the possibility of each outcome of a random experiment. Provides the possibility of various possible occurrences. Here you will learn more about probability distribution.

Probability Distribution

Probability distribution is one of the most important concepts in statistics. It has huge applications in trade, engineering, medicine, and other major sectors. It is mainly used for future predictions based on a sample for a random experiment.

Related Topics

Step by step guide to the probability distribution

The probability distribution gives the probable results for each random event. It is also defined based on the underlying sample space as a set of possible outcomes of each randomized experiment. These settings can be a set of real numbers or a set of vectors or a set of any entity. It is a part of probability and statistics.

Random experiments are defined as experimental results whose results are unpredictable. Suppose, if we toss a coin, we can not predict what outcome it will appear with either it will come as a Head or as a Tail. The probable result of a random experiment is called an outcome. And a set of outcomes is called a sample point. With the help of these experiments or events, we can always create a probability pattern table in terms of variables and probabilities.

Probability distribution of random variables

A random variable has a probability distribution that specifies the probability of its unknown values. Random variables can be discrete (not constant) or continuous, or both. That means it takes any of a designated finite or countable list of values, provided with a probability mass function feature of the random variable’s probability distribution, or can take any numerical value in an interval or set of intervals.

Two random variables with equal probability distribution can yet be different in their relationships with other random variables or whether they are independent of these. Detecting a random variable means that the outcomes of randomly selecting values based on the variable probability distribution function are called random variables.

Probability distribution formulas

Types of the probability distribution

There are two types of probability distributions that are used for different purposes and different types of data production processes.

  1. Normal or cumulative probability distribution
  2. Binomial or discrete probability distribution

Cumulative probability distribution:

Cumulative probability distributions are also known as continuous probability distributions. In this distribution, a set of possible outcomes can take values in a continuous range. For example, a set of real numbers, is a continuous or normal distribution, as it gives all the possible outcomes of real numbers.

The formula for the normal distribution is:

\(\color{blue}{P\left(x\right)=\frac{1}{\sigma \sqrt{2\pi }}e^{-\frac{1}{2}\left(\frac{x-\mu }{\sigma }\right)^2}}\)

Where,

  • \(\mu =\) Mean value
  • \(σ =\) Standard distribution of probability
  • \(x =\) Normal random variable

If mean \((μ) = 0\) and standard deviation \((σ) = 1\), then this distribution is known to be normal distribution.

Discrete probability distribution:

A distribution is called a discrete probability distribution, in which a set of outcomes is discrete in nature. For example, if a dice is rolled, then all the possible outcomes are discrete and give a mass of outcomes. It is also known as the probability mass function.

Therefore, the outcomes of a binomial distribution include n repeated experiments, and the outcome may or may not occur. The formula for the binomial distribution is:

\(\color{blue}{P\left(x\right)=\frac{n!}{r!\left(n-r\right)!}.p^r\left(1-p\right)^{n-r}}\)

\(\color{blue}{P\left(x\right)=C\:\left(n,\:r\right).p^r\left(1-p\right)^{n-r}}\)

Where,

  • \(n =\) Total number of events
  • \(r =\) Total number of successful events
  • \(p =\) Success on a single trial probability
  • \(^nC_r=\left[\frac{n!}{r!\left(n-r\right)!}\right]\)
  • \(1 – p =\) Failure Probability

What is negative binomial distribution?

In probability theory and statistics, if in a discrete probability distribution, the number of successes in a series of independent and identically disseminated Bernoulli trials before a certain number of failures occur, is called a negative binomial distribution. Here the number of failures is indicated by \(r\).

For example, if we roll a dice and set the occurrence of \(1\) as a failure and all non-1s as a success. Now, if we roll the dice repeatedly until \(1\) appears the third time, i.e. \(r =\) three failures, then the probability distribution of the number of non-\(1\)s that arrived would be the negative binomial distribution.

What is a Poisson probability distribution?

The Poisson probability distribution is a discrete probability distribution that represents the probability that a certain number of events will occur at a fixed time or space if these cases occur with a known steady rate and individually of the time since the last event.

Poisson distribution can also be practiced for the number of events that occur at other specific intervals, such as distance, area, or volume.

Probability distribution function

The function used to define the probability distribution is called the Probability distribution function. Depending on the type, we can define these functions. Also, these functions are used in terms of probability density functions for each given random variable.

In the case of Normal distribution, the function of a real-valued random variable \(X\) is the function given by;

\(F_X\left(x\right)=P\left(X\le x\right)\)

Where \(P\) shows the probability that the random variable \(X\) occurs on less than or equal to the value of \(x\).

For a closed interval, \((a→b)\), the cumulative probability function can be defined as:

\(\color{blue}{P\left(a<X\:\le \:b\right)=F_X\left(b\right)-F_X\left(a\right)}\)

If we express, the cumulative probability function as integral of its probability density function \(f_x\), then,

\(\color{blue}{F_x\left(X\right)=\int _{-\infty }^x\:f_x\left(t\right)dt}\)

In the case of a random variable \(X=b\), we can define cumulative probability function as:

\(\color{blue}{P\left(X=b\right)=F_x\left(b\right)-lim _{x\to b }\:f_x\left(t\right)}\)

In the case of a Binomial distribution, as we know it, it is defined as the probability of a mass or a discrete random variable giving exactly some value. This distribution is also called the probability mass distribution and its related function is also called the probability mass function.

The probability mass function is defined for scalar or multivariate random variables whose domain is variant or discrete. Let us discuss its formula:

Suppose a random variable \(X\) and sample space \(S\) is defined as;

\(X: S → A\)

And \(A ∈ R\), where \(R\) is a discrete random variable.

Then the probability mass function \(f_X: A→\left[0,1\right]\) for \(X\) can be defined as:

\(\color{blue}{f_X\left(x\right)\:=\:P_r\:\left(X=x\right)=P\:\left(s ∈ S:\:X\left(s\right)=x\right)}\)

Probability distribution table

The table can be created based on a random variable and possible outcomes. For example, a random variable \(X\) is a function with a real value whose domain is the sample space of a random experiment. The probability distribution \(P(X)\) of a random variable \(X\) is the system of numbers.

where \(Pi > 0\), \(i=1\) to \(n\) and \(P_1+P_2+P_3+ …….. +P_n =1\)

What is the Prior probability?

In Bayesian statistical conclusion, the previous probability distribution, also known as the prior, is an unpredictable quantity, the probability distribution, which expresses one’s faith about this quantity before any evidence is recorded. For example, the prior probability distribution indicates the relative proportion of voters who will vote for some politicians in the next election. The hidden quantity may be a parameter of the design or a possible variable rather than a perceptible variable.

What is Posterior probability?

The posterior probability is the probability of an event occurring after all data or background information has been brought into account. It is nearly related to a prior probability, where an event will occur before you take any new data or evidence into consideration. It is an adjustment of prior probability. We can calculate it using the following formula:

Posterior Probability\(=\) Prior Probability \(+\) New Evidence 

Commonly used in Bayesian hypothesis testing. For example, old data show that around \(60%\) of students who begin college will graduate within \(4\) years. This is the prior probability. However, if we think the figure is much lower, then we start collecting new data. The data collected indicate that the true figure is closer to \(50%\), which is the posterior probability.

What people say about "Probability Distribution"?

No one replied yet.

Leave a Reply

X
30% OFF

Limited time only!

Save Over 30%

Take It Now!

SAVE $5

It was $16.99 now it is $11.99

Math and Critical Thinking Challenges: For the Middle and High School Student