Uniform Distribution (continuous)
Definition and Overview
In probability theory and statistics, the uniform distribution (continuous) is a type of probability distribution in which all outcomes are equally likely within a given interval. This distribution is also known as the rectangular distribution due to its constant probability density function (PDF) over the interval. The uniform distribution is defined by two parameters, \(a\) and \(b\), which represent the minimum and maximum values of the interval, respectively.
Probability Density Function (PDF)
The probability density function of a continuous uniform distribution is given by:
\[ f(x) = \begin{cases} \frac{1}{b-a} & \text{for } a \leq x \leq b \\ 0 & \text{otherwise} \end{cases} \]
Here, \(a\) and \(b\) are the lower and upper bounds of the distribution, respectively. The PDF is constant over the interval \([a, b]\), indicating that each value within this range is equally probable.
Cumulative Distribution Function (CDF)
The cumulative distribution function (CDF) of the uniform distribution is given by:
\[ F(x) = \begin{cases} 0 & \text{for } x < a \\ \frac{x-a}{b-a} & \text{for } a \leq x \leq b \\ 1 & \text{for } x > b \end{cases} \]
The CDF represents the probability that a random variable \(X\) drawn from the uniform distribution is less than or equal to \(x\).
Properties
Mean and Variance
The mean (expected value) and variance of a continuous uniform distribution are given by:
\[ \text{Mean} = \frac{a + b}{2} \]
\[ \text{Variance} = \frac{(b - a)^2}{12} \]
These properties indicate that the mean is the midpoint of the interval \([a, b]\), and the variance depends on the square of the length of the interval.
Moments
The \(n\)-th moment of the uniform distribution is given by:
\[ E[X^n] = \frac{1}{n+1} \sum_{k=0}^{n} \binom{n}{k} a^{n-k} b^k \]
This formula can be used to derive higher-order moments of the distribution.
Applications
The continuous uniform distribution has various applications in fields such as statistics, engineering, and computer science. Some common applications include:
- **Random Number Generation**: Uniform distributions are often used in random number generation algorithms to produce random samples within a specified range.
- **Simulation**: In Monte Carlo simulation, uniform distributions are used to generate random inputs for simulations.
- **Quality Control**: In manufacturing and quality control, uniform distributions can model the distribution of measurements within specified tolerances.
Relationship to Other Distributions
The uniform distribution is related to several other probability distributions:
- **Normal Distribution**: The central limit theorem states that the sum of a large number of independent and identically distributed uniform random variables will approximate a normal distribution.
- **Exponential Distribution**: The minimum of \(n\) independent uniform random variables follows an exponential distribution.
- **Beta Distribution**: The uniform distribution is a special case of the beta distribution with parameters \(\alpha = 1\) and \(\beta = 1\).
Parameter Estimation
Estimating the parameters \(a\) and \(b\) of a uniform distribution can be done using the method of moments or maximum likelihood estimation (MLE). For a sample \(X_1, X_2, \ldots, X_n\) drawn from a uniform distribution, the MLE estimates are:
\[ \hat{a} = \min(X_1, X_2, \ldots, X_n) \]
\[ \hat{b} = \max(X_1, X_2, \ldots, X_n) \]
These estimates are intuitive, as they represent the observed minimum and maximum values in the sample.
Generalizations
The concept of the uniform distribution can be generalized to higher dimensions. In \(d\)-dimensional space, the uniform distribution over a \(d\)-dimensional region (e.g., a hypercube) is defined similarly, with a constant probability density over the region.