Norm (mathematics)
Definition and Introduction
In mathematics, a norm is a function that assigns a strictly positive length or size to each vector in a vector space, save for the zero vector, which is assigned a length of zero. Norms are fundamental in various branches of mathematics, including linear algebra, functional analysis, and geometry. They provide a means to quantify the size of elements in a vector space and are essential in defining concepts such as distance, convergence, and continuity.
A norm is typically denoted by double vertical bars, for example, \(\| \mathbf{v} \|\), where \(\mathbf{v}\) is a vector. The formal definition of a norm on a vector space \(V\) over a field \(\mathbb{F}\) (either the real numbers \(\mathbb{R}\) or the complex numbers \(\mathbb{C}\)) is a function \(\| \cdot \|: V \to [0, \infty)\) satisfying the following properties:
1. **Non-negativity**: \(\| \mathbf{v} \| \geq 0\) for all \(\mathbf{v} \in V\), and \(\| \mathbf{v} \| = 0\) if and only if \(\mathbf{v} = \mathbf{0}\). 2. **Scalar multiplication**: \(\| \alpha \mathbf{v} \| = |\alpha| \| \mathbf{v} \|\) for all \(\alpha \in \mathbb{F}\) and \(\mathbf{v} \in V\). 3. **Triangle inequality**: \(\| \mathbf{u} + \mathbf{v} \| \leq \| \mathbf{u} \| + \| \mathbf{v} \|\) for all \(\mathbf{u}, \mathbf{v} \in V\).
These properties ensure that norms behave in a manner analogous to the intuitive notion of length in Euclidean space.
Types of Norms
Norms can be categorized based on the vector spaces they are defined on and their specific properties. Below are some of the most common types of norms:
Euclidean Norm
The Euclidean norm, also known as the \(L^2\) norm or the standard norm, is defined on the vector space \(\mathbb{R}^n\) or \(\mathbb{C}^n\). For a vector \(\mathbf{v} = (v_1, v_2, \ldots, v_n)\), the Euclidean norm is given by:
\[ \| \mathbf{v} \|_2 = \sqrt{v_1^2 + v_2^2 + \cdots + v_n^2} \]
This norm corresponds to the usual notion of distance in Euclidean geometry and is widely used in various applications, including machine learning and data analysis.
Taxicab Norm
The taxicab norm, also known as the \(L^1\) norm or Manhattan norm, is defined as:
\[ \| \mathbf{v} \|_1 = |v_1| + |v_2| + \cdots + |v_n| \]
This norm is named for the way distances are measured in a grid-like path, similar to the streets of Manhattan. It is often used in optimization problems where sparsity is desired.
Maximum Norm
The maximum norm, or \(L^\infty\) norm, is defined as:
\[ \| \mathbf{v} \|_\infty = \max \{ |v_1|, |v_2|, \ldots, |v_n| \} \]
This norm is useful in scenarios where the maximum deviation of a vector's components is of interest, such as in uniform convergence.
p-Norms
More generally, the \(L^p\) norm for \(1 \leq p < \infty\) is defined as:
\[ \| \mathbf{v} \|_p = \left( |v_1|^p + |v_2|^p + \cdots + |v_n|^p \right)^{1/p} \]
The \(L^p\) norms are a family of norms that generalize the \(L^1\), \(L^2\), and \(L^\infty\) norms. They are extensively used in functional analysis and probability theory.


Norms in Infinite-Dimensional Spaces
In infinite-dimensional vector spaces, norms play a crucial role in the study of Banach spaces and Hilbert spaces. A Banach space is a complete normed vector space, meaning that every Cauchy sequence in the space converges to a limit within the space. The concept of completeness is essential for many areas of analysis, including the study of differential equations and Fourier analysis.
A Hilbert space is a special type of Banach space equipped with an inner product, which induces a norm. The norm \(\| \mathbf{v} \|\) in a Hilbert space is derived from the inner product \(\langle \mathbf{u}, \mathbf{v} \rangle\) as follows:
\[ \| \mathbf{v} \| = \sqrt{\langle \mathbf{v}, \mathbf{v} \rangle} \]
Hilbert spaces are fundamental in quantum mechanics and signal processing.
Applications of Norms
Norms are utilized in numerous mathematical and practical applications. Some notable examples include:
Numerical Analysis
In numerical analysis, norms are used to measure errors and assess the stability and convergence of numerical algorithms. The choice of norm can significantly affect the performance and accuracy of numerical methods.
Optimization
In optimization, norms are employed to define objective functions and constraints. For instance, the \(L^1\) norm is often used in sparse optimization problems, while the \(L^2\) norm is common in least squares problems.
Machine Learning
Norms are integral to machine learning algorithms, particularly in regularization techniques such as ridge regression and lasso regression. These techniques use norms to penalize large coefficients, promoting simpler and more interpretable models.
Signal Processing
In signal processing, norms are used to quantify the energy or power of signals. The \(L^2\) norm, in particular, is used to measure the energy of continuous and discrete signals.
Properties of Normed Spaces
Normed spaces exhibit several important properties that are crucial for analysis and applications:
Convexity
The unit ball in a normed space, defined as \(\{ \mathbf{v} \in V : \| \mathbf{v} \| \leq 1 \}\), is always a convex set. This property is essential in convex analysis and optimization.
Dual Space
The dual space of a normed space \(V\) is the space of all continuous linear functionals on \(V\). The dual norm is defined on this space, providing a framework for duality theory in optimization and functional analysis.
Completeness
Completeness is a key property of Banach spaces, ensuring that limits of Cauchy sequences exist within the space. This property is vital for the convergence analysis of sequences and series in functional spaces.