Perceptron
Introduction
The Perceptron is a type of artificial neural network invented in 1957 by Frank Rosenblatt. It is a binary classifier that makes its predictions based on a linear predictor function combining a set of weights with the feature vector. The concept of the perceptron is foundational to the field of machine learning.
History
The perceptron was developed at the Cornell Aeronautical Laboratory in 1957 by Frank Rosenblatt, funded by the United States Office of Naval Research. The initial perceptron was a single layer feedforward neural network with adjustable weights updated in a supervised learning manner, based on the Hebbian learning rule.
Mathematical Formulation
The perceptron algorithm is used for supervised learning of binary classifiers - functions that can decide whether an input represented by a vector of numbers belongs to one class or another. A binary classifier is a function which can decide whether or not an input, represented by a vector of numbers, belongs to some specific class. It is a type of linear classifier, i.e. a classification algorithm that makes its predictions based on a linear predictor function combining a set of weights with the feature vector.
Learning Algorithm
The perceptron learning algorithm is a simple and effective method for binary classification. The algorithm updates the weights of the perceptron incrementally, at each step making the smallest possible adjustment to the weights to correctly classify a misclassified example. This is known as the perceptron convergence theorem.
Limitations and Criticisms
The single-layer perceptron is the simplest of the artificial neural networks and as such, it is also the most limited in its capabilities. In particular, it cannot learn to classify data sets that are not linearly separable. This limitation was famously demonstrated by Marvin Minsky and Seymour Papert in their book "Perceptrons" published in 1969.
Applications
Despite its limitations, the perceptron algorithm has been used successfully in a number of applications. It is particularly well-suited for large scale problems with high-dimensional input spaces, such as text classification and image recognition.
See Also
References
1. Rosenblatt, F. (1958). The Perceptron: A Probabilistic Model for Information Storage and Organization in the Brain. Psychological Review, 65(6), 386-408. 2. Minsky, M., & Papert, S. (1969). Perceptrons: An Introduction to Computational Geometry. MIT Press.