Simulated Annealing
Introduction
Simulated annealing (SA) is a probabilistic technique used for finding an approximate solution to an optimization problem. The method is inspired by the process of annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one.
History
The concept of simulated annealing was introduced independently by Scott Kirkpatrick, C. Daniel Gelatt and Mario P. Vecchi in 1983, and by Vlado Černý in 1985. The method was inspired by the annealing process in metallurgy, where it was used to reduce defects in crystalline structures. This process was then abstracted and applied to the field of computational optimization.
Basic Concept
Simulated annealing is a metaheuristic approach to solve global optimization problems. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more efficient than exhaustive enumeration — provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution.
The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. Both are attributes of the material's thermodynamic state. Heating and cooling the material influences the rate of diffusion, and thus the rate at which atoms or molecules move in the material, allowing it to reach a state of minimum energy.
Algorithm
The simulated annealing algorithm works by starting with a random solution to a problem and a high "temperature". It then generates a random "neighboring" solution and decides whether to move to it or stay with the current solution based on the objective function and the current temperature. As the algorithm progresses, the temperature is gradually lowered, reducing the chance of accepting worse solutions. This process is repeated until the system cools to a "frozen" state, hopefully in a good solution.
Mathematical Description
The simulated annealing algorithm is a random search algorithm with an objective function f(x) to be minimized. The algorithm is initialized with a system at high temperature, where any move is accepted. As the system cools, the algorithm becomes more selective in accepting moves.
The algorithm's main components are a generator of random initial states, a generator of random modifications to a given state for the search of the neighborhood, an objective function that assigns a quality value to each state, and a cooling schedule.
Applications
Simulated annealing has been used for a wide range of optimization problems. These include numerical optimization, data fitting, machine learning, and many types of scheduling problems, including the traveling salesman problem and the school timetabling problem.
Advantages and Disadvantages
Like any optimization algorithm, simulated annealing has its advantages and disadvantages. One of its main advantages is its simplicity and its ability to escape local minima. However, it has several disadvantages, such as the difficulty of choosing a good cooling schedule and the fact that it can be slow to converge.