Simulated annealing (SA) has been widely used for tackling different combinatorial optimization problems, including academic timetabling problems [12, 13, 2, 48, 28, 8, 14], The basic algorithm is described in Figure 2. The results obtained depend heavily on the cooling schedule used. We initially used the most commonly known and used schedule, which is geometric cooling, but later tried adaptive cooling, as well as the method of geometric reheating based on cost [3].
A comprehensive discussion of the theoretical and practical details of SA
has been given in a number of books and review articles on the subject
[1, 47, 40].
It suffices here to say that the elementary operation is the Metropolis
Monte Carlo update, which involves the generation of some new candidate
configuration (in this case, a schedule),
which is then automatically accepted if it lowers the cost (C),
or accepted with probability , where T is the
temperature parameter, if it would increase the cost by
.
In Figure 2, s is the current schedule (configuration)
and
is a neighboring schedule obtained from the current neighborhood
space (
) by assigning a class to a timeslot and room or
swapping two classes in time and/or space.
The annealing procedure starts at a high temperature, and then slowly
decreases the temperature to zero, following a specified cooling schedule.
Simulated annealing is essentially a generalization of a local optimization
strategy, but improves upon it since at non-zero temperatures,
there is a chance for the system to escape from local minima.
Figure 2: The Simulated Annealing Algorithm
The SA algorithm has advantages and disadvantages compared to other global optimization techniques. Among its advantages are the relative ease of implementation, the applicability to virtually any combinatorial optimization problem, the ability to provide reasonably good solutions for most problems (depending on the cooling schedule and update moves used), and the ease with which it can be combined with other methods, such as expert systems and branch-and-bound techniques, forming quite useful hybrid methods for tackling a range of complex problems. Some of the drawbacks of SA are that to obtain good results the update moves and the various tunable parameters used (such as the cooling rate) need to be carefully chosen, the runs often require a great deal of computer time, and many runs may be required. Despite these problems, SA is a robust technique and has been successfully applied to a wide variety of problems. Detailed comparisons of SA with other optimization techniques for some well-known problems has shown that SA is competitive with many of the best heuristics [30].