next up previous contents
Next: Genetic Algorithms (GAs) Up: Simulated Annealing Previous: Huang-Romeo-Sangiovanni schedule

Applications

Using SA in a multi-stage approach, we have tackled a large instance of the course scheduling problem. As reported in [42], our results depend highly on the cooling schedule as well as on the strategy of swaps (or moves) between courses, rooms, etc. This application and our results reported in [42] show that SA is a viable method in dealing with complex problems such as scheduling.

In a series of articles [15,16,35], David Johnson and colleagues present a top-quality, extensive experimental survey of SA. In the first article, annealing is compared with the Kernighan-Lin algorithm for graph partitioning on a number of problems varying in size and structure. Briefly, annealing did better on sparse random graphs, especially as size increased, but did worse on geometrically structured graphs. Many factors were taken into account: One was the number of runs of a given algorithm needed to find a solution of a given quality. Another was the construction of an appropriate penalty cost term, which keeps the two partitions approximately the same size. Many runs were made for every experiment to properly account for the randomized nature of the algorithms. Johnson's seems to be the only study in the literature to make use of a starting temperature significantly lower than that at which the expected energy begins to decline, validating this technique through observations about typical states at higher temperatures, as well as by direct comparison between schedules with high and low starting temperatures.

The second article by Johnson et al. explores the behavior of annealing on two problems, graph coloring and number partitioning. These were chosen because they were both problems to which it was generally believed annealing was not suited.

For graph coloring, three different move sets were tried for the annealing, and were compared with three different heuristics, as well as with a modification (``XRLF'') of one of the heuristics. Like annealing, XRLF allows a tradeoff between run time and solution quality. To make a long story short, different algorithms were best on different size graphs and on different types of graphs (for example, controlling the edge probability for a random graph). None of the annealing move sets dominated the others. For almost all cases, either XRLF or a version of annealing worked best, the more conventional heuristics were not generally competitive.

For the number partitioning problem the situation is quite different. A key property of the problem is that, at least with the natural move set used by Johnson, there are many states of quite low energy, separated by tall energy barriers. The situation is really terrible for SA, for the reasons: a low temperature does not allow the barriers to be crossed, and a high temperature does not distinguish between the minima. So Johnson's experiments showed this to be one case where annealing was actually no better than repeated descent, or annealing at temperature 0. Also, it was observed that during annealing for this problem, the last state seen was often not the best state ever seen, pretty strange!

After reading Johnson's studies the impression we left with was that the question of whether annealing was better or worse than standard (or conventional) heuristics was not a simple one. Perhaps, the answers change depending on a wide variety of factors, including details of the heuristic algorithms, the choice of the annealing move set, and the exact nature of the problem instances to be solved.



next up previous contents
Next: Genetic Algorithms (GAs) Up: Simulated Annealing Previous: Huang-Romeo-Sangiovanni schedule



HPF demo web account
Mon Nov 18 19:45:42 EST 1996