next up previous contents
Next: references Up: Neural Network Algorithms Previous: Conclusion

Other Variants and Models of Neural Networks

One of the modifications to the Hopfield and Tank model for dealing with optimization problems is the use of simulated annealing to perform a discretized version of the local optimization as in the Boltzmann machine approach of Aarts and Korst [11]. A key factor holding these variants back is the integer programming formulation of Figure 10, which shows that a large portion of a neural net's work be spent merely getting a feasible solution.

There are also two variants of neural nets that can be classified as ``geometric'' neural networks. These are the elastic net of Durbin and Willshaw [36] and the self-organizing map which is based on ideas of Kohonen [37]. When applied to the TSP, these network can be viewed as a string connecting a set of points on the plane, as shown in sketch A of Figure 13, such that where N is the number of cities. The idea here is to move these points iteratively toward the cities, thus deforming the geometric figure shaped by the connecting string as shown in sketch B of Figure 13. The objective is to continue this process until the geometric shape of the connecting string and the X points looks like a tour, with each city matched with one of the X points. See also the work of Durbin, Szeliski and Yuille [38] for more details on these networks.

Johnson and colleagues experimented with these geometric models on the TSP and the best tours they obtained are far worse than the ``average'' tours obtained using the 2-Opt and Lin-Kernighan heuristics.

  
Figure 13: A geometric-based neural network. The empty circles each denotes a city. Sketch A shows the start of the execution and sketch B shows an intermediate stage in the execution.



next up previous contents
Next: references Up: Neural Network Algorithms Previous: Conclusion



HPF demo web account
Mon Nov 18 19:45:42 EST 1996