Results

Lang and Withbrock [#1##1#] report results obtained on this problem using #tex2html_wrap_inline3231# (a two-node input layer, five-node first hidden layer, five-node second hidden layer, five-node third hidden layer, and one-node output layer) network with short-cut connections (each unit is connected to every unit in all earlier layers, not just to units in the previous layer). For this experiment, we use a #tex2html_wrap_inline3233# network with also short-cut connections. In the three runs of the network over the same input data, the first run uses #tex2html_wrap_inline3235#, the second uses #tex2html_wrap_inline3237#, and the third uses #tex2html_wrap_inline3239# activation functions. For each run, the network weights are initialized with a uniformly distributed random values in the range #tex2html_wrap_inline3241# to #tex2html_wrap_inline3243#. Learning rate and momentum are set at #tex2html_wrap_inline3245# and #tex2html_wrap_inline3247#, respectively. The error bound is set to #tex2html_wrap_inline3249#. The network is periodically (every 100000 iterations) saved to a file. Table-(#table1#738>) summarizes the various testing and learning times. Note that the elapsed compute and system times are measured in <#739#> seconds<#739#>, and average performance on the task is measured in <#740#> connections per second<#740#>.

#table741#
Table 1: Performance Figures for Spirals