Suche

3.2 A simple network with two layers

The neural network on this page uses a total of four neurons, which are located in two so-called layers that are arranged one after the other. The data flows simultaneously from left to right through the neural network.

As the neural network is to be trained with the same sample data sets as the network with one neuron, there are also two inputs (x1 and x2) and an output o. In the first layer, there are three neurons in parallel, which in turn are connected to the single neuron in the second layer, the output layer.

After training, this neural network is slightly better able to generate the desired output for the patterns of the two classes 0 and 1 than the neural network with one neuron. However, this may not always be the case, as the success of the training also depends on the random distribution of the starting values for the weights and thresholds.

The calculation of the neural network can be displayed by clicking on one of the four patterns on the left in the interactive figure. The selected pattern is highlighted in green. In the first layer, the activation of each of the three neurons is, as usual, x1⋅w1 + x2⋅w2. To activate the output neuron, however, the three weights of the second layer are now multiplied by the outputs of the three neurons in the first layer. The outputs of the neurons in the first layer are displayed in dashed boxes. The whole process can also be recalculated manually here. The output of each neuron can be recalculated using the small calculator at the bottom of the page. As in the previous unit, the subtraction (activation minus threshold) must be entered as the x-value.

Instructions

  • Use the checkboxes to select which training data should be loaded.
  • Click New to select the pre-selected values for a simple separation, a new randomly selected Boolean function, new random numbers, or circle values as a data set.
  • Click Train to start the training.
  • At the end of the training, click Train again to continue training.
  • Click on the numerical values on the left-hand side of the interactive figure. The selected line from the input x is highlighted in green and the calculation of the neural network for this pattern is displayed.
  • Click in the empty space in the figure to hide the calculation again.
  • Use the calculation of the activation function further down on this page to recalculate on your own. Enter the result of the subtraction (activation minus threshold) as the x-value.
  • In the figure on the left, blue stands for negative values and red stands for positive values.

Bitte einen anderen Browser benutzen. Bitte einen anderen Browser benutzen.

Bitte einen anderen Browser benutzen.



Tasks

  • Load many different examples and train the neural network with the given patterns. Try to understand how the output is calculated and how the neural network manages to calculate the correct class (0 or 1) for a pattern.
  • Recalculate at least one example manually. Note: Rounded values are shown in all places in the figure!
  • For a few different Boolean functions, think about suitable values for the weights and for the threshold value so that the neural network would work as well as possible. Note them on paper, then perform the training and check if your suggestions were correct.

The example from the last data set (circle) is particularly suited to show why the neural network with two layers and more neurons is better at separating the 0-patterns from the 1-patterns: Unlike a single neuron, which can only generate a single separation line, more neurons can generate complex separations. This is especially helpful for classifying real data from the real world, as those are usually not as simply structured as the introductory examples on this page.

Calculation of the activation function:

Close search