Commit f67e0a42 authored by Joao Fabro's avatar Joao Fabro

Update README.md

parent 67adacf5
......@@ -17,41 +17,73 @@ $./neural xor xortest
(this will open the config SAVED NETWORK FROM FILE "neuralnet.xor", and use the data file "training.xortest", to TEST a feedforward neural network with backpropagation)
In order to understand how the parameters of the feedforward neural network should be setted, analyze the file "config.howto", detailed bellow:
#-----------------
2 - Number of Inputs of Input Pattern (for example 2 inputs for XOR)
1 - Number of Outputs of Output Pattern (in this case, one -binary- output)
2 - Number of Layers of the Neural Net (2 layers, one hidden and one output layer)
51 - Number of neurons of 1st hidden layer
4 - Number of neurons of 2nd hidden layer...etc!(the number of layers depends on the third line-Number of Layers)
0.5 - Parameter momentum
0.3 - Parameter learning-rate
0.0001 - Parameter max_err (the training finishes when the maximum error for each pattern is smaller than this parameter)
#-----------------
In order to understand how to prepare the data for training, analyze the file "training.howto", detailed below:
#-----------------
4 // First line, just an integer number, identfying the total number of training patterns, 4 in this case
2 // from the second line to the end of the file, each pattern must be presented. In this case, each
0.0 // training pattern is composed of TWO(2) inputs, and ONE(1) output. Each number must be alone in each line
0.0 // So, for the "XOR" training pattern, the first patter is (0.0,0.0)->(0.0), 2 inputs, 0.0 and 0.0, and
1 // one output, in this case also 0.0.
0.0 // The FIRST training pattern ends here. The next line starts the second, and then the third and fourth.
2
0.0
1.0
1
1.0
2
1.0
0.0
1
1.0
2
1.0
1.0
1
0.0
#-----------------
Please note that these files are commented to explain how to build your own config and training files.
......@@ -61,30 +93,55 @@ After training, a "neuralnet" file is created, with the same "extension" used in
So, after executing "./neural xor", and if the training is successful, a "neuralnet.xor" would be created, with all the trained weights, as follows:
#-----------------
2
5
1
2
2.552809
2.478980
2
-2.468708
0.932158
2
0.486090
-1.731878
2
-0.111488
-1.068038
2
-0.271861
0.937927
5
4.615818
2.032804
1.405843
1.562611
-0.485905
#-----------------
In this case, the feedforward neural network has 2 layers(first line), 5 neurons on the first hidden layer(second line), and 1 neuron on the last(output) layer(third line).
......@@ -98,31 +155,59 @@ By executing "./neural xor xortest2" it is possible to see that the trained netw
....really bad generalization......
#----------------
$ ./neural xor xortest2
Argc= 3
Configuration file: config.xor
Number of Inputs : 2
Number of Outputs : 1
Number of Layers of the Net: 2
Number of Neurons on Layer 0 : 5
Number of Neurons on Layer 1 : 1
Momentum: 0.900000
Learning Rate: 0.200000
Precision: 0.010000
Number of Neurons for level 0 are 5
Number of Neurons for level 1 are 1
Training Pattern
In Part
In = 0.1
In = 0.1
In = 1
Out Part
Out = 0
Remembered Pattern
In Part
In = 0.1
In = 0.1
In = 1
Out Part
Out = 0.894627
#----------------
\ No newline at end of file
#----------------
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment