Package libai.nn.supervised
Class MultiLayerPerceptron
- java.lang.Object
-
- libai.nn.NeuralNetwork
-
- libai.nn.supervised.SupervisedLearning
-
- libai.nn.supervised.MultiLayerPerceptron
-
- All Implemented Interfaces:
java.io.Serializable
public class MultiLayerPerceptron extends SupervisedLearning
Multi Layer Perceptron or MLP. MultiLayerPerceptron was the first algorithm proposed to train multilayer neurons using the general delta rule. This implementation supports multiple backpropagation implementations via a Backpropagation interface. Check the packagelibai.nn.supervised.backpropagation
for more details about the supported implementations.- See Also:
- Serialized Form
-
-
Field Summary
-
Fields inherited from class libai.nn.NeuralNetwork
plotter, progress, random
-
-
Constructor Summary
Constructors Constructor Description MultiLayerPerceptron(int[] nperlayer, Function[] funcs)
Constructor.MultiLayerPerceptron(int[] nperlayer, Function[] funcs, java.util.Random rand)
Constructor.MultiLayerPerceptron(int[] nperlayer, Function[] funcs, Backpropagation trainer)
Constructor.MultiLayerPerceptron(int[] nperlayer, Function[] funcs, Backpropagation trainer, java.util.Random rand)
Constructor.
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description Matrix[]
getWeights()
Column
simulate(Column pattern)
Calculates the output for thepattern
.void
simulate(Column pattern, Column result)
Calculates the output for thepattern
and left the result inresult
.void
train(Column[] patterns, Column[] answers, double alpha, int epochs, int offset, int length, double minerror)
Train the network using the standard backpropagation algorithm.-
Methods inherited from class libai.nn.NeuralNetwork
error, error, euclideanDistance2, euclideanDistance2, gaussian, getDefaultRandomGenerator, getPlotter, getProgressBar, initializeProgressBar, open, open, open, save, setPlotter, setProgressBar, train, train
-
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
-
Methods inherited from class libai.nn.supervised.SupervisedLearning
validatePreconditions
-
-
-
-
Constructor Detail
-
MultiLayerPerceptron
public MultiLayerPerceptron(int[] nperlayer, Function[] funcs)
Constructor. Creates a MultiLayerPerceptron withnperlayer.length
layers. The number of neurons per layer is defined innperlayer
. Thenperlayer[0]
means the input layer. For each layeri
the neurons applies the output functionfuncs[i]
. These functions must be derivable. The training algorithm is standard backpropagation.- Parameters:
nperlayer
- Number of neurons per layer including the input layer.funcs
- Function to apply per layer. The function[0] could be null.
-
MultiLayerPerceptron
public MultiLayerPerceptron(int[] nperlayer, Function[] funcs, Backpropagation trainer)
Constructor. Creates a MultiLayerPerceptron withnperlayer.length
layers. The number of neurons per layer is defined innperlayer
. Thenperlayer[0]
means the input layer. For each layer the neurons applies the output functionfuncs[i]
. These functions must be derivable. The parameterbeta
means the momentum influence. A different implementation of the backpropagation algorithm can be provided on thetrainer
object.- Parameters:
nperlayer
- Number of neurons per layer including the input layer.funcs
- Function to apply per layer. The function[0] could be null.trainer
- The backpropagation implementation to be used during training
-
MultiLayerPerceptron
public MultiLayerPerceptron(int[] nperlayer, Function[] funcs, java.util.Random rand)
Constructor. Creates a MultiLayerPerceptron withnperlayer.length
layers. The number of neurons per layer is defined innperlayer
. Thenperlayer[0]
means the input layer. For each layer the neurons applies the output functionfuncs[i]
. These functions must be derivable. The training algorithm is standard backpropagation. A Random generator can be pass to initialize the matrices.- Parameters:
nperlayer
- Number of neurons per layer including the input layer.funcs
- Function to apply per layer. The function[0] could be null.rand
- Random generator used for creating matrices
-
MultiLayerPerceptron
public MultiLayerPerceptron(int[] nperlayer, Function[] funcs, Backpropagation trainer, java.util.Random rand)
Constructor. Creates a MultiLayerPerceptron withnperlayer.length
layers. The number of neurons per layer is defined innperlayer
. Thenperlayer[0]
means the input layer. For each layer the neurons applies the output functionfuncs[i]
. These functions must be derivable. A different backpropagation implementation can be provided along with a random generator to initialize the matrices.- Parameters:
nperlayer
- Number of neurons per layer including the input layer.funcs
- Function to apply per layer. The function[0] could be null.trainer
- The backpropagation implementation to be used during trainingrand
- Random generator used for creating matrices
-
-
Method Detail
-
train
public void train(Column[] patterns, Column[] answers, double alpha, int epochs, int offset, int length, double minerror)
Train the network using the standard backpropagation algorithm. The pattern is propagated from the input to the final layer (the output). Then the error for the final layer is computed. The error is calculated backwards to the first hidden layer, calculating the differentials between input and expected output (backpropagation). Finally, the weights and biases are updated using the delta rule:
W[i] = W[i] + beta*(W[i]-Wprev[i]) - (1-beta)*alpha.d[i].Y[i-1]^t
B[i] = B[i] + beta*(B[i]-Bprev[i]) - (1-beta)*alpha.d[i]- Specified by:
train
in classNeuralNetwork
- Parameters:
patterns
- The patterns to be learned.answers
- The expected answers.alpha
- The learning rate.epochs
- The maximum number of iterationsoffset
- The first pattern positionlength
- How many patterns will be used.minerror
- The minimal error expected.
-
simulate
public Column simulate(Column pattern)
Description copied from class:NeuralNetwork
Calculates the output for thepattern
.- Specified by:
simulate
in classNeuralNetwork
- Parameters:
pattern
- Pattern to use as input.- Returns:
- The output for the neural network.
-
simulate
public void simulate(Column pattern, Column result)
Description copied from class:NeuralNetwork
Calculates the output for thepattern
and left the result inresult
.- Specified by:
simulate
in classNeuralNetwork
- Parameters:
pattern
- Pattern to use as input.result
- The output for the input.
-
getWeights
public Matrix[] getWeights()
-
-