Package smile.base.mlp
Class OutputLayer
java.lang.Object
smile.base.mlp.Layer
smile.base.mlp.OutputLayer
- All Implemented Interfaces:
Serializable
The output layer in the neural network.
- See Also:
-
Field Summary
Fields inherited from class smile.base.mlp.Layer
bias, biasGradient, biasGradientMoment1, biasGradientMoment2, biasUpdate, dropout, mask, n, output, outputGradient, p, weight, weightGradient, weightGradientMoment1, weightGradientMoment2, weightUpdate -
Constructor Summary
ConstructorsConstructorDescriptionOutputLayer(int n, int p, OutputFunction activation, Cost cost) Constructor. -
Method Summary
Modifier and TypeMethodDescriptionvoidbackpropagate(double[] lowerLayerGradient) Propagates the errors back to a lower layer.voidcomputeOutputGradient(double[] target, double weight) Compute the network output gradient.cost()Returns the cost function of neural network.toString()voidtransform(double[] x) The activation or output function.Methods inherited from class smile.base.mlp.Layer
backpopagateDropout, builder, computeGradient, computeGradientUpdate, getInputSize, getOutputSize, gradient, input, input, leaky, leaky, leaky, linear, linear, mle, mse, of, output, propagate, propagateDropout, rectifier, rectifier, sigmoid, sigmoid, tanh, tanh, update
-
Constructor Details
-
OutputLayer
Constructor.- Parameters:
n- the number of neurons.p- the number of input variables (not including bias value).activation- the output activation function.cost- the cost function.
-
-
Method Details
-
toString
-
cost
Returns the cost function of neural network.- Returns:
- the cost function.
-
transform
public void transform(double[] x) Description copied from class:LayerThe activation or output function. -
backpropagate
public void backpropagate(double[] lowerLayerGradient) Description copied from class:LayerPropagates the errors back to a lower layer.- Specified by:
backpropagatein classLayer- Parameters:
lowerLayerGradient- the gradient vector of lower layer.
-
computeOutputGradient
public void computeOutputGradient(double[] target, double weight) Compute the network output gradient.- Parameters:
target- the desired output.weight- a positive weight value associated with the training instance.
-