Package smile.base.mlp
Class HiddenLayer
java.lang.Object
smile.base.mlp.Layer
smile.base.mlp.HiddenLayer
- All Implemented Interfaces:
Serializable
A hidden layer in the neural network.
- See Also:
-
Field Summary
Fields inherited from class smile.base.mlp.Layer
bias, biasGradient, biasGradientMoment1, biasGradientMoment2, biasUpdate, dropout, mask, n, output, outputGradient, p, weight, weightGradient, weightGradientMoment1, weightGradientMoment2, weightUpdate -
Constructor Summary
ConstructorsConstructorDescriptionHiddenLayer(int n, int p, double dropout, ActivationFunction activation) Constructor. -
Method Summary
Modifier and TypeMethodDescriptionvoidbackpropagate(double[] lowerLayerGradient) Propagates the errors back to a lower layer.toString()voidtransform(double[] x) The activation or output function.Methods inherited from class smile.base.mlp.Layer
backpopagateDropout, builder, computeGradient, computeGradientUpdate, getInputSize, getOutputSize, gradient, input, input, leaky, leaky, leaky, linear, linear, mle, mse, of, output, propagate, propagateDropout, rectifier, rectifier, sigmoid, sigmoid, tanh, tanh, update
-
Constructor Details
-
HiddenLayer
Constructor.- Parameters:
n- the number of neurons.p- the number of input variables (not including bias value).dropout- the dropout rate.activation- the activation function.
-
-
Method Details
-
toString
-
transform
public void transform(double[] x) Description copied from class:LayerThe activation or output function. -
backpropagate
public void backpropagate(double[] lowerLayerGradient) Description copied from class:LayerPropagates the errors back to a lower layer.- Specified by:
backpropagatein classLayer- Parameters:
lowerLayerGradient- the gradient vector of lower layer.
-