Class Relu
java.lang.Object
com.jml.neural_network.activations.Relu
- All Implemented Interfaces:
ActivationFunction
The ReLU (Rectified Linear Unit) activation function. That is f(x)=max(0, x)
-
Field Summary
-
Constructor Summary
-
Method Summary
Modifier and TypeMethodDescriptionlinalg.Matrix
back(linalg.Matrix data)
Applies the derivative of the ReLU activation function to a matrix element-wise.linalg.Matrix
forward(linalg.Matrix data)
Applies the ReLU activation function to a matrix element-wise.getName()
Gets the name of the activation function.
-
Field Details
-
NAME
- See Also:
- Constant Field Values
-
-
Constructor Details
-
Relu
public Relu()
-
-
Method Details
-
forward
public linalg.Matrix forward(linalg.Matrix data)Applies the ReLU activation function to a matrix element-wise.- Specified by:
forward
in interfaceActivationFunction
- Parameters:
data
- Matrix to apply ReLU activation function to.- Returns:
- The result of the ReLU activation function applied to the data matrix.
-
back
public linalg.Matrix back(linalg.Matrix data)Applies the derivative of the ReLU activation function to a matrix element-wise.- Specified by:
back
in interfaceActivationFunction
- Parameters:
data
- Matrix to apply the derivative of the ReLU activation function to.- Returns:
- The result of the derivative of the ReLU activation function applied to the data matrix.
-
getName
Gets the name of the activation function.- Specified by:
getName
in interfaceActivationFunction
- Returns:
- The name of the activation function as a String.
-