Class Relu
java.lang.Object
com.jml.neural_network.activations.Relu
- All Implemented Interfaces:
ActivationFunction
The ReLU (Rectified Linear Unit) activation function. That is f(x)=max(0, x)
-
Field Summary
Fields -
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionlinalg.Matrixback(linalg.Matrix data)Applies the derivative of the ReLU activation function to a matrix element-wise.linalg.Matrixforward(linalg.Matrix data)Applies the ReLU activation function to a matrix element-wise.getName()Gets the name of the activation function.
-
Field Details
-
NAME
- See Also:
- Constant Field Values
-
-
Constructor Details
-
Relu
public Relu()
-
-
Method Details
-
forward
public linalg.Matrix forward(linalg.Matrix data)Applies the ReLU activation function to a matrix element-wise.- Specified by:
forwardin interfaceActivationFunction- Parameters:
data- Matrix to apply ReLU activation function to.- Returns:
- The result of the ReLU activation function applied to the data matrix.
-
back
public linalg.Matrix back(linalg.Matrix data)Applies the derivative of the ReLU activation function to a matrix element-wise.- Specified by:
backin interfaceActivationFunction- Parameters:
data- Matrix to apply the derivative of the ReLU activation function to.- Returns:
- The result of the derivative of the ReLU activation function applied to the data matrix.
-
getName
Gets the name of the activation function.- Specified by:
getNamein interfaceActivationFunction- Returns:
- The name of the activation function as a String.
-