java.lang.Object
com.jml.neural_network.activations.Relu
All Implemented Interfaces:
ActivationFunction

public class Relu extends Object implements ActivationFunction
The ReLU (Rectified Linear Unit) activation function. That is f(x)=max(0, x)
  • Field Summary

    Fields
    Modifier and Type
    Field
    Description
    static String
     
  • Constructor Summary

    Constructors
    Constructor
    Description
     
  • Method Summary

    Modifier and Type
    Method
    Description
    linalg.Matrix
    back​(linalg.Matrix data)
    Applies the derivative of the ReLU activation function to a matrix element-wise.
    linalg.Matrix
    forward​(linalg.Matrix data)
    Applies the ReLU activation function to a matrix element-wise.
    Gets the name of the activation function.

    Methods inherited from class java.lang.Object

    equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
  • Field Details

  • Constructor Details

    • Relu

      public Relu()
  • Method Details

    • forward

      public linalg.Matrix forward(linalg.Matrix data)
      Applies the ReLU activation function to a matrix element-wise.
      Specified by:
      forward in interface ActivationFunction
      Parameters:
      data - Matrix to apply ReLU activation function to.
      Returns:
      The result of the ReLU activation function applied to the data matrix.
    • back

      public linalg.Matrix back(linalg.Matrix data)
      Applies the derivative of the ReLU activation function to a matrix element-wise.
      Specified by:
      back in interface ActivationFunction
      Parameters:
      data - Matrix to apply the derivative of the ReLU activation function to.
      Returns:
      The result of the derivative of the ReLU activation function applied to the data matrix.
    • getName

      public String getName()
      Gets the name of the activation function.
      Specified by:
      getName in interface ActivationFunction
      Returns:
      The name of the activation function as a String.