Class Activations

java.lang.Object
com.jml.neural_network.activations.Activations

public abstract class Activations extends Object
A class which contains methods for geting various pre-made activation functions for use in neural network layers.
  • Field Details

    • sigmoid

      public static final ActivationFunction sigmoid
      The sigmoid activation function. f(x)=1/(1+exp(-x))
    • relu

      public static final ActivationFunction relu
      A pre-defined instance of the Relu (Rectified Linear Unit) activation function. f(x)=max(0, x)
    • linear

      public static final ActivationFunction linear
      A pre-defined instance of the linear activation function. f(x)=x.
    • tanh

      public static final ActivationFunction tanh
      A pre-defined instance of the hyperbolic tangent activation function. f(x) = tanh(x) = (ex - e-x) / (ex + e-x)
  • Constructor Details

    • Activations

      public Activations()
  • Method Details

    • sigmoid

      public static ActivationFunction sigmoid()
      Creates and returns a new instance of the sigmoid activation function.
      Returns:
      The sigmoid activation function. f(x)=1/(1+exp(-x))
    • relu

      public static ActivationFunction relu()
      Creates and returns a new relu activatoin function.
      Returns:
      An instance of the Relu (Rectified Linear Unit) activation function. f(x)=max(0, x)
    • linear

      public static ActivationFunction linear()
      Creates and returns a new instacne of the linear activation function.
      Returns:
      An instance of the linear activation function. f(x)=x.
    • tanh

      public static ActivationFunction tanh()
      Creates and returns a new instance of the hyperbolic tangent activation function.
      Returns:
      An instance of the he hyperbolic tangent activation function
      f(x) = tanh(x) = (ex - e-x) / (ex + e-x)
    • softmax

      public static ActivationFunction softmax()
      Creates and returns a new instance of the softmax activations function.
      Returns:
      The softmax activation function. f(x)i = exp(x_i) / sumj=1m( exp(xj) ) where x is a vector of length m and sumj=1m ( exp(xj) ) = exp(x1) + exp(x2) + ... + exp(xm)