Package com.jml.optimizers


package com.jml.optimizers
This package contains optimizers useful for estimating the minimum of a function.
  • Class Summary
    Class
    Description
    The Adam (Adaptive Moment Estimation) optimizer.
    Vanilla gradient descent optimizer.
    The momentum based gradient descent optimizer.
    Interface for an optimizer which can be used to minimize a function by utilizing the gradients of that function.
    Learning rate scheduler for optimizers.
    StepLearningRate is a Scheduler which "steps" the learning rate at regular intervals during optimization.