Package com.jml.optimizers
Class Optimizer
java.lang.Object
com.jml.optimizers.Optimizer
- Direct Known Subclasses:
Adam
,GradientDescent
,Momentum
Interface for an optimizer which can be used to minimize a function by utilizing the gradients of that function.
-
Field Summary
-
Constructor Summary
-
Method Summary
Modifier and TypeMethodDescriptionabstract String
Gets the details of this optimizer.double
Gets the learning rate for this optimizer.void
setScheduler(Scheduler schedule)
Sets the learning rate scheduler for this optimizer.abstract linalg.Matrix[]
step(boolean flag, linalg.Matrix... params)
abstract linalg.Matrix[]
step(linalg.Matrix... params)
Steps the optimizer a single iteration by applying the update rule of the optimizer to the matrix w.
-
Field Details
-
schedule
-
name
-
-
Constructor Details
-
Optimizer
public Optimizer()
-
-
Method Details
-
step
public abstract linalg.Matrix[] step(linalg.Matrix... params)Steps the optimizer a single iteration by applying the update rule of the optimizer to the matrix w.- Parameters:
params
- An array of Matrices strictly containing the relevant parameters for that optimizer.- Returns:
- The result of applying the update rule of the optimizer to the matrix w.
-
step
public abstract linalg.Matrix[] step(boolean flag, linalg.Matrix... params) -
getDetails
Gets the details of this optimizer.- Returns:
- Important details of this optimizer as a string.
-
setScheduler
Sets the learning rate scheduler for this optimizer.- Parameters:
schedule
- Learning rate scheduler.
-
getLearningRate
public double getLearningRate()Gets the learning rate for this optimizer.- Returns:
- The learning rate for this optimizer.
-