Package com.jml.optimizers
Class GradientDescent
java.lang.Object
com.jml.optimizers.Optimizer
com.jml.optimizers.GradientDescent
Vanilla gradient descent optimizer. Applies the update rule,
wi+1 = wi - a*grad( wi )
where a
is the learning rate.-
Field Summary
-
Constructor Summary
ConstructorDescriptionGradientDescent(double learningRate)
Creates a vanilla gradient descent optimizer with the specified learning rate. -
Method Summary
Modifier and TypeMethodDescriptionGets the details of this optimizer.linalg.Matrix[]
step(boolean flag, linalg.Matrix... params)
Steps the optimizer a single iteration by applying the update rule of the optimizer to the matrix w.linalg.Matrix[]
step(linalg.Matrix... params)
Steps the optimizer a single iteration by applying the update rule of the optimizer to the matrix w.Methods inherited from class com.jml.optimizers.Optimizer
getLearningRate, setScheduler
-
Field Details
-
OPTIM_NAME
- See Also:
- Constant Field Values
-
-
Constructor Details
-
GradientDescent
public GradientDescent(double learningRate)Creates a vanilla gradient descent optimizer with the specified learning rate.- Parameters:
learningRate
- Learning rate to be used in the gradient descent algorithm.
-
-
Method Details
-
step
public linalg.Matrix[] step(linalg.Matrix... params)Steps the optimizer a single iteration by applying the update rule of the optimizer to the matrix w.- Specified by:
step
in classOptimizer
- Parameters:
params
- An array of Matrices strictly containing {w, wGrad} where w is a matrix contain the weights to apply the update to and wGrad is the gradient of the objective function with respect to w.- Returns:
- The result of applying the update rule of the optimizer to the matrix w.
-
step
public linalg.Matrix[] step(boolean flag, linalg.Matrix... params)Steps the optimizer a single iteration by applying the update rule of the optimizer to the matrix w.- Specified by:
step
in classOptimizer
- Parameters:
flag
- Does nothing for gradient descent optimizer.params
- An array of Matrices strictly containing {w, wGrad} where w is a matrix contain the weights to apply the update to and wGrad is the gradient of the objective function with respect to w.- Returns:
- The result of applying the update rule of the optimizer to the matrix w.
-
getDetails
Gets the details of this optimizer.- Specified by:
getDetails
in classOptimizer
- Returns:
- Important details of this optimizer as a string.
-