Adam
Adam: A Method For Stochastic Optimization
An alternative to AdaGrad and RMSProp, it maintains exp-weighted running estimates of the 1st and 2nd moments of the gradient.
from a laptop in Sunnyvale
Adam: A Method For Stochastic Optimization
An alternative to AdaGrad and RMSProp, it maintains exp-weighted running estimates of the 1st and 2nd moments of the gradient.