Adam vs SGD Optimization
Same LR
SGD LR:
0.1
Momentum LR:
0.1
Adam LR:
0.1
Run
Step
Reset
Iteration:
0
/50
SGD Loss:
-
Momentum Loss:
-
Adam Loss:
-