mardi 16 juillet 2019

Adam Optimizer only returns one class every time

I am currently creating a web application for image classification using Tensorflow.js. I realize that when I try to fit/train my model, the SGD Optimizer does its job well when predicting on the training set. On the other hand, the model trained by the Adam optimizer defaults to only one class/category even though there are 7 different classes in the oneHot encoded labels. Funny thing is, the loss continues to go down, even though its predictions on the training set stay the same. Why is it that only the SGD Optimizer works in this case?

I am attempting to use sgd (stochastic gradient descent) for now. The only difference in the code that yielded these two results was the optimizer used (everything else was the same).

Actual Labels:
[[1, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 1, 0, 0],
[0, 1, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 1, 0, 0],
[0, 0, 0, 0, 1, 0, 0],
[0, 0, 0, 0, 0, 0, 1],
[0, 0, 0, 1, 0, 0, 0],
[0, 0, 0, 1, 0, 0, 0]]

SGD Prediction (good accuracy):
[[0.7766488, 0.0011008, 0   , 0.1785417, 0.0128197, 0.0004938, 0.0303952],
     [0.0000667, 0.0000794, 0   , 0.0006806, 0.9991421, 0.0000079, 0.0000238],
     [0.0014632, 0.5784221, 0   , 0.1196582, 0.2930434, 0.0010978, 0.0063155],
     [0.0000067, 0.000024 , 0   , 0.0015023, 0.9984038, 0.0000043, 0.0000589],
     [0.0000328, 0.0000816, 0   , 0.0027291, 0.9970328, 0.0000105, 0.0001136],
     [0.0000657, 0.0000679, 0   , 0.0176743, 0.002068 , 0.000019 , 0.9801047],
     [0.0055083, 0.0023474, 1e-7, 0.7483685, 0.215641 , 0.0009718, 0.027163 ],
     [0.0045981, 0.0015817, 0   , 0.922437 , 0.026959 , 0.0002184, 0.044206 ]]

Adam Prediction (at every step since step 1):
[[0, 0, 0, 0, 1, 0, 0],
[0, 0, 0, 0, 1, 0, 0],
[0, 0, 0, 0, 1, 0, 0],
[0, 0, 0, 0, 1, 0, 0],
[0, 0, 0, 0, 1, 0, 0],
[0, 0, 0, 0, 1, 0, 0],
[0, 0, 0, 0, 1, 0, 0],
[0, 0, 0, 0, 1, 0, 0]]




Aucun commentaire:

Enregistrer un commentaire