Artificial Neural Networks and Deep Learning 2021 - Homework 2 Forum

Go back to competition Back to thread list Post in this thread

> Increasing the size of the window leads to constant prediction

Hi everyone, I have a strange issue. If I increase the size of the window used during training, my models predict a costant value as output, like some sort of average.
The issue seems strange to me because it happens both with the models seen during the exercise sessions and with some custom sequence to sequence models I built. It seems that changing the stride to different values does not improve the prediction, and the optimal window is around 200 samples. If the window is increased to values greater than 1000 the models start predicting a constant value.

The approach with smaller windows led to a quite good result (about 4.6 RMSE), but I think that the models could benefit from predicting data looking at a larger window.

Do any of you have faced the same issue? Any hints on how to solve it?

Posted by: lolepls @ Dec. 30, 2021, 3:28 p.m.

Is it possible that the model is not powerful enough to take advantage of a larger input? My models perform better with large windows, but they also have so many parameters, I don't know if this could be one of the reasons.

Posted by: Gianluca @ Dec. 30, 2021, 6:43 p.m.

UPDATE: I found the problem and I post here the solution just in case someone else has the same problem.

My models were predicting a costant value because the early stopping callback I had was too restrictive, and had too few epochs of patience. The larger the window, the longer the model needs to train in order to start improving, and the harsh early stopping killed the training after the model could actually learn something.

My advice is: if you are having the same problem, try to check if you are letting your model train for a sufficient amount of time; maybe check the number of epochs or the callbacks configurations. Try to train your network for a hundred or so epochs without any early stopping and see if you start getting better results.

Bye guys!

Posted by: lolepls @ Jan. 3, 2022, 9:26 a.m.
Post in this thread