Artificial Neural Networks and Deep Learning 2021 - Homework 1 Forum

Go back to competition Back to thread list Post in this thread

> hints to reduce time computation

We would like to train more models with different numbers of layers and neuron at once (with an iteration that changes parameters like number of layers and other stuff), using at the same time the cross validation. This will result in many trainings and many hours in front of codalab hoping that everything goes well. Which are the things that you can do to decrease the time of training for each model?
I can think about these solutions:
-reduce the number of epochs;
-massive use of pooling layers;
-being careful with the number of neurons of the last Dense layers.

there is something else?
Ps: while I was writing this, I thought that maybe It would be better if firstly we automate the training of different models, and then with the best one we use cross validation. Would that be more efficient? Am I missing something?

Posted by: corlac @ Nov. 16, 2021, 11:54 a.m.
Post in this thread