Artificial Neural Networks and Deep Learning 2021 - Homework 2 Forum

Go back to competition Back to thread list Post in this thread

> TimeDistributed and Bidirectional wrappers

I think I have some incomplete intuitions about how the TimeDistributed and the Bidirectional wrappers can really help with our problem.
While I understand completely the principle behind the Bidirectional wrapper, which simply carries information both forward and backwards across time, together with the attention (in LSTM), which weights the individual samples according to their importance before stacking them together in the output units, I cannot understand completely the principle behind the TimeDistributed wrapper.
I read https://stackoverflow.com/questions/53107126/what-are-the-uses-of-timedistributed-wrapper-for-lstm-or-any-other-layers
and https://stackoverflow.com/questions/47305618/what-is-the-role-of-timedistributed-layer-in-keras
but I think our problem is a bit different, and when we use Conv1D, for example, we do not need to wrap it in the TimeDistributed wrapper, because it would not change the output of the layer.

I am not sure of what I just said, I would like to ask for suggestions and clarification.

Posted by: EugenioBertolini @ Jan. 8, 2022, 10:52 a.m.
Post in this thread