Artificial Neural Networks and Deep Learning 2021 - Homework 2 Forum

Go back to competition Back to thread list Post in this thread

> Seq2Seq: <START> and <END> with time series

Hello everyone, I would like to discuss a topic with you.

My model is of the encoder / decoder type, so I need the <START> and <END> tokens for the training.
Seen that we're working with time series and now with words, I thought I'd do something like this:
<START> = array containing all 0s
<END> = arrays containing all 1s
To do this I modified the build_sequences function.

I also tried with
<START> = array containing all -1
<END> = arrays all containing -2
since 0 and 1 are values ​​that appear in time series, but I only got worse results ...

My problem, specifically, is that in the final part of the code I go to build the predictions by passing to the decoder both its state in the previous step and its prediction in the previous step (as seen in the translation in real time during the last exercise session) , but since the first prediciton is totally wrong due to the <START> I pass in input (all 0s), all the following ones are also wrong.

Do you have any suggestions to solve the problem?

Posted by: AntonioGuadagno @ Dec. 29, 2021, 4:51 p.m.
Post in this thread