Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[P] Subsequence to sequence prediction LSTM / stacked LSTM

[P] Subsequence to sequence prediction LSTM / stacked LSTM

Hi :),

I’m currently on a project, where I’m trying to predict the next value of a sequence.

The data looks as follows:

y: the value to predict is captured ones a day.

x: there is an input sequence of around 2000 timesteps for every day

I would like to predict the next day’s value of y, i.e. y_{t+1}. However y_{t+1} is assumed to be not only dependent on the values of x but also on the history of y, i.e. y_t, y_{t-1}, y_{t-n}. I’m wondering how I could implement this idea in a LSTM-structure.

My idea is a network that looks like that:

https://i.redd.it/b9x7k87nw3631.png

Does that make sense or am I on the wrong track there?

How would you implement such a model in Keras? My idea was to make a network that looks like: x -> TimeDistributed(LSTM1) -> LSTM2 -> y

submitted by /u/cptn_iglo
[link] [comments]