Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[D] do you agree “The research on how to deal with time-series data is almost finished”?

here is my question in quora

https://www.quora.com/Why-is-RNN-less-progress-research-than-CNN-especially-the-time-series

question is

Why is RNN less progress research than CNN (especially the time series)?

and answer is

Any problem concerning the images is incredibly harder than that concerns a time series. This is why the research on CNN and its derivatives (U-Net, GAN) are still continuing.

The research on how to deal with time-series data is almost finished. It looks like researchers are trying to come up with better and better techniques, but what is actually happening is people are trying to predict dependencies that are actually not present in the data or using insufficient data!

A good example is the stock value prediction. The simple truth is that the stock prices depend on many more variables that are not present in the typical input time series’ used with RNN or LSTM.

My personal opinion is now LSTM is SOTA but I think another SOTA network will be created.

how about think of this topic?

submitted by /u/GoBacksIn
[link] [comments]