Join our meetup, learn, connect, share, and get to know your Toronto AI community.
Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.
Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.
Hi All —
I’m working on some scientific research code for a neuroscience experiment where we need to do density estimation in real time or something close to real time. The data is from multiple time series and it’s not particularly wide, but it is long (~7 million time points, maybe more at some point) and there is potential for up to 1000+ time series.
I was wondering if there were suggestions from the r/MachineLearning community on how to scale this analysis. More specifically, are there software frameworks that would make this easier? Are there alternatives to Gaussian Mixture Models and Kernel Density Estimates (i.e. some sort of neural network density estimation) that would be easier to throw on a GPU because of an existing software package?
I realize the answer might be no and I’m going to have to write a bunch of custom code, but I thought I might as well check before I go too far down one rabbit hole. Thanks!
submitted by /u/dingfuus
[link] [comments]