Learn About Our Meetup

5000+ Members



Join our meetup, learn, connect, share, and get to know your Toronto AI community. 



Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.



Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[D] is meta-learning the holy grail?

I have been reading up on meta-learning the past few days and I read a study (sorry I lost the link) but in the study they were able to decrease computational resources needed to build machine learning models by 90-99% in some cases. I believe they used a simply genetic algorithm to design different ML models.

It seems that this could be the holy grail. If you can figure out how to effectively implement meta-learning that you could exponentially increase the desired results of a model.

I’m currently expirementing with a genetic algorithm for model design in keras on the fashion mnist dataset. But I have yet to come up with a more effective model than the one given on the tensorflow website for this task. Although I’m not finished my experiment yet… I can find sparse information on the topic other than in academic literature. But it seems like it should almost be the primary focus if you are designing model architectures from scratch no? Why aren’t more people doing this?

Here is a video on the topic if you are unaware

submitted by /u/cryptonewsguy
[link] [comments]

Toronto AI is a social and collaborative hub to unite AI innovators of Toronto and surrounding areas. We explore AI technologies in digital art and music, healthcare, marketing, fintech, vr, robotics and more. Toronto AI was founded by Dave MacDonald and Patrick O'Mara.