Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[D] is meta-learning the holy grail?

I have been reading up on meta-learning the past few days and I read a study (sorry I lost the link) but in the study they were able to decrease computational resources needed to build machine learning models by 90-99% in some cases. I believe they used a simply genetic algorithm to design different ML models.

It seems that this could be the holy grail. If you can figure out how to effectively implement meta-learning that you could exponentially increase the desired results of a model.

I’m currently expirementing with a genetic algorithm for model design in keras on the fashion mnist dataset. But I have yet to come up with a more effective model than the one given on the tensorflow website for this task. Although I’m not finished my experiment yet… I can find sparse information on the topic other than in academic literature. But it seems like it should almost be the primary focus if you are designing model architectures from scratch no? Why aren’t more people doing this?

Here is a video on the topic if you are unaware https://www.youtube.com/watch?v=2z0ofe2lpz4

submitted by /u/cryptonewsguy
[link] [comments]