Join our meetup, learn, connect, share, and get to know your Toronto AI community.
Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.
Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.
I’m an undergrad about to graduate and debating on doing Graduate school on some form of AI. What I fear is that, every bit of progress that was made in the past decade was either hype or due to faster computation or using relu rather than sigmoid.
IMO AI can be broken down to 3 parts, the algorithm for inference, knowledge representation, and computational speed. Of these 3, have we really made any progress in representation and algorithms in the past decade? Besides getting better GPU’s , everything else was already discover decades ago. Can anyone change my mind? Are we going to have AI winter 2.0 soon?
submitted by /u/uoftsuxalot
[link] [comments]