Hello again! I just realized after a few weeks of sitting on this that I didn’t tell you guys about something really cool I’ve been working on! Maybe I should do that.
A few months ago I posted about DeOldify- my pet project for colorizing and restoring old photos. Well now that includes videos as well! Here’s a demo I showed at Facebook’s F8 conference:
And here’s the talk at F8: https://www.facebook.com/FacebookforDevelopers/videos/340167420019712/
And here’s the article I wrote with Jeremy Howard of FastAI and Uri Manor of Salk Institute: https://www.fast.ai/2019/05/03/decrappify/
Anyway, the gist is that we (FastAI and I) developed this one weird trick called NoGAN to achieve this. That basically consists of this (slide from F8):
The pretraining is using basic perceptual loss (or “feature loss”) for the generator. This gets you the benefits of GANs, without the problems, basically. Hence, smooth video!
The progression of training looks like this (sweet spot is at 1.4%, then it goes too far from there and gets weird with the orange skin):
Anyway, that’s the gist. You can read more in the links above (readme for github project also has good details on NoGAN). Oh by the way, NoGAN also works on super resolution, and I suspect for most image to image tasks as well as perhaps even non-image tasks.
Looks like AutoML has by far the biggest carbon footprint of most training processes (394,863 CO2e).
Are you interested in deep learning for NLP but also concerned about the CO2 footprint of training? You should be! Excited to share our work “Energy and Policy Considerations for Deep Learning in NLP” at @ACL2019_Italy! With @ananya__g and @andrewmccallum. Preprint coming soon.
CMATERdb is the pattern recognition database repository created at the ‘Center for Microprocessor Applications for Training Education and Research’ (CMATER) research laboratory, Jadavpur University, Kolkata 700032, INDIA. This database is free for all non-commercial uses.
Please acknowledge CMATER explicitly, whenever you use this database for academic and research purposes.