Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[R] MGBPv2: Scaling Up Multi-Grid Back-Projection Networks (Winner of AIM ICCV19 Extreme-SR, Perceptual track)

[R] MGBPv2: Scaling Up Multi-Grid Back-Projection Networks (Winner of AIM ICCV19 Extreme-SR, Perceptual track)

(16x upscaling example from paper)

Authors:Pablo Navarrete Michelini, Wenbin Chen, Hanwen Liu, Dan Zhu

Abstract: Here, we describe our solution for the AIM–2019 Extreme Super–Resolution Challenge, where we won the 1st place in terms of perceptual quality (MOS) similar to the ground truth and achieved the 5th place in terms of high–fidelity (PSNR). To tackle this challenge, we introduce the second generation of MultiGrid BackProjection networks (MGBPv2) whose major modifications make the system scalable and more general than its predecessor. It combines the scalability of the multigrid algorithm and the performance of iterative backprojections. In its original form, MGBP is limited to a small number of parameters due to a strongly recursive structure. In MGBPv2, we make full use of the multigrid recursion from the beginning of the network; we allow different parameters in every module of the network; we simplify the main modules; and finally, we allow adjustments of the number of network features based on the scale of operation. For inference tasks, we introduce an overlapping patch approach to further allow processing of very large images (e.g. 8K). Our training strategies make use of a multiscale loss, combining distortion and/or perception losses on the output as well as downscaled output images. The final system can balance between high quality and high performance.

PDF Link | Landing Page | Github

submitted by /u/pnavarre
[link] [comments]