Skip to main content

Blog

Learn About Our Meetup

5000+ Members

MEETUPS

LEARN, CONNECT, SHARE

Join our meetup, learn, connect, share, and get to know your Toronto AI community. 

JOB POSTINGS

INDEED POSTINGS

Browse through the latest deep learning, ai, machine learning postings from Indeed for the GTA.

CONTACT

CONNECT WITH US

Are you looking to sponsor space, be a speaker, or volunteer, feel free to give us a shout.

[P] Anyfig – configuration manager. Argparse alternative. Great if you have complex configurations

Hi everyone!

In my own ML projects, I’ve been using Python classes to define my settings such as batch size and learning rate. In many ways, it has been more powerful than Argparse, so I thought I’d make it into a package. Without any more fuzz, introducing Anyfig!! (Gihub repo)

Anyfig is a Python library for creating configurations (settings) at runtime. Anyfig utilizes Python classes which empowers the developer to put anything, from strings to custom objects in the config. Hence the name Any(con)fig.

The basics

  1. Decorate a class with ‘@anyfig.config_class’.
  2. Add config-parameters as attributes in the class
  3. Call the ‘setup_config’ function to instantiate the config object

import anyfig import random @anyfig.config_class class FooConfig(): def __init__(self): # Config-parameters goes as attributes self.experiment_note = 'Changed some stuff' self.seed = random.randint(0, 80085) config = anyfig.setup_config(default_config='FooConfig') print(config) print(config.seed) 

At the moment, Anyfig supports inheritance, command line inputs, saving & loading and more ideas in the pipeline. Check the github for some more info or reach out to me.

The benefits might not be obvious at first glance so let me try to sell this.

  1. Defining config-parameters at runtime offers dynamic settings. Leverage the power of Python. Also cleans up the code

{"loss_fn": "l1loss"} # Lives inside a .json file config = ~parse_jsonfile~ if config.loss_fn = 'l1loss': loss_fn = torch.nn.L1Loss() elif config.loss_fn = 'mse' loss_fn = torch.nn.MSELoss() 

Skip the if else statements. You can do those kinds of operations directly in the config

@anyfig.config_class class Train(): def __init__(self): self.loss_fn = torch.nn.L1Loss() 
  1. Class inheritance avoids duplicated code. Good for biiig projects. How often have you started training with the wrong settings? Anyfig can help mitigate that problem.

A typical use case for Anyfig would be to have one config-class for training, one for laptop-debugging and one for prediction. Since many of the config-parameters are the same, you can choose one class as the parent class and have the other two inherit from the parent and overwrite selected config-parameters.

import anyfig @anyfig.config_class class Train(): def __init__(self): self.batch_size = 32 self.seed = random.randint(0, 80085) @anyfig.config_class class DebugTrain(Train): def __init__(self): super().__init__() self.seed = 0 @anyfig.config_class class Predict(Train): def __init__(self): super().__init__() self.batch_size = 1 

I’m currently looking for people who are willing to try this out / give feedback 🙂

submitted by /u/machinemask
[link] [comments]