Skip to content

Update to HyperParameters

Hello.

I want to merge my branch with the lightnet develop. Should I just sent the merge request or need to explain something about it here first? This is my first time trying to merge request and a bit confused by it.

Basically, what I added is: The ability to load multiple config files and pass arguments to them. First of all, I like HyperParameters.from_file since it is much more flexible than just having json/yaml/etc. file to set some settings/parameters. However, it is not modular because we can only load one config file.

My update allows us to have multiple config files to set parameters and networks. We also can pass arguments or flags while loading the configs. With this, we can control the flow of config such as load a pre-trained model weight to our network or start from scratch.

But I added constrains in this update which the configs should be defined in a function that takes params and keyword arguments and must return HyperParameters object.

I also make some minor updates to HyperParameters object like allowing it to become iterable etc.

For example:

# In train.py
HyperParameters.from_files([
    (
        "configs/init_cfg.py",
        {"backup": backup}
    ),
    (
         args.network,  # path to particular network
         {"weight": args.weight},
    ),
    (
         "configs/other_hyperparameters.py",
         {"lr": args.lr}
     )
])


# In configs/init_cfg.py
def main(params, **kwargs):
    # Can also be defined as `main(params, backup)` if
    # we know the name of keyword arguments that will be passed.
    # `params` will be `None` if it is the first config to be loaded.
    # If that is not the case, params is the HyperParameters object
    # returned from the previously loaded config file

    if kwargs.get("backup", None):
        # do something here
        
    # rest of code handling params
    return params

I think I will make some tests to show the update for this update.