Interfaces common to all `Modules` and `Models` in Gale.

class Configurable[source]

Configurable() :: ABC

Helper Class to instantiate obj from config

This class provides a common interface for modules so that, they can be easy loaded from a Hydra Config file. This class also supports instantiating via hydra.

Configurable.from_config_dict[source]

Configurable.from_config_dict(config:DictConfig, **kwargs)

Instantiates object using DictConfig-based configuration. You can optionally pass in extra kwargs

Configurable.to_config_dict[source]

Configurable.to_config_dict()

Returns object's configuration to config dictionary

class BasicModule[source]

BasicModule() :: Module

Abstract class offering interface which should be implemented by all Backbones, Heads and Meta Archs in gale.

Any Module that is Registerd in Gale should inherit from this class or its subclass.

BasicModule.forward[source]

BasicModule.forward()

The main logic for the model lives here. Can return either features, logits or loss.

BasicModule.build_param_dicts[source]

BasicModule.build_param_dicts()

Should return the iterable of parameters to optimize or dicts defining parameter groups for the Module.

Configurable.from_config_dict[source]

Configurable.from_config_dict(config:DictConfig, **kwargs)

Instantiates object using DictConfig-based configuration. You can optionally pass in extra kwargs

BasicModule.param_lists[source]

Returns the list of paramters in the module

BasicModule.all_params[source]

BasicModule.all_params(n=slice(None, None, None), with_grad=False)

List of param_groups upto n

BasicModule.freeze[source]

BasicModule.freeze()

Freeze all params for inference & set model to eval

BasicModule.freeze_to[source]

BasicModule.freeze_to(n:int)

Freeze parameter groups up to n

BasicModule.unfreeze[source]

BasicModule.unfreeze()

Unfreeze all parameters for training.

BasicModule.as_frozen[source]

BasicModule.as_frozen()

Context manager which temporarily freezes a module, yields control and finally unfreezes the module.

Utils function

get_callable_name[source]

get_callable_name(fn_or_class:object)

get_callable_dict[source]

get_callable_dict(fn:Union[Callable, Mapping, Sequence])

setup_metrics[source]

setup_metrics(metrics:Union[Metric, Mapping, Sequence, NoneType])

class DefaultTask[source]

DefaultTask(cfg:DictConfig, trainer:Optional[Trainer]=None, metrics:Union[Metric, Mapping, Sequence, NoneType]=None) :: LightningModule

Interface for Pytorch-lightning based Gale modules

Properties

DefaultTask.metrics[source]

Property that returns the metrics for the current Lightning Task

DefaultTask.param_dicts[source]

Property that returns the param dicts for optimization. Override for custom training behaviour. Currently returns all the trainable paramters.

DefaultTask._is_model_being_restored[source]

Wether the model is being used for inference of training. For training it is mandatory to pass in the Training while initializing the class

Training loop

DefaultTask.forward[source]

DefaultTask.forward(x:Tensor)

The Forward method for LightningModule, users should modify this method.

DefaultTask.shared_step[source]

DefaultTask.shared_step(batch:Any, batch_idx:int, stage:str)

The common training/validation/test step. Override for custom behavior. This step is shared between training/validation/test step. For training/validation/test steps stage is train/val/test respectively. You training logic should go here avoid directly overriding training/validation/test step methods. This step needs to return a dictionary contatining the loss to optimize and values to log.

DefaultTask.training_step[source]

DefaultTask.training_step(batch:Any, batch_idx:int)

The training step of the LightningModule. For common use cases you need not need to override this method. See GaleTask.shared_step()

DefaultTask.validation_step[source]

DefaultTask.validation_step(batch:Any, batch_idx:int)

The validation step of the LightningModule. For common use cases you need not need to override this method. See GaleTask.shared_step()

DefaultTask.test_step[source]

DefaultTask.test_step(batch:Any, batch_idx:int)

The test step of the LightningModule. For common use cases you need not need to override this method. See GaleTask.shared_step()

Model Optimzation

DefaultTask.num_training_steps[source]

DefaultTask.num_training_steps()

Total training steps inferred from train dataloader and devices.

DefaultTask.configure_optimizers[source]

DefaultTask.configure_optimizers()

Choose what optimizers and learning-rate schedulers to use in your optimization. See https://pytorch-lightning.readthedocs.io/en/latest/common/optimizers.html

DefaultTask.process_optim_config[source]

DefaultTask.process_optim_config(opt_conf:DictConfig)

Prepares an optimizer from a string name and its optional config parameters. Preprocess the optimization config and adds some infered values like max_steps, max_epochs, etc. This method also fills in the values for max_iters & epochs, steps_per_epoch if the values are -1

DefaultTask.setup_optimization[source]

DefaultTask.setup_optimization(conf:DictConfig=None)

Prepares an optimizer from a string name and its optional config parameters. You can also manually call this method with a valid optimization config to setup the optimizers and lr_schedulers.

DefaultTask.build_optimizer[source]

DefaultTask.build_optimizer(opt_conf:DictConfig, params:Any)

Builds a single optimizer from opt_conf. params are the parameter dict with the weights for the optimizer to optimizer.

DefaultTask.build_lr_scheduler[source]

DefaultTask.build_lr_scheduler(opt_conf:DictConfig, optimizer:Optimizer)

Build the Learning Rate scheduler for current task and optimizer.

DataLoader

DefaultTask.train_dataloader[source]

DefaultTask.train_dataloader()

Returns the Dataloader used for Training

DefaultTask.val_dataloader[source]

DefaultTask.val_dataloader()

Returns the List of Dataloaders or Dataloader used for Validation

DefaultTask.test_dataloader[source]

DefaultTask.test_dataloader()

Returns the List of Dataloaders or Dataloader used for Testing

DefaultTask.setup_training_data[source]

DefaultTask.setup_training_data(*args, **kwargs)

Setups data loader to be used in training

DefaultTask.setup_validation_data[source]

DefaultTask.setup_validation_data(*args, **kwargs)

Setups data loader (s) to be used in validation

DefaultTask.setup_test_data[source]

DefaultTask.setup_test_data(*args, **kwargs)

(Optionally) Setups data loader to be used in test