跳转至

t9k.em.containers.Params

Params(upload: Callable, init_hparams: Dict[str, Union[str, int, float, bool, None, List[~T], Tuple[], Dict[~KT, ~VT]]] = None)

Container class to hold hyperparameters of Run.

It is recommended to set all hyperparameters by calling update method once before building the model. Nevertheless, you are free to operate hyperparameters like items of a dict or attributes of an object.

Examples

Recommended method of setting hyperparameters:

run.update({
    'batch_size': 32,
    'epochs': 10,
})

Assign parameter like an item of dict or attribute of object:

run.params['batch_size'] = 32
run.params.epochs = 10

Args

  • upload (Callable)

    Function that is called to upload hyperparameters every time hyperparameters are updated.

  • init_hparams (Dict[str, Union[str, int, float, bool, None, List[~T], Tuple[], Dict[~KT, ~VT]]])

    Initial hyperparameters.

Ancestors

  • collections.abc.MutableMapping

Methods

as_dict

as_dict(self)

items

items(self)

D.items() -> a set-like object providing a view on D's items

keys

keys(self)

D.keys() -> a set-like object providing a view on D's keys

parse

parse(self, dist_tf_strategy=None, dist_torch_model=None, dist_hvd=None)

Parses hyperparameters from various objects of various frameworks.

Args

  • dist_tf_strategy

    TensorFlow distribution strategy instance if tf.distribute is used for distributed training.

  • dist_torch_model

    PyTorch model wrapped with DP or DDP if torch.distributed is used for distributed training.

  • dist_hvd

    Used module such as horovod.keras and horovod.torch if Horovod is used for distributed training.

update

update(self, new_params: Dict[str, Any], override: bool = True)

Updates with new params.

Args

  • new_params (Dict[str, Any])

    New params to be updated with.

  • override (bool)

    Whether to override current params.

values

values(self)

D.values() -> an object providing a view on D's values