Callbacks
HyperparameterScheduler¶
larq.callbacks.HyperparameterScheduler(
schedule,
hyperparameter,
optimizer=None,
update_freq="epoch",
verbose=0,
log_name=None,
)
Generic hyperparameter scheduler.
Example
bop = lq.optimizers.Bop(threshold=1e-6, gamma=1e-3)
adam = tf.keras.optimizers.Adam(0.01)
optimizer = lq.optimizers.CaseOptimizer(
(lq.optimizers.Bop.is_binary_variable, bop), default_optimizer=adam,
)
callbacks = [
HyperparameterScheduler(lambda x: 0.001 * (0.1 ** (x // 30)), "gamma", bop)
]
Arguments
- schedule
Callable
: a function that takes an epoch index as input (integer, indexed from 0) and returns a new hyperparameter as output. - hyperparameter
str
: str. the name of the hyperparameter to be scheduled. - optimizer
keras.optimizers.optimizer_v2.optimizer_v2.OptimizerV2 | None
: the optimizer that contains the hyperparameter that will be scheduled. Defaults toself.model.optimizer
ifoptimizer == None
. - update_freq
str
: str (optional), denotes on what update_freq to change the hyperparameter. Can be either "epoch" (default) or "step". - verbose
int
: int. 0: quiet, 1: update messages. - log_name
str | None
: str (optional), under which name to log this hyperparameter to Tensorboard. IfNone
, defaults tohyperparameter
. Use this if you have several schedules for the same hyperparameter on different optimizers.