Skip to content

Commit

Permalink
changed intervals from uniform to log uniform, made learning rate ran…
Browse files Browse the repository at this point in the history
…ge larger
  • Loading branch information
simplymathematics committed Dec 3, 2023
1 parent 1712c3b commit d428e48
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions examples/power/conf/torch_cifar100.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -43,9 +43,9 @@ hydra:
n_jobs: 1
params:
++data.sample.random_state: int(range(0, 1))
++model.art.initialize.optimizer.lr: tag(log, interval(0.000001, 1))
++model.trainer.nb_epoch: int(interval(1, 100))
++model.trainer.batch_size: int(interval(1, 10000))
++model.art.initialize.optimizer.lr: tag(log, interval(0.000001, 100))
++model.trainer.nb_epoch: tag(log, int(interval(1, 100)))
++model.trainer.batch_size: tag(log, int(interval(1, 10000)))
++attack.init.eps : interval(0.01, 1.0)
_target_: hydra_plugins.hydra_optuna_sweeper.optuna_sweeper.OptunaSweeper
direction: ${direction}
Expand Down

0 comments on commit d428e48

Please sign in to comment.