diff --git a/dev/.documenter-siteinfo.json b/dev/.documenter-siteinfo.json index 99c4f1b..7596b23 100644 --- a/dev/.documenter-siteinfo.json +++ b/dev/.documenter-siteinfo.json @@ -1 +1 @@ -{"documenter":{"julia_version":"1.10.4","generation_timestamp":"2024-08-02T06:47:55","documenter_version":"1.5.0"}} \ No newline at end of file +{"documenter":{"julia_version":"1.10.4","generation_timestamp":"2024-08-02T06:59:08","documenter_version":"1.5.0"}} \ No newline at end of file diff --git a/dev/examples/custom_loss_layer/index.html b/dev/examples/custom_loss_layer/index.html index 58d0840..b0f0a04 100644 --- a/dev/examples/custom_loss_layer/index.html +++ b/dev/examples/custom_loss_layer/index.html @@ -60,4 +60,4 @@ model_loss = SimpleChains.add_loss(model, BinaryLogitCrossEntropyLoss(Y)); SimpleChains.valgrad!(gradients, model_loss, X, parameters)
Or alternatively, if you want to just train the parameters in full:
epochs = 100
-SimpleChains.train_unbatched!(gradients, parameters, model_loss, X, SimpleChains.ADAM(), epochs);
Settings
This document was generated with Documenter.jl version 1.5.0 on Friday 2 August 2024. Using Julia version 1.10.4.