You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Not sure if this issue would be more cozy in dials (where the parameter function dials::activation() is defined) or here (where the parameter function is referenced), but allowing tune to sample values of activation when tuning mlp(engine = "keras") results in errors:
library(tidymodels)
model<-
mlp(activation= tune()) %>%
set_engine("keras") %>%
set_mode("classification")
set.seed(1)
res<- tune_grid(
workflow(class~., model),
vfold_cv(sim_classification(1000)),
grid=3
)
#> → A | error: `activation` must be one of "linear", "softmax", "relu", or "elu", not#> "tanh".#> There were issues with some computations A: x10#>
Not sure if this issue would be more cozy in dials (where the parameter function
dials::activation()
is defined) or here (where the parameter function is referenced), but allowing tune to sample values ofactivation
when tuningmlp(engine = "keras")
results in errors:Created on 2025-01-10 with reprex v2.1.1
That error is from parsnip itself:
parsnip/R/mlp.R
Lines 195 to 196 in 27df158
Possibly a followup on #1019.
The text was updated successfully, but these errors were encountered: