-
As the title. |
Beta Was this translation helpful? Give feedback.
Answered by
frjnn
Dec 21, 2021
Replies: 3 comments 5 replies
-
I guess you mean in generic context, so that |
Beta Was this translation helpful? Give feedback.
1 reply
-
Any more context, what did you try and what do you want to do? I assume arr0 implements this the same way other dimensionalities do. |
Beta Was this translation helpful? Give feedback.
0 replies
-
Something like this. fn main() {
let mut scalar: f32 = 2.0;
let zero_dim = ndarray::arr0(3.0);
scalar += &zero_dim;
println!("{}", scalar);
} Out
This code would look a bit less clunky. // Trains the model.
for epoch in 0..5 {
let batched_data = dataset.shuffle().batch(2).drop_last();
let mut total_loss: f32 = 0.0;
for (input_array, target_array) in batched_data {
let input = neuronika::from_ndarray(input_array.to_owned());
let target = neuronika::from_ndarray(target_array.to_owned());
let result = model.forward(input);
let loss = loss::mse_loss(result.clone(), target.clone(), loss::Reduction::Mean);
loss.forward();
loss.backward(1.0);
optimizer.step();
total_loss += loss.data().clone().into_scalar(); // total_loss += loss.data() instead,
// or even total_loss += loss
}
println!("Loss for epoch {} : {} ", epoch, total_loss);
} |
Beta Was this translation helpful? Give feedback.
4 replies
Answer selected by
frjnn
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Something like this.
Out
P…