-
Notifications
You must be signed in to change notification settings - Fork 195
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Compute third stage time-step for RK3 in a way that reduces the accumulation of error #3617
Conversation
corrected_third_stage_Δt = tⁿ⁺¹ - model.clock.time | ||
tick!(model.clock, corrected_third_stage_Δt) | ||
update_state!(model, callbacks; compute_tendencies) | ||
step_lagrangian_particles!(model, third_stage_Δt) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here we tick!
with corrected_third_stage_Δt
, but other third stage steps (namely we calculate_pressure_correction!
, pressure_correct_velocities!
and step_langrangian_particles!
) use the original third_stage_Δt
. I would assume all of them need to use the corrected
version, no?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hmm we can change that though, though it shouldn't matter because they are indistinguishable except by machine epsilon?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah right I did consider this. We can change the pressure correction and lagrangian particles. However, other tracers effectively use the original third stage dt via rk3 substep. So I think it's actually simpler to use the same third_state_dt for everything (tracers, predictor velocity, pressure correction, lagrangian particles), but to simply shave off the error accumulated in the model clock during the substeps
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
hmm we can change that though, though it shouldn't matter because they are indistinguishable except by machine epsilon?
Yeah, it's more a matter of consistency than anything else imo
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Right, so my point is that it's more consistent to use the non-corrected time-step everywhere, including rk3substep!
(where it is used implicitly), as well as in the pressure correction and Lagrangian particles step. Do you agree?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah, I see what you're saying. Yes I agree.
As per the example below, this PR seems to resolve #3593 using Oceananigans
grid_base = RectilinearGrid(topology = (Bounded, Periodic, Bounded), size = (16, 20, 4), extent = (800, 1000, 100),)
@inline east_wall(x, y, z) = x > 400
grid = ImmersedBoundaryGrid(grid_base, GridFittedBoundary(east_wall))
model = NonhydrostaticModel(grid = grid, timestepper = :RungeKutta3, buoyancy = BuoyancyTracer(), tracers = :b,)
N² = 6e-6
b∞(x, y, z) = N² * z
set!(model, b=b∞)
simulation = Simulation(model, Δt=25, stop_time=1e4,)
using Statistics: std
using Printf
progress_message(sim) = @printf("Iteration: %04d, time: %s, iteration×Δt: %s, std(pNHS) = %.2e\n",
iteration(sim), sim.model.clock.time, iteration(sim) * sim.Δt, std(model.pressures.pNHS))
add_callback!(simulation, progress_message, IterationInterval(1))
run!(simulation) printing, at the last few time-steps:
I haven't been able to test it yet with more complex simulations though. |
Interesting! Thanks for this @tomchor |
Does anyone approve? |
PS I think we still may want something like #3606 . This just makes small time-steps less likely but doesn't change the fact that small time steps are problematic for pressure correction. (Though I'm wondering if they are only problematic when we have a divergent velocity field, which may be a different issue to fix...) |
Sorry, I was full in approval inside me the other day when I had a look but then I thought I'd wait for the discussion between you and @tomchor to be resolved. Merged now! All good! Moving on! |
Yes, I'll keep that open. I think we'll have a better idea after we start using this fix and figure out how many of small-step cases it prevents. |
Should help address some outstanding issues about error accumulation in
model.clock.time
, like #3606 and #3056.