Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Investigate "microbatch" incremental strategy #477

Open
ian-r-rose opened this issue Nov 13, 2024 · 1 comment
Open

Investigate "microbatch" incremental strategy #477

ian-r-rose opened this issue Nov 13, 2024 · 1 comment
Assignees

Comments

@ian-r-rose
Copy link
Member

dbt 1.9 introduces a new incremental strategy called "microbatching". It seems like it might be a great fit for our current large incremental time series. It allows some of the following behaviors:

  1. Targeted backfills. Previously we had two options: run the most recent data, or do a full table refresh. This allows us to do more specific backfills, like "refresh the last week of data"
  2. Less manual configuration of lookbacks. We might be able to remove or significantly simplify make_model_incremental().
  3. Breaking large table builds into multiple steps. This probably doesn't matter a huge amount for Snowflake (though I would be interested to be proven wrong!), but with some databases it might help with resource usage.

I think an experimental branch testing this new strategy out would be a great idea. Since it relies on dbt-core 1.9, it would probably need to wait on the version upgrade @summer-mothwood is doing in #352

@jkarpen
Copy link

jkarpen commented Nov 22, 2024

Next step is to have a team discussion on this topic.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants