Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failed to create PK constraint if a table already has PK constraint on another column or columns #68

Open
kokorin opened this issue Mar 29, 2024 · 2 comments

Comments

@kokorin
Copy link

kokorin commented Mar 29, 2024

Some of our models have incremental materialization. This model contains some history we want to keep, so we disabled full_refresh for it.

We wanted to change PK column after adding synthetic (generated) column.

dbt_constraints doesn't check if PK constraint already exists and should be dropped.

@sfc-gh-dflippo
Copy link
Collaborator

Even with all the new features just released in version 1.0.0, this is still true. There isn't any logic to drop any existing constraints if they have changed. I have added this as a limitation in the README.md.

@kokorin
Copy link
Author

kokorin commented Jul 26, 2024

We ended up with specific DBT pre-hook to drop constraints in Snowflake:

{% set relation = load_relation( this ) %}
{# prefilled lookup_cache is required by dbt_contraints #}
{%- set lookup_cache = {
    'table_columns': { },
    'table_privileges': { },
    'unique_keys': { },
    'not_null_col': { },
    'foreign_keys': { } } -%}
{% if relation is not none and dbt_constraints.unique_constraint_exists(relation, ['previous_unique_column'], lookup_cache) %}
  alter table {{ this }} drop primary key;
{% else %}
  select 1
{% endif %}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants