Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Secrets not being updated after patch #240

Open
ErikTMA opened this issue Dec 9, 2022 · 13 comments
Open

Secrets not being updated after patch #240

ErikTMA opened this issue Dec 9, 2022 · 13 comments
Labels

Comments

@ErikTMA
Copy link

ErikTMA commented Dec 9, 2022

Describe the bug
We use ArgoCD to deploy applications which have the destination secrets. If a secret is patched through Argo, it is not being updated by the replicator.
ie:
We install a new application - secret is correctly updated by the replicator.
We make an update to the app causing ArgoCD to reconcile, which patches the secret. The secret is not being updated by the replicator.

To Reproduce
Update any destination secret in a pull configuration. ie kubectl apply -f secret.yaml

Expected behavior
I expect the secret to be updated whenever it is patched.

Environment:

  • Kubernetes version: 1.22.15-gke.100
  • kubernetes-replicator version: 2.7.3
@ErikTMA ErikTMA added the bug label Dec 9, 2022
@eugenberend
Copy link

@ErikTMA hi, what is your secret type?
I can confirm this issue with kubernetes.io/tls secret type.

@ErikTMA
Copy link
Author

ErikTMA commented Feb 7, 2023

@ErikTMA hi, what is your secret type? I can confirm this issue with kubernetes.io/tls secret type.

It's a tls secret as well.

@yevgeniyo-ps
Copy link

Seeing the same problem. We are using external secrets for accessing ECR (ECR token TTL is about 12 hours), it means that we have to regenerate token every few hours. We wanted to copy secret with kubernetes-replicator, but we are seeing now that if original secret was updated, cloned ones are not.

@strowi
Copy link

strowi commented Jul 19, 2024

Same here, deploying a secret:

apiVersion: v1
data:
  .dockerconfigjson:  e30K
kind: Secret
metadata:
  annotations:
    replicator.v1.mittwald.de/replicate-from: app/registry.gitlab.com
  name: registry.gitlab.com
  namespace: frontend-develop
type: kubernetes.io/dockerconfigjson

The first time works, and gets updated to:

apiVersion: v1
data:
  .dockerconfigjson:  ......
kind: Secret
metadata:
  annotations:
    kubectl.kubernetes.io/last-applied-configuration: |
      {"apiVersion":"v1","data":{".dockerconfigjson":"e30K"},"kind":"Secret","metadata":{"annotations":{"replicator.v1.mittwald.de/replicate-from":"app/registry.gitlab.com"},"name":"registry.gitlab.com","namespace":"frontend-develop"},"typ
e":"kubernetes.io/dockerconfigjson"}
    replicator.v1.mittwald.de/replicate-from: app/registry.gitlab.com
    replicator.v1.mittwald.de/replicated-at: "2024-07-19T14:34:27Z"
    replicator.v1.mittwald.de/replicated-from-version: "3717"
    replicator.v1.mittwald.de/replicated-keys: .dockerconfigjson
  creationTimestamp: "2024-07-18T09:54:35Z"
  name: registry.gitlab.com
  namespace: frontend-develop
  resourceVersion: "1108217"
  uid: 841a62f5-ad03-4472-8dae-63b34cb630d1
type: kubernetes.io/dockerconfigjson

The re-applying it show the first one with some annotations:

apiVersion: v1
data:
  .dockerconfigjson: e30K
kind: Secret
metadata:
  annotations:
    kubectl.kubernetes.io/last-applied-configuration: |
      {"apiVersion":"v1","data":{".dockerconfigjson":"e30K"},"kind":"Secret","metadata":{"annotations":{"replicator.v1.mittwald.de/replicate-from":"app/registry.gitlab.com"},"name":"registry.gitlab.com","namespace":"frontend-develop"},"type":"kubernetes.io/dockerconfigjson"}
    replicator.v1.mittwald.de/replicate-from: app/registry.gitlab.com
    replicator.v1.mittwald.de/replicated-at: "2024-07-19T14:34:27Z"
    replicator.v1.mittwald.de/replicated-from-version: "3717"
    replicator.v1.mittwald.de/replicated-keys: .dockerconfigjson
  creationTimestamp: "2024-07-18T09:54:35Z"
  name: registry.gitlab.com
  namespace: frontend-develop
  resourceVersion: "1108756"
  uid: 841a62f5-ad03-4472-8dae-63b34cb630d1
type: kubernetes.io/dockerconfigjson

Removing the version-annotation, it gets updated correctly again.

Not sure how kubernetes-replicator exactly works, but this might be related to a kubectl's that won't update fields missing the patch..

PS: I might have found a workaround - adding an annotation replicator.v1.mittwald.de/replicated-from-version: '' to the secret on the pull-end will update the secret each time.

@stippi2
Copy link
Contributor

stippi2 commented Nov 6, 2024

This issue is fixed (for secrets and config maps at least) when you use the new --sync-by-content toggle. More details about the root cause in this issue.

@bygui86
Copy link

bygui86 commented Nov 13, 2024

@stippi2 so enabling --sync-by-content this issue should be solved right?
thanks a lot for support!

@stippi
Copy link

stippi commented Nov 13, 2024

It should be, but there is no new release, yet.

@bygui86
Copy link

bygui86 commented Nov 13, 2024

@stippi2 / @stippi ah ok. In fact I upgraded to v2.10.2 and the flag is not recognised :( when do you plan to release it?

@stippi
Copy link

stippi commented Nov 13, 2024

I am not a maintainer here, sorry. I was hoping @martin-helmich would trigger a new release. :-)

@martin-helmich
Copy link
Member

martin-helmich commented Nov 13, 2024

Yep, hang on... New release is building now. 🙂 ⏳

EDIT:
Here you go: https://github.com/mittwald/kubernetes-replicator/releases/tag/v2.11.0

@stippi
Copy link

stippi commented Nov 13, 2024

@bygui86 Once you had a chance to test the new release, please let us know if the issue is fixed for you! :-D

@bygui86
Copy link

bygui86 commented Nov 13, 2024

@stippi thanks a lot!

@bygui86
Copy link

bygui86 commented Nov 13, 2024

@stippi startup is fine, --sync-by-content flag well accepted and a Secret got synced, so I think it works for now :) thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

8 participants