Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[IKS] Missing IAM auth policy from kube->kms #5007

Closed
Ak-sky opened this issue Dec 22, 2023 · 1 comment
Closed

[IKS] Missing IAM auth policy from kube->kms #5007

Ak-sky opened this issue Dec 22, 2023 · 1 comment
Labels
service/Kubernetes Service Issues related to Kubernetes Service Issues

Comments

@Ak-sky
Copy link

Ak-sky commented Dec 22, 2023

Community Note

  • Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request
  • Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment

Terraform CLI and Terraform IBM Provider Version

Terraform Version v1.5.7
Terraform IBM Provider Version v1.60.0

Affected Resource(s)

  • ibm_container_vpc_cluster

Terraform Configuration Files

While deploying roks cluster in a fresh enterprise sub account with all the necessary permissions, getting a below error

Error: Request failed with status code: 401, ServerErrorResponse: {"incidentID":"2a173666-bfa3-4a34-8c33-e66c9b717917","code":"E09c0","description":"Not authorized to access the Key Management Service. Create an IBM Cloud IAM authorization policy to give the source Kubernetes Service delegate access to the target Key Management Service, and try again.","type":"Authentication"}

Looks like an auth policy is missing/not getting created for kube->kms which I think can be seen in TF trace logs as well-

maybeTainted:module.roks_landing_zone.module.landing_zone.ibm_container_vpc_cluster.cluster["policy-management-cluster"] encountered an error during creation, so it is now marked as tainted

image

And as per the docs the auth policy should get created automaticlly if not present, which is happening successfully when the cluster is deplyed via UI.

Please include all Terraform configurations required to reproduce the bug. Bug reports without a functional reproduction may be closed without investigation.

# Copy-paste your Terraform configurations here - for large Terraform configs,
# please share a link to the ZIP file.

Debug Output

TF_Apply_stdout_logs-
stdout_landing-zone-roks_TFA_15.12.2023-20.09.36.log

Also added the TF trace logs snippets (as the file size is more than 69 MB, so unable to upload) for each failure incident ID - trace_59a47c55-2f5b-40d1-ac45-6114e9c58af5.log and trace_caf9535c-0310-4dc0-918b-18164643a306.log
trace_caf9535c-0310-4dc0-918b-18164643a306.log
trace_59a47c55-2f5b-40d1-ac45-6114e9c58af5.log

Panic Output

Expected Behavior

  • Cluster should get deployed without any issues
  • An additional reader auth policy between IBM Cloud Kubernetes Service and Key Protect should get created automatically for the cluster

Actual Behavior

  • Cluster unable to access KMS instance.

Steps to Reproduce

  1. terraform apply

Important Factoids

References

  • #0000
@github-actions github-actions bot added the service/Kubernetes Service Issues related to Kubernetes Service Issues label Dec 22, 2023
@Ak-sky
Copy link
Author

Ak-sky commented Feb 26, 2024

Issue resolved, closing this.

@Ak-sky Ak-sky closed this as completed Feb 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
service/Kubernetes Service Issues related to Kubernetes Service Issues
Projects
None yet
Development

No branches or pull requests

1 participant