-
-
Notifications
You must be signed in to change notification settings - Fork 281
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add SASL security scheme types for use by Kafka specs #502
feat: add SASL security scheme types for use by Kafka specs #502
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Welcome to AsyncAPI. Thanks a lot for creating your first pull request. Please check out our contributors guide and the instructions about a basic recommended setup useful for opening a pull request.
Keep in mind there are also other channels you can use to interact with AsyncAPI community. For more details check out this issue.
Originally posted by @dalelane in asyncapi/bindings#56 (comment)
* - a new security scheme type to be added |
This is all working on the assumption that If it's not, then there will need to be more to achieve the table above. |
versions/2.0.0/asyncapi.md
Outdated
@@ -1961,7 +1961,7 @@ Defines a security scheme that can be used by the operations. Supported schemes | |||
##### Fixed Fields | |||
Field Name | Type | Applies To | Description | |||
---|:---:|---|--- | |||
<a name="securitySchemeObjectType"></a>type | `string` | Any | **REQUIRED**. The type of the security scheme. Valid values are `"userPassword"`, `"apiKey"`, `"X509"`, `"symmetricEncryption"`, `"asymmetricEncryption"`, `"httpApiKey"`, `"http"`, `oauth2`, and `openIdConnect`. | |||
<a name="securitySchemeObjectType"></a>type | `string` | Any | **REQUIRED**. The type of the security scheme. Valid values are `"userPassword"`, `"apiKey"`, `"X509"`, `"symmetricEncryption"`, `"asymmetricEncryption"`, `"httpApiKey"`, `"http"`, `"oauth2"`, and `"openIdConnect"`, `"plain"`, `"scramSha256"`, `"scramSha512"`, `"gssapi"`. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wouldn't plain
be covered already by the userPassword
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would it make sense to reuse the userPassword
schema, adding an optional field for the encryption? Sorry for the back and forth 😅
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think that depends whether we want userPassword
and X509
to be reusable across protocol.
i.e it's inferred (or we state) - that for kafka
, userPassword
= SASL plain, X509
= mutual TLS, scramShaX
= SASL scram etc
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess it becomes a challenge if kafka (or other protocols) add support for clashing ideas (e.g SASL x509 + mutual tls)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
adding an optional field for the encryption
Can we assume that protocol kafka-secure
means encrypted and kafka
is plain?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i.e it's inferred (or we state) - that for
kafka
,userPassword
= SASL plain,X509
= mutual TLS,scramShaX
= SASL scram etc
Perhaps it makes sense to add the concept of aliases that at the end are just reusing the same JSON Schema types? What I would like to avoid is redefining the same concept with a different name and maintain both at the same time.
Can we assume that protocol kafka-secure means encrypted and kafka is plain?
Yeah, that's a good point 👍 . What does it should mean -secure
suffix here? I guess the question is: Do we talk about encryption or the protocol?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What does it should mean -secure suffix here? I guess the question is: Do we talk about encryption or the protocol?
If we compare http/https
- the s
represents secure
protocol (via SSL encryption) - at least, I believe that's the case. If so, then that maps accordingly to kafka/kafka-secure
- the secure
means it's a secure version of the protocol?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we compare http/https - the s represents secure protocol (via SSL encryption)
I agree with this. I think being consistent with the distinction between http and https would be best - so in the same way that using https means encryption and does not require authentication, I think kafka-secure should mean encryption but not require auth.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would it make sense to reuse the
userPassword
schema, adding an optional field for the encryption?
I don't really mind.
The root requirement is that there at least three SASL mechanisms typically used for Kafka that could all be described as username/password-based (SASL/PLAIN, SASL/SCRAM-SHA-256, SASL/SCRAM-SHA-512) so we either need to describe them as different security scheme types, or make this SASL mechanism information a new additional attribute of the security scheme.
Either is totally fine with me, as long as we have a way to articulate the required config.
@fmvilas Any strong preferences?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not really. Either way is fine for me. Like you said, as long as we have a way to articulate the config, everything is fine. Just let's make sure it's not too verbose and it's maintainable in the long-term.
Kudos, SonarCloud Quality Gate passed! |
Updated the PR to address merge conflicts. I think all the comments above are addressed and this is ready for re-review now, please. |
I think it's pretty much done on the spec side. The only missing part for me is to update the JSON Schema definition and the JS Parser. Regarding the JSON Schema definition, create a new document called Regarding the JS Parser, similar thing: create a PR and make the If you meet these two criteria, we'll move this to the next stage (Draft RFC 2). If you're not familiar with JS, we're happy to help on that front. |
As discussed in asyncapi/bindings#56 this adds additional security scheme types. The motivation for adding them is to enable description of secured Kafka clusters, however the security protocols and mechanisms being added are not unique to Kafka, so this commit adds them as generic security schemes so they could be used by other protocols as well. Contributes to: #466 Signed-off-by: Dale Lane <[email protected]>
Moved the "and" to the end of the list. Signed-off-by: Dale Lane <[email protected]>
Moving this to the next stage 🎉 It's now a Draft, pending testing it on the JS parser. |
In my opinion, we should also get examples of new schemes like in case of others from the list https://github.com/asyncapi/spec/blob/master/spec/asyncapi.md#security-scheme-object-example, also for me it is not clear what is the structure of information that one has to provide when using
Thoughts? |
Sure, I can add that.
For now, there isn't yet extra information - the crucial info is the type itself. Knowing that the SASL mechanism is SASL/SCRAM-SHA-512 is what you need to know to connect to the cluster (other than the actual credentials themselves)
Good idea, I can add that |
Contributes to: #466 Signed-off-by: Dale Lane <[email protected]>
Kudos, SonarCloud Quality Gate passed! |
what do you think about extending the streetlight example? |
@dalelane can you have a look at my last comment? |
@fmvilas I checked the streetlights example and I actually revert my request to @dalelane to extend it with SASL, as SASL is for Kafka and example is for MQTT. Problem is that this example, even though for mqtt, has kafka bindings. So the example is a bit messy and I would not like to block this PR because of it. I had quick chat with @dalelane and he, as our Kafka ❤️ AsyncAPI expert, will work on another PR to fix the existing example and provide a separate Kafka example. @fmvilas I think we are ready to go with these PRs related to SASL. The way to merge should be:
I think this is it 🤔 |
Yeah, agree. And since we all agree, I'm marking it as 🏁 RFC Stage 3 (Accepted) 🎉 Congrats, @dalelane! It's the first accepted RFC after v2.0.0. |
🎉 This PR is included in version 2.1.0-2021-06-release.2 🎉 The release is available on GitHub release Your semantic-release bot 📦🚀 |
Main motivation of this was to provide a place to demonstrate the new Kafka-specific security schemes introduced in AsyncAPI 2.1.0. It was also a chance to remove the Kafka bindings from the otherwise-MQTT sample. The need for this was discussed in #502 Contributes to: #466 Signed-off-by: Dale Lane <[email protected]>
🎉 This PR is included in version 2.2.0-2021-06-release.1 🎉 The release is available on GitHub release Your semantic-release bot 📦🚀 |
As discussed in asyncapi/bindings#56 this
adds additional security scheme types. The motivation for adding
them is to enable description of secured Kafka clusters, however
the security protocols and mechanisms being added are not unique
to Kafka, so this commit adds them as generic security schemes so
they could be used by other protocols as well.
Fixes #466
Signed-off-by: Dale Lane [email protected]