You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is generating API spec from fastapi code intentional?
Further Information
TL;DR
Based on my experience, having a spec and then generating data structures for a fairly complex protocol is a good way to go, because
In multi-language environment (or multi-service environment for that matter) you won't worry about API compatibility when deploying things
You can have different implementations of client/server
Why do I think those might be applicable
At some point of scaling you will need backend to be multiple services handling different workflows, written in different languages
At some point you will most probably will have multiple server instances be it the ones you host, vs the ones which want to be a part of your network, but deployed by another entity and those might be written in different languages.
I'm talking way far into the future. Why on Earth would I suggest it right now?
Based on my experience, the more mature the system is, the harder is to flip that, there are a lot of unknown unknowns out there and the requirements you have right now, will most probably change in the future and programming language is one of those things :)
The use-case I had in mind (the reason I though about it in the first place)
Really nice idea with apis and using file-system as an abstraction over the whole concept. Let's say I have an idea of integrating this into a project, which would gather some data, would have some fixed set of commands to operate over the data and send it to an aggregating server. I do have couple of problems:
I want this to be my private server (think private segment of the Internet, SyftBox in a nutshell a toolkit which allows me to build "Federated Networks", not sure if that's the right term, but basically I mean set of compute nodes with storage on which I can execute some commands)
I want this client to be part of the system, not an arbitrary binary with python running there (interpreter, plus all the libraries will be a costly thing, plus wrapping it around in a docker image...)
I know that in theory I can do a Rust client with fixed set of those APIs under a MB or two. I have no problem looking at the server OpenAPI and implementing client following that, I know that with my forked server it'll continue working, but might become painful if I want to connect to a public net and/or sync my fork. I'll have to update those things manually.
That's why I wanted to ask this question, before getting into implementation myself. After all, I think that the main value of a system software is a spec or API, there could be multiple implementations, but it's about capabilities and how to use it.
I hope that I'm not way to off from the whole concept and would love to hear your thoughts on this :)
The text was updated successfully, but these errors were encountered:
We're still at a pretty early phase and we'll probably have many API changes that are not backwards compatible for the coming months. I think this is a great idea to add, once we are in a less experimental stage
Question
Is generating API spec from fastapi code intentional?
Further Information
TL;DR
Based on my experience, having a spec and then generating data structures for a fairly complex protocol is a good way to go, because
Why do I think those might be applicable
I'm talking way far into the future. Why on Earth would I suggest it right now?
Based on my experience, the more mature the system is, the harder is to flip that, there are a lot of unknown unknowns out there and the requirements you have right now, will most probably change in the future and programming language is one of those things :)
The use-case I had in mind (the reason I though about it in the first place)
Really nice idea with apis and using file-system as an abstraction over the whole concept. Let's say I have an idea of integrating this into a project, which would gather some data, would have some fixed set of commands to operate over the data and send it to an aggregating server. I do have couple of problems:
I know that in theory I can do a Rust client with fixed set of those APIs under a MB or two. I have no problem looking at the server OpenAPI and implementing client following that, I know that with my forked server it'll continue working, but might become painful if I want to connect to a public net and/or sync my fork. I'll have to update those things manually.
That's why I wanted to ask this question, before getting into implementation myself. After all, I think that the main value of a system software is a spec or API, there could be multiple implementations, but it's about capabilities and how to use it.
I hope that I'm not way to off from the whole concept and would love to hear your thoughts on this :)
The text was updated successfully, but these errors were encountered: