-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Client RPC migration #509
Client RPC migration #509
Conversation
ecf309c
to
b734d2d
Compare
b734d2d
to
9258456
Compare
Re-based on staging after #506 merged. Staging is now the base branch for this PR. |
Only two handlers have been converted to server streams so far.
|
Should go into the task list so it's clearer. |
While migrating tests I notices that the vaults clone, pull and scan handlers don't have tests. I'll need to create some for them. |
Those 2 are particularly tricky due to the GRPC hacks we had to do to allow things like With the new RPC system, they have to simulate a stream for the git operations to run over. Did you manage to simplify the whole architecture? That is a duplex stream that "switches" over to binary data, and then passes Git's HTTP protocol transparently over it? There was also random hacks for getting exceptions out of isogit so that it would turn into GRPC exceptions, this should also be considered too. |
BTW the above is probably more related to the agent service, and not so much client service. BUT the performance of vault cloning/sync was so bad previously. Please look into benchmarking this so we can have fast cloning and sync, and it should be much faster now without so many layers of abstraction once QUIC and our new RPC is fully integrated. |
All the fixmes are going to addressed in this PR? |
Hey @tegefaulkes remember address my comments above too. If there needs to be an issue for the agent migration and git tunneling, please remember to create it too. |
As for the |
The So I have two options, I make this data optional so I don't have to fill it in. Or I add some way to provide this information to the We should have the information available but it doesn't feel right that the |
Just a note, when generating test data that needs to be stringified and parsed as JSON. Any values that are |
I know that fast check generators should have something dedicated to JSON. Not all JS objects is valid JSON.
12 Mar 2023 21:07:25 Brian Botha ***@***.***>:
…
Just a note, when generating test data that needs to be stringified and parsed as JSON. Any values that are *-0* become *0* and anything *undefined* will be removed. This has the effect of a round trip of stringify and parsing a JSON object will sometimes not be equal. This can result in random test failures.
—
Reply to this email directly, view it on GitHub[#509 (comment)], or unsubscribe[https://github.com/notifications/unsubscribe-auth/AAE4OHISUKBYWCXDATHZNZLW32MPVANCNFSM6AAAAAAVIZTAVI].
You are receiving this because you commented.[Tracking image][https://github.com/notifications/beacon/AAE4OHLH2YKJLYMW64XI3TTW32MPVA5CNFSM6AAAAAAVIZTAVKWGG33NNVSW45C7OR4XAZNMJFZXG5LFINXW23LFNZ2KUY3PNVWWK3TUL5UWJTSXLF2QM.gif]
|
I think I'll leave removing the GRPC code for the agent migration. It's much easier to remove the whole Otherwise this should be good to merge after some clean up. |
- created agent handlers and tests - adding gestalts handlers - migrating gestalts handlers - migrating identities handlers - adding identities handlers - adding identities handlers - adding keys handlers - migrating nodes handlers - migrating notifications handlers - migrating vaults handlers - migrating vaults handlers [ci skip]
- migrating keys tests - migrating nodes tests - migrating notifications tests - migrating notifications tests - migrating vaults tests [ci skip]
- removing client GRPC and using agnostic RPC - updating agent bin code - updating identities bin code - updating keys bin code - updating nodes bin code - migrating secrets bin code - migrating vaults bin code [ci skip]
[ci skip]
9b9b0b2
to
cccc5c5
Compare
Cleaned up commit history |
This should be good to merge now. |
I think #510 is the next thing for me to work on. That is assuming @CMCDragonkai is still working on the Quic stuff otherwise I can start on that. I can make a start on #512 but I can't test it works without Quic working. #511 is low priority, things still work without it. We just miss out on some expected behaviour with key changes. |
I won't be getting to QUIC anytime soon. So you can start on #510. |
Cleaning up history now. Almost ready for merge. |
… injected into the `PolykeyAgent`. [ci skip]
Previously the `clientManifest` depended on the whole code base due to the handlers. This has now been seperated out, so it only depends on types. [ci skip]
…d one for now [ci skip]
[ci skip]
… better for debugging
… `detail` property, made `RPCServer` extend `EventTarget`
…etPort()` and other fixes [ci skip]
…il reconstruction via transform stream, and reverse pair propagation of cancellation event
73b162f
to
043066d
Compare
This is ready to merge. All that is left to do is house keeping. |
What housekeeping? |
043066d
to
60e3d13
Compare
I just had to update or create issues before I could resolve the last review comment. it's done now. I checked and there is no docs changes commited. I've re-enabled CI so this should be good to merge any time now. |
Description
This PR deals with migrating the clientRPC handlers and code to using the new Agnostic RPC system.
It consists of taking the existing handler code within
client/services
and transplanting it into the newRPC
system from #249 .pagination
Any handlers that stream data need to support the pagination parameters and propagate them. The required parameters are
seek: id
,number: number
,order: 'asc'|'desc'
. Most code that streams an output with a generator already support pagination. But in some cases where it is not already supported it needs to be updated. As for the handler parameters, the message should include the followingBinary data
Currently there are two cases where sending binary data is applicable. The IDs such as
NodeID
, orVaultId
. These are converted to an encoded string form just for readability of the message. Using theNodeIdEncoded
andVaultIdEncoded
for of the Ids.The 2nd case is file contents. Currently these are converted to a binary string when provided within a message using
Buffer.toString('binary')
. This is pretty limiting to small amounts of data so it is subject to change. If we need to send larger files we need to stream it using a raw handler to implement it as a raw binary stream. This has complications such it not supporting any middleware so we need to implement authentication logic withing the raw handler.Standard message structure
To standardise the message structure and types across all messages we are leveraging the typescripts type system. Since it's pretty powerful we can compose messages and message types together effectively.
For example we have basic type messages that can be composed to make more complex messages
Issues Fixed
Tasks
Final checklist