-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Long-running RPC handlers block other RPC requests #48
Comments
Hi thanks for reaching out :) Reading the docs , or the code (e.g. definition of .other, definition of You'd probably have better results using |
Thanks for the pointers. Looking more closely, I'm not actually sure I can achieve the behaviour I want as it stands, since fastapi_websocket_rpc/fastapi_websocket_rpc/websocket_rpc_endpoint.py Lines 90 to 92 in 307c149
I suppose I could use a callback style, like fastapi_websocket_rpc/fastapi_websocket_rpc/rpc_methods.py Lines 82 to 90 in 307c149
but that leaks the result of I will think about ways to improve this. |
If your rpc_method being called is async (meaning spins off a task to do it's work) then that's a none-issue.
That's just a testing utility method |
Say you have the following handler:
I adapted the
test_recursive_rpc_calls
test to see what would happen if I called this several times simultaneously (usingasyncio.create_task
to schedule all the calls for execution at once, rather than waiting for them to complete in order). I expected that while the handler isasyncio.sleep
ing, other handlers would be free to run, so regardless of how many times I call the handler, it shouldn't take much more than a second to complete (since that's the behaviour I would see if I were calling it like a regular async function, rather than via the RPC protocol):However, this test fails:
Rather than running simultaneously, the handlers are running one after the other – so the test takes five full seconds to complete...
Is it expected behaviour that only one RPC handler can be executing at any given time?
The text was updated successfully, but these errors were encountered: