You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I did nothing else than set up the quickstart to run it locally.
python -m llama_deploy.apiserver
INFO: Started server process [98674]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:4501 (Press CTRL+C to quit)
INFO: 127.0.0.1:55026 - "POST /deployments/create HTTP/1.1" 200 OK
INFO:llama_deploy.message_queues.simple - Launching message queue server at 127.0.0.1:8001
INFO: Started server process [98674]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit)
INFO:llama_deploy.message_queues.simple - Consumer ControlPlaneServer-d5a9b353-e3e1-4a09-ace2-46960fd4724f: control_plane has been registered.
INFO: 127.0.0.1:55027 - "POST /register_consumer HTTP/1.1" 200 OK
INFO:llama_deploy.control_plane.server - Launching control plane server at 127.0.0.1:8000
INFO: Started server process [98674]
INFO: Waiting for application startup.
INFO:llama_deploy.services.workflow - Launching echo_workflow server at 0.0.0.0:8002
INFO: Started server process [98674]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO:llama_deploy.services.workflow - Processing initiated.
INFO: Application startup complete.
INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO: Uvicorn running on http://0.0.0.0:8002 (Press CTRL+C to quit)
INFO:llama_deploy.message_queues.simple - Consumer WorkflowService-413b09e4-a3ee-462d-98e0-8c0052b98b92: echo_workflow has been registered.
INFO: 127.0.0.1:55028 - "POST /register_consumer HTTP/1.1" 200 OK
INFO: 127.0.0.1:55029 - "POST /services/register HTTP/1.1" 200 OK
^CINFO: Shutting down
INFO: Waiting for application shutdown.
INFO: Application shutdown complete.
INFO: Finished server process [98674]
However when I try to send a message to it:
llamactl run --deployment QuickStart --arg message 'Hello from my shell!'
Error: 'ModelWrapper' object has no attribute 'tasks'
I did nothing else than set up the quickstart to run it locally.
python -m llama_deploy.apiserver
INFO: Started server process [98674]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:4501 (Press CTRL+C to quit)
INFO: 127.0.0.1:55026 - "POST /deployments/create HTTP/1.1" 200 OK
INFO:llama_deploy.message_queues.simple - Launching message queue server at 127.0.0.1:8001
INFO: Started server process [98674]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit)
INFO:llama_deploy.message_queues.simple - Consumer ControlPlaneServer-d5a9b353-e3e1-4a09-ace2-46960fd4724f: control_plane has been registered.
INFO: 127.0.0.1:55027 - "POST /register_consumer HTTP/1.1" 200 OK
INFO:llama_deploy.control_plane.server - Launching control plane server at 127.0.0.1:8000
INFO: Started server process [98674]
INFO: Waiting for application startup.
INFO:llama_deploy.services.workflow - Launching echo_workflow server at 0.0.0.0:8002
INFO: Started server process [98674]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO:llama_deploy.services.workflow - Processing initiated.
INFO: Application startup complete.
INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO: Uvicorn running on http://0.0.0.0:8002 (Press CTRL+C to quit)
INFO:llama_deploy.message_queues.simple - Consumer WorkflowService-413b09e4-a3ee-462d-98e0-8c0052b98b92: echo_workflow has been registered.
INFO: 127.0.0.1:55028 - "POST /register_consumer HTTP/1.1" 200 OK
INFO: 127.0.0.1:55029 - "POST /services/register HTTP/1.1" 200 OK
^CINFO: Shutting down
INFO: Waiting for application shutdown.
INFO: Application shutdown complete.
INFO: Finished server process [98674]
llamactl deploy deployment.yml
Deployment successful: QuickStart
However when I try to send a message to it:
llamactl run --deployment QuickStart --arg message 'Hello from my shell!'
Error: 'ModelWrapper' object has no attribute 'tasks'
Any idea?
pydantic==2.9.2
python-dotenv==1.0.1
llama-index==0.12.1
llama-index-core==0.12.1
llama-deploy==0.3.4
llama-index-llms-azure-openai==0.3.0
The text was updated successfully, but these errors were encountered: