-
Notifications
You must be signed in to change notification settings - Fork 0
Celery Offload tasks to a queue
Copy/pasted from Aymeric's documentation (google docs)
Source: https://medium.com/swlh/python-developers-celery-is-a-must-learn-technology-heres-how-to-get-started-578f5d63fab3 (detailed explanations + video)
“To recap: [Flask] creates a task (Python function) and tells Celery to add it to the queue. Celery puts that task into Redis (freeing [Flask] to continue working on other things). On a separate server, Celery runs workers that can pick up tasks. Those workers listen to Redis. When the new task arrives, one worker picks it up and processes it, logging the result back to Celery.”
Install redis: https://medium.com/swlh/python-developers-celery-is-a-must-learn-technology-heres-how-to-get-started-578f5d63fab3 pip install on the venv of your app ‘redis’ and ‘celery’ In the init.py file, you need to: from celery import Celery precise the broker and result backend of celery addresses create the celery app This step is normally already done, but you might have to change the addresses. My current understanding is that the ‘CELERY_BROKER_URL’ address is where the tasks are sent by flask, and that result_backend is where the results are stored.
You need to have a redis server up on your machine. To do so, just run ‘redis-server’. The port by default is 6379, but you can always check the port by running the server (it is written on your terminal when you launch it). It must match what is written in the init.py file! Then, you have to launch a celery worker (on another terminal) in the virtual environment of your app and at the root of the app directory. To do so, the command line is: celery -A <flask_app name>.<celery_app name> worker --loglevel=info. In the case of NLP4All, the app name is ‘nlp4all’ and the celery app is named ‘celery_app’ (this information is also in the init.py file), so the command line is: celery -A nlp4all.celery_app worker --loglevel=info
In the end, you should have at least 3 servers/terminals up: the flask app as usual, the redis server, and a celery worker. You can then add more celery workers if you need more processing power.
When this is done, the python functions with the decorator @<celery_app_name>.task will be done in the background.
How to deploy with the true site?
https://arches.readthedocs.io/en/stable/setting-up-supervisord-for-celery/ Honestly, I have no idea how this works.