Simple and reliable Redis queuing utilities.
$ pip install daemonless-queuing
import redis
from daemonless_queuing import setup, shutdown
instance = redis.Redis(
host='localhost',
port=6379
)
enqueue = setup(instance, {
'queues': [
'TESTCHAN_1',
'TESTCHAN_2'
]
})
# the third positional parameter is a integer (or None) representing the timeout in seconds
# after the timeout is reached the function subprocess will be killed
enqueue('TESTCHAN_1', 'my_package.my_module.func_name' 0)
enqueue('TESTCHAN_2', 'my_package.my_module.func_name', 0, 'positional argument', 42, named_arg='Hey!')
# ...
# do some blocking stuff
shutdown()
import redis
from daemonless_queuing import make_lock
instance = redis.Redis(
host='localhost',
port=6379
)
lock = make_lock(instance, 'SOMETHING_ONGOING')
with lock():
# will raise if you try to run this scope again before the lock gets released
...
I couldn't find a better suiting solution for what I needed (enqueuing stuff in multiple queues then running the jobs in parallel). Existing solutions (rq, Celery) are hard to setup and require a daemon/broker to work. Of course this is a much simpler version of these libraries, but some don't need more than that.
- Functions must be provided as Python import paths
- Function parameters must be JSON-serializable
- Return values can't be accessed
The function returned by make_lock
is susceptible to race condition at the moment.
Make sure your asynchronous function won't be called twice at the same time.
This library is MIT licensed.