[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multiple schedulers and multiple queues #280

Open
AlinaYablokova opened this issue Nov 15, 2022 · 0 comments
Open

Multiple schedulers and multiple queues #280

AlinaYablokova opened this issue Nov 15, 2022 · 0 comments

Comments

@AlinaYablokova
Copy link
AlinaYablokova commented Nov 15, 2022

I'm wondering if there is a capability to run 2 schedulers simultaneously: specific scheduler for specific queue?
As mentioned here there is a capability to run multiple schedulers for single queue for higher reliability and stability. But I would like to run 2 schedulers for completely different set of tasks but with single redis instance.
Now I got the problem here.

#scheduler 1
class Scheduler1(RQScheduler):
    def __init__(self, config: Config):
        self._config = config
        self._redis_connection = Redis(host=config.REDIS_HOST, port=config.REDIS_PORT)
        self._queue = Queue("QUEUE 1", connection=self._redis_connection)
        super().__init__(
            queue=self._queue,
            connection=self._redis_connection,
            interval=5,
        )

    def run(self):
        self.schedule(
                scheduled_time=datetime.now(timezone.utc),
                func=task_1,
                args=[...],
                interval=1800,
                repeat=None,
                timeout=900,
                id="***t",
                result_ttl=7 * 24 * 60 * 60,
            )
        ...

#scheduler 2
class Scheduler2(RQScheduler):
    def __init__(self, config: Config):
        self._config = config
        self._redis_connection = Redis(host=config.REDIS_HOST, port=config.REDIS_PORT)
        self._queue = Queue("QUEUE 2", connection=self._redis_connection)
        super().__init__(
            queue=self._queue,
            connection=self._redis_connection,
            interval=5,
        )

    def run(self):
        self.schedule(
                scheduled_time=datetime.now(timezone.utc),
                func=task_2,
                args=[...],
                interval=300,
                repeat=None,
                timeout=180,
                id="***t",
                result_ttl=7 * 24 * 60 * 60,
            )
        ...

#worker 1
if __name__ == "__main__":
    redis_connection = Redis(config.REDIS_HOST, config.REDIS_PORT)

    try:
        with Connection(connection=redis_connection):
            worker = Worker(["QUEUE 1"])
            worker.work()
    finally:
        redis_connection.close()

#worker 2
if __name__ == "__main__":
    redis_connection = Redis(config.REDIS_HOST, config.REDIS_PORT)

    try:
        with Connection(connection=redis_connection):
            worker = Worker(["QUEUE 2"])
            worker.work()
    finally:
        redis_connection.close()

As I see, sometimes jobs from "QUEUE 2" (Scheduler2) are not executed and not planned (al least from second repeat).
Is it ok?
In some cases I also got a warning: Lock already taken - skipping run

P.S. Yes, I can organize the pipeline with single schedulers, but having multiple queues and schedulers would provide more flexibility for workers count tuning (for example I need 5 workers for "QUEUE 1" and 3 for "QUEUE 2").

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant