Cron Tasks on load balanced web servers Cron Tasks on load balanced web servers php php

Cron Tasks on load balanced web servers


You can use this small library that uses redis to create a temporary timed lock:

https://github.com/AlexDisler/MutexLock

The servers should be identical and have the same cron configuration. The server that will be first to create the lock will also execute the task. The other servers will see the lock and exit without executing anything.

For example, in the php file that executes the scheduled task:

MutexLock\Lock::init([  'host'   => $redisHost,  'port'   => $redisPort]);// check if a lock was already created,// if it was, it means that another server is already executing this taskif (!MutexLock\Lock::set($lockKeyName, $lockTimeInSeconds)) {  return;}// if no lock was created, execute the scheduled taskscheduledTaskThatRunsOnlyOnce();

To run the tasks in a de-centralized way and spread the load, take a look at: https://github.com/chrisboulton/php-resqueIt's a php port of the ruby version of resque and it stores the data in the same exact format so you can use https://github.com/resque/resque-web or http://resqueboard.kamisama.me/ to monitor the workers and see reports


Assuming you have a database available not hosted on one of those 3 servers;

Write a "wrapper" script that goes in cron, and takes the program you're running as its argument. The very first thing it does is connect to the remote database, and check when the last time an entry was inserted into a table (created for this wrapper). If the last insertion time is greater than when it was supposed to run, then insert a new record into the table with the current time, and execute the wrapper's argument (your cron job).

Cron up the wrapper on each server, each set X minutes behind the other (server A runs at the top of the hour, server B runs at 5 minutes, C at 10 minutes, etc).

The first server will always execute the cron first, so the other two servers never will. If the first server goes down, the second server will see it hasn't ran, and will run it.

If you also record in the table which server it was that executed the job, you'll have a log of when/where the script was executed.


Wouldn't this be an ideal situation for using a message / task queue?