Search code examples
phplaravelsupervisordlaravel-4.2laravel-queue

How to write multiprocess queues in Laravel 4.2 (having supervisord configured)?


I have a website, where users can export reports in CSV format. When they click "Export" button, I add a new row to my database table indicating that there is a new request for report generation.

I want to set database access for 2 processes because 1 is not enough. My current supervisord configuration is the following:

    [program:csv_export]
    command=php /var/www/mywebsite.com/artisan queue:listen --tries=1 --timeout=3000 --queue=csv_export
    numprocs=2
    process_name=csv_export_%(process_num)02d
    directory=/var/www/mywebsite.com/
    stdout_logfile=/var/log/mywebsite/csv_export.log
    autostart=true
    autorestart=true
    stopsignal=KILL
    loglevel=debug
    redirect_stderr=true

My queue fire() code looks like this:

public function fire($job, $data){
    $reports = \CSVReport::where('status', '=', CSVReport::CSVNEW)->take(1)->get();

    foreach ($reports as $key => $value){
        // prepare data and file
        $value->status = 'done';
        $value->save();
    }

    $job->delete();
}

I'm using Laravel 4.2, an upgrade is not possible yet. I want to avoid a situation when two processes would access the same table row in a database. How can I avoid that in my queue class?


Solution

  • Suppose you push your Job in queue like :

    $csv = new CSVReport;
    // $csv->... = ...;
    $csv->save();
    Queue::push('ExportCSV', array('csvID' => $csv->id));
    

    Job fire method uses $data parameter to pass specific values to make sure the Job do something specific. Now in ExportCSV job, in fire method :

    public function fire($job, $data){
        $report = \CSVReport::find($data['csvID']);
    
        // prepare data and file
        $report->status = 'done';
        $report->save();
    
        $job->delete();
    }
    

    This way each Job will have the CSVReport ID serialized in his $data, and at the time supervisord will execute it, the Job will only process his specific CVSReport ID in the database.

    You can still check for status done if whatever a case could process them outside queues :

    public function fire($job, $data){
        $report = \CSVReport::where('status', CSVReport::CSVNEW)->where('id', $data['csvID'])->first();
    
        // prepare data and file
        if($report) {
            $report->status = 'done';
            $report->save();
        }
    
        $job->delete();
    }