Search code examples
perlinotifylog4perl

Can a single perl script running in background hold multiple instances of Log4Perl?


I have a script "server.pl" which is running in background and which is self-logging using Log4Perl.

This script continuously reads in a directory and detects new files created in it with Linux::Inotify2 module.

Each detected file is a Storable object which is a pipeline to be run and that must write in its own logfile. My problem is that when I call Log4Perl::init to initialize the logger for a pipeline, server.pl doesn't log itself anymore because the new initialization has overwritten the previous one. So the question is how can I make the 'server.pl' script hold multiple instances (undefined number in advance) of Log4Perl ?

Here follows a truncated version of 'server.pl' (without boring stuff)

#!/usr/bin/perl
use warnings;
use strict;
use Carp;
use Log::Log4perl;
use Linux::Inotify2;
use Storable;

Log::Log4perl->init('/.../log4perl.conf');
my $server_logger = Log::Log4perl->get_logger("server");

my $tracker = PipelineBatchTracker->new( _logger => $server_logger);

my $inotify = new Linux::Inotify2()
      or $server_logger->logcroak("Unable to create new inotify object: $!");

# Sets non-blocking inotify
$inotify->blocking(0);

# define watcher
$inotify->watch
    (
        "/.../serial/",
        IN_CREATE,
        sub {
            my $e = shift;
            my $pipe;
            my $serial = $e->{name};
            my $full = $e->{w}{name} . ${serial};

            $server_logger->info(${serial} . " was created in " . $e->{w}{name}) if $e->IN_CREATE;

            eval {
                my $pipe = retrieve("${full}");
                $server_logger->logcroak("Unable to retrieve storable object") unless defined $pipe;

                $server_logger->info(${serial} . " loaded into a Synegie::Pipeline object");
                # This methods call Log::Log4perl->init
                # and this is bad cause the server and former
                # running pipelines are not logging anymore !!!
                $pipe->setLogger();

                $tracker->addPipeline($pipe);
            };
            if ($@) {
                $server_logger->error("server : Failed to add pipeline : $@");
            }
        }
    );


while (1) {
    $server_logger->trace("--------------------- AND AGAIN -------------------------");
    $inotify->poll;
    sleep 2;

    eval {
        $tracker->poke();
    };
    if ($@) {
        $server_logger->error("server : $@");
    }

    sleep 30;
}

EDIT : Globally, this means that I would need a logger depending on each instance rather than a logger only defined at the script level. Any ideas ? Thanks


Solution

  • I will not say 'NO' but since Log4perl is obedient to the singleton pattern isn't a very good idea go against the tide!

    If you are really interested on it you may (or may not!) take some advantages of threads and its unshared data, but create and destroy perl threads will increase,little by little, the amount of used memory with no way back, which is a problem for long living applications!