smallseo.info

log4perl

Log4j Implementation For Perl log4perl - log4j for Perl

How can I get %x in Log4perl to not show "[undef]"?

When I haven't pushed anything to the Log::Log4perl::NDC stack, %x returns [undef]. I would like it to return an empty string when the stack is empty.

For example, take this code:

use strict;
use Log::Log4perl qw(:easy);
Log::Log4perl->easy_init({ level => $INFO, layout => "%x %m%n" });
Log::Log4perl->get_logger()->info("first message");
Log::Log4perl::NDC->push("prefix");
Log::Log4perl->get_logger()->info("second message");

This prints:

[undef] first message
prefix second message

But I want it to print:

first message
prefix second message

How can I do this?


Source: (StackOverflow)

How can I mock Log::Log4perl::INFO?

I'm writing new unit tests for an existing module that uses Log::Log4perl like:

use Log::Log4perl qw(:easy);

The module calls INFO( "important message" );. I'd like to mock this to verify from my test code that INFO is called in certain circumstances.

When I run the test module it doesn't capture the calls to INFO from the module. What's the right way to mock these calls to INFO?

Here's a complete example:

Mut.pm

#!/usr/bin/perl -w
# Mut : Module under test

use strict;
use warnings;

package Mut;

use Log::Log4perl qw(:easy);

sub new {
   my $class = shift;
   my $self = {};
   bless $self, $class;

   INFO( "Mut::new" );

   return $self;
}

1;

Mut.t

#!/usr/bin/perl -w

use strict;
use warnings;

package Mut_Test;

use Test::More tests => 1;
use Test::MockModule;
use Test::MockObject;

my @mock_info_output = ();

my $log4perl = Test::MockModule->new('Log::Log4perl');
$log4perl->mock(
   'INFO' => sub {
      print STDERR $_[0];
      push @mock_info_output, @_;
      return;
   }
    );

BEGIN {
  use_ok('Mut');
}

{
   my $mut = Mut->new;
   ## Do something here to verify INFO...
}

Source: (StackOverflow)

How to create separate log files?

I'm using Log::Log4perl to create log files, but it is appending to a single log file; instead, I want to create a separate log file for each execution of my script.

How can I create separate log files ?

Here is my code :

fetch.pl

#Opening Log configuration file
Log::Log4perl::init('./logs/log4perl.conf');
my $logger = Log::Log4perl->get_logger('./logs_$$.logs');

logs.conf

log4perl.logger = TRACE,  FileAppndr1
log4perl.logger.logs = DEBUG, FileAppndr1
log4perl.appender.FileAppndr1 = Log::Log4perl::Appender::File
log4perl.appender.FileAppndr1.filename = logs.log 
log4perl.appender.FileAppndr1.layout = Log::Log4perl::Layout::SimpleLayout

Source: (StackOverflow)

Log4Perl: How do I change the logger file used from running code? (After a fork)

I have an ETL process set up in perl to process a number of files, and load them to a database.

Recently, for performance reasons I set up the code to be multi-threaded, through use of a fork() call and a call to system("perl someOtherPerlProcess.pl $arg1 $arg2").

I end up with about 12 instances of someOtherPerlProcess.pl running with different arguments, and these processes each work through one directories worth of files (corresponding to a single table in our database).

The applications main functions work, but I am having issues with figuring out how to configure my logging.

Ideally, I would like to have all the someOtherPerlProcess.pl share the same $log_config value to initialize their loggers, but have each of those create a log file in the directory they are working on.

I haven't been able to figure out how to do that. I also noticed that in the directory I am calling these perl scripts from I see several files (ARRAY(0x260eec), ARRAY(0x313f8), etc) that contain all my logging messages!

Is there a simple way to change the log4perl.appender.A1.filename value from running code? Or to otherwise dynamically configure the file name we use, but use all other values from a config file?


Source: (StackOverflow)

Why does USR1 seem to kill Perl, instead of recreating the logfile?

I have a Perl script that I've added logging to, courtesy of Log4perl.

The script itself is long-running, and we also need to do log-rotation/archiving on a daily basis.

I've opted to use the inbuilt Solaris logadm, rather than using Log::Dispatch::FileRotate, because

  1. we're trying to reduce the number of Perl dependencies we need, and
  2. I get the impression that doing it at the OS level, outside your app is the preferred/most robust approach.

As part of rotation, I also need to get the Perl script to refresh its file handle. According to the Log4perl FAQ, you can configure it to listen for the USR1 signal, and recreate the file handles on that:

log4perl.rootLogger                                     = DEBUG, INFOLOG, DEBUGLOG

log4perl.appender.INFOLOG                               = Log::Log4perl::Appender::File
log4perl.appender.INFOLOG.filename                      = myprogram.info.log
log4perl.appender.INFOLOG.mode                          = append
log4perl.appender.INFOLOG.recreate                      = 1
log4perl.appender.INFOLOG.recreate_check_signal         = USR1
log4perl.appender.INFOLOG.layout                        = Log::Log4perl::Layout::PatternLayout
log4perl.appender.INFOLOG.layout.ConversionPattern      = %d [%p] (%F line %L) %m%n
log4perl.appender.INFOLOG.Threshold                     = INFO

log4perl.appender.DEBUGLOG                              = Log::Log4perl::Appender::File
log4perl.appender.DEBUGLOG.filename                     = myprogram.debug.log
log4perl.appender.DEBUGLOG.mode                         = append
log4perl.appender.INFOLOG.recreate                      = 1
log4perl.appender.INFOLOG.recreate_check_signal         = USR1
log4perl.appender.DEBUGLOG.layout                       = Log::Log4perl::Layout::PatternLayout
log4perl.appender.DEBUGLOG.layout.ConversionPattern     = %d [%p] (%F line %L) %m%n

However, for some reason, whenever I send the USR1 signal to the Perl process, my Perl script simply exits.

I'm sending it with:

kill -s USR1 <pid>

As soon as I do that, the Perl process seems to die. This happens whether I've configured Log4perl to capture USR1 or not.

I also tried using USR2, same effect.

Is there something obvious I'm missing here, either in Log4perl, or in Perl or Solaris?


Source: (StackOverflow)

Is it possible to register a function to preprocess log messages with Log::Log4perl?

In this example:

$logger->debug({
    filter => \&Data::Dumper::Dumper,
    value  => $ref
});

I can pretty print my references instead of ARRAY(0xFFDFKDJ). But it's too boring to type that long code every time. I just want:

$logger->preprocessor({
    filter => \&Data::Dumper::Dumper,
    value  => $ref
});

$logger->debug( $ref, $ref2 );
$logger->info( $array );

And $ref, $ref2, and $array will be dumped by Data::Dumper.

It there a way to do this?

UPD
With help of your answers I do the patch

Now you just:

log4perl.appender.A1.layout=FallbackLayout
log4perl.appender.A1.layout.chain=PatternLayout
log4perl.appender.A1.layout.chain.ConversionPattern=%m%n
log4perl.appender.A1.warp_message = sub { $#_ = 2 if @_ > 3; \
                                       return @_; }
# OR
log4perl.appender.A1.warp_message = main::warp_my_message

sub warp_my_message {
    my( @chunks ) =  @_;

    use Data::Dump qw/ pp /;
    for my $msg ( @chunks ) {
        $msg =  pp $msg   if ref $msg;
    }

    return @chunks;
}

UPD2

Or you can use this small module

log4perl.appender.SomeAPP.warp_message  = Preprocess::Messages::msg_filter
log4perl.appender.SomeAPP.layout        = Preprocess::Messages

package Preprocess::Messages;

sub msg_filter {
    my @chunks =  @_;

    for my $msg ( @chunks ) {
        $msg =  pp $msg   if ref $msg;
    }

    return @chunks;
};

sub render {
    my $self =  shift;

    my $layout =  Log::Log4perl::Layout::PatternLayout->new(
        '%d %P %p> %c %F:%L %M%n  %m{indent=2}%n%n'
    );

    $_[-1] += 1; # increase level of the caller
    return $layout->render( join $Log::Log4perl::JOIN_MSG_ARRAY_CHAR, @{ shift() }, @_ );
}


sub new {
    my $class = shift;
    $class = ref ($class) || $class;

    return bless {}, $class;
}

1;

Yes, of course you can set 'warp_message = 0' and combine msg_filter and render together.

log4perl.appender.SomeAPP.warp_message  = 0
log4perl.appender.SomeAPP.layout        = Preprocess::Messages

sub render {
    my($self, $message, $category, $priority, $caller_level) = @_;

    my $layout =  Log::Log4perl::Layout::PatternLayout->new(
        '%d %P %p> %c %F:%L %M%n  %m{indent=2}%n%n'
    );

    for my $item ( @{ $message } ) {
        $item =  pp $item   if ref $item;
    }

    $message =  join $Log::Log4perl::JOIN_MSG_ARRAY_CHAR, @$message;
    return $layout->render( $message, $category, $priority, $caller_level+1 );
}

Source: (StackOverflow)

How can log4perl write to STDERR and a file at the same time?

I tried to set up two appenders, but it seems to only write to STDERR:

my $header = "######$scriptname $version";
use Log::Log4perl qw(:easy);
Log::Log4perl->easy_init($DEBUG);
my $logger = get_logger();
my $layout = Log::Log4perl::Layout::PatternLayout->new(
"%d %p> %F{1}:%L %M - %m%n");
my $appender = Log::Log4perl::Appender->new(
        "Log::Dispatch::File",
        filename=>$scriptname.".log",
        mode => "append"
);
$appender->layout($layout);
my $stderr = Log::Log4perl::Appender::Screen->new(
        stderr =>0,
        );

$stderr->layout($layout);
$logger->info($header);

Source: (StackOverflow)

Logging within utility classes

I want to adopt logging within several utility classes, e. g. DBI. What is the best practice to do it with Log::Log4perl?

I think it is OK to subclass DBI (say, MyDBI) and override some methods there to make them do the logging. But there's a problem with categories. If you create a logger with

Log::Log4perl->get_logger(ref $self || $self)

then all log entries belong to MyDBI and it would be hard to filter them. So it seems better to me to pass a logger to MyDBI from the calling module (say, MyModule), so that category would be semantically right. The first question, is it OK in general? I mean, are there any hidden reefs regarding such approach?

The second question, how to pass the logger to MyDBI? I have an idea to declare a global variable, e. g. $MyDBI::logger and set in the calling method:

local $MyDBI::logger = Log::Log4perl->get_logger(ref $self || $self);

There's a traditional dislike for global variables. Can you think of a better way?

EDIT: Of course, the best code is no code. caller would suffice, if it took inheritance into account.

The third question, is it possible to log into both categories, MyDBI and MyModule, with Log::Log4perl, if they are hierarchically unrelated?


Source: (StackOverflow)

Read console output of a shell script in Perl

Let's say I've got a shell script called print_error.sh looking like this:

#!/usr/bin/bash

echo "ERROR: Bla bla, yada yada."
exit 1

Now I'm in a Perl script, calling this shell script with

system("print_error.sh")

I now want to read the console output of print_error.sh and write it to a Log4perl logger.

How can I achieve this?


Source: (StackOverflow)

How can I report the line number with Log4perl and Moose?

Is it possible to get Log4perl to correctly display the line number and package/class of the log event instead of always Method::Delegation at line 99 when used with Moose?

In my case, I have created an attribute isa Log::Log4perl::Logger and delegated the various logging levels to my class (log, warn, error, ...). Doing this also shows the Delegation.pm as the file.

Thanks!


Source: (StackOverflow)

How to use Log4perl to rotate log files in multithread perl application

Below is the sample code that I have tried to rotate a log file in multithread application using log4perl. But it is working fine unless it is a multithread application. Logs are not rotated and Log file grows in size. Can anyone guide me on where I am going wrong?

use strict;
use warnings;
use Log::Log4perl;
use POSIX;
use threads;
use threads::shared;

my @InputFiles;

my $InputDirectory=$ARGV[0];
my $LogName=$ARGV[1];
opendir(DIR,$InputDirectory) or die "could not open the input directory";
@InputFiles=readdir(DIR);
close(DIR);
my $file;

    #logger_configuration
my $log_conf ="
   log4perl.rootLogger              = DEBUG, LOG1

   log4perl.appender.LOG1           = Log::Dispatch::FileRotate
   log4perl.appender.LOG1.filename  = $LogName
   log4perl.appender.LOG1.mode      = append
   log4perl.appender.LOG1.autoflush = 1
   log4perl.appender.LOG1.size      = 10000
   log4perl.appender.LOG1.max       = 20
   log4perl.appender.LOG1.layout    = Log::Log4perl::Layout::PatternLayout
   log4perl.appender.LOG1.layout.ConversionPattern = \%d{yyyy-MM-dd HH:mm:ss}|\%P|\%m|\%n
";

#loading the configuration file
Log::Log4perl::init(\$log_conf);

#creating logger instance
my $logger = Log::Log4perl->get_logger();

my $thread_count=5;
my $file_total= scalar @InputFiles;
#print STDERR "$file_total\n";

#dividing total files among the no of given threads
my $div = $file_total/$thread_count;
$div = ceil($div);
my $start = 0;
my $end = $div;
my @threads;
for (my $count = 1; $count <=$thread_count ; $count++) 
{
    my $thread = threads->new(\&process,$start,$end);
    push(@threads,$thread);        
    $start = $end;
    $end = $end + $div;
    if ($end > $file_total)
    {
        $end = $file_total;
    }
}

foreach (@threads) 
{
   $_->join;
}

sub process
{
    my $lstart = shift;
    my $lend = shift;
    my $id = threads->tid();
    for (my $index = $lstart; $index < $lend; ++$index) 
    {   
      $logger->info($InputFiles[$index]);
    }
}

Source: (StackOverflow)

Log4Perl bundling logging from several programs into one log

Is there any Logger on CPAN which allows me to bundle logs from several programs into one file with synchronising parallel logging when two programs run the same time and call log4Perl in parallel.
Background is that I use a custom appender which writes Emails and I would like to bundle all emails in a single file as a backup in case the mail server has problems.


Source: (StackOverflow)

Which script initialized module?

In Perl, is there any way to tell which .pl script has initialized this instance of a module?

Specifically, I'd like to get the name of the script calling a module, which has a Log4perl object it. That way, I'll know which .log file I want to write to within the module.

Am I doing this wrong? If I define the $logger in my .pl script, will any $logger calls within the module write to the same .log file as the calling script?

I don't have any sample code yet, but have been reading up on Log4perl. Basically, if I set an Appender to a file, caller.log, which is the file appender for my calling script, caller.pl, I'd want any logging defined in the custom imported module, to also write to caller.log (implicitly, if possible -- obviously I could just pass the name of the log name when I initialize the module instance).

Is this possible without passing arguments specifying which File Appender the module should write to? Doesn't Log4perl use just one $logger instance?

Also, let me know if I'm way out, and if there's a different approach I should be considering.

Thank you

EDIT: Sorry, after I posted this, I looked at the Related Links, and I guess my search wording just wasn't correct. It looks like this is a pretty good solution: Self logging Perl modules (without Moose)

If anyone has any other ideas, though, please let me know.

EDIT 2: Finally tested, and got it to work as I had wanted -- was a lot easier than was making it out to be, too!

This is my setup, pretty much:

Module.pm

package Module;

use Log::Log4perl qw(get_logger :levels);
use Data::Dumper;

my $logger = get_logger("Module");

sub new {
    my ($class, $name) = @_;

    my @caller = caller(0);
    $logger->debug("Creating new Module. Called by " . Dumper(\@caller));

    my $object = { 'name' => $name };

    return bless($object, $class);  
}

caller.pl

use Module;
use Log::Log4perl qw(get_logger :levels);
use Data::Dumper;

my $PATH = "$ENV{'APPS'}/$ENV{'OUTDIR'}";
my $SCRIPT = "caller";

my $logger = get_logger("Module");
$logger->level($DEBUG);

my $file_appender = Log::Log4perl::Appender->new("Log::Dispatch::File", 
                        filename=> "$PATH/$SCRIPT.log", 
                        mode => "append",);
$logger->add_appender($file_appender);

my $layout = Log::Log4perl::Layout::PatternLayout->new("%d %p> %F{1}:%L %M - %m%n");
$file_appender->layout($layout);

my $lib = Module->new('Chris');

$logger->info(Dumper($lib));

Source: (StackOverflow)

Perl sigdie handler and eval

I am overriding my SIG die handler as below inside my Logger module.

# Catch die messages and log them with logdie
$SIG{__DIE__} = \&logdie;

Now below program run as expected and post processing will be called.

use strict;
use warnings;
use File::Path;
# use MyLogger;

my $dir="/random";
eval {
  # local $SIG{__DIE__};
  File::Path::make_path($dir);
};
if($@) {
 warn("Cannot create $dir :$@ \n");
}
print "Post processing \n";

However, if i include my logger module and add use MyLogger the code fails inside eval statment with below error and post processing is not called.

[ERROR] 2015/04/27 22:19:07 Carp.pm:166> mkdir /random: Permission denied at ./test.pl line 11.

One option to fix this to add a local sigdie handle (as shown in commented code).

However, my logger module is used by many scripts.

Is there a way to modify my Logger module so that it supresses ERROR message when called from inside eval block ?


Source: (StackOverflow)

How can I use Log4Perl across modules in Perl?

I'm planning to use Log4Perl in my modules for logging.

My code structure goes like this

I have Start.PL which validates some parameters. I have several modules (PM) file which are interlinked (used across these PL and PM files)

I have a Logger.PM in which I have a method InitiateLogger() which creates the log object

 $log    = Log::Log4perl->get_logger("MyLog");

I call this method Logger::InitiateLogger(); in the Start.pl

Here are my questions

  1. How can I use the same $log across the modules (PM files)
  2. Do I need to use same package name for this?

Would be nice if someone clarifies me these points.


Source: (StackOverflow)