I have just tried using the following Perl script to do some text substitution using the File::Slurp module. It works fine on a single file given as an argument at the command line (BASH).
#!/opt/local/bin/perl -w
use File::Slurp qw( edit_file_lines );
foreach my $argnum (0 .. $#ARGV) {
edit_file_lines {
s/foo/bar/g;
print $_
}
$ARGV[$argnum];
}
I would like to alter it to cope also with pipes (i.e. STDIN), so that it can be in the middle of a series of piped operations: for example:
command blah|....|my-perl-script|sort|uniq|wc....
What is the best way to change the Perl script to allow this, whilst retaining the existing ability to work with single files on the command line?
To have your script work in a pipeline, you could check if STDIN
is connected to a tty
:
use strict;
use warnings;
use File::Slurp qw( edit_file_lines );
sub my_edit_func { s/foo/bar/g; print $_ }
if ( !(-t STDIN) ) {
while(<>) { my_edit_func }
}
else {
foreach my $argnum (0 .. $#ARGV) {
edit_file_lines { my_edit_func } $ARGV[$argnum];
}
}
See perldoc -X
for more information on the -t
file test operator.