Subject: | setting 'eol' affects global input record separator |
The following demonstrates an issue where setting the 'eol' value to
CRLF results in a subsequent unrelated file read slurping the entire
contents at once. The expected behavior can be restored by setting $\
back to "\n".
Note that this is running on a Linux system where 'eol' is a non default
value.
This script outputs:
100
1
100
While the expected output is:
100
100
100
Perhaps this is a deeper issue in IO::Handle or PerlIO, but I wanted to
raise attention where I observed the issue first. Thanks!
- danboo
use strict;
use warnings;
use Text::CSV_XS;
## prints 100 (lines) as expected
slurp_check();
my $csv_data = "a,b,c" . "\015\012" . "1,2,3" . "\015\012";
open my $csv_fh, '<', \$csv_data or die $!;
my $csv = Text::CSV_XS->new( { eol => "\015\012" } );
my $csv_parse = $csv->getline($csv_fh);
## now prints 1
slurp_check();
## restore $/ to get 100 again
{ local $/ = "\n"; slurp_check() }
sub slurp_check
{
my $data = join "\n", 1 .. 100;
open my $fh, '<', \$data or die $!;
print scalar @{[ <$fh> ]};
close $fh;
print "\n";
}