Why does Git.pm on cygwin complain about 'Out of memory during "large" request?
- by Charles Ma
Hi,
I'm getting this error while doing a git svn rebase in cygwin
Out of memory during "large" request for 268439552 bytes, total sbrk() is 140652544 bytes at /usr/lib/perl5/site_perl/Git.pm line 898, <GEN1> line 3.
268439552 is 256MB. Cygwin's maxium memory size is set to 1024MB so I'm guessing that it has a different maximum memory size for perl?
How can I increase the maximum memory size that perl programs can use?
update:
This is where the error occurs (in Git.pm):
while (1) {
my $bytesLeft = $size - $bytesRead;
last unless $bytesLeft;
my $bytesToRead = $bytesLeft < 1024 ? $bytesLeft : 1024;
my $read = read($in, $blob, $bytesToRead, $bytesRead); //line 898
unless (defined($read)) {
$self->_close_cat_blob();
throw Error::Simple("in pipe went bad");
}
$bytesRead += $read;
}
I've added a print before line 898 to print out $bytesToRead and $bytesRead and the result was 1024 for $bytesToRead, and 134220800 for $bytesRead, so it's reading 1024 bytes at a time and it has already read 128MB. Perl's 'read' function must be out of memory and is trying to request for double it's memory size...is there a way to specify how much memory to request? or is that implementation dependent?
UPDATE2:
While testing memory allocation in cygwin:
This C program's output was 1536MB
int main() {
unsigned int bit=0x40000000, sum=0;
char *x;
while (bit > 4096) {
x = malloc(bit);
if (x)
sum += bit;
bit >>= 1;
}
printf("%08x bytes (%.1fMb)\n", sum, sum/1024.0/1024.0);
return 0;
}
While this perl program crashed if the file size is greater than 384MB (but succeeded if the file size was less).
open(F, "<400") or die("can't read\n");
$size = -s "400";
$read = read(F, $s, $size);
The error is similar
Out of memory during "large" request for 536875008 bytes, total sbrk() is 217088 bytes at mem.pl line 6.