Entering edit mode
10.6 years ago
Jack
▴
80
I'm trying to run Pindel on some whole exome tumor/normal pairs. BAMs are about ~40Gb in size. I've got a bad_alloc error at my first trial. Below is the command line. I set the window size to 1; also disabled search for breakpoints and long insertions as suggested in the FAQ:
pindel -f ucsc.hg19.fasta -i config.txt -o test -w 1 -l false -k false
Here's the output:
Initializing parameters...
Pindel version 0.2.5a3, Oct 24 2013.
Loading reference genome ...
terminate called after throwing an instance of 'std::bad_alloc'
what(): std::bad_alloc
Aborted (core dumped)
I'm using 32bit Ubuntu 12.04 with 8Gb RAM, ~6Gb free.
Any suggestions?
I hate to do a "me too" post, but...me too. I haven't been able to find any response to this error around the net. I don't usually have any trouble running a 40G BAM, but I have to give it about 32G of RAM, which seems excessive for an algorithm that's supposed to be relatively efficient. My problem right now is that I've got whole-genome BAMs between 200 and 500 GB, and they're dying with bad_alloc. It happens even when I run a specific small chromosome, -w 1 -[rtlks] false, and give it *96GB* of RAM. Same deal whether I use multiple threads or not. I'm running some tests on our big-memory machines right now, to see if giving it ludicrous amounts of memory (like 256GB) helps; but it seems to me that something is wrong if we have to reserve memory space the size of the entire BAM just to get pindel to run. My current test BAM is 290GB, with an average insert size of 450.