rnaSPAdes running out of memory: oom error
0
0
Entering edit mode
13 months ago
ha2606 • 0

I am trying to assemble a metatranscriptome sample (~83M reads) using rnaSPAdes (v 3.13.0), and I am running out of memory on these runs.

After running the following code:

rnaspades.py -m 1500 -t 8 -o rnaSPAdes_output_37 --pe1-1 SM037_R1.fastq --pe1-2 SM037_R2.fastq

I get the processor error:

slurmstepd: error: Detected 1 oom-kill event(s) in StepId=19919309.batch cgroup. Some of your processes may have been killed by the cgroup out-of-memory handler

And the rnaSPAdes error:

== Error ==  finished abnormally, err code: -9

I have used these parameters to successfully assemble a metaT sample of similar size (~75M reads), and the memory limits set here should be adequate. I am worried there may be an issue with my input file-- it was pre-processed using standard techniques with Trimmomatic and FastQC, and I have been unable to find anything unusual about it.

Does anyone have experience with an error of this nature?

Thank you in advance!

metatranscriptome rnaSPADES SPADES memory • 750 views
ADD COMMENT
0
Entering edit mode

and the memory limits set here should be adequate

Every dataset is different and it is possible that you need more RAM with this particular one. What is -m 1500 is that 15 G or 1.5G? Allocate more RAM and keep trying until the oom-kill error goes away and the process works (or you get some other error).

ADD REPLY
0
Entering edit mode

Thanks for your response! The memory limit in spades is set in Gb, so it should be 1500 Gb.

ADD REPLY
0
Entering edit mode

The memory limit in spades is set in Gb, so it should be 1500 Gb.

Since you appear to be using SLURM are you asking for an identical amount in your sbatch command or SLURM script? Probably not and that is what is likely causing the process to get killed. You will need to use --mem=NNg in your SLURM directive (along with appropriate core request).

ADD REPLY
0
Entering edit mode

Yes, I was asking for a lower amount of RAM in my slurm script (1000 Gb) than in the spades code. I had assumed the assembly would not hit this limit, but it's definitely possible that this was the problem. Thank you for your assistance!

ADD REPLY

Login before adding your answer.

Traffic: 2201 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6