Java.Io.Filenotfoundexception ( Too Many Open Files ) :Picard Tools
2
1
Entering edit mode
12.1 years ago
hellbio ▴ 520

Hi,

I am using the following command,

java -jar /csc/lohi/dog_tools/picard-tools-1.70/AddOrReplaceReadGroups.jar \
  I=bull2.bam \
  O=bull2_rg_sorted.bam \
  SORT_ORDER=coordinate \
  RGID=bull2 \
  RGLB=bull2 \
  RGPL=Illumina \
  RGPU=bull2 \
  RGSM=bull2 \
  VALIDATION_STRINGENCY=SILENT \
  CREATE_INDEX=True \
  TMP_DIR=`pwd`/tmp

and it says

Exception in thread "main" net.sf.samtools.util.RuntimeIOException: java.io.IOException: (Too many open files)

Any suggestions to fix this?

picard • 14k views
ADD COMMENT
5
Entering edit mode
12.1 years ago

Add the parameter named MAX_FILE_HANDLES_FOR_READ_ENDS_MAP, I set it to 1000 and it helped :-) with the explanation here

ADD COMMENT
0
Entering edit mode

MAX_FILE_HANDLES_FOR_READ_ENDS_MAP is a command line parameter for MarkDuplicates function whereas my task is AddOrReplaceReadGroups.jar which doesn't have the parameter.

Any help?

ADD REPLY
1
Entering edit mode

There is even another parameter for that. By looking at the link posted by Noolean you'll find:

There are several things you can do: 1) On Unix systems, the allowed number of open files can be increased. ulimit -n will show you how many open files a process is allowed to have. You can ask your system administrator to increase that number. 2) Many Picard programs use temporary files to sort. You can increase the value of the MAX_RECORDS_IN_RAM command-line parameter in order to tell Picard to store more records in fewer files in order to reduce the number of open files. This will, however, increase memory consumption. 3) MarkDuplicates has the command-line parameter MAX_FILE_HANDLES_FOR_READ_ENDS_MAP. By reducing this number, you reduce the number of concurrently open files (trading off execution speed).

ADD REPLY
0
Entering edit mode

Thank you for this information. I was struggling with Picard's AddOrReplaceReadGroup in order to sort and add Read Group information while converting my .sam to .bam file. ulimit -n (500,000) didn't helped but then I used MAX_RECORDS_IN_RAM=1000000, which finally enabled my command to run successfully and I got the desired output .bam file.

ADD REPLY
0
Entering edit mode

I was having problem with MarkDuplicates and setting MAX_FILE_HANDLES_FOR_READ_ENDS_MAP to 1000 as pointed out above does do the trick!

ADD REPLY
0
Entering edit mode

Great, I will be happy if you upvote my answer in case it helped you.

ADD REPLY
2
Entering edit mode
ADD COMMENT
0
Entering edit mode

Hi Albert,

ulimit is set to unlimited and MAX_RECORDS_IN_RAM=1000000 but, I still have the same issue. Any suggestions would help.

ADD REPLY

Login before adding your answer.

Traffic: 1764 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6