Entering edit mode
9 months ago
daffodil
▴
10
Hi I have run markduplicated from picard but I got this error: It would be appritiated if you could help me.
Feb 08, 2024 8:51:48 AM com.intel.gkl.NativeLibraryLoader load
INFO: Loading libgkl_compression.so from jar:file:/sw/bioinfo/picard/3.1.1/rackham/picard.jar!/com/intel/gkl/native/libgkl_compression.so
[Thu Feb 08 08:51:48 CET 2024] MarkDuplicates INPUT=[SPG_rep3_uniquely_mapped.bam] OUTPUT=SPG_rep3_marked_duplicates.bam METRICS_FILE=SPG_rep3_metrics.txt REMOVE_DUPLICATES=true MAX_SEQUENCES_FOR_DISK_READ_ENDS_MAP=50000 MAX_FILE_HANDLES_FOR_READ_ENDS_MAP=8000 SORTING_COLLECTION_SIZE_RATIO=0.25 TAG_DUPLICATE_SET_MEMBERS=false REMOVE_SEQUENCING_DUPLICATES=false TAGGING_POLICY=DontTag CLEAR_DT=true DUPLEX_UMI=false FLOW_MODE=false FLOW_QUALITY_SUM_STRATEGY=false USE_END_IN_UNPAIRED_READS=false USE_UNPAIRED_CLIPPED_END=false UNPAIRED_END_UNCERTAINTY=0 FLOW_SKIP_FIRST_N_FLOWS=0 FLOW_Q_IS_KNOWN_END=false FLOW_EFFECTIVE_QUALITY_THRESHOLD=15 ADD_PG_TAG_TO_READS=true ASSUME_SORTED=false DUPLICATE_SCORING_STRATEGY=SUM_OF_BASE_QUALITIES PROGRAM_RECORD_ID=MarkDuplicates PROGRAM_GROUP_NAME=MarkDuplicates READ_NAME_REGEX=<optimized capture of last three ':' separated fields as numeric values> OPTICAL_DUPLICATE_PIXEL_DISTANCE=100 MAX_OPTICAL_DUPLICATE_SET_SIZE=300000 VERBOSITY=INFO QUIET=false VALIDATION_STRINGENCY=STRICT COMPRESSION_LEVEL=5 MAX_RECORDS_IN_RAM=500000 CREATE_INDEX=false CREATE_MD5_FILE=false USE_JDK_DEFLATER=false USE_JDK_INFLATER=false
[Thu Feb 08 08:51:48 CET 2024] Executing as ozatalab@rackham3.uppmax.uu.se on Linux 3.10.0-1160.108.1.el7.x86_64 amd64; OpenJDK 64-Bit Server VM 17+35-2724; Deflater: Intel; Inflater: Intel; Provider GCS is available; Picard version: 3.1.1
INFO 2024-02-08 08:51:48 MarkDuplicates Start of doWork freeMemory: 1005193264; totalMemory: 1019805696; maxMemory: 16307191808
INFO 2024-02-08 08:51:48 MarkDuplicates Reading input file and constructing read end information.
INFO 2024-02-08 08:51:48 MarkDuplicates Will retain up to 59084028 data points before spilling to disk.
INFO 2024-02-08 08:51:54 MarkDuplicates Read 1,000,000 records. Elapsed time: 00:00:05s. Time for last 1,000,000: 5s. Last read position: chr10:82,178,530
INFO 2024-02-08 08:51:54 MarkDuplicates Tracking 328 as yet unmatched pairs. 227 records in RAM.
[Thu Feb 08 08:51:56 CET 2024] picard.sam.markduplicates.MarkDuplicates done. Elapsed time: 0.14 minutes.
Runtime.totalMemory()=1019805696
To get help, see http://broadinstitute.github.io/picard/index.html#GettingHelp
Exception in thread "main" htsjdk.samtools.SAMException: Value was put into PairInfoMap more than once. 1: RGNB551428:57:HT2V3BGXV:1:22111:15131:18178
at htsjdk.samtools.CoordinateSortedPairInfoMap.ensureSequenceLoaded(CoordinateSortedPairInfoMap.java:133)
at htsjdk.samtools.CoordinateSortedPairInfoMap.remove(CoordinateSortedPairInfoMap.java:86)
at picard.sam.markduplicates.util.DiskBasedReadEndsForMarkDuplicatesMap.remove(DiskBasedReadEndsForMarkDuplicatesMap.java:61)
at picard.sam.markduplicates.MarkDuplicates.buildSortedReadEndLists(MarkDuplicates.java:560)
at picard.sam.markduplicates.MarkDuplicates.doWork(MarkDuplicates.java:270)
at picard.cmdline.CommandLineProgram.instanceMain(CommandLineProgram.java:280)
at picard.cmdline.PicardCommandLine.instanceMain(PicardCommandLine.java:105)
at picard.cmdline.PicardCommandLine.main(PicardCommandLine.java:115)
see Markduplicates: Value Was Put Into Pairinfomap More Than Once ; Picard ValidateSamFile Error: "ValidateSamFile Value was put into PairInfoMap more than once" ; Picard MarkDuplicates error: Value was put into PairInfoMap more than once ; etc...