Hi,
I have been working with all my DNA analysis files in parallels but I got to a point where I had about 15 files get stuck on one step. Specifically, I notice something is wrong because the files have 0 memory in them when they go from bam to sortedbam. I have compared all the files during previous steps, bam, sam, and fastq. The ones which are stuck are of a comparable size as the rest, up until they get sorted.
This is the code I am using to sort them.
#!/bin/bash
#SBATCH -J bamSORT_all
#SBATCH -A cluster_username
#SBATCH -N 1 --ntasks-per-node=24
#SBATCH --mem-per-cpu=8G
#SBATCH -t 36:00:00
#SBATCH -o Report-%j.out
cd $SLURM_SUBMIT_DIR
ml picard/3.0.0
for i in $(ls *.bam | rev | cut -c 5- | rev | uniq); do java -jar /usr/local/pace-apps/manual/packages/picard/3.0.0/build/libs/picard.jar SortSam -I /scratch/bam_try2/${i}.bam -O /scratch/bam_try2/${i}.sorted.bam --SORT_ORDER coordinate; done
I have tried repeating the bam to sorted bam step, sam to bam, and bwa-mem to sam. By the time I get to the sortedbam step it will show the files are empty. What could be going on?
check the logs stdout and stderr should have some information in my opinion