Hello all,
I'm attempting to use STAR to map some RNA-seq data, but keep getting some sort of error. Here is the command I used:
for i in `ls *_clean.fastq | sed 's/_clean.fastq//'`; do STAR --runThreadN 20 --quantMode GeneCounts --outFileNamePrefix aligned_haploid/$i --outSAMtype BAM SortedByCoordinate --outSAMunmapped Within --genomeDir /media/smasher/4TB_data/pairwise_brb_8-2023/haploid_genome --readFilesIn $i\_clean.fastq; done
Here's the error I get:
EXITING because of FATAL ERROR: failed reading from temporary file: aligned_haploid/A01_STARtmp//BAMsort/2/44
It doesn't happen to every fastq file, I mapped 95 files and 5 of them generated this error. I thought maybe it was something to do with the fastq's themselves because I mapped them individually rather than in a loop and still got this same error. However, when I map them to a different, but very similar, genome, I get the error but for different files.
Anyone know what's going on here and have an idea of how to fix it? I'm running STAR 2.7.10a via Mamba, could that be the problem?
Thanks!
Looks like this fixed it, thank you! I started it running without sorting after reading your response and every single one has mapped successfully so far, including a couple that were giving the error before. This will be great to know going forward, thanks again!!!
It's an internal drive, it's just mounted to /media.
thanks, for me it was the number of cores indeed, had always used 8 before, this time kept giving error with 24 cores