STAR alignment killed
1
0
Entering edit mode
6.3 years ago
jsl ▴ 50

Hi, I am trying to run STAR alignment using the command below:

STAR --runThreadN 10 --runMode genomeGenerate --genomeDir /home/user/scratch60/beargenomes –genomeFastaFile /home/user/scratch60/hg38.fa --readFilesIn /home/user/scratch60/SRR7059137.fastq --sjdbGTFfile /home/user/scratch60/gencode.v28.annotation.gtf  --sjdbOverhang 99 --outFileNamePrefix /home/user/scratch60/SRR7059137 --outSAMtype BAM

and this came up

Aug 09 11:40:01 ..... started STAR run
Aug 09 11:40:01 ... starting to generate Genome files
Killed

I would like to understand why... if anyone can shed some light to this that'd be much appreciated. Thanks!

RNA-Seq alignment • 14k views
ADD COMMENT
0
Entering edit mode

I'm having the same problem as yours. Can you elaborate Slurm job?

ADD REPLY
0
Entering edit mode

Please post a new question with the necessary details and the code you used.

Please be sure to read and follow: [ Please read before posting a question ] -- How To Ask A Good Question

ADD REPLY
0
Entering edit mode
6.3 years ago
GenoMax 147k

How much memory is being assigned? STAR requires 30+G of RAM for human genome generation/alignments.

ADD COMMENT
0
Entering edit mode

And maybe that --runThreadN=10 is multiplying the requirements. Try 1 thread, and monitor memory usage, then try two threads and monitor memory usage.

ADD REPLY
1
Entering edit mode

I tried 1 thread but it still didnt work :(

ADD REPLY
0
Entering edit mode

I'm running on a cluster on interactive bash... how can i determine the memory?

i ran /usr/bin/time in front of the STAR command. 11.76user 4.19system 0:16.34elapsed 97%CPU (0avgtext+0avgdata 5243688maxresident)

does this help?

ADD REPLY
0
Entering edit mode

Can you run top and show us the 4-5 lines at top of the window?

ADD REPLY
0
Entering edit mode

I cant copy it, so here's an image of the top 4-5lines at the top of the window! https://s22.postimg.cc/mxqg8764x/top.png

ADD REPLY
0
Entering edit mode

Find the log file generated from the run and see why it got killed. Alternatively, upload it somwhere (like pastebin) for us to see.

ADD REPLY
0
Entering edit mode

Hi Marks,

I've uploaded the log.out file onto pastebin like your suggested

https://pastebin.com/WUhxYpix

ADD REPLY
0
Entering edit mode

nothing there unfortunately, can you paste Log.progress.out and Log.final.out just to make sure

ADD REPLY
0
Entering edit mode

Do you know how much ram the cluster has available? This is probably an issue with STAR reaching the RAM limit. You could try changing genomeChrBinNbits to 12 to limit the ram usage.

ADD REPLY
0
Entering edit mode

Thanks Marks. there isnt a log.progress.out or log.final.out associated with that run. But i did try the genomeChrBinNbits and the run was aborted with the follow:

Aug 10 07:16:06 ..... started STAR run
Aug 10 07:16:06 ... starting to generate Genome files
terminate called after throwing an instance of 'std::bad_alloc'
  what():  std::bad_alloc
Aborted
ADD REPLY
0
Entering edit mode

junsionglow : I would like to see something right at top that shows the amount of memory available on your node.

%Cpu(s):  0.5 us,  0.4 sy,  0.0 ni, 99.1 id,  0.0 wa,  0.0 hi,  0.0 si,  0.0 st
KiB Mem : 32781404 total,   483144 free, 17888072 used, 14410188 buff/cache
KiB Swap:  2097148 total,  2080764 free,    16384 used. 14402508 avail Mem
ADD REPLY
0
Entering edit mode

ah, this is it.

Tasks: 699 total,   1 running, 696 sleeping,   2 stopped,   0 zombie
%Cpu(s):  0.0 us,  0.4 sy,  0.3 ni, 99.2 id,  0.0 wa,  0.0 hi,  0.0 si,  0.0 st
KiB Mem : 13131726+total,   352784 free,  6140812 used, 12482367+buff/cache
KiB Swap: 16777212 total, 16484712 free,   292500 used. 12422304+avail Mem
ADD REPLY
1
Entering edit mode

If I am reading it right, it looks like you only have 16GB of RAM on this machine. That is not going to be enough to run STAR.

ADD REPLY
0
Entering edit mode

How much is recommended for STAR?

ADD REPLY
1
Entering edit mode

A minimum of 30G free for human genome.

ADD REPLY
1
Entering edit mode

Thanks genomax and Marks. I will see what work around and get back to you guys.

ADD REPLY
1
Entering edit mode

Dear Marks and Genomax,

Finally gotten it to work. There was 2 part problem, one was my indexing, the other was memory (I believe). So when I submitted a Slurm job requesting for memory, that worked! Thanks guys!

ADD REPLY
0
Entering edit mode

Nice, glad we could help. Good luck and have fun

ADD REPLY
0
Entering edit mode
cat /proc/meminfo

[or trying desparately to copy-paste the output from top (scratching my head a little), then giving up :D]

ADD REPLY
0
Entering edit mode

top -d 100 should give you ample time to copy.

ADD REPLY
0
Entering edit mode
MemTotal:       131484208 kB
MemFree:         5941884 kB
MemAvailable:   124402808 kB
Buffers:            1712 kB
Cached:         117349124 kB
SwapCached:       609500 kB
Active:         18888120 kB
Inactive:       102006036 kB
Active(anon):    2442888 kB
Inactive(anon):  2040928 kB
Active(file):   16445232 kB
Inactive(file): 99965108 kB
Unevictable:           0 kB
Mlocked:               0 kB
SwapTotal:      134217724 kB
SwapFree:       128618648 kB
Dirty:               364 kB
Writeback:             0 kB
AnonPages:       3081108 kB
Mapped:           194920 kB
Shmem:            939852 kB
Slab:            2960212 kB
SReclaimable:    2672528 kB
SUnreclaim:       287684 kB
KernelStack:       34400 kB
PageTables:        62552 kB
NFS_Unstable:          0 kB
Bounce:                0 kB
WritebackTmp:          0 kB
CommitLimit:    199959828 kB
Committed_AS:   12846268 kB
VmallocTotal:   34359738367 kB
VmallocUsed:      657620 kB
VmallocChunk:   34291795964 kB
HardwareCorrupted:     0 kB
AnonHugePages:    475136 kB
HugePages_Total:       0
HugePages_Free:        0
HugePages_Rsvd:        0
HugePages_Surp:        0
Hugepagesize:       2048 kB
DirectMap4k:      759796 kB
DirectMap2M:    33515520 kB
DirectMap1G:    101711872 kB
ADD REPLY

Login before adding your answer.

Traffic: 2074 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6