The memory here in sort is calculated per thread so essentially this is 16 threads * 32GB RAM so 512GB requested RAM -- not going to happen and also not necessary for sorting. Do something like -@ 8 -m 1G, that's by far enough and the improvement in speed if you go higher than that is so minimal that it does not merit the additional resources.
I need to use the restroom so my computer was inactive for a few minutes and it lost connection to the server so all work after hours running is gone 🥲. Maybe I need to use batch submit.
Not sure who you are talking to. Of course jobs to a server should be submitted and independent from an active ssh connection. Sounds like you should talk to the admin of this server aiming for an introduction to some basics. I am not saying to taunt you, but it's really basics you have to know. Also be sure that you are using the job nodes in case that exists rather than the headnode...
Thanks ATpoint! The first bam file is 66Gb so the first command without adding thread and memory has run for hours:
I need to use the restroom so my computer was inactive for a few minutes and it lost connection to the server so all work after hours running is gone 🥲. Maybe I need to use batch submit.
Not sure who you are talking to. Of course jobs to a server should be submitted and independent from an active ssh connection. Sounds like you should talk to the admin of this server aiming for an introduction to some basics. I am not saying to taunt you, but it's really basics you have to know. Also be sure that you are using the job nodes in case that exists rather than the headnode...
I used batch submit and it took not so long. Thank you!
If you're allowed to use the login node on your server to run tasks you can use tmux to keep a command running regardless of your connection.
My job on the server finished. Thank you!