Hello,
I am trying to download some data from my cluster using sbatch. My script is (using Bash on Mac M1 Big Sur),
#!/bin/sh
#SBATCH --job-name=sample10
#SBATCH --nodes=2
#SBATCH --ntasks=48
#SBATCH --time=02:00:00
#SBATCH --error=sample10.err
wget ftp://ftp.sra.ebi.ac.uk/vol1/run/SRR136/SRR13644609/possorted_genome_bam10.bam.1
However, when I run sbatch sample10.sh
, my error file has multiple repeats of the following error:
--2022-06-25 10:53:49-- ftp://ftp.sra.ebi.ac.uk/vol1/run/SRR136/SRR13644609/possorted_genome_bam10.bam.1
(try: 2) => ‘possorted_genome_bam10.bam.1’
Connecting to ftp.sra.ebi.ac.uk (ftp.sra.ebi.ac.uk)|... connected.
Error in server response. Closing.
Retrying.
I thought this might mean something is wrong with the wget function. However, if I use wget ftp://ftp.sra.ebi.ac.uk/vol1/run/SRR136/SRR13644609/possorted_genome_bam10.bam.1
by itself on Terminal, everything works properly.
However, if something was wrong with my slurm commands, then the job itself wouldn't proceed to the wget line, right?
Has anyone encountered anything like this?