I want to download the following fastq files in Salmon all at once:
ftp://ftp.sra.ebi.ac.uk/vol1/fastq/SRR106/014/SRR10611214/SRR10611214_1.fastq.gz ftp://ftp.sra.ebi.ac.uk/vol1/fastq/SRR106/014/SRR10611214/SRR10611214_2.fastq.gz ftp://ftp.sra.ebi.ac.uk/vol1/fastq/SRR106/015/SRR10611215/SRR10611215_1.fastq.gz ftp://ftp.sra.ebi.ac.uk/vol1/fastq/SRR106/015/SRR10611215/SRR10611215_2.fastq.gz ftp://ftp.sra.ebi.ac.uk/vol1/fastq/SRR106/016/SRR10611216/SRR10611216_1.fastq.gz ftp://ftp.sra.ebi.ac.uk/vol1/fastq/SRR106/016/SRR10611216/SRR10611216_2.fastq.gz ftp://ftp.sra.ebi.ac.uk/vol1/fastq/SRR106/017/SRR10611217/SRR10611217_1.fastq.gz ftp://ftp.sra.ebi.ac.uk/vol1/fastq/SRR106/017/SRR10611217/SRR10611217_2.fastq.gz
Is there a way to do this using a bash for loop or fasterq-dump?
Put the links in a file (one per line). Then use
wget -i url.txt
(if your file is calledurl.txt
. If you want to usecurl
instead then usexargs –n 1 curl –O < url.txt
.Is
wget -i your_list
what you are looking for ?After making a input_files.txt file with only SRR names and put this in a input_file_folder. You can use this script: