Download multiple fastq files
1
0
2
Entering edit mode

Put the links in a file (one per line). Then use wget -i url.txt (if your file is called url.txt. If you want to use curl instead then use xargs –n 1 curl –O < url.txt.

ADD REPLY
1
Entering edit mode

Is wget -i your_list what you are looking for ?

ADD REPLY
0
Entering edit mode

After making a input_files.txt file with only SRR names and put this in a input_file_folder. You can use this script:

cd input_file_folder
NUMB_LINES=$(wc -l input_files.txt|cut -f 1 -d ' ')
for i in $(eval echo {1..$NUMB_LINES}); do
FILE_NAME=$(head -n $i input_files|tail -n 1 |cut -f 1)
VAR1=$(head -n $i input_files|tail -n 1 |cut -f 2)
VAR2=$(head -n $i input_files|tail -n 1 |cut -f 3)
INPUT=$(echo ftp:link)
wget -i $INPUT -O $FILE_NAME
done
ADD REPLY
0
Entering edit mode
4.4 years ago

Save the links in a text file and use GNU parallel with wget to download the files simultaneously.

parallel --verbose "wegt {}" ::: $(cat download.txt)

If you want to use multiple connections for download use aria2.

parallel --verbose "aria2c -x <number of conections> {}" ::: $(cat download.txt)
ADD COMMENT
2
Entering edit mode

Assuming OP likely has a single ethernet port on their machine this sounds like an invitation for trouble. Multiple threads are just going to compete with each other and eventually may take more time to complete the download.

ADD REPLY

Login before adding your answer.

Traffic: 1827 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6