Large file download
1
0
Entering edit mode
2.2 years ago
MattiaF • 0

Hi everyone,

I need to download a large file (~40Gb) from this online repository (https://data.goettingen-research-online.de/dataverse/gro) which, unfortunately, has some sort of timeout setting that interrupts the connection after 15 minutes.

I already tried to exploit wget in order to re-start the download but it is not working.

Do you have any suggestion on how to proceed in similar situations?

Thank you!

Mattia

bash • 448 views
ADD COMMENT
0
Entering edit mode
2.2 years ago

wget has a --continue flag and this should work unless the URL changes upon retry.

Alternatively, you could use curl with the -C flag so it resumes automatically.

ADD COMMENT

Login before adding your answer.

Traffic: 2500 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6