Search code examples
scriptingcurlwget

Is there a curl/wget option that prevents saving files in case of http errors?


I want to download a lot of urls in a script but I do not want to save the ones that lead to HTTP errors.

As far as I can tell from the man pages, neither curl or wget provide such functionality. Does anyone know about another downloader who does?


Solution

  • One liner I just setup for this very purpose:

    (works only with a single file, might be useful for others)

    A=$$; ( wget -q "http://example.com/pipo.txt" -O $A.d && mv $A.d pipo.txt ) || (rm $A.d; echo "Removing temp file")
    

    This will attempt to download the file from the remote host. If there is an error, the file is not kept. In all other cases, it's kept and renamed.