I need to copy my website. Now I'm using the instruction that is provided on [official website][1], to exclude some urls.
Here is the wget command:
/usr/bin/wget --no-host-directories -U Mozilla --recursive
--page-requisites --restrict-file-names=windows --convert-links
--html-extension --mirror -P /path/to/folder --no-parent -e
--exclude-domains http://www.external-domain.com/
robots=off http://my-domain.com
I also tried this code:
/usr/bin/wget --no-host-directories -U Mozilla --recursive
--page-requisites --restrict-file-names=windows --convert-links
--html-extension --mirror -P /path/to/folder --no-parent -e
--exclude-domains=http://www.external-domain.com/
robots=off http://my-domain.com
I get the following error - wget: Invalid --execute command --exclude-domains'
. So, how can I exclude the domains I don't need to be copied?
I'm sorry , I didn't aware, that the options positions are important, so I've changed it like this:
/usr/bin/wget --no-host-directories -U Mozilla --recursive
--page-requisites --restrict-file-names=windows --convert-links
--exclude-domains www.external-domain.com --html-extension --mirror
-P /path/to/my/folder --no-parent -e robots=off www.my-domain.com
and it starts working.