I'm tryng to verify if all my page links are valid, and also something similar to me if all the pages have a specified link like contact. i use python unit testing and selenium IDE to record actions that need to be tested.
So my question is can i verify the links in a loop or i need to try every link on my own?
i tried to do this with __iter__
but it didn't get any close ,there may be a reason that i'm poor at oop, but i still think that there must me another way of testing links than clicking them and recording one by one.
I would just use standard shell commands for this:
grep --files-without-match
to find those
that don't have a contact link.If you're on windows, you can install cygwin or install the win32 ports of these tools.
use wget to detect broken links
link above:When ever we release a public site its always a good idea to run a spider on it, this way we can check for broken pages and bad urls. WGET has a recursive download command and mixed with --spider option it will just crawl the site.
1) Download WGET Mac: http://www.statusq.org/archives/2008/07/30/1954/ Or use macports and download wget. Windows: http://gnuwin32.sourceforge.net/packages/wget.htm Linux: Comes built in ---------------------------------------- 2) In your console / terminal, run (without the $): $ wget --spider -r -o log.txt http://yourdomain.com 3) After that just locate you "log.txt" file and at the very bottom of the file will be a list of broken links, how many links there are, etc.