I have a list of several million domain names and I want to see if they are available or not.
I tried pywhois first but am getting rate limited. As I don't need an authoritative answer, I thought I would just use nslookup. I am having trouble scripting this though.
Basically, what I want to do is, if the domain is registered, echo it. What I'm getting is grep: find”: No such file or directory . I think its something easy and I've just been looking at this for too long...
#!/bin/bash
START_TIME=$SECONDS
for DOMAIN in `cat ./domains.txt`;
do
if ! nslookup $DOMAIN | grep -v “can’t find”; then
echo $DOMAIN
fi
done
echo ELAPSED_TIME=$(($SECONDS - $START_TIME))
If you have millions to check, you may like to use GNU Parallel to get the job done faster, like this if you want to repeatedly do, say 32 lookups in parallel
parallel -j 32 nslookup < domains.txt | grep "^Name"
If you want to fiddle with the output of nslookup
, the easiest way is probably to declare a little function called lkup()
, tell GNU Parallel about it and then use that, like this
#!/bin/bash
lkup() {
if ! nslookup $1 | grep -v "can't find"; then
echo $1
fi
}
# Make lkup() function visible to GNU parallel
export -f lkup
# Check the domains in parallel
parallel -j 32 lkup < domains.txt
If the order of the lookups is important to you, you can add the -k
flag to parallel
to keep the order.