
crawler, disallow it access to nothing.This might seem like backward logic, but the
following robots.txt file indicates that all crawlers are to be sent away except for
the crawler named Palookaville:
#Bring on Palookaville
User-Agent: *
Disallow: /
User-Agent: Palookaville
Disallow:
Notice that there is no slash after Palookaville’s disallow. (Norman Cook fans
will be delighted to notice the absence of both slashes and dots from anywhere
near Palookaville.) Saying that there’s no disallow is like saying that user agent is
allowed—sloppy and confusing, but that’s the way it is.
Google allows for extensions to the robots.txt standard. A disallow pattern ...