[Buildbot-devel] [patch] Add support for robots.txt

Brian Warner warner-buildbot at lothar.com
Wed Oct 26 21:10:52 UTC 2005

> After looking at my logs for a long time, I'm sick of watching bots 
> (especially yahoo's slurp spider) wandering across the bot. The attached 
> patch (plus file) provides support for the robots.txt file.

Is it worth sticking into a separate file? We could have the Waterfall stuff
serve the contents directly, with a static.Data() resource. That would have
the advantage of letting the buildmaster admin specify the contents from
within the master.cfg config file, probably with syntax like:

robots_txt = """
User-agent: *
Disallow: /
c['status'].append(html.Waterfall(http_port=8071, robots_txt=robots_txt))

(plus it would mean fewer non-.py files installed with that nasty setup.py
hack, which I would love to get rid of some day).

The disadvantage is that the buildmaster admin would have to specify the
contents inside the master.cfg file, which would be kind of gross if
robots.txt got larger than a couple of lines. On the other hand, in that case
you could put it elsewhere in the buildmaster's working directory and just do
robots_txt=open("robots.txt","r").read() to get the contents.

I think using static.Data might also affect caching behavior (no timestamp to
work with), but for a two line file it would probably be faster to return the
whole thing than to return a 304: You Already Have It response.


More information about the devel mailing list