Robots.txt handler


Serve a robots.txt file from Trac. Mostly useful to tracd users, but works on anything. Just put the data you want in the wiki page RobotsTxt.

Bugs/Feature Requests

Existing bugs and feature requests for RobotsTxtPlugin are here.

If you have any issues, create a new ticket.


Download the zipped source from here.


You can check out RobotsTxtPlugin from here using Subversion, or browse the source with Trac.


To enable:

robotstxt.* = enabled

A typical RobotsTxt Wiki page will look like:

User-agent: *
Disallow: /browser
Disallow: /log
Disallow: /changeset
Disallow: /report
Disallow: /newticket
Disallow: /search

Recent Changes

[7204] by coderanger on 2009-11-30 09:08:25
0.11 version of RobotsTxt.
[7203] by coderanger on 2009-11-30 08:51:15
Branch for 0.11.
[3416] by coderanger on 2008-03-25 07:39:08
Change my email to avoid Yahoo, which decided to brake my scraper script recently.


Author: coderanger
Maintainer: none

Last modified 13 months ago Last modified on Mar 26, 2013 9:47:09 PM