[[PageOutline(2-5,Contents,pullout)]] = Robots.txt handler == Description This plugin allows you to serve the [wikipedia:Robots_exclusion_standard robots.txt] file from Trac. The wiki page with name `RobotsTxt` will display the contents of the prevailing robots.txt file. Primarily useful to tracd users. == Bugs/Feature Requests Existing bugs and feature requests for RobotsTxtPlugin are [report:9?COMPONENT=RobotsTxtPlugin here]. If you have any issues, create a [/newticket?component=RobotsTxtPlugin new ticket]. [[TicketQuery(component=RobotsTxtPlugin&group=type,format=progress)]] == Download Download the zipped source from [export:robotstxtplugin here]. The plugin is also available on [pypi:TracRobotsTxt PyPi]. == Source You can check out RobotsTxtPlugin from [/svn/robotstxtplugin here] using Subversion, or [source:robotstxtplugin browse the source] with Trac. == Installation General instructions on installing Trac plugins can be found on the [TracPlugins#InstallingaTracplugin TracPlugins] page. To enable, add the following lines to your `trac.ini` file: {{{#!ini [components] robotstxt.* = enabled }}} A typical `RobotsTxt` Wiki page will look as follows: {{{ User-agent: * Disallow: /browser Disallow: /log Disallow: /changeset Disallow: /report Disallow: /newticket Disallow: /search }}} == Recent Changes [[ChangeLog(robotstxtplugin, 3)]] == Author/Contributors '''Author:''' [wiki:coderanger] [[BR]] '''Maintainer:''' [[Maintainer]] [[BR]] '''Contributors:'''