<?xml version='1.0' encoding='UTF-8'?>
<rss version='2.0'>
	<channel>
		<title>cotonti.com : [IDEA] Robots.txt Autogeneration</title>
		<link>https://www.cotonti.com</link>
		<description>Last topic posts</description>
		<generator>Cotonti</generator>
		<language>en</language>
		<pubDate>Sun, 12 Apr 2026 23:51:30 -0000</pubDate>

		<item>
			<title>Kilandor</title>
			<description><![CDATA[I would have to agree, this shoudln't really be in core]]></description>
			<pubDate>Mon, 18 Aug 2008 20:52:25 -0000</pubDate>
			<link><![CDATA[https://www.cotonti.com/forums?m=posts&q=22&d=0#post549]]></link>
		</item>
		<item>
			<title>KillerSneak</title>
			<description><![CDATA[This is quite easy done by hand, or you can make a google account that will generate / test this for your site/domain.<br />
<br />
I don't think this should be added as a default as it will bloat the package ?]]></description>
			<pubDate>Mon, 18 Aug 2008 18:18:01 -0000</pubDate>
			<link><![CDATA[https://www.cotonti.com/forums?m=posts&q=22&d=0#post547]]></link>
		</item>
		<item>
			<title>Asmo</title>
			<description><![CDATA[Not quite on the topic, but also of the robots<br />
<br />
At this time, perhaps not fully enjoy robots.txt<br />
Because, all the links look like ID | param | param | ...<br />
It would be great to bring links to such mean param | param | ID (if there is a string parameters)  <br />
Then in the robots.txt can be cut duplicate pages and lists with the parameters (if required)]]></description>
			<pubDate>Tue, 05 Aug 2008 17:25:45 -0000</pubDate>
			<link><![CDATA[https://www.cotonti.com/forums?m=posts&q=22&d=0#post112]]></link>
		</item>
		<item>
			<title>diablo</title>
			<description><![CDATA[google robots now can access to sections which need login.. i think it should be a tool more then a plugin]]></description>
			<pubDate>Tue, 05 Aug 2008 16:56:39 -0000</pubDate>
			<link><![CDATA[https://www.cotonti.com/forums?m=posts&q=22&d=0#post111]]></link>
		</item>
		<item>
			<title>Lombi</title>
			<description><![CDATA[The idea is to have a plugin that would autogen robots.txt based on what the admin wants to have.<br />
<br />
For example you could block most of the php files which require logging in as well as all php files that could get accessed via the old URL structure which would result in duplicate content...<br />
<br />
Thoughts?]]></description>
			<pubDate>Mon, 04 Aug 2008 19:15:28 -0000</pubDate>
			<link><![CDATA[https://www.cotonti.com/forums?m=posts&q=22&d=0#post104]]></link>
		</item>
	</channel>
</rss>