Форумы / Cotonti / Extensions / [IDEA] Robots.txt Autogeneration

Lombi
#1 04.08.2008 19:15
The idea is to have a plugin that would autogen robots.txt based on what the admin wants to have.

For example you could block most of the php files which require logging in as well as all php files that could get accessed via the old URL structure which would result in duplicate content...

Thoughts?
<a href="http://www.domenlo.com">Surreal Art</a>
diablo
#2 05.08.2008 16:56
google robots now can access to sections which need login.. i think it should be a tool more then a plugin
"Always code as if the guy who ends up maintaining your code will be a violent psychopath who knows where you live."
Asmo
#3 05.08.2008 17:25
Not quite on the topic, but also of the robots

At this time, perhaps not fully enjoy robots.txt
Because, all the links look like ID | param | param | ...
It would be great to bring links to such mean param | param | ID (if there is a string parameters)
Then in the robots.txt can be cut duplicate pages and lists with the parameters (if required)
KillerSneak
#4 18.08.2008 18:18
This is quite easy done by hand, or you can make a google account that will generate / test this for your site/domain.

I don't think this should be added as a default as it will bloat the package ?
Kilandor
#5 18.08.2008 20:52
I would have to agree, this shoudln't really be in core