It would be great to have the robots.txt file managed by vbseo. Like having a section in vbseocp dedicated to it; where we can select which URLs to allow, what to deny, the crawl delay, bots to which rules should apply, etc. In addition we can have a textarea for custom robot entries.
Some of the settings in the CP could be like :
- Disallow showpost.php : Yes / No
- Disallow /archive/ : Yes / No
- Disallow member lists : Yes / No
- Crawl Delay : 10
And of course an option to disable this feature, so users who want to manage it on their own can do it like they are doing it now, via the filesystem (to avoid opening the CP everytime they want to make a change).
Inspired by : Understanding: robots.txt