Need help with this?
Purchase All in One SEO Pack Pro to get dedicated support from us.
The robots.txt module in All in One SEO Pack allows you to set up a robots.txt file for your site that will override the default robots.txt file that WordPress creates. By creating a robots.txt file with All in One SEO Pack you have greater control over the instructions you give web crawlers about your site.
Just like WordPress, All in One SEO Pack generates a dynamic file so there is no static file to be found on your server. The content of the robots.txt file is stored in your WordPress database.
The default settings that show in the Create a Robots.txt File box (shown in screenshot above) ask robots not to crawl your core WordPress files. It’s unnecessary for search engines to access these files directly because they don’t contain any relevant site content.
The rule builder is used to add your own custom rules for specific paths on your site. For example, if you would like to add a rule to block all robots from a temp directory then you can use the rule builder to add this rule as shown below.
To add a rule:
- Enter the User Agent. Using * will apply the rule to all user agents
- Select the rule type to Allow or Block a robot
- Enter the directory path, for example /wp-content/plugins/
- Click the Add Rule button
- The rule will appear in the table and in the box that shows your robots.txt appears
There is also a Robots.txt Editor for Multisite Networks. Details can be found here.
NOTE: Whilst the robots.txt generated by All in One SEO Pack is a dynamically generated page and not a static text file on your server, care should be taken in creating a large robots.txt for two reasons – 1). a large robots.txt indicates a potentially complex set of rules which could be hard to maintain; and 2). Google has proposed a maximum file size of 512KB to alleviate strain on servers from long connection times.