Robots.txt Module

Need help with this?
Purchase All in One SEO Pack Pro to get dedicated support from us.

Buy Now

Create robots.txt file

The robots.txt module in All in One SEO Pack allows you to set up a robots.txt file for your site that will override the default robots.txt file that WordPress creates.  By creating a robots.txt file with All in One SEO Pack you have greater control over the instructions you give web crawlers about your site.

Just like WordPress, All in One SEO Pack generates a dynamic file so there is no static file to be found on your server.  The content of the robots.txt file is stored in your WordPress database.

The default settings that show in the Create a Robots.txt File box (shown in screenshot above) ask robots not to crawl your core WordPress files.  It’s unnecessary for search engines to access these files directly because they don’t contain any relevant site content.

The rule builder is used to add your own custom rules for specific paths on your site.  For example, if you would like to add a rule to block all robots from a temp directory then you can use the rule builder to add this rule as shown below.

To add a rule:

  1. Enter the User Agent. Using * will apply the rule to all user agents
  2. Select the rule type to Allow or Block a robot
  3. Enter the directory path, for example /wp-content/plugins/
  4. Click the Add Rule button
  5. The rule will appear in the table and in the box that shows your robots.txt appears

Adding a rule to your robots.txt

 

There is also a Robots.txt Editor for Multisite Networks.  Details can be found here.

 Useful Links

  1. Explanation of the robots.txt protocol
[i]
[i]
Skip to toolbar