Robots.txt Editor for Multisite Networks

Need help with this?
Purchase All in One SEO Pack Pro to get dedicated support from us.

Buy Now

The Robots.txt module now includes an Editor for Multisite Networks.  This Editor can be found in the Network Admin screen as shown below:

Robots.txt Editor in Network Admin

The Robots.txt Editor works in exactly the same way as the Robots.txt module.  However, any rules you enter here will take precedence and override rules added at the site level.  This means that Network Administrators can set global rules using the editor that will apply to ALL site in the multisite network.  It also means that Site Administrators cannot override these network level rules.

To add a rule:

  1. Enter the User Agent. Using * will apply the rule to all user agents
  2. Select the rule type to Allow or Block a robot
  3. Enter the directory path, for example /wp-content/plugins/
  4. Click the Add Rule button
  5. The rule will appear in the table and in the box that shows your robots.txt appears

Adding a rule to your robots.txt

NOTE:  Whilst the robots.txt generated by All in One SEO Pack is a dynamically generated page and not a static text file on your server, care should be taken in creating a large robots.txt for two reasons – 1). a large robots.txt indicates a potentially complex set of rules which could be hard to maintain; and 2). Google has proposed a maximum file size of 512KB to alleviate strain on servers from long connection times.

  • Was this helpful ?
  • YesNo
[i]
[i]
Skip to toolbar