Prior to All in One SEO Pack v2.7, the robots.txt file was output as a physical file on your web server. However, WordPress creates a dynamic virtual robots.txt file, which is always preferred over a physical file. The overhaul of the Robots.txt module meant that we could remove support for physical files entirely and instead use the dynamic virtual file that WordPress creates. This has two benefits:
- Rules that are added to the robots.txt file are stored in the database.
- The default rules created by WordPress are used by All in One SEO Pack.
In addition to switching from a physical file to a virtual file, we also overhauled the user interface. We kept the easy-to-use rule builder and added a table that displays your custom rules.
The new Robots.txt module now also supports WordPress Multisite Networks. There’s a new Robots.txt Editor in the Network Admin panel that enables Network Administrators to set global rules. These global rules are used by all sites in the network and cannot be overridden at site level.
The new Robots.txt module is a big improvement for both site administrators and network administrators of large multisite installations. It continues our commitment to deliver the best possible SEO tools to you so you can effectively manage your own SEO.
If you’d like to learn more, then we highly recommend you to check out our documentation on the new Robots.txt module and Network Admin Robots.txt Editor: