Need help with this?
Purchase All in One SEO Pack Pro to get dedicated support from us.
The robots.txt module in All in One SEO Pack allows you to set up a static robots.txt file for your site that will override the dynamically generated robots.txt file that WordPress creates by default. By creating a static robots.txt file with All in One SEO Pack you have greater control over the instructions you give web crawlers about your site.
The default settings that show in the Create a Robots.txt File box (shown in top screenshot) ask robots not to crawl your core WordPress files. It’s unnecessary for search engines to access these files directly because they don’t contain any relevant site content.
The rule builder is used to set rules for specific paths on your site. For example, if you would like to keep the Disallow: /wp-admin/ rule for all browsers but you do want Google to have access to that file you can set that specific parameter by filling in the information as seen below.
The Robots.txt Module also allows you to check your robots.txt file for any problems with the syntax. The Optimize your Robots.txt File box will show your whether your current file has any issues and provides any recommended changes which you can accept or disregard.
The Status indicator displays three statuses:
- Green means that the line in your robots.txt file is good and complies with the standards of the robots.txt protocol
- Yellow means that the line in your robots.txt file is ok but is a non-standard entry and not all crawlers may recognize it or interpret it the same way
- Red means that the line in your robots.txt file is bad and that the syntax is invalid and does not meet the standards of the robots.txt protocol