Modify the robots.txt file
Note
Environments using a convenience domain (e.g., a subdomain of go-vip.net or go-vip.co), have a hard-coded /robots.txt output that returns a “Disallow for all user agents” result. This is to prevent search engines from indexing content hosted on non-production sites, or unlaunched production sites. In order to modify the output for /robots.txt an environment (production or non-production) must have a custom mapped domain.
To modify the /robots.txt file hook into the do_robotstxt
action, or filter the output by hooking into robots_txt
filter.
Example: Mark a directory as “nofollow”
function my_disallow_directory() {
echo "User-agent: *" . PHP_EOL;
echo "Disallow: /path/to/your/directory/" . PHP_EOL;
}
add_action( 'do_robotstxt', 'my_disallow_directory' );
Caching
The /robots.txt file is cached for long periods of time. In order to force the cache to clear after any changes made to the file, go to Settings > Reading
within WP-Admin and toggle the Search engine visibility
setting, saving the changes each time the setting is changed.
The page cache for the /robots.txt file can also be flushed using the wp vip cache purge-url
WP-CLI command which is available on VIP Go environments.