Back to Community

Resolving WP-Optimize Robots.txt Issues in Google Search Console

8 threads Sep 9, 2025

Content

Many users of the 'WP-Optimize – Cache, Compress images, Minify & Clean database to boost page speed & performance' plugin have encountered a specific issue where the plugin adds a line to their site's robots.txt file. This line can cause errors in Google Search Console (GSC), such as "No user-agent specified" or crawling issues related to sitemaps. This article explains the problem and provides the most common solutions.

Understanding the Problem

Starting from version 3.3.0, WP-Optimize began adding a directive to the robots.txt file to block search engines from crawling a specific file it generates. The line added is:

User-agent: *
Disallow: /wp-content/uploads/wpo-plugins-tables-list.json

While the intention is to prevent indexing of a technical file, the implementation can sometimes cause two main issues:

  1. Formatting Errors: On some sites, the directive is added in a way that Google's parser interprets as a rules error, often flagging "No user-agent specified."
  2. Pre-existing robots.txt Issues: The plugin adds the line to an existing robots.txt file. If that file has other formatting problems, the new addition can exacerbate them.

How to Fix the Robots.txt Issue

There are two primary methods to resolve this. The first method is generally recommended as it is cleaner and prevents the file from being generated at all.

Solution 1: Prevent the File from Being Generated (Recommended)

This solution stops WP-Optimize from creating the `wpo-plugins-tables-list.json` file, which means there is no need to block it in robots.txt. Add the following code to your theme's `functions.php` file or use a code snippets plugin.

add_filter( 'wpo_update_plugin_json', '__return_false' );

Important Note for Multisite: If you are running a WordPress Multisite network, you may need to add this code to the functions.php file of your active theme on each site where the issue occurs.

Solution 2: Remove the Robots.txt Entry

If you prefer to keep the JSON file but remove the directive from your robots.txt, you can use this code snippet. Add it to your active theme's `functions.php` file.

add_action('after_setup_theme', 'remove_wpo_robots_txt_entry');
function remove_wpo_robots_txt_entry() {
    if (function_exists('WP_Optimize')) {    
        remove_filter('robots_txt', array(WP_Optimize(), 'robots_txt'), 99, 1);
    }
}

Warning: Never edit the `functions.php` file located in the `wp-includes/` directory. This will break your site. Only edit the `functions.php` file within your active theme or child theme directory.

Clearing Cache and Re-checking

After implementing either solution:

  1. Clear any caching mechanisms you have active on your site, including the cache in WP-Optimize.
  2. Visit your robots.txt file by going to yoursite.com/robots.txt to confirm the line has been removed.
  3. Use the "Test" function in Google Search Console's "robots.txt Tester" tool to validate the new file and request a re-crawl.

By following these steps, you should be able to resolve the robots.txt errors caused by WP-Optimize and ensure your site is being crawled properly by search engines.