Fixing Twitter Card Errors Caused by Robots.txt in All in One SEO
Content
If you're using All in One SEO and Twitter's card validator is showing errors like "No card found" or "Fetching the page failed because it’s denied by robots.txt," you're not alone. This is a common point of confusion that often stems from how the plugin manages your site's robots.txt file.
This guide will explain why this happens and walk you through the most effective solutions to get your Twitter cards working properly.
Why This Error Occurs
Twitter uses a specific crawler, called "Twitterbot," to read the meta tags on your pages to generate card previews. For this to work, Twitterbot must be allowed to access your site. By default, a standard robots.txt file often blocks all bots from certain directories, which can inadvertently include Twitterbot.
While the All in One SEO plugin provides a built-in robots.txt editor to create custom rules, issues like caching, incorrect rule syntax, or conflicts with other plugins can prevent the correct rules from taking effect. This leads to Twitter's validator being blocked, resulting in the errors you see.
How to Fix Twitter Card Robots.txt Errors
Follow these steps to resolve the issue.
Step 1: Check Your Current Robots.txt File
First, verify what the live version of your robots.txt file looks like. Simply go to yourdomain.com/robots.txt in your browser. Do not rely solely on the preview within the All in One SEO settings, as it may not reflect the live file due to caching.
Step 2: Add the Correct Twitterbot Rule in AIOSEO
You need to explicitly allow the Twitterbot crawler. Here’s how to add the rule using the All in One SEO plugin:
- In your WordPress admin, go to All in One SEO > Tools > Robots.txt.
- Ensure the "Enable Custom Robots.txt" toggle is switched on.
- In the text editor, you need to add a rule specifically for Twitterbot. A common and effective rule is:
User-agent: Twitterbot
Allow: /
This tells the Twitterbot crawler that it is allowed to access all areas of your site. - Your final robots.txt content should look something like this:
User-agent: *
Allow: /wp-admin/admin-ajax.php
Disallow: /wp-admin/
User-agent: Twitterbot
Allow: /
Sitemap: https://yourdomain.com/sitemap.xml - Click Save Changes.
Step 3: Clear All Caches
This is a critical step that is often overlooked. Caching can prevent your updated robots.txt file from being visible to the outside world.
- Clear your site's cache: If you use a caching plugin (like WP Rocket, W3 Total Cache) or your hosting provider has a server-level cache (like Cloudflare, Bluehost), clear them all.
- Clear your browser cache: Hard refresh your browser (Ctrl+F5 on Windows, Cmd+Shift+R on Mac) when viewing your robots.txt file to ensure you're seeing the latest version.
Step 4: Validate the Fix
After saving your changes and clearing caches, take the following steps to confirm the fix is working:
- Revisit
yourdomain.com/robots.txtin your browser to confirm the new rules for Twitterbot are present. - Test your URL again in the Twitter Card Validator. The "denied by robots.txt" error should now be gone.
Troubleshooting Persistent Issues
If the problem continues, consider these possibilities:
- Plugin Conflict: Another plugin might be modifying the robots.txt file. As seen in Thread 1, a user found a separate Twitter card plugin was adding blocking rules. Temporarily deactivate other SEO or security plugins to test if the issue resolves.
- Syntax Error: Double-check the syntax of your rules in the AIOSEO editor. Ensure there are no typos and that the rules are correctly formatted.
- File Permissions: In rare cases, if your server has strict file permissions, the plugin may not be able to write the robots.txt file successfully. The error in Thread 4 suggests this could happen. If you see a generic saving error, checking your browser's console for JavaScript errors can provide clues.
By methodically working through these steps, you can eliminate the robots.txt blockage and ensure Twitterbot can properly read your site's meta tags to generate beautiful card previews.
Related Support Threads Support
-
ERROR: No card found (Card error)https://wordpress.org/support/topic/error-no-card-found-card-error/
-
Robots.txt blocking twitter cardshttps://wordpress.org/support/topic/robots-txt-blocking-twitter-cards/
-
robots.txt editor won’t save changeshttps://wordpress.org/support/topic/robots-txt-editor-wont-save-changes/
-
Saving custom robots.txt file failshttps://wordpress.org/support/topic/saving-custom-robots-txt-file-fails/
-
Issues with Twitter cardhttps://wordpress.org/support/topic/issues-with-twitter-card/
-
robots.txt isn’t updating with AIOSEOhttps://wordpress.org/support/topic/robots-txt-isnt-updating-with-aioseo-2/