Back to Community

Why Are My Pretty Links Being Indexed by Google and How to Stop It

30 threads Sep 17, 2025

Content

If you use the Pretty Links plugin, you may have encountered a frustrating issue: your cloaked affiliate or marketing links are appearing in Google's search index. This is a common concern for users who rely on the plugin's 'nofollow' and 'noindex' settings to prevent search engines from crawling and indexing these special URLs.

This guide will explain why this indexing happens and provide practical, effective solutions to protect your links and your site's SEO.

Why Does This Happen?

Based on community discussions and official responses, the indexing of Pretty Links occurs for a few key reasons:

  1. Search Engine Interpretation: Historically, the 'nofollow' attribute was a strong signal for search engines to ignore a link. However, as noted in the support threads, this is no longer a guaranteed directive. Search engines like Google now treat 'nofollow' more as a hint or a signal rather than a strict command. They may choose to crawl and index links despite the presence of a 'nofollow' or 'noindex' tag in the HTTP headers.
  2. Direct URL Discovery: Even if your Pretty Links are not placed on any public page, search engine bots can discover them through other means, such as XML sitemaps (if not configured properly) or by following links from other websites.
  3. Redirect Type: Some users have reported different behaviors between 301 (permanent) and 307 (temporary) redirects, though the consensus is that the primary issue lies with how search engines interpret the robots meta directives.

How to Prevent Pretty Links from Being Indexed

Here are the most effective strategies, gathered from community solutions, to stop search engines from indexing your shortened links.

Solution 1: Use a URL Prefix and robots.txt

This is one of the most reliable methods. By grouping all your Pretty Links under a common directory, you can easily block all bots from accessing them with a single line in your robots.txt file.

  1. Add a Prefix to Your Links: Manually create all your new Pretty Links with a prefix like /go/, /out/, or /recommends/. For example, instead of yoursite.com/product-name, use yoursite.com/go/product-name.
  2. Update Your robots.txt File: Add the following directives to your site's robots.txt file (typically found at yoursite.com/robots.txt). This tells all compliant search engine crawlers to avoid any URL that starts with your chosen prefix.
    User-agent: *
    Disallow: /go/
    Disallow: /out/

    Note: Replace /go/ and /out/ with the actual prefixes you use.
  3. Be Patient: After updating robots.txt, it can take some time for search engines to recrawl your site and deindex the blocked URLs. You can use Google Search Console's "URL Removal" tool to request faster removal of outdated pages.

Solution 2: Verify and Strengthen HTTP Headers

Ensure that the 'nofollow' and 'noindex' signals are being sent correctly by your Pretty Links.

  1. In your WordPress admin, go to Pretty Links -> Options -> Links.
  2. Confirm that the "Enable No Follow" option is checked. This instructs the plugin to add X-Robots-Tag: noindex, nofollow HTTP headers to your redirects.
  3. You can use an online HTTP header checker tool to verify that a specific Pretty Link URL is returning the correct X-Robots-Tag header.

Solution 3: Manage Your Sitemap

If you are using an SEO plugin like Yoast or Rank Math, ensure that your Pretty Links are not being accidentally included in your XML sitemap. These plugins typically allow you to exclude specific post types or taxonomies from the sitemap. Since Pretty Links can create category pages, you should check your SEO plugin's settings to exclude any Pretty Links-related taxonomies from being submitted to search engines.

Conclusion

The indexing of Pretty Links is a known challenge, primarily due to the evolving nature of how search engines handle directives like 'nofollow'. While the plugin's built-in options provide a good first layer of defense, the most robust solution is to use a dedicated prefix for all links and block them systematically via your robots.txt file. This approach gives you the greatest control and aligns with standard practices for managing which parts of your site search engines can access.

If you have a large number of links already indexed, remember that deindexing is a process that requires patience. Consistently using the methods above will help clean up your current index and prevent future indexing issues.

Related Support Threads Support