SEO robots.txt Google Search Console Troubleshooting

“Blocked by robots.txt” in Google Search Console: What It Means and How to Fix It

Udjin, CEO

Udjin

CEO

4 min read

Where you’ll see it in GSC

  • Indexing → Pages: Look for the reason “Blocked by robots.txt” and open the sample URLs.
  • URL Inspection: Inspect a specific URL, then Test live URL to confirm whether access is blocked by robots.txt.

What “Blocked by robots.txt” actually means

Your /robots.txt file tells crawlers which paths they may not crawl. If a URL is disallowed, Google won’t fetch its content. Important nuances:

  • Robots.txt controls crawling, not indexing. A disallowed URL can still be indexed if Google discovers it via links, but the listing will usually lack full content (e.g., “No information is available for this page”).
  • To remove a URL from Google’s index, allow crawling and return a noindex directive (meta robots or HTTP header), or block with authentication.

First decide: should the page be crawlable?

  1. Pages that should be public and rank (e.g., product, collection, blog): should not be disallowed in robots.txt.
  2. Utility or private pages (e.g., admin, cart, checkout, internal search results): it’s normal to keep them disallowed.

Once you know the intent, fix accordingly.

How robots.txt rules work (quick refresher)

User-agent: *Disallow: /private/Allow: /private/press-kit.pdfDisallow: /*?session=Disallow: /*.pdf$
  • User-agent: which crawler the rules apply to (use * for all).
  • Disallow: path patterns the bot should not crawl.
  • Allow: explicitly whitelists paths inside a disallowed folder.
  • * matches any sequence; $ anchors the end of the URL.

Fix 1: Pages should be crawlable and indexable

Goal: Remove or narrow the rule that blocks important content.

  1. Open https://yourdomain.com/robots.txt and locate the rule covering your URL(s).
  2. Remove the over-broad disallow, or replace with a more specific rule.
  3. If you must keep a folder disallowed, use Allow: for specific files/paths that should be crawlable.
  4. Save and deploy the updated robots.txt, then retest the affected URL in GSC (URL Inspection → Test live URL).
  5. Make sure the URL is internally linked and included in your XML sitemap for faster discovery.

Examples

Problem: All product pages live under /products/, but you disallowed the entire folder:

# Too broad — blocks all product pagesUser-agent: *Disallow: /products/

Fix: Remove the folder-level block or selectively allow what matters:

User-agent: *# Disallow only product JSON endpoints, allow HTMLDisallow: /products/*?view=jsonAllow: /products/

Problem: Blocking every URL with parameters also blocks essential pages:

Disallow: /*?

Fix: Target only the noisy parameters you truly want to exclude:

Disallow: /*?utm_Disallow: /*&ref=# Keep canonical, crawlable versions accessible

Fix 2: Pages are intentionally blocked

If a page should not be crawled (e.g., cart, checkout, internal search), it’s okay to keep it disallowed. For clean reporting:

  • Keep these URLs out of your XML sitemap.
  • Ensure key public pages are not accidentally grouped under the same disallow pattern.

Fix 3: You want a page not indexed (and it’s currently disallowed)

Because robots.txt prevents crawling, Google can’t see a noindex tag on the page. To reliably remove it from the index:

  1. Temporarily allow crawling for that URL/path.
  2. Add <meta name="robots" content="noindex"> (or an X-Robots-Tag header).
  3. Once deindexed, you may reapply a disallow rule if needed.
  4. For urgent situations, use GSC Removals as a temporary measure while you implement the permanent fix.

Editing robots.txt on common platforms

If your platform allows editing robots.txt (for example, via a theme file or settings), make changes there and redeploy. Many platforms provide a safe default; adjust carefully and avoid broad patterns that block important content (e.g., entire product or blog paths).

Validate and monitor

  1. Re-test affected URLs in GSC (URL Inspection → Test live URL) to confirm Google can crawl.
  2. Revalidate issue groups under Indexing → Pages.
  3. Watch the status over the next few days/weeks as Google recrawls.

Common pitfalls to avoid

  • Homepage blocked by Disallow: / or Disallow: /? — double-check patterns.
  • Everything blocked due to a stray slash or copy/paste error.
  • Over-broad parameter blocks that unintentionally catch canonical URLs.
  • Disallow + desire to deindex — remember Google can’t see your noindex if crawling is blocked.

Quick “Blocked by robots.txt” fix checklist

  1. Confirm whether the page should be crawlable.
  2. Locate the blocking pattern in /robots.txt.
  3. Remove or narrow the rule; use Allow: for exceptions.
  4. Retest with GSC URL Inspection (live test).
  5. Keep private/utility pages disallowed and out of sitemaps.
  6. For deindexing, allow crawl → send noindex → (optional) reapply disallow.

Bottom line: Treat “Blocked by robots.txt” as a configuration problem to be confirmed, not an automatic error. Fix overly broad rules for pages that should rank, and keep truly private paths disallowed.

Share Article

Related Articles

Featured image for AI Tools for SEO: The Best Paid & Free Options for Shopify Stores in 2024
SEO Shopify Ecommerce SEO AI Tools Content Marketing Keyword Research On-Page SEO Conversion Rate Optimization Topify

AI Tools for SEO: The Best Paid & Free Options for Shopify Stores in 2024

Most Shopify brands know they "should use AI" for SEO—but end up with random tools and messy content. This guide breaks AI tools into seven jobs (research, keywords, images, copywriting, content, design, and on-page SEO) and shows how to use both paid and free options to grow organic traffic and sales for your store.

Featured image for 11 Best SEO Keyword Research Tools in 2025 (For Shopify & Ecommerce Stores)
SEO Shopify Ecommerce SEO Keyword Research Content Marketing PPC Topify

11 Best SEO Keyword Research Tools in 2025 (For Shopify & Ecommerce Stores)

Choosing the right keyword tools can make or break your Shopify SEO in 2025. This guide breaks down 11 of the best options—from heavyweights like Ahrefs and Semrush to lightweight browser extensions like Keywords Everywhere and Keyword Surfer—and explains exactly how to use them to grow organic traffic and sales, not just rankings.

Featured image for How to Become an SEO Expert in 2025 (Roadmap for Shopify & Ecommerce)
SEO Shopify Ecommerce SEO AI SEO Technical SEO Content Marketing SEO Career Topify

How to Become an SEO Expert in 2025 (Roadmap for Shopify & Ecommerce)

Most Shopify marketers don't need another random SEO tip—they need a clear roadmap. This guide breaks down how to become an SEO expert in 2025 step by step: learn the basics, master technical and AI-driven SEO, build a repeatable process, specialize in ecommerce, automate tasks, and use free tools with data from Topify to keep improving.

Featured image for The SEO Content Matrix for Google: A Practical Content Plan for Shopify Stores
SEO Shopify Ecommerce SEO Content Strategy Google Search Content Marketing UX Keyword Research Topify

The SEO Content Matrix for Google: A Practical Content Plan for Shopify Stores

Most Shopify brands publish content without a clear map: a random product guide here, a blog post there, and no plan for how it all works together in Google. The SEO Content Matrix for Google turns your ideas into a simple grid—by intent and by time—so you know exactly which pages to create to drive impressions, convert traffic, build authority, and keep customers coming back.

Featured image for Fashion E-commerce SEO Guide for Shopify Brands
Fashion SEO Ecommerce SEO Shopify SEO Fashion Marketing Apparel SEO Category SEO Topify

Fashion E-commerce SEO Guide for Shopify Brands

Fashion and apparel SEO has unique challenges: fast-changing inventory, seasonal trends, rich imagery, and high competition. In this guide, Udjin (CEO of Topify) breaks down a complete Fashion E-commerce SEO strategy specifically for Shopify stores — from collection architecture and seasonal content to lookbooks, influencer backlinks, and optimizing filters.

Featured image for Shopify Image SEO: How to Optimize Product Photos for Google
Shopify SEO Shopify Image SEO shopify seo image Ecommerce SEO Technical SEO Page Speed Topify

Shopify Image SEO: How to Optimize Product Photos for Google

Most Shopify guides treat images as decoration. In reality, image SEO can drive thousands of extra visits from Google Images and improve your overall rankings. In this article, Udjin (CEO of Topify) explains step by step how to optimize every product and collection image for search—covering file names, alt text, size, compression, lazy loading, and structured data for the query "shopify seo image".

Featured image for Shopify Technical SEO Checklist
Shopify Technical SEO Technical SEO Ecommerce SEO Shopify SEO Core Web Vitals Crawl Optimization Indexation Topify

Shopify Technical SEO Checklist

Technical SEO is one of the biggest ranking factors for Shopify stores—yet also the most misunderstood. In this advanced guide, Udjin (CEO of Topify) presents a complete Shopify Technical SEO checklist, including site speed, URL structure, indexation, duplicate content, structured data, Core Web Vitals, internal linking, and theme-level optimizations.