Back to dispatches
§ Dispatch № 237

URL Inspection Tool: The Search Console Feature Founders Underuse

Learn how URL Inspection diagnoses indexing problems in 30 seconds. The on-demand audit founders forget exists. Step-by-step guide for technical founders.

Filed
May 3, 2026
Read
20 min
Author
The Seoable Team

You're Shipping Code. Google's Not Indexing It.

You launched. Traffic didn't follow. You assume SEO takes months. It does—if you're chasing rankings before pages are even indexed.

Most founders never check. They publish, wait, hope. Meanwhile, Google crawled their site, hit a redirect chain, and moved on. Or hit a noindex tag someone left in staging. Or couldn't render the JavaScript that powers your entire page.

You don't know because you never asked.

Google Search Console has a feature that answers this in 30 seconds. It's called URL Inspection. It's free. It's built into the tool you should already be using. And almost nobody uses it.

This guide shows you why you need it, how to use it, and what to do when it finds problems.

Prerequisites: What You Need Before You Start

Before you inspect a single URL, you need three things in place.

Google Search Console access. If you don't have it set up, stop here. Add your property to Google Search Console first. Verify ownership via DNS record, HTML file upload, or Google Analytics. This takes 10 minutes. Do it now.

A live website. URL Inspection checks what Google actually sees when it crawls your site. You need a live domain, not localhost. If you're still in development, deploy a staging version with a robots.txt rule that blocks indexing. We'll cover that.

URLs you want to check. You'll be pasting specific URLs into the inspection tool. Have a list ready: your homepage, key landing pages, blog posts you published, product pages, anything you expect to rank. Start with your top 10.

If you're running SEO Triage for Busy Founders: The 80/20 You Can't Skip, URL Inspection is part of the domain audit phase. It's the fastest way to diagnose why indexed pages aren't ranking and why some pages aren't indexed at all.

Understanding What URL Inspection Actually Does

URL Inspection is a diagnostic tool. Think of it as a window into Google's view of your page.

When you submit a URL to the tool, it does four things:

It shows you the indexed version. Google's cache of your page. Not always current. Sometimes weeks old. But it shows you what Google actually stored.

It tests indexability. Can Google crawl this page right now? It simulates a fresh crawl from Google's perspective. Hits your site, follows redirects, renders JavaScript, reads robots.txt, checks for noindex tags.

It shows you Core Web Vitals. Page speed, interactivity, visual stability. Real data from actual users, not synthetic lab tests. If your page is slow, you'll see it here.

It surfaces crawl issues. Redirect chains. Blocked resources. Noindex tags. Robots.txt blocks. Canonicals pointing elsewhere. All in one place.

This is not a ranking tool. It doesn't tell you why you're not ranking. It tells you if Google can even see your page. Ranking comes later. Indexing comes first.

As explained in The Difference Between Indexing and Ranking — And Why It Matters, most founders optimize for rankings before pages are indexed. URL Inspection fixes this. It forces you to check indexing first.

Step 1: Navigate to Search Console and Find the Tool

Open Google Search Console and select your property.

On the left sidebar, you'll see a menu. Look for "Inspect URL" or "URL Inspection." It's usually near the top. Click it.

You'll see a search bar at the top of the page. This is where you paste URLs.

One critical detail: the URL must belong to your property. If you're inspecting https://example.com, you must have verified ownership of example.com in Search Console. You can't inspect random URLs from other sites.

Copy the full URL from your browser address bar. Paste it into the Search Console inspection box. Press Enter.

Wait 10-15 seconds. Google crawls the live page and returns results.

Step 2: Read the Indexation Status Card

The first thing you see is a status card. It tells you one of three things:

"URL is on Google." The page is indexed. Google has crawled it, processed it, and stored it in the index. This is good. This means the page can appear in search results.

"URL is not on Google." The page is not indexed. Google either hasn't crawled it yet, or it crawled it and decided not to index it. This is bad. No index, no rankings.

"Partial match." Google found a similar URL, but not the exact one you searched. Usually means you have a redirect or a canonical pointing elsewhere. Check the details.

If the status is "URL is on Google," you're past the first hurdle. Skip to Step 4 to check for issues.

If the status is "URL is not on Google," continue to Step 3. This is where the diagnosis happens.

Step 3: Diagnose Why a Page Isn't Indexed

When a page isn't indexed, Search Console shows you why. Look for the section labeled "Crawl and indexing."

Common reasons a page won't index:

"Noindex tag detected." You have a noindex meta tag on the page. This explicitly tells Google not to index it. Check your page template. Search for <meta name="robots" content="noindex">. Remove it from production pages. Noindex is for staging, drafts, and duplicate content. Not for pages you want to rank.

"Blocked by robots.txt." Your robots.txt file is telling Google not to crawl this page. Open your robots.txt (at https://yoursite.com/robots.txt) and check the rules. If you see Disallow: / or Disallow: /your-page-path, remove it. Robots.txt should allow crawling of pages you want indexed.

"Blocked by authentication." The page requires a login. Google can't access it. If it's a public page, remove the authentication requirement or create a public version.

"Redirect error." The page redirects to another URL, which redirects again, which redirects again. Redirect chains. Google gives up. Check your redirects. Use a redirect checker tool to trace the chain. Simplify it to a single redirect or eliminate it.

"Soft 404." The page returns a 200 status code but looks like an error page. Google thinks it's broken. Check your page content. Make sure it's real content, not a 404 template.

"Crawl anomaly." Google tried to crawl the page and hit an error. Could be timeout, server error, or network issue. Try again in 24 hours. If it persists, check your server logs.

"JavaScript issues." Google rendered the page but couldn't execute JavaScript properly. If your content is loaded by JavaScript, this is a problem. Use a rendering test tool to see what Google actually sees. You might need to pre-render content or use server-side rendering.

Each of these has a fix. The point is: URL Inspection tells you exactly what's wrong. No guessing. No waiting for Google to reindex. Diagnose, fix, retest.

Step 4: Check for Core Web Vitals Issues

Scroll down on the inspection results. You'll see a "Core Web Vitals" section.

Google measures three metrics:

Largest Contentful Paint (LCP). How fast the main content loads. Should be under 2.5 seconds. If it's red, your page is slow. Optimize images, defer non-critical JavaScript, use a CDN.

First Input Delay (FID) or Interaction to Next Paint (INP). How responsive the page is to user input. Should be under 100ms. If it's red, JavaScript is blocking the main thread. Reduce JavaScript, break up long tasks, optimize event handlers.

Cumulative Layout Shift (CLS). How much the page layout shifts while loading. Should be under 0.1. If it's red, elements are moving around. Fix by setting explicit dimensions on images, avoiding unsized ads, using font-display: swap.

Google uses Core Web Vitals as a ranking factor. A fast page ranks higher than a slow page, all else equal. URL Inspection shows you real user data. Not synthetic. Not lab tests. Actual performance from people visiting your site.

If your Core Web Vitals are poor, fix them before worrying about keywords. Speed is table stakes.

Step 5: Review the Cached Version

Scroll further. You'll see a section showing the indexed version of your page.

This is what Google stored. It might be old. If you updated the page yesterday, the cache might show last week's version. That's normal. Google re-crawls periodically.

But sometimes the cached version looks wrong. Content is missing. Images aren't loading. Text is garbled. This usually means Google had trouble rendering your page.

If the cached version looks correct, you're fine. Google is seeing your content.

If it looks broken, you have a rendering issue. Check for:

  • JavaScript that loads content after page load. Google might not wait long enough.
  • Images hosted on a different domain that's blocked in robots.txt.
  • CSS that's blocked, so styling isn't applied.
  • Lazy-loaded content that never loads.

Fix rendering issues by pre-rendering content server-side or using a rendering optimization strategy.

Step 6: Test Indexability in Real Time

At the bottom of the inspection results, you'll see a "Test live URL" button.

Click it. This runs a fresh crawl right now. Not using cached data. A real-time test.

It takes 30-60 seconds. Google crawls your live page, renders it, checks for issues.

Results come back with one of three statuses:

"Page is crawlable." No blocking issues. Google can reach it, crawl it, render it. Good sign.

"Page is crawlable but has issues." Google can crawl it, but found problems. Redirect chains, slow resources, JavaScript errors. Not fatal, but fix them.

"Page is not crawlable." Google can't access it. Server error, timeout, or authentication required. Fix before expecting indexing.

If the live test shows issues, you get specific details. Blocked resources, JavaScript errors, redirect chains. Fix each one.

Then run the test again. Confirm the fix worked.

This feedback loop is fast. Diagnose, fix, verify. All in Search Console. No waiting for Google to recrawl on its own schedule.

Step 7: Request Indexing (If Needed)

If the page is crawlable but not indexed, you can request indexing.

Look for a button labeled "Request indexing" or "Request re-indexing." Click it.

Google adds the URL to a crawl queue. Usually crawls within 24-48 hours.

Don't spam this button. Use it for new pages or critical updates. Not for every page change.

Also: requesting indexing doesn't guarantee indexing. Google still decides. But it prioritizes the URL for crawling.

Note: This only works if the page passes the crawlability test. If Google can't crawl it, requesting indexing won't help. Fix the crawl issues first.

Running URL Inspection at Scale: Batch Testing

You can only inspect one URL at a time in the Search Console UI. For 10-20 pages, that's fine. For 100+ pages, it's tedious.

Two options:

Option 1: Use the Search Console API. Google provides a URL Inspection API for programmatic access. You can batch test URLs. Requires coding, but powerful for large sites.

Option 2: Use an SEO tool that integrates with Search Console. Tools like Semrush and Conductor have built-in URL Inspection features. They batch test, prioritize issues, and track changes over time.

For founders, batch testing matters when you're publishing at scale. If you're using AI blog generation to create 100 posts in one go, you want to inspect all 100 URLs at once, not one at a time.

The Search Console API documentation shows how to do this programmatically. If you're technical, it's straightforward. If not, use an SEO tool.

Pro Tip: Use URL Inspection Before Publishing

Here's the move most founders miss:

Publish to staging. Run URL Inspection on staging. Fix issues. Then push to production.

This requires staging to be crawlable by Google. Set robots.txt to allow crawling. Or use a tool that simulates crawling without hitting your server.

Why? Because you catch issues before they go live. Noindex tags. Redirect chains. Rendering problems. Fix them in staging. Launch clean.

Then run URL Inspection on production immediately after launch. Confirm everything indexed as expected.

This is part of Karl's Pre-Launch Checklist: SEO Moves That Paid Off Day One. Pre-launch URL inspection. Post-launch verification. Two inspections. Two minutes. Saves weeks of troubleshooting.

Common Issues URL Inspection Reveals (And How to Fix Them)

Redirect Chains

Problem: URL A redirects to URL B, which redirects to URL C. Google has to follow three hops to reach the final page.

Why it matters: Each redirect adds latency. Google crawls slower. Users wait longer. Ranking signal degrades.

Fix: Redirect A directly to C. One hop. Use 301 redirects (permanent) for old URLs. 302 (temporary) for short-term changes.

How to check: URL Inspection shows redirect chains. Click "Request live test" and watch the redirect path.

Noindex on Production

Problem: You have <meta name="robots" content="noindex"> on a page you want to rank.

Why it matters: Noindex explicitly tells Google not to index. Your page will never rank.

Fix: Remove the noindex tag. Search your codebase for "noindex" and remove it from production templates. Keep it only on staging.

How to check: URL Inspection shows "Noindex tag detected" if present.

Robots.txt Blocking

Problem: Your robots.txt has Disallow: /blog but you want blog posts to rank.

Why it matters: Google won't crawl blocked paths. No crawl, no index, no rankings.

Fix: Update robots.txt to allow crawling. Example:

User-agent: *
Disallow: /admin
Disallow: /staging
Allow: /blog

Allow the paths you want indexed. Block only admin, staging, or duplicate content.

How to check: URL Inspection shows "Blocked by robots.txt" if present.

Slow Page Load

Problem: Core Web Vitals show LCP > 4 seconds. Page is slow.

Why it matters: Google penalizes slow pages. Users bounce. Rankings drop.

Fix: Optimize images (compress, lazy-load), defer JavaScript, use a CDN, remove render-blocking resources.

How to check: URL Inspection shows Core Web Vitals. Red = poor. Green = good.

JavaScript Rendering Issues

Problem: Content is loaded by JavaScript. Google's crawler doesn't wait for it to load.

Why it matters: Google sees a blank page. No content to index. No rankings.

Fix: Use server-side rendering (SSR) or static site generation (SSG). Or pre-render JavaScript content at build time.

How to check: URL Inspection cached version shows no content. Live test shows "Page is crawlable but has issues" with JavaScript errors.

Canonical Pointing Wrong Direction

Problem: You have a canonical tag pointing to a different URL. Google indexes the canonical, not your page.

Why it matters: You want to rank for your URL, not the canonical. Wrong canonical wastes your SEO effort.

Fix: Canonical should point to itself (self-referential) or to a truly duplicate page. Not to a different page.

How to check: URL Inspection shows the canonical URL. If it's not your page, investigate.

When to Run URL Inspection: Timing and Frequency

After launch: Day 1. Inspect your homepage and top 10 landing pages. Confirm Google can crawl them.

After publishing content: Inspect new blog posts or product pages within 24 hours of publishing. Confirm they're crawlable.

After site changes: Updated navigation? Changed domain structure? Migrated to HTTPS? Inspect affected URLs. Confirm no new issues.

Weekly during first 100 days: As part of Day 1 to Day 100: The Founder's SEO Onboarding, run URL Inspection on 5-10 new pages each week. Catch issues early.

Monthly after that: Run The 10-Minute SEO Review Every Founder Should Run Monthly. Inspect a sample of pages. Spot-check for new issues.

Don't obsess over it. URL Inspection is a diagnostic tool, not a ranking tool. Use it to fix problems, not to chase metrics.

Integration with Your Broader SEO Strategy

URL Inspection is one piece of a larger SEO system. It fits here:

Phase 1: Audit. Run a domain audit. Check crawlability, indexation, Core Web Vitals. URL Inspection is your primary tool here.

Phase 2: Keywords. Build a keyword roadmap. Target keywords with search volume and low competition.

Phase 3: Content. Create content for those keywords. Publish at scale if you're using AI.

Phase 4: Monitor. Track rankings, traffic, indexation. URL Inspection catches indexation drops.

As covered in The 5 Pillars of Modern SEO Every Founder Should Master, crawlability is the first pillar. You can't rank what you can't crawl. URL Inspection is how you verify crawlability.

This is why Seoable includes URL Inspection diagnostics in the domain audit phase. It's the foundation. Get it right, and the rest of SEO becomes predictable.

URL Inspection for Different Scenarios

Scenario 1: Pre-Launch Verification

You're shipping next week. You need to confirm your site is SEO-ready.

Action: Inspect your homepage, top 5 landing pages, and a sample blog post. Check for:

  • "URL is on Google" status (or "Page is crawlable" if not yet indexed)
  • No noindex tags
  • No robots.txt blocks
  • Core Web Vitals all green
  • Cached version shows correct content

If all checks pass, you're good to launch. If issues exist, fix them before going live.

Scenario 2: Post-Launch Troubleshooting

You launched two weeks ago. No traffic. You're wondering why.

Action: Inspect 10 URLs (homepage + top landing pages + recent blog posts). Check indexation status.

  • If "URL is not on Google": diagnose why. Fix crawl issues. Request indexing.
  • If "URL is on Google" but no traffic: ranking issue, not indexation. Different problem. Check keyword targeting and content quality.

URL Inspection won't solve ranking issues. But it will confirm indexation isn't the problem.

Scenario 3: Site Migration

You're moving from HTTP to HTTPS, or old-domain.com to new-domain.com, or www to non-www.

Action: Inspect old URLs and new URLs.

  • Old URLs should redirect to new URLs.
  • New URLs should be crawlable and indexable.
  • Check that redirects are 301 (permanent), not 302 (temporary).
  • Monitor Search Console for crawl errors during migration.

URL Inspection confirms redirects are working and new URLs are being indexed.

Scenario 4: Content Decay

You published 50 blog posts. Some are ranking. Some aren't. You want to know why.

Action: Batch inspect all 50 URLs using the Search Console API or an SEO tool.

  • Identify which posts are indexed and which aren't.
  • For non-indexed posts, diagnose why (noindex, robots.txt, rendering issues).
  • For indexed posts that aren't ranking, the issue is content quality or keyword targeting, not indexation.

This is part of Week 4 of SEO: The Inflection Point Most Founders Miss. By week 4, you have enough content to diagnose patterns. URL Inspection reveals which content is discoverable.

Pro Tips and Warnings

Pro Tip 1: Check the Cached Version Before Assuming Rendering Issues

The cached version is what Google stored. If it looks wrong, Google had trouble rendering or parsing your page.

Before you rebuild your site with SSR, check:

  • Are images loading? (Check image URLs in cached version.)
  • Is text visible? (Check if content is present.)
  • Are styles applied? (Check if layout looks right.)

If images aren't loading, your image hosting might be blocked. If text isn't visible, your JavaScript might be broken. If styles aren't applied, your CSS might be blocked.

Small fixes often solve apparent rendering issues.

Pro Tip 2: Use URL Inspection to Validate Fixes

You fixed a noindex tag. You updated robots.txt. You optimized page speed.

Don't wait for Google to recrawl on its own. Use URL Inspection to test the fix immediately.

Click "Request live test." Google crawls the live URL right now. Confirms your fix worked.

This feedback loop is fast. Diagnose, fix, verify. All within minutes.

Pro Tip 3: Inspect Your Competitors' URLs (For Benchmarking)

You can't inspect competitors' URLs in their Search Console. But you can use the URL Inspection API or third-party tools to check if their pages are indexed and their Core Web Vitals.

This is useful for benchmarking. If a competitor's page is slow but still ranking, maybe speed isn't the issue. If their page has a noindex tag, maybe they're not trying to rank it.

Warning 1: Don't Confuse "Not Indexed" with "Not Ranking"

URL Inspection tells you if a page is indexed. It doesn't tell you if it's ranking.

A page can be indexed but not ranking for your target keyword. That's a content or authority issue, not an indexation issue.

If URL Inspection shows "URL is on Google," the indexation part is solved. If you're not getting traffic, look at keyword targeting, content quality, and backlinks. Not indexation.

Warning 2: Requesting Indexing Too Often Is Pointless

The "Request indexing" button is useful for new pages or critical updates. Don't use it for every page change.

Google will crawl and index your site on its own schedule. Requesting indexing prioritizes the URL, but doesn't guarantee immediate indexing.

Spamming the button wastes quota and looks suspicious.

Warning 3: Core Web Vitals Data Is Delayed

Core Web Vitals in URL Inspection come from real user data. If your page is new or gets little traffic, there might not be enough data.

Google requires at least 100 real user interactions to show Core Web Vitals. New pages might show "Not enough data."

Wait a few days for traffic to accumulate. Then check again.

In the meantime, use PageSpeed Insights for synthetic lab tests.

Integrating URL Inspection Into Your Weekly Workflow

As a busy founder, you don't have time for daily SEO tasks. But you can spare 10 minutes a week.

Here's how URL Inspection fits into a weekly SEO review:

Monday: Review last week's content. Identify 5 new pages to inspect.

Tuesday: Batch inspect those 5 pages. Note any issues.

Wednesday: Fix critical issues (noindex tags, robots.txt blocks, slow pages).

Thursday: Request indexing for new pages. Verify fixes with live tests.

Friday: Log issues in a spreadsheet. Prioritize fixes for next week.

This is part of The 10-Minute SEO Review Every Founder Should Run Monthly. URL Inspection is the first step. Identify problems. Fix them. Move on.

Don't get bogged down in SEO details. Use URL Inspection to diagnose quickly, then focus on what matters: content and keywords.

Key Takeaways: What You Need to Remember

URL Inspection is a diagnostic tool, not a ranking tool. It tells you if Google can crawl and index your page. It doesn't tell you why you're not ranking. Use it to fix indexation issues, not to chase rankings.

30 seconds to diagnose. Paste a URL. Get results in seconds. You see indexation status, crawl issues, Core Web Vitals, and cached content. Fast feedback loop.

Indexation comes before ranking. Most founders optimize for rankings before pages are indexed. URL Inspection forces you to check indexation first. Right order of operations.

Common issues are easy to fix. Noindex tags, robots.txt blocks, redirect chains, slow pages. URL Inspection shows you exactly what's wrong. Fixes are straightforward.

Use it early and often. Pre-launch verification. Post-launch troubleshooting. Weekly reviews. URL Inspection catches issues fast, before they compound.

Integrate with your broader SEO system. URL Inspection is part of the audit phase. It's the foundation. Get crawlability right, then focus on keywords and content.

As you scale, use SEO Triage for Busy Founders: The 80/20 You Can't Skip to prioritize what matters. URL Inspection is part of that triage. Diagnose fast. Fix critical issues. Ship.

Final Word

You shipped code. Google didn't index it. You assumed SEO takes months. It does—if you're guessing.

URL Inspection removes the guessing. It shows you exactly what Google sees. In 30 seconds, you know if indexation is the problem. If it is, you fix it. If it's not, you move on.

Most founders never use this tool. That's why they're invisible. You now have an unfair advantage. Use it.

Start with your homepage. Inspect it today. See what Google sees. Fix anything broken. Request indexing if needed.

Then inspect your top 5 landing pages. Same process.

Then integrate URL Inspection into your weekly workflow. 10 minutes a week. Catches issues before they become problems.

This is how you build organic visibility without agencies. Diagnose fast. Fix ruthlessly. Ship.

For a complete SEO system, check out Your First 100 Days of SEO: A Day-by-Day Founder Playbook. URL Inspection is part of the foundation. The rest is keywords, content, and time.

You've got this.

§ The Dispatch

Get the next
dispatch on Monday.

One email per week with the most important SEO and AEO moves for founders. Unsubscribe in one click.

Free · Weekly · Unsubscribe anytime