← Back to insights
Guide · #399

JavaScript SEO: What Founders on Next.js Need to Know

Next.js JavaScript SEO guide for founders. Fix rendering issues, boost crawlability, and get organic visibility in 2026. Technical playbook inside.

Filed
March 19, 2026
Read
16 min
Author
The Seoable Team

The JavaScript SEO Problem That's Costing You Traffic

You shipped a Next.js app. It's fast. It works. Users love it. But Google doesn't know you exist.

This is the JavaScript SEO trap. Your site renders beautifully in the browser, but search engines see an empty shell—or worse, partially rendered content that looks broken to the crawler. You're invisible not because your product is bad, but because the way you built it breaks how Google indexes the web.

The brutal truth: most founders on Next.js don't know their site has a JavaScript SEO problem until traffic plateaus. By then, you've lost months of organic visibility.

This guide fixes that. We'll walk through the specific JavaScript rendering issues that kill SEO on Next.js, show you exactly how to diagnose them, and give you step-by-step fixes you can ship today. No agency needed. No guessing.

Prerequisites: What You Need Before Starting

Before you dive into JavaScript SEO fixes, make sure you have these in place:

Technical setup:

  • A Next.js project (App Router or Pages Router—both covered here)
  • Access to your site's source code and deployment environment
  • A staging environment where you can test changes before shipping to production
  • Node.js and npm/yarn installed locally

Tools and access:

Knowledge:

  • Basic understanding of how Next.js renders pages (SSR vs. SSG)
  • Familiarity with your site's URL structure and page hierarchy
  • Access to your hosting/deployment dashboard (Vercel, AWS, etc.)

If you're missing any of these, pause here and set them up. The fixes that follow won't work without them.

Why JavaScript Rendering Breaks SEO on Next.js

Google's crawlers have gotten better at JavaScript. They're not perfect, but they're competent. So why does your Next.js site still have JavaScript SEO problems?

The answer: it's not that Google can't render JavaScript. It's that your site was built in a way that makes crawling inefficient, expensive, or impossible.

Here's what happens:

The rendering pipeline problem. When Google crawls your Next.js page, it doesn't just fetch the HTML and index it like it did in 2010. It has to:

  1. Request the HTML
  2. Parse the HTML
  3. Download all JavaScript bundles
  4. Execute the JavaScript (which takes CPU and memory)
  5. Wait for API calls to complete
  6. Re-render the DOM
  7. Wait for client-side state to settle
  8. Finally, index the rendered content

Steps 3–7 are expensive. Google has a crawl budget. Expensive pages get crawled less frequently. If your site takes 10 seconds to render, Google crawls it 10x less often than a site that renders in 1 second.

The hydration mismatch problem. Next.js renders on the server, then "hydrates" on the client (JavaScript takes over). If the server-rendered HTML doesn't match what the client renders, Google sees one thing and your users see another. This causes indexing errors and ranking penalties.

The API dependency problem. Many Next.js apps fetch data from APIs on the client side. Google's crawler might not wait for these calls to complete. So it indexes your page before the content loads. You rank for nothing.

The dynamic content problem. If your page content is generated by JavaScript after the page loads, Google might miss it entirely. This is especially common in single-page app (SPA) patterns that some teams incorrectly use with Next.js.

These aren't theoretical problems. They're the reason your Robots, sitemaps, and canonicals guide matters—misconfigured robots.txt can prevent Google from even attempting to crawl your JavaScript in the first place.

Step 1: Diagnose Your JavaScript SEO Issues

Before you fix anything, you need to know what's broken. This step takes 20 minutes and tells you exactly what Google sees.

Run a URL Inspection in Google Search Console

Go to Google Search Console (if you haven't set it up, do it now).

  1. Navigate to your property
  2. Click "URL Inspection" in the left sidebar
  3. Paste in your homepage URL
  4. Wait for Google's report to load
  5. Click "View Tested Page" to see what Google actually sees
  6. Compare it to what your browser shows

If they're different, you have a JavaScript rendering problem. Write down the differences.

Check PageSpeed Insights for Rendering Issues

Go to PageSpeed Insights.

  1. Enter your homepage URL
  2. Run the test (takes 30–60 seconds)
  3. Scroll to the "Diagnostics" section
  4. Look for these red flags:
    • "Largest Contentful Paint (LCP) is too high" (should be under 2.5 seconds)
    • "Cumulative Layout Shift (CLS) is too high" (should be under 0.1)
    • "First Input Delay (FID) is too high" (should be under 100ms)
    • "Unused JavaScript" (indicates bloated bundles)
    • "Render-blocking resources" (CSS or JS that delays page rendering)

These metrics directly impact crawlability. High LCP means Google's crawler times out before your content renders.

Inspect the Network Timeline in Chrome DevTools

  1. Open your site in Chrome
  2. Press F12 to open DevTools
  3. Go to the "Network" tab
  4. Reload the page
  5. Look at the waterfall chart
  6. Identify which resources load last and take the longest
  7. Check the "Initiator" column to see if JavaScript is causing late-loading resources

If you see a bunch of requests firing after the page loads, that's client-side data fetching. Google might not wait for it.

Check for Hydration Mismatches

  1. Open DevTools on your site
  2. Go to the "Console" tab
  3. Look for errors like "Hydration mismatch" or "Text content did not match"
  4. If you see these, your server and client are rendering different HTML

Write down everything you find. This is your diagnosis.

Step 2: Fix Server-Side Rendering (SSR) and Static Generation (SSG)

This is the most important fix. It addresses the root cause: Google's crawler waiting forever for content to render.

Use Static Generation (SSG) for Content That Doesn't Change Often

If your page content doesn't change every second, generate it at build time. Google will index the static HTML instantly.

In Next.js App Router, use generateStaticParams and revalidate:

// app/blog/[slug]/page.js

export async function generateStaticParams() {
  const posts = await fetch('https://api.example.com/posts').then(r => r.json());
  return posts.map(post => ({ slug: post.slug }));
}

export const revalidate = 3600; // Revalidate every hour

export default function BlogPost({ params }) {
  // Your page component
}

This tells Next.js to:

  1. Generate static HTML for every blog post at build time
  2. Serve that static HTML to Google (and users) instantly
  3. Re-generate it every hour in the background

Google crawls static HTML 100x faster than it crawls dynamically rendered content.

Use Server-Side Rendering (SSR) for Content That Changes Frequently

If your page content changes often (e.g., real-time data, user-specific content), use SSR. Don't use client-side rendering.

In Next.js App Router, SSR is the default:

// app/dashboard/page.js

export default async function Dashboard() {
  const data = await fetch('https://api.example.com/user-data', {
    cache: 'no-store', // Disable caching for real-time data
  }).then(r => r.json());

  return (
    <div>
      <h1>{data.title}</h1>
      <p>{data.content}</p>
    </div>
  );
}

This renders on the server before sending HTML to the browser. Google gets the full, rendered HTML instantly. No waiting for JavaScript.

Avoid Client-Side Rendering (CSR) for SEO-Critical Content

Client-side rendering is the biggest JavaScript SEO mistake. It looks like this:

// DON'T DO THIS for SEO-critical pages
import { useEffect, useState } from 'react';

export default function Page() {
  const [data, setData] = useState(null);

  useEffect(() => {
    fetch('https://api.example.com/data')
      .then(r => r.json())
      .then(d => setData(d));
  }, []);

  return <div>{data?.title}</div>;
}

Here's what Google sees:

  1. Empty <div> (no title)
  2. Waits for JavaScript to run
  3. Waits for the API call
  4. Finally sees the title

But Google might not wait. It indexes the empty page. You rank for nothing.

The fix: Move that fetch to the server.

// DO THIS instead
export default async function Page() {
  const data = await fetch('https://api.example.com/data').then(r => r.json());
  return <div>{data.title}</div>;
}

Now Google gets the full HTML immediately.

Step 3: Implement Proper Metadata and Open Graph Tags

Google needs to understand what your page is about. Use the Metadata API to tell it.

In Next.js App Router, add metadata to each page:

// app/blog/[slug]/page.js

export async function generateMetadata({ params }) {
  const post = await fetch(`https://api.example.com/posts/${params.slug}`)
    .then(r => r.json());

  return {
    title: post.title,
    description: post.excerpt,
    openGraph: {
      title: post.title,
      description: post.excerpt,
      url: `https://example.com/blog/${params.slug}`,
      type: 'article',
      publishedTime: post.publishedAt,
      authors: [post.author],
    },
    canonical: `https://example.com/blog/${params.slug}`,
  };
}

This tells Google (and AI engines like ChatGPT) what your page is about. It also fixes social sharing—when someone shares your link on Twitter, it shows the right title and image.

For your homepage and key pages, add structured data (JSON-LD). Here's our guide to Organization schema, which adds trust signals Google uses to understand your brand.

Step 4: Optimize JavaScript Bundle Size

Large JavaScript bundles slow down rendering. Slow rendering means Google crawls less frequently. Smaller bundles = faster crawling = more frequent indexing.

Use Code Splitting

Next.js does this automatically, but you can optimize it further:

// Lazy-load components that aren't critical
import dynamic from 'next/dynamic';

const HeavyComponent = dynamic(() => import('./heavy-component'), {
  loading: () => <div>Loading...</div>,
});

export default function Page() {
  return (
    <div>
      <h1>Fast content that loads first</h1>
      <HeavyComponent /> {/* Loads later */}
    </div>
  );
}

Google crawls the "Fast content" immediately. The heavy component loads later, after crawling is done.

Remove Unused JavaScript

Run PageSpeed Insights again. It shows you unused JavaScript. Remove it:

  1. Identify which npm packages you're not actually using
  2. Remove them: npm uninstall package-name
  3. Check your imports—remove unused ones
  4. Use tree-shaking in your build config to eliminate dead code

Every kilobyte you remove makes your site faster and more crawlable.

Minify and Compress

Next.js does this by default in production, but verify it:

  1. Deploy to production
  2. Open DevTools Network tab
  3. Check the size of JavaScript bundles
  4. If they're large (>500KB for a homepage), something's wrong

Step 5: Fix Hydration Mismatches

If your server and client render different HTML, Google gets confused. Fix it.

Identify Hydration Mismatches

Open DevTools Console on your site. Look for errors:

Warning: Hydration failed because the initial UI does not match what was rendered on the server.

If you see this, your server and client are out of sync.

Common Causes and Fixes

Problem: Using new Date() or Math.random() in rendering

These produce different values on server and client:

// WRONG
export default function Page() {
  return <div>{new Date().toISOString()}</div>; // Different on server vs. client
}

Fix: Use useEffect to run only on client

'use client';
import { useEffect, useState } from 'react';

export default function Page() {
  const [date, setDate] = useState('');

  useEffect(() => {
    setDate(new Date().toISOString());
  }, []);

  return <div>{date}</div>;
}

Problem: Rendering different content based on browser detection

// WRONG
const isMobile = typeof window !== 'undefined' && window.innerWidth < 768;

export default function Page() {
  return <div>{isMobile ? 'Mobile' : 'Desktop'}</div>; // Server doesn't know window size
}

Fix: Use CSS media queries instead

export default function Page() {
  return (
    <div className="responsive">
      Content
    </div>
  );
}

// In your CSS
@media (max-width: 768px) {
  .responsive { /* mobile styles */ }
}

Problem: Conditional rendering based on typeof window

// WRONG
if (typeof window !== 'undefined') {
  // This code runs on client but not server
  // Server renders different HTML
}

Fix: Use suppressHydrationWarning as a temporary band-aid, but move the logic to useEffect

'use client';
import { useEffect, useState } from 'react';

export default function Page() {
  const [isClient, setIsClient] = useState(false);

  useEffect(() => {
    setIsClient(true);
  }, []);

  return <div>{isClient ? 'Client-only content' : 'Server content'}</div>;
}

Step 6: Set Up Proper Sitemaps and Robots.txt

Google needs a roadmap to crawl your site. This is foundational.

Generate a Dynamic Sitemap

In Next.js, create a dynamic sitemap:

// app/sitemap.js

export default async function sitemap() {
  const baseUrl = 'https://example.com';
  const posts = await fetch('https://api.example.com/posts')
    .then(r => r.json());

  const postUrls = posts.map(post => ({
    url: `${baseUrl}/blog/${post.slug}`,
    lastModified: new Date(post.updatedAt),
    changeFrequency: 'weekly',
    priority: 0.8,
  }));

  return [
    {
      url: baseUrl,
      lastModified: new Date(),
      changeFrequency: 'daily',
      priority: 1,
    },
    ...postUrls,
  ];
}

Deploy this. Next.js automatically creates /sitemap.xml. Here's our complete guide to generating sitemaps for every stack.

Configure Robots.txt

Create public/robots.txt:

User-agent: *
Allow: /
Disallow: /admin
Disallow: /api
Disallow: /private

Sitemap: https://example.com/sitemap.xml

This tells Google what to crawl and where to find your sitemap. Here's our deep dive on robots.txt, sitemaps, and canonicals—most founders get these wrong.

Step 7: Use Structured Data (JSON-LD)

Structured data helps Google understand your content and AI engines (like ChatGPT and Perplexity) cite you. This is critical for 2026.

Add Article Schema to Blog Posts

// app/blog/[slug]/page.js

export default function BlogPost({ params, post }) {
  const schema = {
    '@context': 'https://schema.org',
    '@type': 'BlogPosting',
    headline: post.title,
    description: post.excerpt,
    image: post.image,
    datePublished: post.publishedAt,
    author: {
      '@type': 'Person',
      name: post.author,
    },
  };

  return (
    <>
      <script
        type="application/ld+json"
        dangerouslySetInnerHTML={{ __html: JSON.stringify(schema) }}
      />
      <article>
        <h1>{post.title}</h1>
        <p>{post.excerpt}</p>
        {/* content */}
      </article>
    </>
  );
}

For more on structured data, check out our guide to setting up schema markup with Google's Rich Results Test.

Add Organization Schema to Homepage

This tells Google who you are. Our Organization schema guide walks you through it in 5 minutes.

Step 8: Monitor and Iterate

You've fixed the issues. Now measure the impact.

Set Up Core Web Vitals Monitoring

Go to Google Search Console and check the "Core Web Vitals" report every week. You're looking for:

  • LCP (Largest Contentful Paint): Should be under 2.5 seconds
  • FID/INP (First Input Delay / Interaction to Next Paint): Should be under 100ms
  • CLS (Cumulative Layout Shift): Should be under 0.1

If these improve after your fixes, you're on the right track.

Track Rankings and Organic Traffic

  1. Set up Google Analytics 4 (if you haven't: here's our free SEO tool stack guide)
  2. Check Google Search Console Performance reports weekly
  3. Track:
    • Organic traffic (should go up)
    • Average ranking position (should go down—lower is better)
    • Click-through rate (CTR) from search results
    • Crawl errors (should go down)

Re-run Diagnostics Monthly

Every month:

  1. Run PageSpeed Insights on your homepage and 2–3 key pages
  2. Run a URL inspection in Google Search Console
  3. Check for new console errors in DevTools
  4. Compare to your baseline

If metrics are improving, keep going. If they plateau, dig deeper.

Pro Tips and Warnings

⚠️ Warning: Don't Use Next.js as an SPA

Don't do this:

// WRONG - This is a single-page app, not Next.js
export default function App() {
  const [page, setPage] = useState('home');

  return (
    <div>
      {page === 'home' && <Home />}
      {page === 'about' && <About />}
    </div>
  );
}

Next.js is a framework for server-rendered pages, not client-side routing. Use it that way. If you need an SPA, use React Router or Remix—but expect SEO challenges.

✅ Pro Tip: Use Vercel Analytics to Monitor Real-World Performance

Vercel Analytics shows you real-world Core Web Vitals from actual users. It's more accurate than lab tests. Check it weekly.

✅ Pro Tip: Prerender Heavy Pages

If you have a page that's expensive to render (e.g., a dashboard with lots of data), prerender it at build time:

// next.config.js

const nextConfig = {
  onDemandEntries: {
    maxInactiveAge: 60 * 1000,
    pagesBufferLength: 5,
  },
};

module.exports = nextConfig;

This caches rendered pages so Google gets them instantly.

✅ Pro Tip: Use revalidatePath for Incremental Static Regeneration

If you update content, tell Next.js to re-render the page:

// In your API route or server action
import { revalidatePath } from 'next/cache';

export async function updatePost(id, data) {
  await db.posts.update(id, data);
  revalidatePath(`/blog/${id}`); // Re-render this page
}

Now when you update a blog post, the page re-renders automatically. Google sees the new version on the next crawl.

Troubleshooting Common Issues

"Google says my page isn't indexed"

Check these in order:

  1. Robots.txt is blocking it: Go to Google Search Console > URL Inspection. Does it say "Blocked by robots.txt"? If yes, fix your public/robots.txt.
  2. No sitemap: Submit your sitemap in Google Search Console > Sitemaps.
  3. Noindex tag: Check your metadata. Are you accidentally adding noindex?
  4. Rendering timeout: Run PageSpeed Insights. If LCP is over 5 seconds, Google times out before crawling.
  5. Duplicate content: Check our canonical domain guide. Are you serving the same content on multiple URLs?

"My content ranks, but traffic is low"

You're ranking but not getting clicks. This usually means:

  1. Bad title or meta description: Make them compelling. Google shows them in search results. If they're boring, people don't click.
  2. Wrong keywords: You're ranking for keywords nobody searches for. Here's our guide to search intent.
  3. Low CTR: Check Google Search Console Performance reports. If your CTR is below 2%, your titles/descriptions need work.

"My site is fast, but still not ranking"

Speed matters, but it's not everything. Check these:

  1. Content quality: Is your content better than the competition? Our search intent guide helps you match what users actually want.
  2. Backlinks: Do sites link to you? If not, Google sees you as less authoritative. This is hard to fix without PR.
  3. E-E-A-T: Google favors content from experts with experience. Is your author credible? Add author bylines and bios.
  4. Search volume: Are you targeting keywords people actually search for? Use Google Trends or Ahrefs to check.

Next Steps: Build on Your Foundation

You've fixed JavaScript SEO on Next.js. Now scale it.

Short term (next 2 weeks):

  • Deploy the fixes in this guide
  • Monitor Google Search Console for indexing improvements
  • Check PageSpeed Insights weekly

Medium term (next 2 months):

Long term (next 6 months):

Key Takeaways

JavaScript SEO on Next.js isn't hard—it just requires the right approach.

  1. Use static generation (SSG) for content that doesn't change. Google crawls it instantly. This is the biggest win.
  2. Use server-side rendering (SSR) for dynamic content. Never use client-side rendering for SEO-critical pages.
  3. Implement proper metadata and structured data. Tell Google (and AI engines) what your pages are about.
  4. Optimize bundle size and Core Web Vitals. Faster sites get crawled more frequently.
  5. Fix hydration mismatches. Server and client must render identical HTML.
  6. Set up sitemaps and robots.txt correctly. These are foundational. Most founders get them wrong.
  7. Monitor weekly. Check Google Search Console, PageSpeed Insights, and Google Analytics. Data drives decisions.

You don't need an agency. You don't need to hire an SEO specialist. You need the right tools and a playbook. You have both now.

Ship these fixes today. Measure the impact next week. Scale from there.

The founders who win in 2026 aren't the ones with the biggest budgets—they're the ones who ship SEO foundations fast and iterate on what works. You're now equipped to do exactly that.

Free weekly newsletter

Get the next one on Sunday.

One short email a week. What is working in SEO right now. Unsubscribe in one click.

Subscribe on Substack →
Keep reading