Why Your Next.js Pages Aren't Showing Up in Google (And How to Fix It)
Understanding the Problem
When Google's crawler (Googlebot) visits your site, it needs clear signals about:
Which pages exist and should be indexed
How to find those pages
Whether those pages are important enough to index
If any of these signals are missing or conflicting, Google might simply skip your pages—even if you've manually requested indexing.
The Top 5 Reasons Your Pages Aren't Indexing
1. robots.txt is Blocking Your Pages
This is the #1 culprit. Your robots.txt file tells search engines which parts of your site they can crawl. A misconfigured robots.txt can block everything except your homepage.
Check your robots.txt: Visit https://yoursite.com/robots.txt
❌ Bad (blocks everything):
txt
User-agent: *
Disallow: /✅ Good (allows crawling):
txt
User-agent: *
Allow: /Sitemap: https://yoursite.com/sitemap.xml
Fix it in Next.js App Router:
Create app/robots.ts:import { MetadataRoute } from "next";
export default function robots(): MetadataRoute.Robots {
return {
rules: [
{
userAgent: "*",
allow: "/",
disallow: ["/admin/", "/api/"], // Only block private sections
},
],
sitemap: "https://yoursite.com/sitemap.xml",
};
}
2. Your Sitemap is Missing Pages
A sitemap is like a roadmap for Google—it lists all the pages you want indexed. If a page isn't in your sitemap, Google might never find it.
Check your sitemap: Visit https://yoursite.com/sitemap.xml
Make sure it includes:
Your homepage
Static pages (About, Services, Contact)
Blog index page
All individual blog posts
Category pages
Create a dynamic sitemap in Next.js:
typescript
// app/sitemap.ts
import { MetadataRoute } from 'next'
import { prisma } from '@/lib/prisma'
export default async function sitemap(): MetadataRoute.Sitemap {
const baseUrl = 'https://yoursite.com'
// Fetch all published blog posts
const posts = await prisma.blogPost.findMany({
where: { published: true },
select: {
slug: true,
updatedAt: true,
},
})
return [
// Static pages
{
url: baseUrl,
lastModified: new Date(),
changeFrequency: 'daily',
priority: 1,
},
{
url: ${baseUrl}/about,
lastModified: new Date(),
changeFrequency: 'monthly',
priority: 0.8,
},
{
url: ${baseUrl}/blog,
lastModified: new Date(),
changeFrequency: 'daily',
priority: 0.9,
},
// Dynamic blog posts
...posts.map((post) => ({
url: ${baseUrl}/blog/${post.slug},
lastModified: post.updatedAt,
changeFrequency: 'weekly' as const,
priority: 0.7,
})),
]
}3. Missing or Incorrect Metadata
Every page needs proper metadata to help Google understand what it's about. Missing titles, descriptions, or canonical URLs can hurt your indexing chances.
Add comprehensive metadata to every page:
typescript
// app/about/page.tsx
import { Metadata } from 'next'
export const metadata: Metadata = {
title: 'About Us | Your Company Name',
description: 'Learn about our team, mission, and the services we provide.',
alternates: {
canonical: 'https://yoursite.com/about',
},
robots: {
index: true,
follow: true,
},
openGraph: {
title: 'About Us',
description: 'Learn about our team...',
url: 'https://yoursite.com/about',
type: 'website',
},
}
export default function About() {
return <div>About content...</div>
}4. No Internal Links to Your Pages
Google discovers new pages by following links. If your About page or Blog aren't linked from your homepage or navigation menu, Google might never find them—even if they're in your sitemap.
Add prominent internal links:
typescript
// components/Header.tsx
export default function Header() {
return (
<nav>
<Link href="/">Home</Link>
<Link href="/about">About</Link>
<Link href="/blog">Blog</Link>
<Link href="/services">Services</Link>
</nav>
)
}
Link between blog posts:
typescript
// At the end of each blog post
<div className="mt-8">
<h3>Related Articles</h3>
<Link href="/blog/post-1">Read: How to Build...</Link>
<Link href="/blog/post-2">Read: Best Practices...</Link>
</div>5. Pages Have noindex Tags
Sometimes metadata accidentally includes a noindex directive that explicitly tells Google NOT to index a page.
Check your page source: View the HTML source and search for:
html
<meta name="robots" content="noindex">
If you find this, remove it immediately.
Ensure indexing is explicitly enabled:
typescript
export const metadata = {
robots: {
index: true, // ✅ Explicitly allow indexing
follow: true,
},
}The Step-by-Step Fix
Here's your action plan to get your pages indexed:
Step 1: Verify Your Setup
Check robots.txt: Make sure it's not blocking your pages
Verify sitemap.xml: Confirm all pages are listed
View page source: Look for noindex tags
Check metadata: Ensure every page has proper titles and descriptions
Step 2: Submit to Google Search Console
Go to Google Search Console
Use the URL Inspection tool
Enter each URL (About, Blog, individual posts)
Click Request Indexing
Step 3: Add Internal Links
Make sure your homepage and navigation link to:
About page
Blog index
Services/Products pages
Add related post links within blog articles.
Step 4: Monitor and Wait
Coverage Report: Check for indexing errors
Sitemaps: Submit your sitemap if you haven't
Be Patient: Google can take 3-7 days (sometimes up to 4 weeks)
Advanced Optimization Tips
Use Structured Data
Help Google understand your content better with schema markup:
typescript
// In blog post pages
export default function BlogPost({ post }) {
const jsonLd = {
'@context': 'https://schema.org',
'@type': 'BlogPosting',
headline: post.title,
image: post.featuredImage,
author: {
'@type': 'Person',
name: post.author,
},
datePublished: post.publishedAt,
dateModified: post.updatedAt,
}
return (
<>
<script
type="application/ld+json"
dangerouslySetInnerHTML={{ __html: JSON.stringify(jsonLd) }}
/>
<article>{/* Post content */}</article>
</>
)
}Implement Breadcrumbs
Help Google understand your site structure:
typescript
export default function BlogPost() {
const breadcrumbJsonLd = {
'@context': 'https://schema.org',
'@type': 'BreadcrumbList',
itemListElement: [
{
'@type': 'ListItem',
position: 1,
name: 'Home',
item: 'https://yoursite.com',
},
{
'@type': 'ListItem',
position: 2,
name: 'Blog',
item: 'https://yoursite.com/blog',
},
{
'@type': 'ListItem',
position: 3,
name: 'Article Title',
item: 'https://yoursite.com/blog/article',
},
],
}
return (
<>
<script
type="application/ld+json"
dangerouslySetInnerHTML={{ __html: JSON.stringify(breadcrumbJsonLd) }}
/>
{/* Content */}
</>
)
}Optimize Images
Images help pages rank better and can appear in image search:
typescript
import Image from 'next/image'
<Image
src="/blog-featured-image.jpg"
alt="Descriptive alt text with keywords"
width={1200}
height={630}
priority // For above-the-fold images
/>Common Mistakes to Avoid
Don't block CSS/JS in robots.txt: Google needs to render your JavaScript
Avoid duplicate content: Use canonical URLs consistently
Don't spam keywords: Write natural, helpful content
Don't forget mobile: Ensure pages are mobile-friendly
Avoid thin content: Pages with <300 words may not index well
Monitoring Your Progress
Google Search Console Metrics to Watch:
Coverage Report: Shows indexing status of all pages
Pages Report: Lists indexed vs. non-indexed pages
Sitemaps Report: Shows if sitemap was processed successfully
Performance Report: Shows which pages are getting impressions/clicks
Use These Queries in Google:
site:yoursite.com
Shows all indexed pages
site:yoursite.com/blog
Shows all indexed blog pages
site:yoursite.com/about
Checks if specific page is indexed
Timeline Expectations
Typical indexing times:
Manual submission: 1-7 days
Natural crawling: 2-4 weeks
After sitemap submission: 3-10 days
What affects speed:
Site authority (new sites take longer)
Content quality
Number of internal links
Update frequency
Technical SEO health
Conclusion
Getting your Next.js pages indexed in Google comes down to three key principles:
Make pages discoverable: Use sitemaps and internal links
Make indexing easy: Proper metadata, no blocking directives
Make content valuable: Quality content that deserves to rank
Most indexing issues stem from one of the five problems we covered. Check your robots.txt, verify your sitemap, add proper metadata, create internal links, and check for noindex tags. Then give Google time to crawl and index your pages.
Remember: SEO is a marathon, not a sprint. Focus on creating great content and following best practices, and the traffic will follow.
Have you faced indexing issues with your Next.js site? What worked for you? Share your experience in the comments below!
Further Reading
Last updated: February 2026

