Optimizing JavaScript Rendering for Search Engines (JavaScript SEO)
Googlebot and other search engines handle JavaScript differently. Google executes JS through headless Chrome (WRS — Web Rendering Service), but with a delay: HTML pages are indexed immediately, JS rendering is queued separately. For SEO this means content existing only in the DOM after JS execution may be delayed in indexing by days or weeks.
How Google Handles JavaScript
Stage 1: Crawling — Googlebot downloads the source HTML.
Stage 2: Rendering queue — page is queued for JS rendering. Delay: from hours to weeks.
Stage 3: Indexing after render — content after JS execution enters the index.
Yandex, Bing, DuckDuckGo — handle JS significantly worse or don't render at all.
Problem Diagnosis
Google Search Console → URL Inspection → View Crawled Page — shows final DOM after Googlebot rendering. If important content is there but isn't indexed — rendering isn't the problem.
Screaming Frog with JS rendering — comparing HTML source with rendered DOM.
# Compare source HTML with rendered (via curl vs puppeteer)
curl -s https://site.com/page | grep -c "product-title"
# vs
node -e "const puppeteer = require('puppeteer'); (async () => {
const browser = await puppeteer.launch();
const page = await browser.newPage();
await page.goto('https://site.com/page');
await page.waitForSelector('.product-title');
const count = await page.$$eval('.product-title', els => els.length);
console.log(count);
await browser.close();
})()"
Solution 1: SSR (Server-Side Rendering)
SSR renders pages on the server — bot gets ready HTML without needing to execute JS.
Next.js:
// getServerSideProps — render on each request
export async function getServerSideProps({ params }) {
const product = await fetchProduct(params.id)
return { props: { product } }
}
// getStaticProps — build at deploy (faster)
export async function getStaticProps({ params }) {
const product = await fetchProduct(params.id)
return {
props: { product },
revalidate: 3600 // ISR: update every hour
}
}
Nuxt.js:
// nuxt.config.ts
export default defineNuxtConfig({
ssr: true, // enable SSR
})
// composable
const { data } = await useFetch(`/api/products/${id}`)
Solution 2: SSG (Static Site Generation)
For content with infrequent updates, SSG provides better performance:
// Next.js ISR (Incremental Static Regeneration)
export async function getStaticPaths() {
const products = await fetchAllProducts()
return {
paths: products.map(p => ({ params: { id: p.id } })),
fallback: 'blocking' // new pages render on first request
}
}
Solution 3: Prerendering for SPA without SSR
If SSR can't be implemented, prerendering generates static HTML snapshots:
# nginx: serve prerendered HTML to bots, SPA to humans
map $http_user_agent $is_bot {
~*(googlebot|bingbot|yandex|baiduspider|facebookexternalhit) 1;
default 0;
}
server {
location / {
if ($is_bot = 1) {
proxy_pass http://prerender-service:3000;
break;
}
try_files $uri /index.html;
}
}
Critical JS SEO Rules
Content in source HTML > content in JS
<!-- Bad: title generated only by JS -->
<title>Loading...</title>
<script>document.title = fetchProductTitle()</script>
<!-- Good: title in source HTML -->
<title>iPhone 15 Pro - buy in Moscow | Shop.ru</title>
Links in <a href>, not just onClick:
// Bad: no href, bot won't find link
<span onClick={() => navigate('/product/42')}>Product</span>
// Good
<a href="/product/42">Product</a>
Lazy loading and SEO:
// Intersection Observer + lazy loading: content below fold
// Googlebot doesn't scroll, so lazy-loaded content may not be indexed
// Solution: use loading="lazy" only for images,
// but not for text content
<img src="product.jpg" loading="lazy" alt="Product">
// Text below fold — don't apply lazy loading
JSON-LD instead of microdata in attributes:
<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "Product",
"name": "iPhone 15 Pro",
"offers": {
"@type": "Offer",
"price": "99999",
"priceCurrency": "RUB"
}
}
</script>
Verification via Google Rich Results Test
# API for checking markup
curl "https://search.google.com/test/rich-results/result?url=https://site.com/product/1"
Core Web Vitals and JavaScript
Long TBT (Total Blocking Time) from heavy JS harms Page Experience signal:
# Find long tasks via Chrome DevTools Performance API
window.performance.getEntriesByType('longtask').forEach(task => {
if (task.duration > 50) {
console.warn(`Long task: ${task.duration}ms at ${task.startTime}`)
}
})
Timeline
JS SEO audit + SSR/ISR implementation for Next.js — 3–7 business days depending on application size.







