JavaScript rendering optimization for search crawlers

Our company is engaged in the development, support and maintenance of sites of any complexity. From simple one-page sites to large-scale cluster systems built on micro services. Experience of developers is confirmed by certificates from vendors.
Development and maintenance of all types of websites:
Informational websites or web applications
Business card websites, landing pages, corporate websites, online catalogs, quizzes, promo websites, blogs, news resources, informational portals, forums, aggregators
E-commerce websites or web applications
Online stores, B2B portals, marketplaces, online exchanges, cashback websites, exchanges, dropshipping platforms, product parsers
Business process management web applications
CRM systems, ERP systems, corporate portals, production management systems, information parsers
Electronic service websites or web applications
Classified ads platforms, online schools, online cinemas, website builders, portals for electronic services, video hosting platforms, thematic portals

These are just some of the technical types of websites we work with, and each of them can have its own specific features and functionality, as well as be customized to meet the specific needs and goals of the client.

Our competencies:
Development stages
Latest works
  • image_web-applications_feedme_466_0.webp
    Development of a web application for FEEDME
    1161
  • image_ecommerce_furnoro_435_0.webp
    Development of an online store for the company FURNORO
    1041
  • image_crm_enviok_479_0.webp
    Development of a web application for Enviok
    822
  • image_crm_chasseurs_493_0.webp
    CRM development for Chasseurs
    847
  • image_website-sbh_0.png
    Website development for SBH Partners
    999
  • image_website-_0.png
    Website development for Red Pear
    451

Optimizing JavaScript Rendering for Search Engines (JavaScript SEO)

Googlebot and other search engines handle JavaScript differently. Google executes JS through headless Chrome (WRS — Web Rendering Service), but with a delay: HTML pages are indexed immediately, JS rendering is queued separately. For SEO this means content existing only in the DOM after JS execution may be delayed in indexing by days or weeks.

How Google Handles JavaScript

Stage 1: Crawling — Googlebot downloads the source HTML.

Stage 2: Rendering queue — page is queued for JS rendering. Delay: from hours to weeks.

Stage 3: Indexing after render — content after JS execution enters the index.

Yandex, Bing, DuckDuckGo — handle JS significantly worse or don't render at all.

Problem Diagnosis

Google Search Console → URL Inspection → View Crawled Page — shows final DOM after Googlebot rendering. If important content is there but isn't indexed — rendering isn't the problem.

Screaming Frog with JS rendering — comparing HTML source with rendered DOM.

# Compare source HTML with rendered (via curl vs puppeteer)
curl -s https://site.com/page | grep -c "product-title"
# vs
node -e "const puppeteer = require('puppeteer'); (async () => {
  const browser = await puppeteer.launch();
  const page = await browser.newPage();
  await page.goto('https://site.com/page');
  await page.waitForSelector('.product-title');
  const count = await page.$$eval('.product-title', els => els.length);
  console.log(count);
  await browser.close();
})()"

Solution 1: SSR (Server-Side Rendering)

SSR renders pages on the server — bot gets ready HTML without needing to execute JS.

Next.js:

// getServerSideProps — render on each request
export async function getServerSideProps({ params }) {
  const product = await fetchProduct(params.id)
  return { props: { product } }
}

// getStaticProps — build at deploy (faster)
export async function getStaticProps({ params }) {
  const product = await fetchProduct(params.id)
  return {
    props: { product },
    revalidate: 3600  // ISR: update every hour
  }
}

Nuxt.js:

// nuxt.config.ts
export default defineNuxtConfig({
  ssr: true,  // enable SSR
})

// composable
const { data } = await useFetch(`/api/products/${id}`)

Solution 2: SSG (Static Site Generation)

For content with infrequent updates, SSG provides better performance:

// Next.js ISR (Incremental Static Regeneration)
export async function getStaticPaths() {
  const products = await fetchAllProducts()
  return {
    paths: products.map(p => ({ params: { id: p.id } })),
    fallback: 'blocking'  // new pages render on first request
  }
}

Solution 3: Prerendering for SPA without SSR

If SSR can't be implemented, prerendering generates static HTML snapshots:

# nginx: serve prerendered HTML to bots, SPA to humans
map $http_user_agent $is_bot {
    ~*(googlebot|bingbot|yandex|baiduspider|facebookexternalhit) 1;
    default 0;
}

server {
    location / {
        if ($is_bot = 1) {
            proxy_pass http://prerender-service:3000;
            break;
        }
        try_files $uri /index.html;
    }
}

Critical JS SEO Rules

Content in source HTML > content in JS

<!-- Bad: title generated only by JS -->
<title>Loading...</title>
<script>document.title = fetchProductTitle()</script>

<!-- Good: title in source HTML -->
<title>iPhone 15 Pro - buy in Moscow | Shop.ru</title>

Links in <a href>, not just onClick:

// Bad: no href, bot won't find link
<span onClick={() => navigate('/product/42')}>Product</span>

// Good
<a href="/product/42">Product</a>

Lazy loading and SEO:

// Intersection Observer + lazy loading: content below fold
// Googlebot doesn't scroll, so lazy-loaded content may not be indexed

// Solution: use loading="lazy" only for images,
// but not for text content
<img src="product.jpg" loading="lazy" alt="Product">
// Text below fold — don't apply lazy loading

JSON-LD instead of microdata in attributes:

<script type="application/ld+json">
{
  "@context": "https://schema.org",
  "@type": "Product",
  "name": "iPhone 15 Pro",
  "offers": {
    "@type": "Offer",
    "price": "99999",
    "priceCurrency": "RUB"
  }
}
</script>

Verification via Google Rich Results Test

# API for checking markup
curl "https://search.google.com/test/rich-results/result?url=https://site.com/product/1"

Core Web Vitals and JavaScript

Long TBT (Total Blocking Time) from heavy JS harms Page Experience signal:

# Find long tasks via Chrome DevTools Performance API
window.performance.getEntriesByType('longtask').forEach(task => {
  if (task.duration > 50) {
    console.warn(`Long task: ${task.duration}ms at ${task.startTime}`)
  }
})

Timeline

JS SEO audit + SSR/ISR implementation for Next.js — 3–7 business days depending on application size.