JavaScript SEO

JavaScript SEO is the discipline of ensuring search engines can crawl, render, and index content delivered via JavaScript. The rise of single-page applications, client-side frameworks, and JavaScript-heavy CMS templates has made this one of the most common failure points in modern technical SEO.

How Googlebot handles JavaScript

Googlebot uses a two-pass indexing system:

  1. First pass. The HTML is fetched and processed for immediate indexing of any content present in the initial server response.
  2. Second pass (rendering). The page is queued for rendering by the Web Rendering Service, which runs the JavaScript and produces the final DOM. The rendered output is then processed for indexing.

The rendering queue is not instant. The delay between the first pass and the second pass is typically minutes to days, depending on the site’s authority and crawl budget. For time-sensitive content, JavaScript-only rendering means longer delays before content is fully indexed.

Other crawlers (Bingbot, Yandex, AI crawlers) have varying levels of JavaScript support. AI crawlers in particular often do not execute JavaScript at all, meaning client-side-rendered content may be effectively invisible to them.

Rendering models

Static rendering. Pages are generated at build time and served as fully-formed HTML. Best for SEO; Googlebot sees everything in the first pass; AI crawlers can extract content without rendering. Examples: Astro, Next.js Static Generation, Hugo, Jekyll.

Server-side rendering (SSR). Pages are generated on each request and served as fully-formed HTML. Equivalent to static rendering for SEO purposes. Examples: Next.js SSR, Nuxt SSR, Remix.

Hybrid / Incremental Static Regeneration. A mix of static and SSR. Most pages are pre-built; some are generated on demand. Equivalent to static for SEO when configured correctly.

Client-side rendering (CSR). Pages are served as a minimal HTML shell with JavaScript that fetches data and renders the content in the browser. Worst for SEO; the initial HTML contains no content; rendering depends on JavaScript execution by the crawler. Examples: traditional Create React App, Vue with no SSR.

Dynamic rendering. A workaround where the server detects bot user agents and serves a pre-rendered HTML version, while users get the client-side version. Google previously recommended this; in 2024 the recommendation was downgraded. Use only as a transitional measure.

For new builds, default to static or SSR. Client-side rendering is acceptable for authenticated dashboards and tools that don’t need to be indexed; it is not appropriate for content pages.

Common JavaScript SEO problems

Content rendered after user interaction. Content that only loads after a click, scroll, or hover is not seen by Googlebot. Tabbed content, “load more” buttons, accordions, and infinite scroll all need to be implemented carefully so the content is in the initial render.

Async content fetched from APIs. When content is fetched client-side from an API, the initial HTML doesn’t include it. Googlebot’s second pass may eventually capture it, but with delay and reduced reliability. Server-render or pre-render this content.

JavaScript-generated meta tags. Title tags, meta descriptions, and canonical tags injected by JavaScript can be missed or mis-prioritised. Render these server-side.

Hash-based URLs (/#/page). Fragment URLs are not separate pages from a crawler perspective. They aren’t indexed independently. Use the History API to produce real URLs.

Render-blocking JavaScript. Heavy synchronous JavaScript in the page head delays rendering, hurting both Core Web Vitals and Googlebot’s ability to process the page within reasonable time bounds.

Infinite scroll without pagination. Content loaded only through scroll triggers is hard for crawlers to access. Implement true pagination URLs underneath the infinite scroll behaviour.

Debugging JavaScript rendering

Search Console URL Inspection. Shows the rendered HTML and screenshot for any URL, as Googlebot saw it. Compare to what users see; differences indicate rendering problems.

Mobile-Friendly Test and Rich Results Test. Both tools render pages with Googlebot’s rendering engine and report what they see. Useful for spot-checking individual URLs.

Crawl with JavaScript rendering enabled. Screaming Frog and Sitebulb both support JavaScript-rendered crawling. Compare HTML-only crawls to JavaScript-rendered crawls; large differences indicate content that is invisible to non-rendering crawlers.

Disable JavaScript in your browser. Open the page with JavaScript disabled. What’s visible is approximately what AI crawlers and basic indexers see. If critical content disappears, that content is not reaching them.

Hydration and selective rendering

Modern frameworks increasingly support partial or selective hydration: rendering most of a page server-side and only hydrating specific interactive components on the client. Astro Islands, React Server Components, and similar approaches make it possible to deliver mostly-static HTML with interactive enhancements where needed.

For SEO, the goal is to maximise the content visible in the initial server response. Interactive components (forms, menus, dynamic widgets) can be client-rendered without affecting indexing as long as the underlying content they enhance is server-rendered.

AI crawlers handle JavaScript inconsistently. GPTBot historically did not render JavaScript; OAI-SearchBot has improved capability but is not a full browser. ClaudeBot, PerplexityBot, and most others fall on the same spectrum.

The practical implication: content that depends on JavaScript rendering is at risk of being invisible to AI search citation, even if Googlebot eventually renders it. For sites prioritising AI-search visibility, server-rendering is more important than ever.

Frequently asked questions

Can Googlebot render React? Yes. Googlebot uses a recent version of Chromium and can execute modern JavaScript including React, Vue, and Angular. The constraints are timing (rendering happens in a separate pass, not in real time) and reliability (rendering occasionally fails for individual pages).

Should I server-render dynamic dashboards? Generally no. Dashboards behind authentication don’t need to be indexed; client-side rendering is appropriate. Server-render content pages and let the dashboard remain client-side.

Does using noscript tags help? Marginally. <noscript> content is processed by Google but should be a fallback, not a primary content delivery mechanism. The right answer is to render the actual content server-side.