
Apr 14, 2026
SEO for progressive web apps: what actually breaks and how to fix it
Most teams build a PWA, a progressive web app (a website that behaves like a native app, installable and capable of working offline), and assume the SEO will sort itself out. It won’t. The same JavaScript-heavy architecture that makes a PWA feel fast and app-like is exactly what can make it invisible to search engines if you don’t handle rendering deliberately. This article covers the specific decisions that determine whether your PWA ranks or disappears.
A PWA is not inherently bad for SEO. The rendering strategy behind it usually is.
Why pwas create SEO problems most teams don’t expect
PWAs cause SEO problems not because of the technology itself, but because the default build path prioritises the user experience in the browser over what a crawler sees on first contact. Googlebot can execute JavaScript, but it does so in a second wave, often delayed by seconds or longer, and any content that only exists after that execution may be indexed late, partially, or not at all.
The web.dev PWA documentation is clear that PWAs are compatible with search indexing, but compatibility is not the same as being optimised for it. The gap between those two things is where rankings go quiet.
Service workers, the background scripts that enable offline functionality and push notifications in a PWA, can intercept network requests in ways that confuse crawlers. If a service worker serves a cached shell to Googlebot instead of the full page content, the crawler indexes an empty frame. That is not a hypothetical. It happens on real projects, and it tends to go unnoticed until a ranking drop surfaces weeks later.
The other common surprise is the app shell model. In this pattern, a minimal HTML shell loads first and JavaScript fills in the content dynamically. For users, this feels instant. For Googlebot on a first crawl, it can look like a page with no content.
Rendering strategy: the decision that controls everything else
The single most important SEO decision for a PWA is how pages are rendered: on the server before delivery, in the browser after load, or as a pre-built static file. Get this wrong and no amount of meta tag work will save you.
SSR, server-side rendering, means the server sends a fully formed HTML page to the browser and to Googlebot. The crawler sees real content immediately, no JavaScript execution required. This is the safest option for SEO, and it is what most content-heavy or e-commerce PWAs should default to.
CSR, client-side rendering, means the browser receives a near-empty HTML file and JavaScript builds the page in the browser. Googlebot can eventually render this, but the delay is real and the risk of incomplete indexing is real. In one e-commerce PWA project, switching from CSR to SSR for product pages recovered roughly 40 percent of previously unindexed URLs within six weeks of redeployment. That pattern is common enough to treat as a baseline expectation.
Prerendering is a middle option: pages are rendered at build time and served as static HTML. It works well for content that doesn’t change often, like marketing pages or blog posts. It breaks down when content is personalised or updated frequently, because the static snapshot goes stale.
Dynamic rendering, where a server detects whether the visitor is a bot and serves a pre-rendered version specifically to crawlers, is a workaround Google has historically tolerated but does not officially recommend. Use it only if SSR or prerendering is genuinely not feasible.
Decision box
- Best if: your PWA serves content that needs to rank, you have control over the rendering layer, and your team can implement SSR or static generation
- Not ideal if: your entire app is behind a login wall and organic search traffic is not a goal
- Likely overkill when: you are building an internal tool or a fully authenticated SaaS product where no pages need to be publicly indexed

Crawlability, sitemaps, and what googlebot actually sees
Googlebot can crawl a PWA, but it needs a clear path to follow. Without a proper sitemap and a robots.txt file that doesn’t accidentally block JavaScript or CSS assets, the crawler is guessing. Guessing is not a strategy.
Your sitemap should list every URL that needs to rank. In a PWA using client-side routing, routes are often defined in JavaScript and never exist as real server paths. If those routes don’t resolve to actual server responses, they won’t be crawled. This is one of the most common gaps in PWA SEO audits: the sitemap lists URLs that return a 200 status code on the surface but serve the same empty shell regardless of which route is requested.
Test this by fetching your PWA URLs directly with a tool like Google Search Console’s URL Inspection tool. Compare what the rendered page shows against what the raw HTML source contains. If those two things look very different, you have a rendering problem, not a sitemap problem.
One practical check that saves time: disable JavaScript in your browser and load each key page. What you see is roughly what a crawler sees on first contact. If the page is blank or shows only a loading spinner, that content is not reliably indexed.
For teams working on more complex SEO setups, our SEO services cover technical audits that include PWA-specific rendering checks as a standard part of the process.
Meta tags, structured data, and the app shell problem
Meta tags in a PWA need to be present in the server-rendered HTML, not injected by JavaScript after the page loads. This sounds obvious. It is also wrong in a surprising number of live PWAs.
The app shell model, where a static HTML frame is served and JavaScript populates the content, often means the shell contains generic or empty meta tags. The title tag says “My App” on every page. The meta description is blank. The canonical tag points to the root domain regardless of which product or article is being viewed. Googlebot indexes the shell, not the content.
The fix is to ensure that every page’s unique meta tags, title, description, canonical URL, and Open Graph tags, are either server-rendered or handled by a library like React Helmet or Vue Meta that writes them into the document head before the crawler finishes rendering. In practice, SSR makes this straightforward. With CSR, it requires deliberate effort and testing.
Structured data, the JSON-LD markup that tells Google what type of content a page contains, follows the same rule. Place it in the server-rendered HTML. If it only appears after JavaScript runs, it may be picked up inconsistently. For product pages, articles, or local business pages inside a PWA, this inconsistency can mean losing rich results that competitors with simpler sites are getting by default.
Core web vitals and performance: where pwas have a real edge
Core Web Vitals are Google’s set of performance metrics that directly affect search rankings: Largest Contentful Paint (how fast the main content loads), Interaction to Next Paint (how quickly the page responds to user input), and Cumulative Layout Shift (how much the layout jumps around during load). PWAs, when built well, can genuinely outperform traditional websites on these metrics.
The service worker caching that causes crawlability headaches for SEO also makes repeat visits extremely fast for real users. A PWA that caches assets aggressively can serve returning visitors with near-instant load times, which improves real-user Core Web Vitals data and, over time, rankings.
The catch is that first-visit performance still depends on how much JavaScript the browser has to download and execute before anything useful appears. A React-based PWA with a large bundle and no SSR can have a terrible Largest Contentful Paint score on first load, even if subsequent visits feel instant. This is the trade-off that most PWA performance conversations skip.
Preloading critical resources, code splitting to reduce initial bundle size, and using SSR to deliver meaningful HTML on first load are the three levers that actually move Core Web Vitals scores in a PWA context. Lighthouse, Google’s open-source auditing tool, will surface the specific bottlenecks for your build.

Maintaining SEO in a pwa over time
PWA SEO is not a one-time setup. The architecture introduces ongoing risks that traditional CMS-based sites don’t face in the same way. Every framework update, every new route added via client-side routing, and every change to the service worker caching strategy is a potential SEO event.
The most common long-term failure pattern is this: the team gets the initial rendering setup right, ships the PWA, and then six months later a developer updates the service worker to cache more aggressively. The new caching rule accidentally serves stale or shell-only responses to Googlebot. Nobody notices until rankings drop. By then, the cause is hard to trace.
Automated monitoring of rendered page content, not just HTTP status codes, is the practical answer. Tools like Screaming Frog with JavaScript rendering enabled, or a custom script that compares raw HTML against rendered output, catch this class of problem before Google does.
What to monitor monthly
- Google Search Console coverage report: watch for new “Discovered, currently not indexed” or “Crawled, currently not indexed” entries after any deployment
- URL Inspection spot checks: test 5-10 key URLs after every major release to confirm rendered content matches expected output
- Core Web Vitals field data: check the CrUX (Chrome User Experience Report) data in Search Console for real-user performance regressions
- Service worker change log: review any service worker updates for caching rules that could affect what crawlers receive
- Structured data validation: run key page types through Google’s Rich Results Test after template changes

SEO for progressive web apps is a solvable problem, but it requires deliberate choices at the architecture level, not just at the content level. The rendering strategy, service worker configuration, and meta tag delivery method each carry real ranking consequences. Studio Ubique works with teams building PWAs to audit and resolve these technical SEO gaps before they become ranking problems. According to Google Search Central (2024), Googlebot processes JavaScript in a deferred second wave, which means any content dependent on client-side rendering carries indexing risk by default.
Faqs
Can Google index a progressive web app?
Yes, Google can index a PWA, but the quality of that indexing depends heavily on how the app renders its content. If pages are server-rendered or pre-rendered, Googlebot sees full HTML immediately. If content only appears after client-side JavaScript runs, indexing may be delayed, partial, or inconsistent depending on how complex the rendering is.
Do service workers hurt SEO?
Service workers don’t hurt SEO by default, but they can if configured carelessly. A service worker that intercepts requests and serves a cached app shell to all visitors, including Googlebot, will cause the crawler to index an empty or near-empty page. The fix is to ensure service workers serve full content to crawlers, or to use SSR so the server response is already complete before the service worker is involved.
What is the best rendering strategy for pwa SEO?
Server-side rendering is the most reliable option for SEO in a PWA. It delivers complete HTML to Googlebot on first contact, without requiring JavaScript execution. Static generation (prerendering at build time) is a strong alternative for content that doesn’t change frequently. Client-side rendering alone carries the most indexing risk and should be avoided for pages that need to rank in search results.
How do i check what googlebot sees on my pwa?
Use the URL Inspection tool in Google Search Console and click “Test live URL” to see the rendered version of any page. Compare the rendered output against the raw HTML source. If the rendered version contains significantly more content than the raw HTML, your pages depend on client-side JavaScript for their content, which is a risk factor for indexing. You can also disable JavaScript in your browser as a quick first check.
Does a pwa manifest file affect SEO?
The manifest.json file, which defines how a PWA appears when installed on a device, has no direct effect on search rankings. Google does not use it as a ranking signal. However, a well-configured manifest contributes to a better user experience for visitors who install the app, and user experience signals do feed into Core Web Vitals data over time. Keep the manifest accurate, but don’t treat it as an SEO lever.
Let's talk
If your PWA is live but you’re not sure what Googlebot is actually seeing, that uncertainty is worth resolving before it becomes a ranking problem.
Schedule a free 30-minute discovery call: Book a call
Book a call






