Lighthouse SEO audits check whether search engines can find, crawl, and understand your pages. Unlike Core Web Vitals which measure user experience, SEO audits are binary: you either pass or fail.
A page can score 100 on Performance yet be invisible to Google because of a single misconfigured robots directive.
Lighthouse SEO focuses on technical SEO fundamentals—the stuff that makes or breaks whether your content even enters the index. It doesn't evaluate content quality, keyword optimization, or backlinks. Those matter, but they're meaningless if crawlers can't access your pages.
SEO Score Weight: The SEO category is equally weighted across all passing audits. One failure drops your score significantly.
Google can't rank what it can't see. A noindex directive, broken canonical, or malformed robots.txt removes your page from search entirely. No amount of great content fixes that.
Research from Ahrefs shows 96.55% of pages get zero traffic from Google. Technical SEO issues are often the first barrier—fix those before worrying about content strategy.
The frustrating part: you won't know something's wrong. Your site loads fine for users. Analytics looks normal (for the traffic you do get). But Google's crawler sees a brick wall.
Lighthouse catches these invisible issues before they cost you months of lost rankings.
One bad canonical implementation affects every page using that template. One misconfigured hreflang breaks international SEO across your entire site. Technical SEO issues scale with your content.
Lighthouse runs 8 SEO audits. Each is pass/fail—no partial scores.
| Audit | What It Checks | Why It Matters |
|---|---|---|
| Page isn't blocked from indexing | noindex directives, robots.txt blocking | Blocked pages can't appear in search results |
Document has a valid rel=canonical | Canonical URL validity, conflicts | Invalid canonicals confuse Google about the "real" URL |
| Document has a meta description | Meta description presence | Missing descriptions hurt click-through rates |
Document has a valid hreflang | Language/region alternate links | Invalid hreflang breaks international targeting |
| Page has successful HTTP status code | HTTP 2xx response | 4xx/5xx errors prevent indexing |
| Links are crawlable | Anchor href validity | Uncrawlable links block PageRank flow |
| Links have descriptive text | Link anchor text quality | Generic text ("click here") hurts SEO signals |
| robots.txt is valid | robots.txt syntax | Malformed robots.txt confuses all crawlers |
These audits check whether your page can be indexed at all:
Page isn't blocked from indexing — Checks for noindex in meta tags, X-Robots-Tag headers, or robots.txt disallow rules. Failure means Google won't index the page.
Page has successful HTTP status code — Checks for 4xx or 5xx responses. Pages returning errors aren't indexed.
robots.txt is valid — Parses your robots.txt for syntax errors. A malformed file might accidentally block your entire site.
These affect how well your page ranks and appears in results:
Document has a valid rel=canonical — Checks for conflicting canonicals, invalid URLs, or relative paths. Wrong canonicals can consolidate ranking to the wrong page.
Document has a meta description — Checks presence and content. Missing descriptions mean Google auto-generates snippets—often poorly.
Links have descriptive text — Flags generic anchor text like "click here" or "read more." Descriptive text helps Google understand link context.
These matter for specific use cases:
Document has a valid hreflang — Only relevant for multi-language sites. Checks for valid language codes and fully-qualified URLs.
Links are crawlable — Checks for javascript:void(0), empty hrefs, or invalid URLs. Impacts PageRank flow and content discovery.
Run Lighthouse in Chrome DevTools or via CLI:
lighthouse https://example.com --only-categories=seo
Check the SEO section for failing audits. Click each for specific elements causing failures.
Enter your URL at PageSpeed Insights. The SEO section shows both lab results and any field data issues.
For production sites, Search Console is the authority:
Test how Googlebot sees your page:
curl -A "Googlebot" https://example.com/
Check response headers for X-Robots-Tag:
curl -I https://example.com/ | grep -i x-robots
Different page types have different failure patterns:
| Template | Common Issues | Quick Check |
|---|---|---|
| Homepage | Missing meta description, incorrect canonical | Usually fine |
| Blog posts | No canonical, generic link text | Check CMS output |
| Product pages | Blocked by robots.txt, 404s for variants | Test URL patterns |
| Category pages | Paginated canonical issues, duplicate content | Check pagination |
| Search results | Should be noindexed, but often aren't | Verify intentional |
| User profiles | Accidental noindex, thin content | Often should be blocked |
Most SEO failures have straightforward fixes:
→ See all SEO issues and fixes
SEO issues love to hide. Your homepage passes while 500 blog posts have broken canonicals. Your main pages are fine but a template change silently added noindex to your product catalog.
Testing one URL at a time catches one problem at a time.
Unlighthouse crawls your entire site and runs SEO audits on every page. You'll see exactly which pages fail each audit, which templates are misconfigured, and whether your fixes actually work at scale.
The CLI is free and runs locally. Cloud adds scheduled monitoring to catch regressions—like when a deploy accidentally breaks your robots.txt.