SEO Audits Guide

Master Lighthouse SEO audits. Understand what each audit checks, why it matters for search visibility, and how to pass every SEO test.
Harlan WiltonHarlan Wilton6 min read Published

Lighthouse SEO audits check whether search engines can find, crawl, and understand your pages. Unlike Core Web Vitals which measure user experience, SEO audits are binary: you either pass or fail.

A page can score 100 on Performance yet be invisible to Google because of a single misconfigured robots directive.

What Lighthouse SEO Measures

Lighthouse SEO focuses on technical SEO fundamentals—the stuff that makes or breaks whether your content even enters the index. It doesn't evaluate content quality, keyword optimization, or backlinks. Those matter, but they're meaningless if crawlers can't access your pages.

SEO Score Weight: The SEO category is equally weighted across all passing audits. One failure drops your score significantly.

Why SEO Audits Matter

Crawlability = Visibility

Google can't rank what it can't see. A noindex directive, broken canonical, or malformed robots.txt removes your page from search entirely. No amount of great content fixes that.

Research from Ahrefs shows 96.55% of pages get zero traffic from Google. Technical SEO issues are often the first barrier—fix those before worrying about content strategy.

Indexability Problems are Silent

The frustrating part: you won't know something's wrong. Your site loads fine for users. Analytics looks normal (for the traffic you do get). But Google's crawler sees a brick wall.

Lighthouse catches these invisible issues before they cost you months of lost rankings.

Compound Effects

One bad canonical implementation affects every page using that template. One misconfigured hreflang breaks international SEO across your entire site. Technical SEO issues scale with your content.

All SEO Audits

Lighthouse runs 8 SEO audits. Each is pass/fail—no partial scores.

AuditWhat It ChecksWhy It Matters
Page isn't blocked from indexingnoindex directives, robots.txt blockingBlocked pages can't appear in search results
Document has a valid rel=canonicalCanonical URL validity, conflictsInvalid canonicals confuse Google about the "real" URL
Document has a meta descriptionMeta description presenceMissing descriptions hurt click-through rates
Document has a valid hreflangLanguage/region alternate linksInvalid hreflang breaks international targeting
Page has successful HTTP status codeHTTP 2xx response4xx/5xx errors prevent indexing
Links are crawlableAnchor href validityUncrawlable links block PageRank flow
Links have descriptive textLink anchor text qualityGeneric text ("click here") hurts SEO signals
robots.txt is validrobots.txt syntaxMalformed robots.txt confuses all crawlers

High Priority: Blocking Issues

These audits check whether your page can be indexed at all:

Page isn't blocked from indexing — Checks for noindex in meta tags, X-Robots-Tag headers, or robots.txt disallow rules. Failure means Google won't index the page.

Page has successful HTTP status code — Checks for 4xx or 5xx responses. Pages returning errors aren't indexed.

robots.txt is valid — Parses your robots.txt for syntax errors. A malformed file might accidentally block your entire site.

Medium Priority: Ranking Signals

These affect how well your page ranks and appears in results:

Document has a valid rel=canonical — Checks for conflicting canonicals, invalid URLs, or relative paths. Wrong canonicals can consolidate ranking to the wrong page.

Document has a meta description — Checks presence and content. Missing descriptions mean Google auto-generates snippets—often poorly.

Links have descriptive text — Flags generic anchor text like "click here" or "read more." Descriptive text helps Google understand link context.

Lower Priority: Edge Cases

These matter for specific use cases:

Document has a valid hreflang — Only relevant for multi-language sites. Checks for valid language codes and fully-qualified URLs.

Links are crawlable — Checks for javascript:void(0), empty hrefs, or invalid URLs. Impacts PageRank flow and content discovery.

How to Measure SEO

Lighthouse (Per-Page)

Run Lighthouse in Chrome DevTools or via CLI:

lighthouse https://example.com --only-categories=seo

Check the SEO section for failing audits. Click each for specific elements causing failures.

PageSpeed Insights

Enter your URL at PageSpeed Insights. The SEO section shows both lab results and any field data issues.

Google Search Console

For production sites, Search Console is the authority:

  • Coverage report shows indexing status
  • Core Web Vitals shows real user data
  • URL Inspection tests specific pages

Manual Crawler Checks

Test how Googlebot sees your page:

curl -A "Googlebot" https://example.com/

Check response headers for X-Robots-Tag:

curl -I https://example.com/ | grep -i x-robots

Common SEO Issues by Template Type

Different page types have different failure patterns:

TemplateCommon IssuesQuick Check
HomepageMissing meta description, incorrect canonicalUsually fine
Blog postsNo canonical, generic link textCheck CMS output
Product pagesBlocked by robots.txt, 404s for variantsTest URL patterns
Category pagesPaginated canonical issues, duplicate contentCheck pagination
Search resultsShould be noindexed, but often aren'tVerify intentional
User profilesAccidental noindex, thin contentOften should be blocked

Fixing SEO Issues

Most SEO failures have straightforward fixes:

  1. Identify the scope — Is it one page or a template affecting thousands?
  2. Check the source — Is the issue in HTML, headers, or robots.txt?
  3. Fix at the right level — Template fix vs. per-page override
  4. Verify the fix — Re-run Lighthouse, check Search Console

See all SEO issues and fixes

Test Your Entire Site

SEO issues love to hide. Your homepage passes while 500 blog posts have broken canonicals. Your main pages are fine but a template change silently added noindex to your product catalog.

Testing one URL at a time catches one problem at a time.

Unlighthouse crawls your entire site and runs SEO audits on every page. You'll see exactly which pages fail each audit, which templates are misconfigured, and whether your fixes actually work at scale.

The CLI is free and runs locally. Cloud adds scheduled monitoring to catch regressions—like when a deploy accidentally breaks your robots.txt.