SEO Audits Guide
Lighthouse SEO audits check whether search engines can find, crawl, and understand your pages. Unlike Core Web Vitals which measure user experience, SEO audits are binary: you either pass or fail.
Important distinction: Google doesn't use Lighthouse scores for rankings. John Mueller confirmed they use Core Web Vitals data separately from real users (CrUX), not lab scores. Lighthouse SEO audits are diagnostic tools—they help you find problems, but the score itself isn't a ranking factor.
A page can score 100 on Performance yet be invisible to Google because of a single misconfigured robots directive.
What Lighthouse SEO Measures
Lighthouse SEO focuses on technical SEO fundamentals—the stuff that makes or breaks whether your content even enters the index. It doesn't evaluate content quality, keyword optimization, or backlinks. Those matter, but they're meaningless if crawlers can't access your pages.
SEO Score Weight: The SEO category is equally weighted across all passing audits. One failure drops your score significantly.
Why SEO Audits Matter
Crawlability = Visibility
Google can't rank what it can't see. A noindex directive, broken canonical, or malformed robots.txt removes your page from search entirely. No amount of great content fixes that.
Research from Ahrefs shows 96.55% of pages get zero traffic from Google. Technical SEO issues are often the first barrier—fix those before worrying about content strategy.
Since July 5, 2024, mobile-first indexing is 100% universal—Google only uses the smartphone crawler. If your mobile version has less content or different robots directives than desktop, that's what gets indexed.
Indexability Problems are Silent
The frustrating part: you won't know something's wrong. Your site loads fine for users. Analytics looks normal (for the traffic you do get). But Google's crawler sees a brick wall.
Lighthouse catches these invisible issues before they cost you months of lost rankings.
Compound Effects
One bad canonical implementation affects every page using that template. One misconfigured hreflang breaks international SEO across your entire site. Technical SEO issues scale with your content.
All SEO Audits
Lighthouse runs 8 SEO audits. Each is pass/fail—no partial scores.
| Audit | What It Checks | Why It Matters |
|---|---|---|
| Page isn't blocked from indexing | noindex directives, robots.txt blocking | Blocked pages can't appear in search results |
Document has a valid rel=canonical | Canonical URL validity, conflicts | Invalid canonicals confuse Google about the "real" URL |
| Document has a meta description | Meta description presence | Missing descriptions hurt click-through rates |
Document has a valid hreflang | Language/region alternate links | Invalid hreflang breaks international targeting |
| Page has successful HTTP status code | HTTP 2xx response | 4xx/5xx errors prevent indexing |
| Links are crawlable | Anchor href validity | Uncrawlable links block PageRank flow |
| Links have descriptive text | Link anchor text quality | Generic text ("click here") hurts SEO signals |
| robots.txt is valid | robots.txt syntax | Malformed robots.txt confuses all crawlers |
High Priority: Blocking Issues
These audits check whether your page can be indexed at all:
Page isn't blocked from indexing — Checks for noindex in meta tags, X-Robots-Tag headers, or robots.txt disallow rules. Failure means Google won't index the page. Note: noindex still wastes crawl budget—Google crawls the page before seeing the tag.
Page has successful HTTP status code — Checks for 4xx or 5xx responses. Pages returning errors aren't indexed.
robots.txt is valid — Parses your robots.txt for syntax errors. A malformed file might accidentally block your entire site.
Medium Priority: Ranking Signals
These affect how well your page ranks and appears in results:
Document has a valid rel=canonical — Checks for conflicting canonicals, invalid URLs, or relative paths. Wrong canonicals can consolidate ranking to the wrong page.
Document has a meta description — Checks presence and content. Missing descriptions mean Google auto-generates snippets—often poorly.
Links have descriptive text — Flags generic anchor text like "click here" or "read more." Descriptive text helps Google understand link context.
Lower Priority: Edge Cases
These matter for specific use cases:
Document has a valid hreflang — Only relevant for multi-language sites. Checks for valid language codes and fully-qualified URLs.
Links are crawlable — Checks for javascript:void(0), empty hrefs, or invalid URLs. Impacts PageRank flow and content discovery.
How to Measure SEO
Lighthouse (Per-Page)
Run Lighthouse in Chrome DevTools or via CLI:
lighthouse https://example.com --only-categories=seo
Check the SEO section for failing audits. Click each for specific elements causing failures.
PageSpeed Insights
Enter your URL at PageSpeed Insights. The SEO section shows both lab results and any field data issues.
Google Search Console
For production sites, Search Console is the authority:
- Coverage report shows indexing status
- Core Web Vitals shows real user data
- URL Inspection tests specific pages
Manual Crawler Checks
Test how Googlebot sees your page:
curl -A "Googlebot" https://example.com/
Check response headers for X-Robots-Tag:
curl -I https://example.com/ | grep -i x-robots
Common SEO Issues by Template Type
Different page types have different failure patterns:
| Template | Common Issues | Quick Check |
|---|---|---|
| Homepage | Missing meta description, incorrect canonical | Usually fine |
| Blog posts | No canonical, generic link text | Check CMS output |
| Product pages | Blocked by robots.txt, 404s for variants | Test URL patterns |
| Category pages | Paginated canonical issues, duplicate content | Check pagination |
| Search results | Should be noindexed, but often aren't | Verify intentional |
| User profiles | Accidental noindex, thin content | Often should be blocked |
Fixing SEO Issues
Most SEO failures have straightforward fixes:
- Identify the scope — Is it one page or a template affecting thousands?
- Check the source — Is the issue in HTML, headers, or robots.txt?
- Fix at the right level — Template fix vs. per-page override
- Verify the fix — Re-run Lighthouse, check Search Console
→ See all SEO issues and fixes
Test Your Entire Site
SEO issues love to hide. Your homepage passes while 500 blog posts have broken canonicals. Your main pages are fine but a template change silently added noindex to your product catalog.
Testing one URL at a time catches one problem at a time.
Unlighthouse crawls your entire site and runs SEO audits on every page. You'll see exactly which pages fail each audit, which templates are misconfigured, and whether your fixes actually work at scale.
The CLI is free and runs locally. Cloud adds scheduled monitoring to catch regressions—like when a deploy accidentally breaks your robots.txt.
Related
Troubleshooting
Fix common issues when running Lighthouse with Playwright: port conflicts, authentication problems, flaky scores, and Chrome version mismatches.
Canonical URL
Resolve canonical tag errors that cause duplicate content issues and diluted page authority. Learn to implement rel=canonical correctly.