Skip to content
← Back to blog
SEOTips

SEO audit for SMBs: the complete 2026 checklist

An SEO audit is the full diagnosis of your website before any positioning action. Checklist of the 7 essential blocks for SMBs in 2026.

serpixel ·
Professional team reviewing analytics charts and web metrics during a working meeting

Key points

An SEO audit is diagnosis, not execution: The audit identifies what is wrong and why. The work of fixing it comes after, with an action plan prioritized by impact and effort.
The 7 essential blocks cover the entire ranking cycle: Crawling and indexing, on-page optimization, content quality, external authority, Schema.org structured data, Core Web Vitals, and measurement setup. Skip one and the diagnosis is incomplete.
Without Google Search Console there is no real audit: External tools show what their crawlers see. Search Console shows what Google sees. An audit that does not use Search Console works with second-hand data.
The first error to fix is usually technical, not content-related: Sites that are not crawlable, broken redirects, forgotten noindex tags, or misconfigured sitemaps: without that base, no content or linkbuilding action has measurable effect.
A serious audit delivers an action plan, not just a list of problems: Each detected problem must come with a concrete recommendation, an impact estimate, and a priority level. An audit without a plan is a report that stays in the drawer.
The site must be ready for AI search engines, not just Google: ChatGPT, Perplexity, and Gemini cite content based on structure, structured data, and factual density. A 2026 audit must check citability for generative search engines, not just Google ranking.

You have a website. You don’t know whether it is properly configured for Google, you don’t know why you are not appearing for the searches that matter, and before investing in a content or linkbuilding strategy you need to understand where you are right now. That is what an SEO audit does: diagnosis before treatment.

This checklist covers the 7 blocks any serious consultant reviews before proposing actions. Use it to run the audit on your own, or to evaluate whether the audit delivered by an external provider is complete or superficial.

Block 1: Crawling and indexing

Without crawling, there is no ranking. If Google cannot read your pages, any subsequent action is pointless. These are the points to verify:

  • robots.txt present and without accidental blocks of important sections. Check at yourdomain.com/robots.txt and cross-reference with the crawler.
  • XML sitemap generated and submitted to Search Console. It must include all canonical URLs and exclude pages with noindex.
  • noindex tag reviewed page by page. A forgotten noindex on the main landing is the most expensive error you can make.
  • 404 errors identified in Search Console (the “Coverage” report). Popular pages with 404 errors lose external links and authority.
  • Redirects clean: no chain longer than 2 hops, no redirect loops, 301 (permanent) for definitive changes and 302 (temporary) only for specific cases.
  • HTTPS correctly configured without mixed content (images or scripts served via HTTP inside HTTPS pages).
  • Canonical versions defined with <link rel="canonical"> to avoid duplicate content between versions with and without www, with and without trailing slash, and between HTTP and HTTPS.

A single problem in this block can keep a site out of Google for months without the owner noticing. It is the first place to look when rankings are not moving.

Block 2: On-page optimization

Once Google can read the site, you have to optimize what it reads. On-page is the set of elements within each page that tell Google what it is about and why it should rank.

  • Page title unique for each page, within 50-60 characters, with the target keyword at the start.
  • Meta description of 120-160 characters, descriptive, written to be clicked (not to repeat keywords).
  • Heading structure: a single H1 per page, logical hierarchy of H2 and H3, no unjustified jumps.
  • Readable URLs: words separated by hyphens, no unnecessary parameters, no uppercase.
  • Alt text descriptive on every image, especially those that carry meaning (not decorative).
  • Internal links from each page to 3-5 thematically related pages, with natural anchor text.
  • External links to authority sources when content justifies it (cited legislation, referenced studies, official bodies).

This block can be reviewed with a crawler (Screaming Frog does it very well) which will list every page and flag anomalies. Manual review is needed to validate that titles and meta descriptions are well written, not just present.

Block 3: Content quality and freshness

Google has returned to content as a primary ranking factor. Sites with outdated, thin, or duplicate content lose positions consistently from the 2024 core updates onward.

  • Factual density: each page must answer concrete questions with specific data (numbers, dates, names, locations), not generic platitudes.
  • Freshness: no important page should go more than 18 months without review. Dates on the page (publishedAt, updatedAt) must be visible and real.
  • Thin content: pages under 300 words without a concrete commercial reason (category, tag, contact) are candidates to be deleted, merged, or expanded.
  • Topic coverage: for each central topic, the site should have a pillar page (2,000+ words, well structured) and 4-8 satellite pages covering subtopics.
  • Internal duplication: two different articles targeting the same primary keyword create cannibalization and reduce the visibility of both. Consolidate.

For multilingual sites, each language needs adapted content (not literal translation): keywords, FAQs, and cultural references shift by market.

Block 4: External authority

External links to your site remain one of the most predictive ranking factors. The audit must review:

  • Quantity and quality of linking domains, not just the absolute number of links. 10 links from 10 thematically relevant domains beat 100 links from a single blog.
  • Anchor text distribution: most links should carry the brand name or natural URLs, not the exact target keyword repeated (an unnatural pattern that can trigger penalties).
  • Toxic links: spam domains, link farms, autogenerated content. Detectable with tools like Moz or Ahrefs (free versions are limited but useful for spotting the worst cases).
  • Mentions without a link: if the business is mentioned in media or blogs without a link to the site, it is an easy authority recovery opportunity through a request email.
  • Local citations: for local businesses, consistent presence in directories (Google Business Profile, Bing Places, sectoral directories) with the same NAP (name, address, phone).

The authority audit is the hardest one to run on your own without paid tools. If the site is new or has low visibility, this block can be deferred to a second phase.

Block 5: Structured data

Schema.org is the language Google uses to understand what each piece of content is. Without structured data, the site ranks on plain text. With structured data, it can appear as a rich result (rich snippet, visible FAQ, ratings, opening hours).

  • Organization or LocalBusiness on the site, with name, url, logo, address, phone, and sameAs to social profiles.
  • BreadcrumbList on every internal page to help Google understand site hierarchy.
  • Article or BlogPosting on every content piece with author, datePublished, dateModified, and image.
  • FAQPage on FAQ blocks (each properly marked FAQ can generate enrichment in Google).
  • Service or Product on service and product pages, with name, description, provider.
  • Validation: every structured data validated with Google’s Rich Results Test. Errors here block the enrichment.

For generative AI search engines (ChatGPT, Perplexity, Gemini), structured data matters even more: it is the most reliable signal to know what each content piece is and whether it is worth citing as a source.

Block 6: Core Web Vitals and performance

Since 2021, Google uses Core Web Vitals as a ranking factor for mobile search. In 2024, FID was replaced by INP. In 2026 the thresholds are demanding.

  • LCP (Largest Contentful Paint): under 2.5 seconds is good, over 4 seconds is a serious problem.
  • INP (Interaction to Next Paint): under 200 ms is good, over 500 ms is a serious problem.
  • CLS (Cumulative Layout Shift): under 0.1 is good, over 0.25 is a serious problem.
  • Total page weight: ideally under 500 KB initial load. Sites with 5-10 MB initial load will lose positions systematically.
  • Images: modern format (WebP or AVIF), responsive with srcset, lazy loading below the first fold.
  • Fonts: font-display: swap, self-hosted when possible to avoid Google Fonts dependency.
  • JavaScript: no file over 100 KB, deferred loading for anything not essential to first paint.

PageSpeed Insights gives Google’s measurement for these indicators. If the mobile score is below 80, there is urgent work. If it is below 50, the problem is structural and will not be solved without serious technical intervention.

Block 7: Measurement and tracking

An audit without measurement configured delivers an initial diagnosis but does not allow the improvement to be validated. This block must be closed before any action.

  • Google Search Console: property verified, sitemap submitted, alerts for critical errors active.
  • Google Analytics 4: installed with Consent Mode v2 for GDPR, conversion events defined (form submitted, call, download), filters to exclude internal traffic.
  • Conversions: every value action on the site (form, call, booking, purchase) configured as a conversion event in GA4.
  • Search Console linked to GA4 to see search queries inside Analytics.
  • Periodic report: format and frequency defined. Monthly is the standard. The report must include positions for target keywords, evolution, organic conversions, and next steps.

A site without measurement is a site where SEO ROI cannot be demonstrated. Setting this block up properly is the lowest-hour, highest-impact investment of the entire process.

What happens after the audit

A serious audit does not end with a list of problems. It ends with an action plan prioritized by two criteria: expected impact and required effort.

“High impact, low effort” actions (forgotten noindex tags, missing sitemap, duplicate titles) are executed in the first week. “High impact, high effort” actions (site restructuring, pillar content creation, toxic backlink cleanup) are scheduled in quarterly cycles. “Low impact” actions are deferred until the rest is done or removed from the plan.

Without that prioritization, the audit stays as a report and changes nothing the business measures.

Citability for AI search engines: the new 2026 block

ChatGPT, Perplexity, Gemini, and Google’s AI Overviews cite content as a source when three conditions are met: clear semantic structure, high factual density, and self-contained passages.

The 2026 audit must include:

  • Reviewing whether the site has an /llms.txt file that orients models toward the most useful content.
  • Verifying that important passages are written to be cited without context (each paragraph must make sense in isolation).
  • Checking that FAQs are in plain text on the page, not just inside JavaScript scripts that AI crawlers do not execute.
  • Reviewing mentions of the business in sources the models train on (Wikipedia, media, topical communities).

This block is what separates a contemporary audit from one that repeats the 2018 playbook.

How serpixel integrates the SEO audit

The SEO audit is part of the Web + SEO product from serpixel (Clever European Business, S.L.), an AI agent implementation agency for SMBs based in Catalonia, Spain. When a new client enters the Web + SEO product, the first month covers the 7 blocks described, delivers a prioritized plan, and implements urgent fixes before the monthly content and tracking cadence begins.

The client’s human team keeps judgment over the business, ambiguous decisions, and customer relationships. The mechanical layer (crawling, Schema.org validation, Core Web Vitals monitoring, report generation) is covered by the combination of tools and agents serpixel implements from day one. The goal is not to replace human work; it is to free it from what is repetitive so the team focuses on what brings real value.

The process starts with a 30-minute discovery session, no commitment. If all you need right now is the audit, without entering the monthly product, we say so in the same session and evaluate whether it makes sense.


Shall we talk? Book the session and we will tell you what to audit first on your site, before spending a single euro on positioning actions.

Tags

seo auditseo checklistseo for smbsGoogle rankingSearch ConsoleCore Web Vitals

Frequently asked questions

For a website under 50 pages, between 8 and 15 hours of focused work. For larger sites (e-commerce with hundreds of pages, multilingual sites), it can reach 30-60 hours. If someone sells you a complete audit in 2 hours, they are delivering an automated report without human interpretation.
A complete annual audit is the minimum recommended. Light quarterly reviews (Core Web Vitals, Search Console errors, indexation health) prevent problems from accumulating. After a major change (migration, redesign, restructuring), an immediate audit is required.
A basic 5-point audit (unique titles, mobile-friendly, reasonable speed, working forms, fresh content) yes. An audit covering crawling, structured data, Core Web Vitals, and Search Console correlation requires specific training and paid tools.
Google Search Console (free), Google Analytics 4 (free), PageSpeed Insights (free), a crawler (Screaming Frog free up to 500 URLs), a Schema.org validator (Schema.org Validator and Rich Results Test, both free). With these 5 tools you can cover 80% of the audit without paying anything.
An internal audit focuses on your site. Competitor analysis is a complementary task done before defining the strategy: which keywords competitors dominate, which pages have most links, what content structure works in the sector. Both exercises are needed, but they are different exercises with different methods.
Generative search engines (ChatGPT, Perplexity, Gemini, Google's AI Overviews) select content based on clear semantic structure, clean structured data, and self-contained passages with verifiable data. Content that ranks on Google but is not structured to be cited as a source loses a growing share of the traffic.
A manual penalty appears in Search Console as 'manual action'. You document the practice that triggered it (artificial linkbuilding, thin content, badly handled reorganization), fix it, and submit a reconsideration request through Search Console. The process usually takes between 2 and 8 weeks, with no guarantees.
The SEO audit is part of serpixel's Web + SEO product, not a separately invoiced service. When a new client enters the product, the first month includes a full audit, prioritized action plan, and technical implementation of urgent fixes. If the client only wants the audit without ongoing work, we say so in the discovery session.

Related posts

Person writing on a laptop with a notebook beside them
SEO

How to create content that ranks on Google (without being a writer)

You don't need to be a writer to create content that ranks on Google. We explain the method we use at serpixel to generate articles that reach page one in weeks, with real market examples.

Local market in Spain with product stalls under colourful awnings
SEO

Local SEO: the complete guide for businesses in Catalonia

46% of Google searches have local intent. Learn how to rank for 'service + city' searches: Google Business, schema, local content and multilingual strategy.

Professional working at a laptop with web analytics data and search ranking charts visible on screen
SEOWeb Design

SEO Consultant: what they do and how to choose one

An SEO consultant diagnoses why your website doesn't rank on Google and builds a plan to fix it. Learn what they actually deliver, how to evaluate one, and when a small business really needs one.

All posts →