Technical SEO Engineering
Forensic engineering for how search engines crawl, parse, and index a site — schema, sitemaps, Core Web Vitals, canonicals, redirects. Code in your repo, not a PDF.
Technical SEO is engineering, not marketing. The work is auditing how Googlebot reaches a site, measuring what it parses, diagnosing what breaks, and shipping the fix as code in the client repository with tests attached. Content strategy and link building are different practices and explicitly out of scope here.
I audit the crawl log before I touch a template. I measure Core Web Vitals from Chrome User Experience Report data and Lighthouse before and after every change. The deliverable is a 301 map, a JSON-LD schema graph, a sitemap generator, or a Core Web Vitals dashboard — named artifacts that live in the repo, not a slide deck that lives on a shelf.
What I ship
- JSON-LD schema graphs. Organization, Article, BreadcrumbList, FAQ, Service, Product, and LocalBusiness types, validated against the Rich Results Test and monitored for regressions after every deploy.
- Database-driven sitemap engineering. Multi-sitemap indexes, priority and changefreq tuned per content tier, low-value pages filtered at the SQL layer instead of generated and then hidden.
- Core Web Vitals remediation. Largest Contentful Paint, Interaction to Next Paint, and Cumulative Layout Shift diagnosed from Chrome User Experience Report field data, then fixed inside the build pipeline — not patched at the edge.
- Crawl-budget discipline. Server log analysis, robots.txt hardened against faceted-URL explosions, pagination and canonical policy that keeps Googlebot on pages that earn revenue.
- Canonical and redirect architecture. 301 maps for replatforms, a written rel=canonical policy across www and non-www, http and https, and trailing-slash variants, and hreflang for multi-locale sites that need it.
- Search Console integration. Rich-result triage, structured-data regression alerts, and post-deploy verification wired into the release process.
Where it fits
Replatform traffic drop
A site moved from one CMS to another. Organic traffic fell thirty percent inside sixty days. The cause is almost always mechanical: a broken 301 map, lost canonicals, schema that no longer validates, or a sitemap pointing at pages the new platform serves as 404. I audit the redirect chain, diagnose the schema regressions, fix them in the repo, and measure the recovery in Search Console crawl stats and impressions.
Rich results disappeared
The Search Console rich-results report turned red. Google tightened a validator and the JSON-LD that worked last quarter no longer parses. I read the actual output, map it back to the current Schema.org spec, fix it at the template layer, and verify clean validation across the next crawl cycle.
Page experience flagged as Poor
Search Console shows Largest Contentful Paint and Interaction to Next Paint failing on mobile across a meaningful share of URLs. I pull the Chrome User Experience Report field data, isolate the offending routes, diagnose render-blocking resources or hydration cost, and ship the fix inside the build pipeline so it survives the next release.
How I work
Every engagement opens with a written audit. Crawl-log sample, Search Console errors, schema output for the top page types, Core Web Vitals on real URLs measured on real hardware, sitemap structure, and the current redirect map. The audit and a prioritized fix list ship before any code changes. The principal carrying the work is described on the about page.
Fixes land as reviewable pull requests with descriptions a non-engineer can follow. Tests run in the pull request, not promised for later. Where the engagement touches programmatic content at scale, I bring patterns that already run in production — sitemap exclusion at the SQL layer, canonical groupings across content tiers, crawl-budget protection on faceted URLs. The patterns on this page are running on the page you are reading, and several are written up in the research notes.
What I will not promise
I will not promise a ranking, a position, or a percentage improvement in organic visits. Search results are owned by the search engine, not by the engineer who fixes the plumbing. The contract is the artifact — the schema graph, the sitemap generator, the redirect map, the Core Web Vitals dashboard — and the measured before-and-after on the signals the engine actually consumes. The ranking that follows from clean signals is the search engine's call, not mine.
Engagement model
Audit-only engagements run two to three weeks and deliver a written report with a prioritized fix list and effort estimates. Audit-plus-remediation engagements run six to ten weeks, longer for replatforms or multi-locale sites. Retainer arrangements cover monthly Search Console triage, schema regression checks, and post-deploy verification for teams shipping continuously. To scope an audit, get in touch.
Technical SEO is the engineering sibling to the primary AI consulting practice on this site. If a site also needs work on how it is read by AI crawlers and answer engines, that is the separate AI SEO and Generative Search Optimization service — a different pipeline with different signals, scoped on its own.