AI-built pages often ship with plausible markup but weak production details: duplicate titles, missing descriptions, stale sitemap entries, preview domains in canonicals, and Open Graph images that work locally but not after deploy.
Start with canonical truth
Pick one production origin and make every signal agree with it. HTML
canonicals, sitemap URLs, Open Graph URLs, llms.txt links, and internal
navigation should all point at the same host and trailing-slash convention.
Check crawl access
Fetch robots.txt and sitemap.xml from the public origin. The sitemap
should list only URLs you want indexed. Robots rules should protect private
areas without blocking the homepage, public assets, sitemap, or agent-facing
discovery files.
Check metadata coverage
Titles and descriptions need to be route-specific. If an AI site generator cloned one layout across every page, treat duplicated metadata as a launch blocker. Add Open Graph tags with absolute image URLs before sharing links.
Check evidence, not screenshots
Screenshots prove the page looks finished to a human. Technical SEO evidence comes from response headers, HTML source, sitemaps, robots policy, and fetch results. Run the robots and sitemap checker or the full readiness scan to collect those signals.