Check robots.txt, XML sitemap, canonical URLs, SSL, and HTTPS — everything search engines need to crawl and index your site.
What you get
Check for crawl directives, sitemap references, wildcard rules, and common pitfalls that block search engines.
Detect your sitemap, validate its format, count URLs, and verify it's referenced in robots.txt.
Verify self-referencing canonicals, detect conflicts between canonical tags and redirects.
Check certificate validity, HSTS headers, mixed content issues, and HTTP-to-HTTPS redirect chains.
FAQ
A technical SEO audit checks the infrastructure of your website to ensure search engines can crawl, index, and render your pages correctly. It covers robots.txt, sitemaps, canonical tags, SSL, site speed, and more.
robots.txt tells search engine crawlers which pages they can and cannot access. A misconfigured robots.txt can accidentally block important pages from being indexed, causing them to disappear from search results.
A canonical URL tells search engines which version of a page is the 'original' when duplicate or similar content exists. Without proper canonicals, search engines may split ranking signals across multiple URLs.
Yes. Google confirmed HTTPS as a ranking signal in 2014. Sites without SSL certificates show 'Not Secure' warnings in browsers, which hurts trust and click-through rates. HTTPS is now considered a baseline requirement.
Run a technical audit at least monthly, and always after major site changes like redesigns, migrations, or CMS updates. Our free tool makes it easy to check anytime.
Learn more
How it works
Paste any URL. We handle normalization, redirects, and protocol detection.
Content, technical SEO, performance, and social tags — scanned and scored in seconds.
Scores, issues, and specific fixes — grouped by category with actionable recommendations.
Paste your URL, get a full SEO report in seconds. No account needed.
Results are shareable via URL. 12 checks across content, technical SEO, performance, and social.