Audit Modules
Understand what each of our 13 modules analyzes
Technical SEO Module
The Technical SEO module examines the foundational elements that search engines rely on to crawl, index, and understand your pages. It checks for the presence and correctness of essential meta tags including the title tag (verifying it exists, is unique, and falls within the recommended 50-60 character length), the meta description (checking for existence and optimal length of 120-155 characters), and the meta robots tag (ensuring you are not accidentally blocking indexing with a noindex directive).
This module also validates your robots.txt file, checking that it is accessible, properly formatted, and not blocking important resources like CSS, JavaScript, or image files that search engines need to render your pages. It examines your canonical tags to ensure every page specifies its preferred URL, preventing duplicate content issues. The module checks for a valid XML sitemap reference in your robots.txt and verifies basic URL structure for SEO friendliness.
Security headers are another key component. The module verifies HTTPS enforcement, checks for proper HTTP-to-HTTPS redirects, and examines security headers like Content-Security-Policy, X-Frame-Options, and Strict-Transport-Security. Pages served over HTTP or with mixed content warnings are flagged as critical issues, as Google has confirmed HTTPS is a ranking signal and users increasingly distrust non-secure sites.
Content Quality Module
The Content Quality module evaluates your page content against Google's E-E-A-T (Experience, Expertise, Authoritativeness, Trustworthiness) framework. It analyzes the depth, structure, and readability of your text content to determine whether it meets the quality threshold that modern search engines expect. The module checks word count relative to the content type, heading structure (verifying logical H1 through H6 hierarchy), and paragraph length for readability.
Readability scoring is based on established formulas that estimate how accessible your content is to your target audience. The module flags content that is excessively complex for a general audience or too thin to adequately cover the topic. It also checks for the presence of author information, publication dates, and update timestamps, all of which contribute to E-E-A-T signals that Google's quality raters evaluate.
The module examines internal linking within the content, checking that you link to related pages on your site and that anchor text is descriptive rather than generic ('click here'). It also evaluates whether your content includes elements that demonstrate first-hand experience, such as original images, specific examples, and detailed how-to instructions. Pages with high E-E-A-T scores tend to rank better for competitive queries, especially in YMYL (Your Money or Your Life) categories.
AI Crawler Module
The AI Crawler module is what sets SEO Audit My Website apart from traditional SEO tools. It specifically checks whether your site is accessible and optimized for the new generation of AI crawlers including GPTBot (OpenAI/ChatGPT), ClaudeBot (Anthropic/Claude), and PerplexityBot (Perplexity AI). These crawlers index your content for use in AI-generated answers, conversational search, and retrieval-augmented generation systems.
The module first checks your robots.txt for rules that affect AI crawler user agents. Many sites inadvertently block GPTBot or ClaudeBot through broad disallow rules targeting unknown bots. It then checks for the presence of an llms.txt file at your domain root, an emerging standard that provides AI systems with a machine-readable summary of your site's purpose, key pages, and content structure. The module validates the format and completeness of your llms.txt if one exists.
Content structure analysis evaluates how well your page content is formatted for AI extraction. The module checks for clear question-and-answer patterns, structured data that AI systems can parse (FAQ, HowTo, and Product schemas), and content organization that facilitates accurate quoting. It also evaluates whether your content includes verifiable facts, named entities, and cited sources that AI systems can cross-reference. Each finding includes a specific recommendation for improvement, from adding missing schema types to restructuring content sections for better AI readability.
Schema Validation Module
The Schema Validation module inspects all JSON-LD structured data on your page and validates it against the Schema.org vocabulary and Google's specific structured data requirements. It identifies every `<script type="application/ld+json">` block on the page, parses the JSON content, and checks for syntax errors, missing required properties, and incorrect value types.
For each schema type found, the module checks whether all required properties are present and correctly formatted. For example, an Article schema must include headline, author, datePublished, and image properties to be eligible for rich results in Google Search. A Product schema needs name, image, and at least one Offer with price and priceCurrency. The module flags missing recommended properties separately from missing required properties, so you can prioritize fixes that affect rich result eligibility first.
The module also checks for common implementation mistakes that can prevent your structured data from being processed correctly. These include: multiple conflicting schema types on a single page, nested entities that reference undefined @id values, date formats that do not follow ISO 8601 standards, and URL properties that point to non-existent pages. Results include a severity rating (error, warning, or info) and a direct link to the relevant Google documentation for each finding.