Launch your first site audit by entering your domain, configuring crawl settings, and letting NitroShock analyze technical health.
Running a site audit is one of the most valuable actions you can take to understand your website's technical SEO health. With NitroShock, you can launch your first site audit by entering your domain, configuring crawl settings, and letting the platform analyze your site's technical health across 50+ SEO checkpoints. Whether you're auditing a client's site or your own WordPress installation, the process takes just a few clicks to start and delivers actionable insights you can implement immediately.
Site audits in NitroShock examine everything from page speed and Core Web Vitals to meta tags, heading structure, mobile-friendliness, and technical SEO factors that directly impact your search rankings. Unlike subscription-based tools that force you to pay monthly regardless of usage, NitroShock's credit-based system means you only pay for the audits you run, when you need them.
This guide walks you through launching your first audit, understanding the configuration options, managing credit costs, and interpreting your results.
Before running your first audit, you'll need an active project in your NitroShock account. If you haven't created a project yet, navigate to your Account Dashboard and click the Projects tab, then select + New Project to add your website.
Once you have a project set up, follow these steps to launch your first audit:
You'll see the audit configuration panel where you can customize how NitroShock crawls and analyzes your site. The most important decision at this stage is choosing your crawl scope.
NitroShock offers two primary crawl modes for site audits:
Single Page Audit: Analyzes one specific URL in depth. This option is perfect when you want to check a newly published page, verify fixes you've implemented, or get detailed metrics on a high-priority landing page. Single page audits complete quickly and use minimal credits.
Full Site Crawl: Discovers and analyzes all accessible pages on your domain. The crawler starts at your homepage or specified entry point and follows internal links to map your entire site structure. This comprehensive approach reveals site-wide issues, broken internal links, orphaned pages, and patterns in technical problems across your content.
For your first audit, a full site crawl provides the most value. You'll get a complete picture of your site's technical health and establish a baseline for tracking improvements over time.
In the Domain/URL field, enter the URL you want to audit. You can enter your homepage (like https://example.com) or any specific page as the starting point.
Make sure to include the correct protocol (
https://orhttp://) and use the exact domain variant you want to track. NitroShock treatswww.example.comandexample.comas different targets, so be consistent with the version you want to monitor.
If you've already added tracking targets to your project, these will appear in a dropdown menu for quick selection. This saves time when running regular audits on the same URLs.
For WordPress sites where you've installed the NitroShock plugin, the platform can authenticate and access password-protected staging sites or membership areas. This is particularly useful for auditing sites under development before they go live.
If you're auditing a public site, no additional authentication is needed. The crawler will access pages exactly as search engines would see them.
The audit configuration panel includes several settings that control how NitroShock crawls your site and what data it collects. Understanding these options helps you run more effective audits and manage credit costs.
The Crawl Depth setting determines how many levels deep the crawler will follow internal links from your starting URL. This is measured in "hops" from the entry point:
For most sites, a crawl depth of 3-4 levels is sufficient to discover all important pages. Very large sites with complex navigation might benefit from depth 5 or higher, but this increases both crawl time and credit usage.
Setting an appropriate crawl depth prevents the audit from running indefinitely on very large sites while ensuring you capture all significant pages. If you have 50 pages, depth 3 will likely cover everything. Sites with thousands of pages may need depth 4-5.
The Page Limit setting caps the total number of pages analyzed during a full site crawl. This serves as a safety mechanism to prevent unexpectedly large audits on sites with more pages than anticipated.
You can set limits like 100, 500, 1000, or unlimited pages. Remember that each page audited uses credits, so setting reasonable limits helps control costs, especially when you're learning the platform.
For your first audit, consider these guidelines:
The User Agent dropdown lets you choose whether the crawler identifies itself as desktop or mobile. This affects how responsive sites serve content and which version of your pages gets analyzed.
Desktop User Agent: Crawls as a desktop browser. Use this for sites that aren't mobile-responsive or when you want to analyze the desktop experience specifically.
Mobile User Agent: Crawls as a mobile device (typically smartphone). This is increasingly important as Google uses mobile-first indexing for most sites. The mobile audit checks mobile-specific elements like viewport configuration, touch target sizing, and mobile performance.
For most modern websites, running a mobile audit provides the most relevant data since search engines primarily evaluate your mobile experience. You can always run a second audit with the desktop user agent if you need to compare both versions.
The Respect Robots.txt checkbox determines whether the crawler follows the directives in your site's robots.txt file. This file tells search engine crawlers which parts of your site to access or avoid.
Enabled (recommended for most audits): The crawler honors your robots.txt rules and won't analyze pages blocked from crawlers. This gives you audit results that reflect what search engines actually see.
Disabled: The crawler ignores robots.txt and audits all accessible pages. Use this when you need to audit staging sites or check pages that are temporarily blocked from search engines but still need technical analysis.
For your first audit on a live production site, keep this enabled to get search-engine-accurate results.
The JavaScript Rendering option controls whether the crawler executes JavaScript before analyzing page content. Many modern sites use JavaScript frameworks like React, Vue, or Angular to generate content dynamically.
Enabled: The crawler waits for JavaScript to execute and renders the page fully before analyzing it. This reveals the actual content users and search engines see, including dynamically loaded elements. However, it increases crawl time and credit cost per page.
Disabled: The crawler analyzes the raw HTML without executing JavaScript. This is faster and cheaper but may miss content that loads after the initial page render.
For JavaScript-heavy sites or sites built with WordPress page builders that rely on JavaScript, enable rendering to get accurate results. For traditional WordPress sites with server-rendered HTML, you can typically leave this disabled.
Every site audit uses credits based on the number of pages analyzed and the complexity of the crawl settings you've selected. NitroShock shows you the estimated credit cost before you start the audit, so there are never surprise charges.
The credit cost for an audit depends on these factors:
Number of pages: Each page crawled and analyzed uses credits. A single-page audit uses minimal credits, while a 500-page full site crawl uses credits proportional to the number of pages.
JavaScript rendering: Enabling JavaScript rendering increases the cost per page because it requires additional processing power and time to execute scripts and wait for content to load.
Crawl frequency: Running audits more frequently doesn't change the per-page cost, but it accumulates credit usage over time. Most sites don't need daily audits - weekly or monthly is sufficient for tracking improvements.
When you've configured your audit settings and click the Start Audit button, NitroShock displays a confirmation modal showing the estimated credit cost. This estimate is based on your page limit and crawl settings.
The modal shows:
You can review these details and either confirm to proceed or cancel to adjust your settings. This transparency ensures you're never charged credits without explicit confirmation.
If your credit balance is insufficient, the confirmation modal will prompt you to add credits before running the audit. Click Add Credits to purchase more, or reduce your page limit to fit within your available balance.
Here are practical strategies to control audit costs while still getting valuable insights:
Start with targeted audits: For your first audit, consider limiting pages to 100-200 to understand the platform and identify major issues before running a comprehensive crawl.
Disable JavaScript rendering initially: Unless you know your site requires it, start with JavaScript rendering disabled. You can always run a second audit with rendering enabled if the results suggest missing content.
Use single-page audits for verification: After fixing issues identified in a full crawl, run quick single-page audits on specific URLs to verify your fixes without crawling the entire site again.
Schedule strategically: You don't need to audit daily. Run full audits monthly or after major site changes. Use single-page audits for new content or after implementing fixes.
Monitor historical data: Once you've established a baseline with your first audit, subsequent audits can focus on changed pages or specific site sections rather than always crawling everything.
After confirming your audit settings and credit cost, NitroShock begins crawling and analyzing your site. The time required varies significantly based on several factors.
Single page audits: Usually complete within 30-60 seconds. The system fetches the page, runs all technical checks, and generates the report quickly.
Small site crawls (10-50 pages): Typically finish in 2-5 minutes. The crawler moves through pages rapidly and the analysis processes each page's data efficiently.
Medium site crawls (50-200 pages): Usually take 5-15 minutes depending on page complexity and server response times.
Large site crawls (200-1000+ pages): Can take 15-60 minutes or longer. The platform processes pages in batches and the total time scales with the number of pages and crawl depth.
JavaScript rendering adds significant time to each page, potentially doubling or tripling the total audit duration for large crawls.
While your audit runs, the Site Audit tab displays a progress indicator showing:
You don't need to keep the browser tab open while the audit runs. NitroShock processes audits in the background on its servers, so you can navigate away, close the tab, or even log out. When the audit completes, it will appear in your audit history.
In your Account Dashboard → Settings tab, you can enable email notifications for completed audits. When this is enabled, you'll receive an email when your audit finishes, including a summary of critical issues found and a link to view the full results.
This is particularly useful for large site audits that take 30+ minutes. You can start the audit and continue with other work, then review results when notified.
Understanding what NitroShock does during an audit helps you anticipate results and potential issues:
The crawler respects server resources and includes rate limiting to avoid overwhelming your hosting. If your site is slow to respond, the audit will take longer as the crawler waits for page requests to complete.
Once your audit completes, the results page provides a comprehensive overview of your site's technical health with actionable insights organized for easy prioritization.
The results page opens with a summary dashboard showing:
Overall Score: A numerical score (0-100) representing your site's technical SEO health. This score weighs critical issues more heavily than warnings or informational items.
Issue Breakdown: Visual representation of total issues found, categorized by severity:
Category Scores: Individual scores for Performance, SEO, Accessibility, and Best Practices. This helps you identify which areas need the most attention.
Pages Analyzed: Total count of pages successfully crawled and audited.
This overview gives you an immediate sense of your site's health and which areas need priority attention.
Below the overview, you'll find a detailed list of all identified issues. Use the filtering tools to focus on specific problems:
Filter by Severity: Show only Critical issues, Warnings, or Info items. Start by addressing Critical issues first for maximum SEO impact.
Filter by Category: View only Performance issues, SEO issues, Accessibility problems, or Best Practices recommendations. This helps when you're working with specialists (like having a developer focus on Performance while you handle SEO issues).
Search: Use the search box to find specific issue types, like "meta description" or "404" to quickly locate particular problems.
Sort Options: Order issues by severity (default), number of affected pages, or issue type. Sorting by affected pages helps you prioritize fixes that will improve the most URLs.
Click on any issue to expand its details and see:
Issue Description: Clear explanation of what the problem is and why it matters for SEO or user experience.
Affected Pages: List of all URLs where this issue appears. For widespread problems, this might show dozens or hundreds of pages.
How to Fix: Specific guidance on resolving the issue. NitroShock provides WordPress-specific instructions when applicable, like which plugin settings to adjust or theme files to modify.
SEO Impact: Explanation of how this issue affects your search rankings, traffic, or conversions.
For example, a "Missing Meta Description" issue would show:
Switch to the Pages tab in the results view to see audit data organized by URL instead of by issue type. This view shows:
Click any page to see all issues affecting that specific URL. This view is valuable when optimizing individual pages or troubleshooting why a particular URL has problems.
For pages where performance data is available, NitroShock includes Lighthouse metrics that Google uses for ranking:
Largest Contentful Paint (LCP): How quickly the main content loads. Good LCP is under 2.5 seconds.
First Input Delay (FID): How quickly the page responds to user interactions. Good FID is under 100 milliseconds.
Cumulative Layout Shift (CLS): Measures visual stability (whether elements jump around while loading). Good CLS is under 0.1.
These Core Web Vitals directly impact search rankings and user experience. Pages with poor metrics should be prioritized for optimization.
Click the Export button to download your audit results as:
**CSV