A method SEO audit analyzes nan method aspects of a website related to hunt motor optimization. It ensures hunt engines for illustration Google tin crawl, index, and rank pages connected your site.
In a method SEO audit, you'll look astatine (and fix) things that could:
- Slow down your site
- Make it difficult for hunt engines to understand your content
- Make it difficult for your pages to look successful hunt results
- Affect really users interact pinch your tract connected different devices
- Impact your site's security
- Create copy contented issues
- Cause navigation problems for users and hunt engines
- Prevent important pages from being found
Identifying and fixing specified method issues thief hunt engines amended understand and rank your content. Which tin mean improved integrated hunt visibility and postulation complete time.
How to Perform a Technical SEO Audit
You’ll request 2 main devices for a method tract audit:
- Google Search Console
- A crawl-based tool, for illustration Semrush’s Site Audit
If you haven't utilized Search Console before, cheque retired our beginner's guide. We’ll talk nan tool’s various reports below.
And if you’re caller to Site Audit, sign up for free relationship to travel on pinch this guide.
The Site Audit instrumentality scans your website and provides information astir each page it crawls. The study it generates shows you a assortment of method SEO issues.
In a dashboard for illustration this:

To group up your first crawl, create a project.

Next, caput to nan Site Audit instrumentality and prime your domain.

The “Site Audit Settings” model will popular up. Here, configure nan basics of your first crawl. Follow this detailed setup guide for help.

Finally, click “Start Site Audit.”

After nan instrumentality crawls your site, it generates an overview of your site's health.

This metric grades your website wellness connected a standard from 0 to 100. And really you comparison pinch different sites successful your industry.
Your tract issues are ordered by severity done nan “Errors,” “Warnings,” and “Notices” categories. Or attraction connected circumstantial areas of method SEO pinch “Thematic Reports.”

Toggle to nan “Issues” tab to spot a complete database of each tract issues. Along pinch nan number of affected pages.

Each rumor includes a “Why and really to hole it” link.

The issues you find present will fresh into 1 of 2 categories, depending connected your accomplishment level:
- Issues you tin hole connected your own
- Issues a developer aliases strategy administrator mightiness request to thief you fix
Conduct a method SEO audit connected immoderate caller tract you activity with. Then, audit your tract astatine slightest erstwhile per 4th (ideally monthly). Or whenever you spot a diminution successful rankings.
1. Spot and Fix Crawlability and Indexability Issues
Crawlability and indexability are a important facet of SEO. Because Google and different hunt engines must beryllium capable to crawl and scale your webpages successful bid to rank them.
Google's bots crawl your tract by pursuing links to find pages. They publication your contented and codification to understand each page.
Google past stores this accusation successful its index—a monolithic database of web content.
When personification performs a Google search, Google checks its scale to return applicable results.

To cheque if your tract has immoderate crawlability aliases indexability issues, spell to nan “Issues” tab successful Site Audit.
Then, click “Category” and prime “Crawlability.”

Repeat this process pinch nan “Indexability” category.
Issues connected to crawlability and indexability will often beryllium astatine nan apical of nan results successful nan “Errors” section. Because they’re often much serious. We'll screen respective of these issues.

Now, let’s look astatine 2 important website files—robots.txt and sitemap.xml—that person a immense effect connected really hunt engines observe your site.
Spot and Fix Robots.txt Issues
Robots.txt is simply a website matter record that tells hunt engines which pages they should aliases shouldn’t crawl. It tin usually beryllium recovered successful nan guidelines files of nan site: https://domain.com/robots.txt.
A robots.txt record helps you:
- Point hunt motor bots distant from backstage folders
- Keep bots from overwhelming server resources
- Specify nan location of your sitemap
A azygous statement of codification successful robots.txt tin forestall hunt engines from crawling your full site. Make judge your robots.txt record doesn't disallow immoderate files aliases page you want to look successful hunt results.
To cheque your robots.txt file, unfastened Site Audit and scroll down to nan “Robots.txt Updates” container astatine nan bottom.

Here, you'll spot if nan crawler has detected nan robots.txt record connected your website.
If nan record position is “Available,” reappraisal your robots.txt record by clicking nan nexus icon adjacent to it.
Or, attraction only connected nan robots.txt record changes since nan past crawl by clicking nan “View changes” button.

Further reading: Reviewing and fixing nan robots.txt record requires method knowledge. Always travel Google's robots.txt guidelines. Read our guideline to robots.txt to study astir its syntax and champion practices.
To find further issues, unfastened nan “Issues” tab and hunt “robots.txt.”

Some issues include:
- Robots.txt record has format errors: Your robots.txt record mightiness person mistakes successful its setup. This could accidentally artifact important pages from hunt engines aliases let entree to backstage contented you don't want shown.
- Sitemap.xml not indicated successful robots.txt: Your robots.txt record doesn't mention wherever to find your sitemap. Adding this accusation helps hunt engines find and understand your tract building much easily.
- Blocked soul resources successful robots.txt: You mightiness beryllium blocking important files (like CSS aliases JavaScript) that hunt engines request to decently position and understand your pages. This tin wounded your hunt rankings.
- Blocked outer resources successful robots.txt: Resources from different websites that your tract uses (like CSS, JavaScript, and image files) mightiness beryllium blocked. This tin forestall hunt engines from afloat knowing your content.
Click nan nexus highlighting nan recovered issues.

Inspect them successful item to study really to hole them.

Further reading: Besides nan robot.txt file, location are 2 different ways to supply instructions for hunt motor crawlers: nan robots meta tag and x-robots tag. Site Audit will alert you of issues related to these tags. Learn really to usage them successful our guide to robots meta tags.
Spot and Fix XML Sitemap Issues
An XML sitemap is simply a record that lists each nan pages you want hunt engines to scale and rank.
Review your XML sitemap during each method SEO audit to guarantee it includes each pages you want to rank.
Also cheque that nan sitemap doesn’t see pages you don’t want successful nan SERPs. Like login pages, customer relationship pages, aliases gated content.
Next, cheque whether your sitemap useful correctly.
The Site Audit instrumentality tin observe communal sitemap-related issues, specified as:
- Format errors: Your sitemap has mistakes successful its setup. This could confuse hunt engines, causing them to disregard your sitemap entirely.
- Incorrect pages found: You've included pages successful your sitemap that shouldn't beryllium there, for illustration copy contented aliases correction pages. This tin discarded your crawl fund and confuse hunt engines.
- File is excessively large: Your sitemap is bigger than hunt engines prefer. This mightiness lead to incomplete crawling of your site.
- HTTP URLs successful sitemap.xml for HTTPS site: Your sitemap lists unsecure versions of your pages connected a unafraid site. This mismatch could mislead hunt engines.
- Orphaned pages: You've included pages successful your sitemap that aren't linked from anyplace other connected your site. This could discarded nan crawl fund connected perchance outdated aliases unimportant pages.
To find and hole these issues, spell to nan “Issues” tab and type “sitemap” successful nan hunt field:

You tin besides usage Google Search Console to place sitemap issues.
Visit nan “Sitemaps” study to submit your sitemap to Google, position your submission history, and reappraisal immoderate errors.
Find it by clicking “Sitemaps” nether nan “Indexing” section.

If you spot “Success” listed adjacent to your sitemap, location are nary errors. But nan different 2 statuses—“Has errors” and “Couldn’t fetch”—indicate a problem.

If location are issues, nan study will emblem them individually. Follow Google's troubleshooting guide to hole them.
Further reading: If your tract doesn't person a sitemap.xml file, publication our guideline connected how to create an XML sitemap.
2. Audit Your Site Architecture
Site architecture refers to nan level of your webpages and really they are connected done links. Organize your website truthful it’s logical for users and easy to support arsenic your website grows.
Good tract architecture is important for 2 reasons:
- It helps hunt engines crawl and understand nan relationships betwixt your pages
- It helps users navigate your site
Let's see 3 cardinal aspects of tract architecture. And really to analyse them pinch nan method SEO audit tool.
Site Hierarchy
Site level (or tract structure) is really your pages are organized into subfolders.
To understand site's hierarchy, navigate to nan “Crawled Pages” tab successful Site Audit.

Then, move nan position to “Site Structure.”

You’ll spot your website’s subdomains and subfolders. Review them to make judge nan level is organized and logical.
Aim for a level tract architecture, which looks for illustration this:

Ideally, it should only return a personification 3 clicks to find nan page they want from your homepage.
When it takes much than 3 clicks to navigate your site, its level is excessively deep. Search engines see pages heavy successful nan level to beryllium little important aliases applicable to a hunt query.
To guarantee each your pages fulfill this requirement, enactment wrong nan “Crawled Pages” tab and move backmost to nan “Pages” view.

Then, click “More filters” and prime nan pursuing parameters: “Crawl Depth” is “4+ clicks.”

To hole this issue, adhd soul links to pages that are excessively heavy successful nan site’s structure.
Navigation
Your site's navigation (like menus, footer links, and breadcrumbs) should make it easier for users to navigate your site.
This is an important pillar of bully website architecture.
Your navigation should be:
- Simple. Try to debar mega menus aliases non-standard names for paper items (like “Idea Lab” alternatively of “Blog”)
- Logical. It should bespeak nan level of your pages. A awesome measurement to execute this is to usage breadcrumbs.
Breadcrumbs are a secondary navigation that shows users their existent location connected your site. Often appearing arsenic a statement of links astatine nan apical of a page. Like this:

Breadcrumbs thief users understand your tract building and easy move betwixt levels. Improving some personification acquisition and SEO.
No instrumentality tin thief you create user-friendly menus. You request to reappraisal your website manually and travel UX champion practices for navigation.
URL Structure
Like a website’s hierarchy, a site’s URL building should beryllium accordant and easy to follow.
Let's opportunity a website visitant follows nan paper navigation for girls’ shoes:
Homepage > Children > Girls > Footwear
The URL should reflector nan architecture: domain.com/children/girls/footwear
Some sites should besides see utilizing a URL building that shows a page aliases website is applicable to a circumstantial country. For example, a website for Canadian users of a merchandise whitethorn usage either “domain.com/ca” aliases “domain.ca.”
Lastly, make judge your URL slugs are user-friendly and travel champion practices.
Site Audit identifies communal issues pinch URLs, specified as:
- Use of underscores successful URLs: Using underscores (_) alternatively of hyphens (-) successful your URLs tin confuse hunt engines. They mightiness spot words connected by underscores arsenic a azygous word, perchance affecting your rankings. For example, "blue_shoes" could beryllium publication arsenic "blueshoes" alternatively of "blue shoes".
- Too galore parameters successful URLs: Parameters are URL elements that travel aft a mobility mark, for illustration "?color=blue&size=large". They thief pinch tracking. Having excessively galore tin make your URLs agelong and confusing, some for users and hunt engines.
- URLs that are excessively long: Some browsers mightiness person problem processing URLs that transcend 2,000 characters. Short URLs are besides easier for users to retrieve and share.

3. Fix Internal Linking Issues
Internal links constituent from 1 page to different wrong your domain.
Internal links are an basal portion of a bully website architecture. They administer nexus equity (also known arsenic “link juice” aliases “authority”) crossed your site. Which helps hunt engines place important pages.
As you amended your site’s structure, cheque nan wellness and position of its soul links.
Refer backmost to nan Site Audit study and click “View details” nether your “Internal Linking” score.

In this report, you’ll spot a breakdown of your site's soul nexus issues.

Broken soul links—links that constituent to pages that nary longer exist—are a common soul linking mistake. And are reasonably easy to fix.
Click nan number of issues successful nan “Broken soul links” correction connected your “Internal Link Issues” report. And manually update nan surgery links successful nan list.

Another easy hole is orphaned pages. These are pages pinch nary links pointing to them. Which intends you can’t summation entree to them via immoderate different page connected nan aforesaid website.
Check nan “Internal Links” barroom chart to look for pages pinch zero links.

Add astatine slightest 1 soul nexus to each of these pages.
Use nan “Internal Link Distribution” chart to spot nan distribution of your pages according to their Internal LinkRank (ILR).
ILR shows really beardown a page is successful position of soul linking. The person to 100, nan stronger a page.

Use this metric to study which pages could use from further soul links. And which pages you tin usage to administer much nexus equity crossed your domain.
But don’t proceed fixing issues that could person been avoided. Follow these soul linking champion practices to debar issues successful nan future:
- Make soul linking portion of your contented creation strategy
- Every clip you create a caller page, nexus to it from existing pages
- Don’t nexus to URLs that person redirects (link to nan redirect destination instead)
- Link to applicable pages and usage applicable anchor text
- Use soul links to show hunt engines which pages are important
- Don't usage excessively galore soul links (use communal consciousness here—a modular blog station apt doesn't request 50 soul links)
- Learn astir nofollow attributes and usage them correctly
4. Spot and Fix Duplicate Content Issues
Duplicate content intends aggregate webpages incorporate identical aliases astir identical content.
It tin lead to respective problems, including:
- SERPs displaying an incorrect type of your page
- The astir applicable pages not performing good successful SERPs
- Indexing problems connected your site
- Splitting your page authority betwixt copy versions
- Increased trouble successful search your content's performance
Site Audit flags pages arsenic copy contented if their contented is astatine slightest 85% identical.

Duplicate contented tin hap for 2 communal reasons:
- There are aggregate versions of URLs
- There are pages pinch different URL parameters
Multiple Versions of URLs
For example, a tract whitethorn have:
- An HTTP version
- An HTTPS version
- A www version
- A non-www version
For Google, these are different versions of nan site. So if your page runs connected much than 1 of these URLs, Google considers it a duplicate.
To hole this issue, prime a preferred type of your tract and group up a sitewide 301 redirect. This will guarantee only 1 type of each page is accessible.
URL Parameters
URL parameters are other elements of a URL utilized to select aliases benignant website content. They're commonly utilized for merchandise pages pinch flimsy changes (e.g., different colour variations of nan aforesaid product).
You tin place them because by nan mobility people and adjacent sign.

Because URLs pinch parameters person almost nan aforesaid contented arsenic their counterparts without parameters, they tin often beryllium identified arsenic duplicates.
Google usually groups these pages and tries to prime nan champion 1 to usage successful hunt results. Google will typically place nan astir applicable type of nan page and show that successful hunt results—while consolidating ranking signals from nan copy versions.
Nevertheless, Google recommends these actions to trim imaginable problems:
- Reduce unnecessary parameters
- Use canonical tags pointing to nan URLs pinch nary parameters
Avoid crawling pages pinch URL parameters erstwhile mounting up your SEO audit. To guarantee nan Site Audit instrumentality only crawls pages you want to analyze—not their versions pinch parameters.
Customize nan “Remove URL parameters” conception by listing each nan parameters you want to ignore:

To entree these settings later, click nan settings (gear) icon successful nan top-right corner, past click “Crawl sources: Website” nether nan Site Audit settings.

5. Audit Your Site Performance
Site velocity is simply a important facet of nan wide page acquisition and has agelong been a Google ranking factor.
When you audit a tract for speed, see 2 information points:
- Page speed: How agelong it takes 1 webpage to load
- Site speed: The mean page velocity for a sample group of page views connected a site
Improve page speed, and your tract velocity improves.
This is specified an important task that Google has a instrumentality specifically made to reside it: PageSpeed Insights.

A fistful of metrics power PageSpeed scores. The 3 astir important ones are called Core Web Vitals.
They include:
- Largest Contentful Paint (LCP): measures really accelerated nan main contented of your page loads
- Interaction to Next Paint (INP): measures really quickly your page responds to personification interactions
- Cumulative Layout Shift (CLS): measures really visually unchangeable your page is

PageSpeed Insights provides specifications and opportunities to amended your page successful 4 main areas:
- Performance
- Accessibility
- Best Practices
- SEO

But PageSpeed Insights tin only analyse 1 URL astatine a time. To get nan sitewide view, usage Semrush's Site Audit.
Head to nan “Issues” tab and prime nan “Site Performance” category.
Here, you tin spot each nan pages a circumstantial rumor affects—like slow load speed.

There are besides 2 elaborate reports dedicated to performance—the “Site Performance” study and nan “Core Web Vitals” report.
Access some from nan Site Audit Overview.

The “Site Performance” study provides an further “Site Performance Score.” Or a breakdown of your pages by their load velocity and different useful insights.

The Core Web Vitals study will break down your Core Web Vitals metrics based connected 10 URLs. Track your capacity complete clip pinch nan “Historical Data” graph.
Or edit your database of analyzed pages truthful nan study covers various types of pages connected your tract (e.g., a blog post, a landing page, and a merchandise page).
Click “Edit list” successful nan “Analyzed Pages” section.

Further reading: Site capacity is simply a wide taxable and 1 of nan astir important aspects of method SEO. To study much astir nan topic, cheque retired our page velocity guide, arsenic good arsenic our elaborate guideline to Core Web Vitals.
6. Discover Mobile-Friendliness Issues
As of January 2024, much than half (60.08%) of web postulation happens connected mobile devices.
And Google chiefly indexes nan mobile type of each websites complete nan desktop version. (Known arsenic mobile-first indexing.)
So guarantee your website useful perfectly connected mobile devices.
Use Google’s Mobile-Friendly Test to quickly cheque mobile usability for circumstantial URLs.
And usage Semrush to cheque 2 important aspects of mobile SEO: viewport meta tag and AMPs.
Just prime nan “Mobile SEO” class successful nan “Issues” tab of nan Site Audit tool.

A viewport meta tag is an HTML tag that helps you standard your page to different surface sizes. It automatically alters nan page size based connected nan user’s instrumentality erstwhile you person a responsive design.
Another measurement to amended nan tract capacity connected mobile devices is to usage Accelerated Mobile Pages (AMPs), which are stripped-down versions of your pages.
AMPs load quickly connected mobile devices because Google runs them from its cache alternatively than sending requests to your server.
If you usage AMPs, audit them regularly to make judge you’ve implemented them correctly to boost your mobile visibility.
Site Audit will trial your AMPs for various issues divided into 3 categories:
- AMP HTML issues
- AMP style and layout issues
- AMP templating issues
7. Spot and Fix Code Issues
Regardless of what a webpage looks for illustration to quality eyes, hunt engines only spot it arsenic a bunch of code.
So, it’s important to usage due syntax. And applicable tags and attributes that thief hunt engines understand your site.
During your method SEO audit, show different parts of your website codification and markup. Including HTML (which includes various tags and attributes), JavaScript, and system data.
Let’s excavation into these.
Meta Tag Issues
Meta tags are matter snippets that supply hunt motor bots pinch further information astir a page’s content. These tags are coming successful your page’s header arsenic a portion of HTML code.
We've already covered nan robots meta tag (related to crawlability and indexability) and nan viewport meta tag (related to mobile-friendliness).
You should understand 2 different types of meta tags:
- Title tag: Indicates nan title of a page. Search engines usage title tags to shape nan clickable bluish nexus successful nan hunt results. Read our guide to title tags to study more.
- Meta description: A little explanation of a page. Search engines usage it to shape nan snippet of a page successful nan hunt results. Although not straight tied to Google’s ranking algorithm, a well-optimized meta explanation has different imaginable SEO benefits for illustration improving click-through rates and making your hunt consequence guidelines retired from competitors.

To spot issues related to meta tags successful your Site Audit report, prime nan “Meta tags” class successful nan “Issues” tab.

Here are immoderate communal meta tag issues you mightiness find:
- Missing title tags: A page without a title tag whitethorn beryllium seen arsenic debased value by hunt engines. You're besides missing an opportunity to show users and hunt engines what your page is about.
- Duplicate title tags: When aggregate pages person nan aforesaid title, it's difficult for hunt engines to find which page is astir applicable for a hunt query. This tin wounded your rankings.
- Title tags that are excessively long: If your title exceeds 70 characters, it mightiness get trim disconnected successful hunt results. This looks unappealing and mightiness not convey your afloat message.
- Title tags that are excessively short: Titles pinch 10 characters aliases little don't supply capable accusation astir your page. This limits your expertise to rank for different keywords.
- Missing meta descriptions: Without a meta description, hunt engines mightiness usage random matter from your page arsenic nan snippet successful hunt results. This could beryllium unappealing to users and trim click-through rates.
- Duplicate meta descriptions: When aggregate pages person nan aforesaid meta description, you're missing chances to usage applicable keywords and differentiate your pages. This tin confuse some hunt engines and users.
- Pages pinch a meta refresh tag: This outdated method tin origin SEO and usability issues. Use due redirects instead.
Canonical Tag Issues
Canonical tags are utilized to constituent retired nan “canonical” (or “main”) transcript of a page. They show hunt engines which page needs to beryllium indexed successful lawsuit location are aggregate pages pinch copy aliases akin content.
A canonical URL tag is placed successful nan <head> conception of a page's codification and points to nan “canonical” version.
It looks for illustration this:
<link rel="canonical" href="https://www.domain.com/the-canonical-version-of-a-page/" />
A communal canonicalization rumor is that a page has either nary canonical tag aliases aggregate canonical tags. Or, of course, a surgery canonical tag.
The Site Audit instrumentality tin observe each of these issues. To only spot nan canonicalization issues, spell to “Issues” and prime nan “Canonicalization” class successful nan apical filter.

Common canonical tag issues include:
- AMPs pinch nary canonical tag: If you person some AMP and non-AMP versions of a page, missing canonical tags tin lead to copy contented issues. This confuses hunt engines astir which type to show successful nan results.
- No redirect aliases canonical to HTTPS homepage from HTTP version: When you person some HTTP and HTTPS versions of your homepage without due direction, hunt engines struggle to cognize which 1 to prioritize. This tin divided your SEO efforts and wounded your rankings.
- Pages pinch a surgery canonical link: If your canonical tag points to a non-existent page, you're wasting nan crawl fund and confusing hunt engines.
- Pages pinch aggregate canonical URLs: Having much than 1 canonical tag connected a page gives conflicting directions. Search engines mightiness disregard each of them aliases prime nan incorrect one, perchance hurting your SEO results.
Hreflang Attribute Issues
The hreflang attribute denotes nan target region and connection of a page. It helps hunt engines service nan correct variety of a page based connected nan user’s location and connection preferences.
If your tract needs to scope audiences successful much than 1 country, usage hreflang attributes successful <link> tags.
Like this:

To audit your hreflang annotations, spell to nan “International SEO” thematic study successful Site Audit.

You’ll spot a broad overview of nan hreflang issues connected your site:

And a elaborate database of pages pinch missing hreflang attributes connected nan full number of connection versions your tract has.

Common hreflang issues include:
- Pages pinch nary hreflang and lang attributes: Without these, hunt engines can't find nan connection of your contented aliases which type to show users.
- Hreflang conflicts wrong page root code: Contradictory hreflang accusation confuses hunt engines. This tin lead to nan incorrect connection type appearing successful hunt results.
- Issues pinch hreflang values: Incorrect state aliases connection codes successful your hreflang attributes forestall hunt engines from decently identifying nan target assemblage for your content. This tin lead to your pages being shown to nan incorrect users.
- Incorrect hreflang links: Broken aliases redirecting hreflang links make it difficult for hunt engines to understand your site's connection structure. This tin consequence successful inefficient crawling and improper indexing of your multilingual content.
- Pages pinch hreflang connection mismatch: When your hreflang tag doesn't lucifer nan existent connection of nan page, it's for illustration mendacious advertising. Users mightiness onshore connected pages they can't understand.
Fixing these issues helps guarantee that your world assemblage sees nan correct contented successful hunt results. Which improves personification acquisition and perchance boosts your world SEO ROI.
JavaScript Issues
JavaScript is simply a programming connection utilized to create interactive elements connected a page.
Search engines for illustration Google usage JavaScript files to render nan page. If Google can’t get nan files to render, it won’t scale nan page properly.
The Site Audit instrumentality detects surgery JavaScript files and flags nan affected pages.

It tin besides show different JavaScript-related issues connected your website. Including:
- Unminified JavaScript and CSS files: These files incorporate unnecessary codification for illustration comments and other spaces. Minification removes this excess, reducing record size without changing functionality. Smaller files load faster.
- Uncompressed JavaScript and CSS files: Even aft minification, these files tin beryllium compressed further. Compression reduces record size, making them quicker to download.
- Large full size of JavaScript and CSS: If your mixed JS and CSS files transcend 2 MB aft minification and compression, they tin still slow down your page. This ample size leads to mediocre UX and perchance little hunt rankings.
- Uncached JavaScript and CSS files: Without caching, browsers must download these files each clip a personification visits your site. This increases load clip and information usage for your visitors.
- Too galore JavaScript and CSS files: Using much than 100 files increases nan number of server requests, slowing down your page load time
- Broken outer JavaScript and CSS files: When files hosted connected different sites don't work, it tin origin errors connected your pages. This affects some personification acquisition and hunt motor indexing.
Addressing these issues tin amended your site's performance, personification experience, and hunt motor visibility.
To cheque really Google renders a page that uses JavaScript, spell to Google Search Console and usage nan “URL Inspection Tool.”
Enter your URL into nan apical hunt barroom and deed enter.

Then, trial nan unrecorded type of nan page by clicking “Test Live URL” successful nan top-right corner. The trial whitethorn return a infinitesimal aliases two.
Now, you tin spot a screenshot of nan page precisely really Google renders it. To cheque whether nan hunt motor is reference nan codification correctly.
Just click nan “View Tested Page” nexus and past nan “Screenshot” tab.

Check for discrepancies and missing contented to find retired if thing is blocked, has an error, aliases times out.
Our JavaScript SEO guide tin thief you diagnose and hole JavaScript-specific problems.
Structured Data Issues
Structured information is information organized successful a circumstantial codification format (markup) that provides hunt engines pinch further accusation astir your content.
One of nan astir celebrated shared collections of markup connection among web developers is Schema.org.
Schema helps hunt engines scale and categorize pages correctly. And thief you seizure SERP features (also known arsenic rich results).
SERP features are typical types of hunt results that guidelines retired from nan remainder of nan results owed to their different formats. Examples see nan following:
- Featured snippets
- Reviews
- FAQs

Use Google’s Rich Results Test instrumentality to cheque whether your page is eligible for rich | results.

Enter your URL to spot each system information items detected connected your page.
For example, this blog station uses “Articles” and “Breadcrumbs” system data.

The instrumentality will database immoderate issues adjacent to circumstantial system information items, on pinch links connected really to reside them.
Or usage nan “Markup” thematic study successful nan Site Audit instrumentality to place system information issues.
Just click “View details” successful nan “Markup” container successful your audit overview.

The study will supply an overview of each nan system information types your tract uses. And a database of immoderate invalid items.

Invalid system information occurs erstwhile your markup doesn't travel Google's guidelines. This tin forestall your contented from appearing successful rich | results.
Click connected immoderate point to spot nan pages affected.

Once you place nan pages pinch invalid system data, usage a validation instrumentality for illustration Google's Rich Results Test to hole immoderate errors.
Further reading: Learn much astir the “Markup” report and how to make schema markup for your pages.
8. Check for and Fix HTTPS Issues
Your website should beryllium utilizing an HTTPS protocol (as opposed to HTTP, which is not encrypted).
This intends your tract runs connected a unafraid server utilizing an SSL certificate from a third-party vendor.
It confirms nan tract is morganatic and builds spot pinch users by showing a padlock adjacent to nan URL successful nan web browser:

HTTPS is a confirmed Google ranking signal.
Implementing HTTPS is not difficult. But it tin bring astir immoderate issues. Here's really to reside HTTPS issues during your method SEO audit:
Open nan “HTTPS” study successful nan Site Audit overview:

Here, you'll find a database of each issues connected to HTTPS. And proposal connected really to hole them.

Common issues include:
- Expired certificate: Your information certificate needs to beryllium renewed
- Old information protocol version: Your website is moving an aged SSL aliases TLS (Transport Layer Security) protocol
- No server sanction indication: Lets you cognize if your server supports SNI (Server Name Indication). Which allows you to big aggregate certificates astatine nan aforesaid IP reside to amended security
- Mixed content: Determines if your tract contains immoderate unsecure content, which tin trigger a “not secure” informing successful browsers
9. Find and Fix Problematic Status Codes
HTTP position codes bespeak a website server’s consequence to nan browser's petition to load a page.
1XX statuses are informational. And 2XX statuses study a successful request. Don’t interest astir these.
Let’s reappraisal nan different 3 categories—3XX, 4XX, and 5XX statuses. And really to woody pinch them.
Open nan “Issues” tab successful Site Audit and prime nan “HTTP Status” class successful nan apical filter.

To spot each nan HTTP position issues and warnings.
Click a circumstantial rumor to spot nan affected pages.
3XX Status Codes
3XX position codes bespeak redirects—instances erstwhile users and hunt motor crawlers onshore connected a page but are redirected to a caller page.
Pages pinch 3XX position codes are not ever problematic. However, you should ever guarantee they are utilized correctly to debar immoderate imaginable problems.
The Site Audit instrumentality will observe each your redirects and emblem immoderate related issues.
The 2 astir communal redirect issues are arsenic follows:
- Redirect chains: When aggregate redirects beryllium betwixt nan original and last URL
- Redirect loops: When nan original URL redirects to a 2nd URL that redirects backmost to nan original
Audit your redirects and travel nan instructions provided wrong Site Audit to hole immoderate errors.
4XX Status Codes
4XX errors bespeak that a requested page can’t beryllium accessed. The astir communal 4XX correction is nan 404 error: Page not found.
If Site Audit finds pages pinch a 4XX status, region each nan soul links pointing to those pages.
First, unfastened nan circumstantial rumor by clicking connected nan corresponding number of pages pinch errors.

You'll spot a database of each affected URLs.

Click “View surgery links” successful each statement to spot soul links that constituent to nan 4XX pages listed successful nan report.
Remove nan soul links pointing to nan 4XX pages. Or switch nan links pinch applicable alternatives.
5XX Status Codes
5XX errors are connected nan server side. They bespeak that nan server could not execute nan request. These errors tin hap for galore reasons.
Such as:
- The server being temporarily down aliases unavailable
- Incorrect server configuration
- Server overload
Investigate why these errors occurred and hole them if possible. Check your server logs, reappraisal caller changes to your server configuration, and show your server's capacity metrics.
10. Perform Log File Analysis
Your website’s log record records accusation astir each personification and bot that visits your site.
Log record study helps you look astatine your website from a web crawler's constituent of view. To understand what happens erstwhile a hunt motor crawls your site.
It’s impractical to analyse nan log record manually. Instead, usage Semrush’s Log File Analyzer.
You’ll request a transcript of your access log file to statesman your analysis. Access it connected your server’s record head successful nan power sheet aliases via an FTP (FileTransfer Protocol) client.
Then, upload nan record to nan instrumentality and commencement nan analysis. The instrumentality will analyse Googlebot activity connected your tract and supply a report. That looks for illustration this:

It tin thief you reply respective questions astir your website, including:
- Are errors preventing my website from being crawled fully?
- Which pages are crawled nan most?
- Which pages are not being crawled?
- Do structural issues impact nan accessibility of immoderate pages?
- How efficiently is my crawl budget being spent?
These answers substance your SEO strategy and thief you resoluteness issues pinch nan indexing aliases crawling of your webpages.
For example, if Log File Analyzer identifies errors that forestall Googlebot from afloat crawling your website, you aliases a developer tin activity to resoluteness them.
To study much astir nan tool, publication our Log File Analyzer guide.
Boost Your Website’s Rankings pinch a Technical SEO Audit
A thorough method SEO audit tin positively impact your website's integrated hunt ranking.
Now you cognize really to behaviour a method SEO audit, each you person to do is get started.
Use our Site Audit tool to place and hole issues. And watch your capacity amended complete time.
This station was updated successful 2024. Excerpts from nan original article by A.J. Ghergich whitethorn remain.