Struggling to guarantee Googlebot decently crawls and indexes your website? For method SEOs, rendering issues, particularly connected JavaScript-heavy sites, tin lead to missed rankings and hidden content.
That’s wherever utilizing Chrome (or Chrome Canary) to emulate Googlebot comes in. This method uncovers discrepancies betwixt what users and hunt engines see, ensuring your tract performs arsenic expected.
Whether spoofing Googlebot aliases not, pinch a circumstantial testing browser, technical audits are much businesslike and accurate.
In this guide, I’ll show you really to group up a Googlebot browser, troubleshoot rendering issues, and amended your SEO audits.
Why should I position a website arsenic Googlebot?
In the past, technical SEO audits were simpler, pinch websites relying connected HTML and CSS, and JavaScript constricted to insignificant enhancements for illustration animations. Today, full websites are built pinch JavaScript, shifting the workload from servers to browsers. This intends that hunt bots, including Googlebot, must render pages client-side, a process that’s resource-intensive and prone to delays.
Search bots often struggle pinch JavaScript. Googlebot, for example, processes the earthy HTML first and whitethorn not afloat render JavaScript contented until days aliases weeks later, depending connected the website. Some sites usage move rendering to bypass these challenges, serving server-side versions for bots and client-side versions for users.
Mini rant
Generally, this setup overcomplicates websites and creates more technical SEO issues than a server-side rendered aliases accepted HTML website. Thankfully, dynamically rendered websites are declining successful use.
While exceptions exist, I judge client-side rendered websites are a bad idea. Websites should beryllium designed to activity connected the lowest communal denominator of a device, pinch progressive enhancement (through JavaScript) utilized to amended the acquisition for group utilizing devices that tin grip extras.
My anecdotal grounds suggests that client-side rendered websites are mostly much difficult for group who trust connected accessibility solutions specified arsenic surface readers. Various studies backmost this up, though the studies I’ve seen are by companies and charities invested successful accessibility (an illustration wherever I deliberation immoderate bias is possibly justified for the bully of all). However, location are instances where technical SEO and usability crossover.
The bully news
Viewing a website arsenic a Googlebot lets you observe discrepancies betwixt what bots and users see. While these views don’t request to beryllium identical, captious elements—like navigation and content must align. This attack helps place indexing and ranking issues caused by rendering limitations and different hunt bot-speicific quirks.
Never miss an rumor impacting postulation connected your site
Find and hole method SEO issues accelerated pinch Moz Pro.
Can we spot what Googlebot sees?
No, not entirely.
Googlebot renders webpages pinch a headless type of the Chrome browser, but moreover pinch the techniques successful this article, it’s intolerable to replicate its behaviour perfectly. For example, Googlebot’s handling of JavaScript tin beryllium unpredictable.
A notable bug successful September 2024 prevented Google from detecting meta noindex tags successful client-side rendered codification for galore React-based websites. Issues for illustration these item the limitations of emulating Googlebot, particularly for important SEO elements for illustration tags and main content.
The goal, however, is to emulate Googlebot’s mobile-first indexing arsenic intimately arsenic possible. For this, I usage a operation of tools:
A Googlebot browser for nonstop emulation.
Screaming Frog SEO Spider to spoof and render arsenic Googlebot.
Google’s devices for illustration the URL Inspection instrumentality successful Search Console and Rich Results Test for screenshots and codification analysis.
It’s worthy noting that Google’s tools, particularly aft they switched to the “Google-InspectionTool” user-agent successful 2023, aren’t wholly meticulous representations of what Googlebot sees. However, erstwhile utilized alongside the Googlebot browser and SEO Spider, they’re valuable for identifying imaginable issues and troubleshooting.
Why usage a abstracted browser to position websites arsenic Googlebot?
Using a dedicated Googlebot browser simplifies method SEO audits and improves the accuracy of your results. Here's why:
1. Convenience
A dedicated browser saves clip and effort by allowing you to quickly emulate Googlebot without relying connected aggregate tools. Switching personification agents successful a modular browser extension tin beryllium inefficient, particularly erstwhile auditing sites pinch inconsistent server responses aliases move content.
Additionally, immoderate Googlebot-specific Chrome settings don’t persist crossed tabs aliases sessions, and circumstantial settings (e.g., disabling JavaScript) tin interfere pinch different tabs you’re moving on. You tin bypass these challenges and streamline your audit process pinch a abstracted browser.
2. Improved accuracy
Browser extensions tin unintentionally change really websites look aliases behave. A dedicated Googlebot browser minimizes the number of extensions, reducing interference and ensuring a much meticulous emulation of Googlebot’s experience.
3. Avoiding mistakes
It’s easy to hide to move disconnected Googlebot spoofing successful a modular browser, which tin origin websites to malfunction aliases artifact your access. I’ve moreover been blocked from websites for spoofing Googlebot and had to email them pinch my IP to region the block.
4. Flexibility contempt challenges
For galore years, my Googlebot browser worked without a hitch. However, pinch the emergence of Cloudflare and its stricter information protocols on e-commerce websites, I’ve often had to inquire clients to adhd circumstantial IPs to an let database truthful I tin trial their sites while spoofing Googlebot.
When whitelisting isn’t an option, I move to alternatives for illustration the Bingbot aliases DuckDuckBot user-agent. It's a little reliable solution than mimicking Googlebot, but tin still uncover valuable insights. Another fallback is checking rendered HTML in Google Search Console, which, contempt its limitation of being a different user-agent to Google's crawler, remains a reliable measurement to emulate Googlebot behavior.
If I’m auditing a site that blocks non-Google Googlebots and tin get my IPs allowed, the Googlebot browser is still my preferred tool. It’s much than conscionable a user-agent switcher and offers the astir broad measurement to understand what Googlebot sees.
Which SEO audits are useful for a Googlebot browser?
The astir communal usage lawsuit for a Googlebot browser is auditing websites that trust connected client-side aliases move rendering. It’s a straightforward measurement to comparison what Googlebot sees to what a wide visitant sees, highlighting discrepancies that could effect your site’s capacity successful hunt results.
Given I urge limiting the number of browser extensions to an basal few, it’s besides a much meticulous trial than an extension-loaded browser of really existent Chrome users acquisition a website, particularly erstwhile utilizing Chrome’s inbuilt DevTools and Lighthouse for velocity audits, for example.
Even for websites that don’t usage move rendering, you ne'er cognize what you mightiness find by spoofing Googlebot. In complete 8 years of auditing e-commerce websites, I’m still amazed by the unsocial problems I encounter.
What should you analyse during a Googlebot audit?
- Navigation differences: Is the main navigation accordant crossed personification and bot views?
- Content visibility: Is Googlebot capable to spot the contented you want indexed?
- JavaScript indexing delays: If the tract depends connected JavaScript rendering, will caller contented beryllium indexed quickly capable to matter (e.g., for events aliases merchandise launches)?
- Server consequence issues: Are URLs returning correct server responses? For instance, an incorrect URL mightiness show a 200 OK for Googlebot but a 404 Not Found for visitors.
- Page layout variations: I’ve often seen links show arsenic bluish matter connected a achromatic inheritance erstwhile spoofing Googlebot. It’s machine-readable but acold from user-friendly. If Googlebot can’t render your tract properly, it won’t cognize what to prioritize.
- Geolocation-based redirects: Many websites redirect based connected location. Since Googlebot crawls chiefly from US IPs, it’s important to verify really your tract handles specified requests.
How elaborate you spell depends connected the audit, but Chrome offers galore built-in devices for technical SEO audits. For example, I often comparison Console and Network tab information to place discrepancies betwixt wide visitant views and Googlebot. This process catches files blocked for Googlebot aliases missing contented that could different spell unnoticed.
Never miss an rumor impacting postulation connected your site
Find and hole method SEO issues accelerated pinch Moz Pro.
How to group up your Googlebot browser
Setting up a Googlebot browser takes astir 30 minutes and makes it overmuch easier to position webpages arsenic Googlebot. Here’s really to get started:
Step 1: Download and instal Chrome aliases Canary
- If Chrome isn’t your default browser, you tin usage it arsenic your Googlebot browser.
- If Chrome is your default browser, download and install Chrome Canary instead.
Canary is simply a improvement type of Chrome wherever Google tests caller features. It runs separately from the default Chrome installation and is easy identified by its yellowish icon, a motion to the canaries erstwhile utilized successful mines to observe venomous gases.
While Canary is branded “unstable,” I haven’t encountered immoderate issues utilizing it arsenic my Googlebot browser. In fact, it offers beta features that are useful for audits. If these features make it to Chrome, you’ll beryllium up of the curve and tin impressment your non-Canary-using colleagues.
Step 2: Install browser extensions
To optimize your Googlebot browser, I urge intalling 5 important extensions and a bookmarklet to optimize my Googlebot browser. These devices emulate Googlebot and improve method SEO audits, pinch 3 particularly useful for JavaScript-heavy websites. Here’s the breakdown:
Extensions for emulating Googlebot:
- User-Agent Switcher: Switches the browser’s user-agent to mimic Googlebot’s behavior.
- Web Developer: Allows you to move JavaScript connected aliases disconnected easily, giving penetration into really Googlebot mightiness process the site.
- Windscribe (or your preferred VPN): Simulates Googlebot’s location, typically successful the US, ensuring location-based discrepancies are accounted for.
Additional favorites:
- Link Redirect Trace: Quickly checks server responses, and HTTP headers for method SEO audits.
- View Rendered Source: Compares earthy HTML (what the server delivers) pinch rendered HTML (what the browser processes).
Bookmarklet:
- NoJS Side-by-Side: Compares a webpage’s quality pinch and without JavaScript enabled, making discrepancies easier to spot.
Before we move connected to measurement 3, I’ll break down these extensions I conscionable mentioned
User-Agent Switcher extension
User-Agent Switcher does what it says connected the tin: switches the browser’s user-agent. While Chrome and Canary see a built-in user-agent setting, it only applies to the progressive tab and resets erstwhile you adjacent the browser. Using this hold ensures consistency crossed sessions.
I return the Googlebot user-agent drawstring from Chrome’s browser settings, which, astatine the clip of writing, was the latest type of Chrome (note that below, I’m taking the user-agent from Chrome and not Canary).
Setting up the User-Agent Switcher:
1.Get the Googlebot user-agent string:
- Open Chrome DevTools by pressing F12 aliases going to More tools> Developer tools.
- Navigate to the Network tab.
- From the top-right Network hamburger menu, select More devices > Network conditions.
- In the Network conditions tab:
- Untick "Use browser default."
- Choose "Googlebot Smartphone" from the list.
- Copy and paste the user-agent from the section beneath the database into the User-Agent Switcher hold database (another screenshot below). Remember to move Chrome to its default user-agent if it's your main browser.
- An further extremity for Chrome users:
- While you’re here, if Chrome will beryllium your Googlebot browser, tick "Disable cache" successful DevTools for much meticulous results during testing.
- While you’re here, if Chrome will beryllium your Googlebot browser, tick "Disable cache" successful DevTools for much meticulous results during testing.
2. Add the user-agent to the extension:
- Right-click the User-Agent Switcher icon successful the browser toolbar and click Options (see screenshot below).
- “Indicator Flag” is the matter successful the browser toolbar that shows which user-agent you’ve selected. Paste the Googlebot user-agent drawstring into the database and springiness it a explanation (e.g., "GS" for Googlebot Smartphone).
- Optionally, adhd different user-agents like Googlebot Desktop, Bingbots, or DuckDuckBot for broader testing.
Why spoof Googlebot’s user-agent?
Web servers place browsers done their user-agent strings. For example, the user-agent for a Windows 10 instrumentality utilizing Chrome mightiness look for illustration this:
Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, for illustration Gecko) Chrome/131.0.0.0 Safari/537.36
If you’re funny astir the history of user-agent strings and why different browsers look successful Chrome’s user-agent, you mightiness find resources for illustration the History of the user-agent string an absorbing read.
Web Developer extension
The Web Developer hold is an basal instrumentality for method SEOs, particularly erstwhile auditing JavaScript-heavy websites. In my Googlebot browser, I regularly move JavaScript connected and disconnected to mimic really Googlebot processes a webpage.
Why disable JavaScript?
Googlebot doesn’t execute each JavaScript connected its first crawl of a URL. To understand what it sees earlier rendering JavaScript, disable it. This reveals the earthy HTML contented and helps place captious issues, specified arsenic missing navigation aliases contented that relies connected JavaScript to display.
By toggling JavaScript pinch this extension, you summation insights into really your tract performs for hunt engines during the important first crawl.
Windscribe (or different VPN)
Windscribe, aliases immoderate reliable VPN, is invaluable for emulating Googlebot’s emblematic US-based location. While I usage a Windscribe Pro account, their free scheme includes up to 2GB of monthly information and offers respective US locations.
Tips for utilizing a VPN pinch your Googlebot browser:
- Location doesn’t matter much: Googlebot mostly crawls from the US, truthful immoderate US location works. For fun, I ideate Gotham arsenic existent (and villain-free).
- Disable unnecessary settings: Windscribe’s browser hold blocks ads by default, which tin interfere pinch really webpages render. Make judge the 2 icons successful the top-right area show a zero.
- Use a browser hold complete an app: A VPN hold ties the location spoofing to your Googlebot browser, ensuring your modular browsing isn’t affected.
These tools, paired pinch the User-Agent Switcher, heighten your expertise to emulate Googlebot, revealing contented discrepancies and potential indexing issues.
Never miss an rumor impacting postulation connected your site
Find and hole method SEO issues accelerated pinch Moz Pro.
Why spoof Googlebot’s location?
Googlebot chiefly crawls websites from US IPs, and location are respective reasons to mimic this behaviour erstwhile conducting audits:
- Geolocation-based blocking: Some websites artifact entree to US IPs, which intends Googlebot can’t crawl aliases scale them. Spoofing a US location ensures that you’re seeing the tract arsenic Googlebot would.
- Location-specific redirects: Many websites service different contented based connected location. For instance, a business mightiness person abstracted sites for Asia and the US, pinch US visitors automatically redirected to the US site. In specified cases, Googlebot mightiness ne'er brushwood the Asian version, leaving it unindexed.
Other Chrome extensions useful for auditing JavaScript websites
Beyond the essentials for illustration User-Agent Switcher and a VPN, present are a fewer much devices I trust connected for method audits:
- Link Redirect Trace: Shows server responses and HTTP headers, helping troubleshoot method issues.
- View Rendered Source: Compares earthy HTML (delivered by the server) to rendered HTML (processed by the browser), helping you spot discrepancies successful what users and Googlebot see.
- NoJS Side-by-Side bookmarklet: Allows you to comparison a webpage pinch and without JavaScript enabled, displayed broadside by broadside successful the aforesaid browser window.
Alright, backmost to measurement 3
Step 3: Configure browser settings to emulate Googlebot
Next, we’ll configure the Googlebot browser settings to lucifer what Googlebot doesn’t support erstwhile crawling a website.
What Googlebot doesn’t support:
- Service workers: Since users clicking done hunt results whitethorn not person visited the page before, Googlebot doesn’t cache information for later visits.
- Permission requests: Googlebot does not process push notifications, webcam access, geolocation requests, and akin features. Therefore, immoderate contented relying connected these permissions will not beryllium visible to it.
- Statefulness: Googlebot is stateless, meaning it doesn’t clasp information for illustration cookies, convention storage, section storage, aliases IndexedDB. While these mechanisms tin temporarily shop data, they are cleared earlier Googlebot crawls the adjacent URL.
These slug points are summarized from an interview by Eric Enge pinch Google’s Martin Splitt.
Step 3a: DevTools settings
You’ll request to set immoderate settings successful Developer Tools (DevTools) to configure your Googlebot browser for meticulous emulation.
How to unfastened DevTools:
- Press F12, aliases unfastened the hamburger paper successful the top-right area of Chrome aliases Canary and spell to More devices > Developer tools.
- The DevTools model is docked wrong the browser by default, but you tin alteration this. Use the 2nd hamburger paper successful DevTools to move the Dock broadside aliases unfastened it successful a abstracted window.
Key configurations successful DevTools:
- Disable cache:
- You whitethorn person already done this if you’re utilizing Chrome arsenic your Googlebot browser.
- Otherwise, successful DevTools, unfastened the hamburger menu, spell to More devices > Network conditions, and tick the “Disable cache” option.
- Block work workers:
- Navigate to the Application tab successful DevTools.
- Under Service Workers, tick the “Bypass for network” option.
Step 3b: General browser settings
Adjust the wide browser settings to bespeak Googlebot’s behavior.
- Block each cookies:
- Go to Settings > Privacy and information > Cookies, aliases enter chrome://settings/cookies successful the reside bar.
- Select “Block each cookies (not recommended)”—sometimes it’s nosy to spell against the grain!
- Adjust tract permissions:
- In Privacy and Security, navigate to Site settings aliases enter chrome://settings/content.
- Under Permissions, individually block Location, Camera, Microphone, and Notifications.
- In the Additional Permissions section, disable Background sync.
Step 4: Emulate a mobile device
Since Googlebot chiefly uses mobile-first crawling, it’s important to emulate a mobile instrumentality successful your Googlebot browser.
How to emulate a mobile device:
- Open DevTools and click the device toolbar toggle successful the top-left corner.
- Choose a instrumentality to emulate from the dropdown paper aliases adhd a civilization instrumentality for much circumstantial testing.
Key considerations:
- Googlebot doesn’t scroll connected web pages. Instead, it renders utilizing a model pinch a agelong vertical height.
- While mobile emulation is essential, I besides urge testing successful desktop position and, if possible, connected existent mobile devices to cross-check your results.
How astir viewing a website arsenic a Bingbot?
To create a Bingbot browser, usage a caller type of Microsoft Edge and configure it pinch the Bingbot user-agent.
Why see Bingbot?
- Bingbot’s behaviour is akin to Googlebot’s successful what it supports and doesn’t support.
- Search engines for illustration Yahoo, DuckDuckGo, and Ecosia are either powered by aliases based connected Bing, making it much influential than galore realize.
Never miss an rumor impacting postulation connected your site
Find and hole method SEO issues accelerated pinch Moz Pro.
Summary and closing notes
Now, you person your ain Googlebot emulator. Setting up a browser to mimic Googlebot is 1 of the easiest and quickest ways to position webpages arsenic the crawler does. Best of all, it’s free if you already person a desktop instrumentality tin of installing Chrome aliases Canary.
While different devices for illustration Google’s Vision API (for images) and Natural Language API connection valuable insights, a Googlebot browser simplifies the website method audits, particularly those that trust connected client-side rendering.
For a deeper dive into auditing JavaScript sites and knowing the nuances betwixt modular HTML and JavaScript-rendered websites, I urge exploring articles and presentations from experts like Jamie Indigo, Joe Hall, and Jess Peck. They connection fantabulous insights into JavaScript SEO and its challenges.
Feel free to scope retired if you person questions aliases deliberation I’ve missed something. Tweet me @AlexHarfordSEO, link on Bluesky, aliases find maine on LinkedIn. Thanks for reading.
The author's views are wholly their ain (excluding the improbable arena of hypnosis) and whitethorn not ever bespeak the views of Moz.
English (US) ·
Indonesian (ID) ·