step-by-step guide to check if critical resources (e.g., HTML, CSS, and JS) are accessible to Googlebot

step-by-step guide to check if critical resources (e.g., HTML, CSS, and JS) are accessible to Googlebot:

Step 1: Use Google Search Console

1. Login to Google Search Console:Go to Google Search Console.Select your property (website) from the list

.2. Open the URL Inspection Tool:In the left-hand menu, click on “URL Inspection.”

3. Inspect a URL:Enter the URL of the page you want to check in the search bar at the top.Press Enter to inspect the URL.

4. View the Results:Under “Coverage”, check if Google says:“URL is available to Google” (this means Googlebot can access the URL).Next, click on “View Tested Page” to see how the page was rendered by Googlebot.

5. Check the Resources Section:After running a live test, go to the “More Info” tab on the right side.Look for a section called “Page Resources”:If any CSS, JS, or images are listed as blocked, you’ll see them here.

Step 2: View Robots.txt Rules

1. Check Your Robots.txt File:Go to https://yourwebsite.com/robots.txt in your browser.

Look for any lines blocking critical resources like:

Disallow: /wp-includes/

Disallow: /wp-content/plugins/

Ensure there are no rules that block:

/wp-content/ (this folder often contains CSS and JS files for WordPress).

Any other paths for your critical files

Edit Robots.txt

if Needed:If you find such rules, edit your robots.txt file to allow access to those folders.

For example:

Allow: /wp-content/

Allow: /wp-includes/—

Step 3: Verify Resource

Accessibility Using the “Fetch as Google” Tool

1. Perform a Live Test:In Search Console > URL Inspection, click on “Test Live URL”.This will fetch the page directly as Googlebot.

2. Check Page Rendering:Scroll to the “Screenshot” tab to see how Googlebot rendered your page.If the page looks incomplete or broken, it means critical resources are blocked.

Leave a Reply