How To Fix Googlebot Cannot Access CSS & JS Warning

Have you received a warning from Google Search console team saying “Googlebot cannot access CSS and JS files on”?


If your answer is yes, here is how you can fix the warning. If your answer is no, you are about to receive such email soon.efore we do that, let me explain why you and many other webmasters got this warning.

Here is how my warning email from Google team looks like:

If you haven’t received this warning email yet, instead of waiting for the email, you can take action right away.

I suggest you to login to your Google search console tool here, & go to your site dashboard. Click on Google Index > Blocked resources & see is search console is showing any blocked resources for your site.

You can click on the domain name under Host column, which will show which all specific files are blocked for search engine bots for crawling. If you see only those entries that are intentionally blocked by you for better crawling, then you don’t have to worry about anything.

But if you see files such as theme or plugins .css & .js files which is essential for site display, you need to edit your website/blog robots.txt file. This is applicable for almost all WordPress blogs & few other popular CMS.

Before I show you how to fix this issue, here is a tip for those who have submitted their site to search console just after reading this. (Easy to follow Tutorial here)

Chances are you might not see any blocked resources for your site right away, so you can use Fetch as Google feature.


Click on Crawl > Fetch as Google to add a fetch & render request that will be completed in few seconds. After that click under status to see how Google sees (Render) your site. In the below screenshot, you can see how Google view is different that a reader view for epostakur. (Which I have fixed  & shared the steps below).

This is where things get interesting as you can click on robots.txt tester to further see which line of your robots.txt file is blocking the bots from accessing your site CSS & JS file. In my case Disallow: /wp-includes/ was the culprit.

If your result is something like the same, here is how you can fix it. My guide is targeted for WordPress platform, but the process is similar for any other platform including custom CMS. BlogSpot users can learn here on editing robots.txt file on their platform.

How to fix CSS & JS Warning by editing robots.txt?

If the terms like robots.txt sounds new to you, don’t worry as you are not alone. It’s a common lingo in the SEO industry but not so popular among bloggers. Here we have two guides that will give you essential learning about it & they are not boring.

I suggest you to read it, as it will give you a great insight on a big aspect of search engine optimization. For WordPress blog, you can edit your robots.txt file using FTP click such as FileZilla or use the file editor feature of SEO by Yoast.

Yoast SEO plugin let you edit your robots.txt & .htaccess file from the WordPress dashboard. I have already talked about this earlier in this setup guide. I have also highlighted this step below.

Here is how you can access robots.txt file & edit it from your WordPress dashboard.

Inside your WordPress dashboard, click on SEO > Tools & click on file editor

On this page, you can view & edit your robots.txt file. In the majority of cases, you need to remove following line:

Disallow: /wp-includes/

Depending upon how you have configured your robots.txt file, it will fix most of the warnings

As I mentioned above, Google search console tool is giving enough data for any non-SEO or a basic user to fix all warnings. Here is how Google is rendering epostakur after editing & fixing the Robots.txt file.

For the reference, you can see epostakur Robots.txt file here.

Why Google bots need to access theme CSS & JS files?

You might ask this question as earlier it was not essential to give Google access to your theme part. CSS & JS is not typically a part of the content & here is what I have found on official help page in answer for this question:


A web page relies on the availability of my_script.js, which is typically run by web browsers to provide the browsers with the core textual content of the page. If my_script.js is blocked from Google, we won’t be able to get the text content when Googlebot renders the web page.

Here is a video by Matt Cutts where he shared why you should not block Javascript & CSS:


Usually, I won’t bother much about a plugin JS file being blocked which have nothing to do with the site display. But after this mass warning email, it’s better to be safe & let Google access your site completely. In the majority of cases, you are good just to block your WP-admin area in WordPress Or sensitive (Dashboard, private area) while using any other platform.

Must read:

I hope this detailed guide answered all your question regarding Google mass mail to webmasters regarding Google bot unable to access CSS, JS or any other file.

Do let me know if you also received similar warning mail? Have you fixed all the warnings? If you have any interesting insights to share about this issue, do share with us in the comment section below.