Wednesday, July 29, 2020

How to fix the robots txt file in blogger and WordPress


fix-the-robots-txt-file-in-blogger-custom-roboto-txt-wordpress

Robots.txt file in blogger and WordPress | Generate Sitemap in Google Console


Hey, hope you are fine in this Quarantine. If you have a website and website are hosting on the blogger or WordPress so then you definitely got the Robot txt error on the website through the mail.

If you got his kind of mail so you need to fix it first you can cause critical SEO that can negatively impact your ranking and traffic as well.

                   fix-the-robots-txt-file-in-blogger-custom-roboto-txt-wordpress

    

As you can see on the ScreenShot where you can see Fix Coverage issues this section of Google Console in this section are find some kind of error Robot txt, crawler issue,404 error, and 505 error so the day will be discussed about the Robot txt Error 


In this post, you will learn what is a robot txt file, importance of robot txt file, why robot txt show, why do you need it. How to fix it in a blogger, how to fix it in WordPress. 

If you want to fix this so read up this article till at the end and I will tell you How to fix the robot txt file in blogger and WordPress step by step with the use of screenshots so Let's start.


What is Robot.txt

Robot.txt is a file in the form of words and symbols. that resides in the root directory of your website and gives search engine crawlers instruction as to which page they indexing, Crawl while during the indexing and crawlings.

when it's visiting your website so then the first thing they do is look check out the website content.

Also, check your labels of websites and check and read the URL,  for indexing the particular websites.

Importance of robot.txt file.

Robot.txt file is important because it helps your website for indexing, crawling, and also helps to SEO and ranking of the article

Robot.txt helps your website to find the publicly your page and a particular website can be crawled and added to their index.

If you have a big website, where crawling and indexing can be a very resource-intensive process. Crawler from various search engines will be trying to crawl and index the whole website and can be crate serious problems.

Why robot.txt show in Google Console

This is due to showing the Robots.txt file in Google console coverage when we do not update the robots txt file on hosting site websites like Blogger and WordPress.

In the case of a blogger, we used labels while post the article while we add labels on the post. If we added the same labels on the single post then you get this type of problem.

In the case of word press, we used the tags and category if we used the same tags then you can get this type of problems.

When we use the same labels, tags, and category on a single article so when the crawler is indexing them and get the three labels to have a single article so when this situation you got the duplicate content error.  so it's very important to fix it because it will impact your SEO.


Two Important things you should know about Robot.txt
The first thing is that if you added rules to the robot.txt are the only directive. This means that SEO Search Engine to obey and follow the rules according to a robot txt file.

The second thing is if you added the if block any page on the robot.txt then that will be generated and removed or not appear on the web.

How robots.txt works

Robot.txt file is the form of a simple keywords structure. In which some predefined keywords and value combinations you can use 

Allow, Disallow, Crawler Delay, Sitemap and User agent

Allow:- You can use the allow on your robot.txt file it gives access to a specific folder on our website and even the parent directory is disallowed.

For example, If you disallow access to photos directory but in case of allowing you can be located your blog subfolders are located in subfolders.



User-agent: *
Disallow: /search
Disallow: /category/
Disallow: /tag/
Allow: /
User-Agent:* Its help to the specific crawler to take the specific account the directive. you can use * it specify the crawler name as you can see on the below side.

User-Agent:* this allows to crawlers. 
User-Agent: this for the Google bot Only.

Disallow:- it directive that instructs the user agent, not any URL


Sitemap:- It's directive the direct major search engine also including Google. it's very helpful to specify the find the location of the XML file.

The search will be able to find the cause of the XML sitemap. this added on the bottom od robot.txt file.

SITEMAP:- https://emample.com/sitemap.xml

How to Fix Robot.Txt 

For Blogger:-

First of log in your blogger which has the Robot.Txt error and Click on the Setting of your blogger.

As you can see on the Screenshot and follow me by step by step



Setting > Crawlers and indexing

Now you can see the bottom of Crawlers and indexing has Custom robot.txt now enable the  1 Enable custom robots.txt and now Go to google and Denrate the custom eobots.txt file. Click ≻ Blogger site map



fix-the-robots-txt-file-in-blogger-custom-roboto-txt
Now generating code is a copy on the arrow 2 section and save the section. 

Now enable the Custom robots header tag and at active all the button as according to screen show as you see on the picture

In Home page tags
all        enable 
no dp   enable  

In Archive and search page tags
Noindex     enable
nodp           enable

Inpost and tags
all                enable
nodp           enable

                                                  
                             fix-the-robots-txt-file-in-blogger-custom-roboto-txt


now click on the save and close the blogger.


Now open the Google console on your browser and click on the coverage on the google console.

See on the screenshot I marked by the arrow your screen also show like that index through blocked by the robots.txt so then click on this error 

fix-the-robots-txt-file-in-blogger-custom-roboto-txt


after a click on this screenshot, your screen will be shown like see on the bottom screenshot. Now click on start validation and your error will be removed after some days from the coverage. it can be fixed within 15 days after the fix you will be getting the Gmail from the Google Console.


For WordPress

In WordPress, we need to install the one plugin YOST SEO install the Yoast SEO 

fix-the-robots-txt-file-in-blogger-custom-roboto-txt-wordpress


After install open the Yoast SEO and then your screen will be shown like ↡↡↡



Now you can see three option are shown on screen Category, Tag, and format there some change in three options as you see on the pictures

In Category   ➡️ NO
In Tag            ➡️ NO
In Format      ➡️ Disable

after active, these options save the sitting and these fills are fixed after some days.

Note:- If you don't save all these sitting so you must be save all these settings. If you don't do this can impact your SEO.

Thanks for the reading please drop the comments if not understand anything we ill help within 8hr.


.

4 comments:

Thanks For Your feedback All4techs Team will be the response to you soon.!