phone and tablets

Phones and Tablets

How to create a searchable Robots.txt on blogger blogs

"Custom Robots.txt, is an SEO Tool/or Plugin which enables search consoles; such as Google, to crawl easily on all published posts, contents, and pages. Also, to folders/and features in blogs.
    
Robots.txt files, also helps Google search engines/and all other search engines, to identify which pages, folder, and contents to crawl/or to ignore; through the help/or command given to it from blogs sitemaps.
     
Creating/or having a clear eligible/or professional Robots.txt/or sitemaps helps Google search engines to index your contents/or pages, without any obstacle. Why did I say; "Having an eligible/or professional sitemaps/or Robots.txt". For instance; if in your sitemap, you have:
       
       User-Agents: /*
       Disallow: /search
       Allow: /
       Sitemap:https://yourdomainname.com/atom.xml----------- (and so on till 500),
     
If in your sitemap, in the inscription code if the Disallow part bears a folder in front after the : / sign, that folder can not be crawled and indexed. (Example-: Disallow: /search, means that any content/or page bearing the folder "search", will not be crawled nor index). This will be explained to you, when we run through the steps on how to set/or create a searchable Robots.txt file
     
In setting up a Custom Robots.txt file, they are some key things that needs to be put into consideration; these are-:
     
Try to ensure that you don't create a Robots.txt file, that will lead to any error whatsoever.
   
☆ Providing a wrong/or blocked Robots.txt file, will hinder Google/any search engines to crawl on your site, if there are hindered/or blocked from crawling round the contents of the website,  indexing becomes a problem.
     
Providing/or creating a correct sitemaps without blocking of any useful/or desired folder in the Robots.txt file (because what information you submit to these search engines, as seen in your sitemap; search engines will work with it), search engines can and will be able to crawl round all the permitted folder/or contents/and pages of your blog.
     
When we talk about setting a Robots.txt, we are referring mostly on sitemaps, because it is simply the channel through which search engines can crawl and locate pages found on blogs/or websites. Please don't be confuse/or carried out.
      
When you have the above key things into consideration, your contents/or pages will be freely indexed.
    
Sitemaps/and Robots.txt, also helps in the ranking of your contents/or pages on search zones/or wall (browser search).
 It also helps in providing Traffic to your blog (a secret).
   
Having spelt out what you should know about a Robot.txt file/or sitemap. Also, some key points, as well as importance of having/or creating a sacrosanct sitemap/or Robots.txt file, I will take you through on how to set up a sitemap that will grant access to search engines to crawl through all your website's pages.
      

Steps taken to set/or create a searchable/and crawl able Robots.txt file on blogger.

          
1) Log on to your blogger dashboard
          
2) At the top right corner of your blogger dashboard, click on "menu", locate/or scroll down to "settings". Click it.
          
3) Under "settings", scroll down to "search preferences". Click it.
          
4) In "search preference", locate/or scroll down to "Custom Robots.txt file", (custom Robots.txt file, is found under "Crawling and Indexing"), Click on "Edit" in front of the "Custom Robots.txt file",  you will see an inscription code-: "Yes" or "No", click on Yes! To enable you edit/or paste/or create your sitemap.
    
If you are having issues on how to paste your sitemap/or how to generate a sitemap, this can help-:
     For blogspot users, try the ctqrl to generate sitemap for your blogspot-blogs.
          
5) When the pop-up box in the "custom Robots.txt file" is out, paste your generated blogger sitemap. Before clicking on "save changes", please ensure that your sitemap has been configured to your taste and desire. What do I mean by taste and desire in this text? What I meant is that, you have given Google search console engine/and any other search engines the pages to search and the ones to block.
If you haven't, and you don't know what/or how to set it up, don't worry for I will teach you in every aspect, and will break each steps to you.
          
6) If you generated a sitemap through ctqrl, your sitemap will come in this format-:
          
       User-Agents: /*
       Disallow: /search
       Allow: /
       Sitemap:https://yourdomainname.com/atom.xml----------- (and so on till 500),
   
As you can see that there's a blockage in the folder "search", reason being there's a command given in that your generated sitemap to Disallow any search folder in your pages/or contents, (see-: Disallow: /search). If this  Disallow: /search, is present in your sitemap; your pages will not be searched/crawled, so as indexing  some of your pages bearing a search folder, impossible. Don't worry for it can be edited.
     

Explanation of these codes
       

       User-Agents: /*
       Disallow: /search
       Allow: /
       Sitemap:https://yourdomainname.com/atom.xml----------- (and so on till 500), found in a blogger ctqrl sitemap generator-:

I will explain what the above codes is all about, as well as their functions.
    User-Agents: /*
     
This simply means/or explains that all search engines/or Agents, have easy access to crawl round your blogs' pages. Search engines such as-:

a) Googlebots, adsbots, etc.

b) Bing search engines.

c) Yahoo search engines. etc.
      In a situation where you see-:

i) User-Agents: /googlebots

ii) User-Agents: /adsbots etc (they are so many of Google search engines).
   
It simply/and clearly implies that only the agents (googlebots, adsbots), can search/crawl through your websites' pages. The best thing is to leave it like this User-Agents: /* (for all Search engines to crawl round your pages).

   Disallow: /
  
In this case, the sitemap is blocking/or refusing crawling to all folders whatsoever in that particular blog that bears the sitemap.
   
But in the generated blogger ctqrl sitemap, it has this format-: Disallow: /search, which means that contents found in blogs that have something to do with search folder, has blocked search spiders, to crawl freely on that blogs' pages.
  
I used this format-: Disallow: /prints, to control my own www.tutorialswrld.com blog search indexing.
     
Please Note-: You can block any folder of your choice using the above format.
   
Google Team,  have made a provision/or a clear way to explain Robots.txt Disallow: /  code. It will help in understanding how to tackle the "Disallow: /" code.
     
   Allow: /
   
This explains/or signifies that all search engines are free/and allowed to crawl round blogs' pages.
Similar to the explanation above (in the User-Agents: /* or User-Agents: /googlebots), if there's a folder attached to Allow: /, that means search engines are allowed to crawl on that specific folder (example-: Allow: /prints, only pages with folder "prints", can be crawled).
  
    Sitemap:https://yourdomainname.com/atom.xml----------- (and so on till 500)
  
This is the most important aspect of the generated sitemap of blogs. This is the little code which gives search engines links and channel to locate your pages/or contents. This is seen as the primary aspect of Google/or search indexing. Without the "atom.xml", your pages can will not be indexed. Likewise without the blog's sitemap, indexing the blogs' pages may not take place.
          
7) Click on "save changes". The refresh your blogger dashboard  and click on "save changes" above
    
Good luck as you apply these methods to your blog. You can as well read on some relevant posts

Comments

  1. Thanks for sharing the best information and suggestions, I love your content, and they are very nice and very useful to us. If you are looking for the best Google Knowledge Graph Api, then visit SERP House. I appreciate the work you have put into this.

    ReplyDelete
  2. Thanks for sharing the best information and suggestions, I love your content, and they are very nice and very useful to us. If you are looking for the best Google Search Api, then visit SERP House. I appreciate the work you have put into this.

    ReplyDelete

Post a Comment

What do you think?

Computing Category

Other post

Things to do on your site, so as to rank top on google search engine?

How to upload videos, Audios, and Pictures to your blog post, on Blogger

How to Monetize my Website Using Jumia Affiliate Marketing

How to carryout manual recording on your Dstv explora 2 decoder

Full specification of Servo K07 Bluetooth pen smartphone

How to use the new fetch as google tool in google webmaster

How I created a locally made helicopter that lifts using a plastic rubber.

HOW TO CATEGORIES POSTS IN YOUR WORDPRESS SITE

HOW TO SELL YOUR DOMAIN IN GODADDY.COM

Full Techno Camon X specific features (full specifications) ​