What is Robots.txt file in SEO (Search Engine Optimization)

If a question is arising in your mind that what is robots.txt in SEO ? Then i Have exact answer for you guys . Robots.txt file is basically text file webmaster create to instruct robots (Mainly Search Engine Robots) Google use google bot to Crawl a website. Robots.txt is used to allow and disallow web content to be appear in Search Engine. This is basically used to instruct to google bot that how to crawl and index pages on their website. A robots.txt file is a file at the root of your site that indicates those parts of your site you don’t want accessed by search engine crawlers .


What is robots.txt used for ?

Robots.txt does prevent image files from appearing in Google search results. (However it does not prevent other pages or users from linking to your image. You can use robots.txt to block resource files such as unimportant image, script, or style files, if you think that pages loaded without these resources will not be significantly affected by the loss. However, if the absence of these resources make the page harder to understand for Google’s crawler, you should not block them, or else Google won’t do a good job of analyzing your pages that depend on those resources. You can also check  How to Increase Website Traffic for free Tips and Tricks .

Leave a Reply