robots.txt is a file (with the “”.txt”” extension) in the root directory of a web site, and it is used to restrict content and notify search engines crawlers which areas of your site are excluded for, which pages it may not index, and it can be a very useful and important file particularly for SEO.
It can be used to exclude all your admin pages from indexing (for example, pages in the wp-admin section of WordPress blog), and also can be used it to prevent search engines from seeing duplicate content on your site.
Robots (that complies with the Robots Exclusion Standard) will read this file on each visit, so that pages or areas of sites can be made public or private at any time by changing the content of robots.txt before re-submitting to the search engines.