A Robot communicate with Search Engine
Search engine spider crawl your periodically. How to limit the access of the spider? So the search engine only crawled the page pages you want to show and leave others ignored. What we need is a file called “robots.txt”. The robot text file is used to disallow the search engine spider’s access to folders or pages that you don’t want … Read more