Control Website Crawling with Robots.txt
Website crawling is the process by which search engine bots scour the web to gather information about your site and its content. While this is essential for search engine optimization (SEO), sometimes you need to restrict which parts of your website are crawlable to bots. This is where the Robots.txt get more info file comes in handy. Robots.txt