You are here

Robots txt File

Robert Maxim uses a robots.txt file on your website to help search engines efficiently search and index your website.

The robots.txt file is used to prevent cooperating web spiders and other web robots from accessing all or part of a website which is otherwise publicly viewable. Robots.txt files are often used by search engines to categorize and archive web sites and are an important part of auto repair SEO. The robots.txt file is used in conjunction with a XML Sitemap which is a robot inclusion standard for websites.