Check Robots

Most modern websites built using content management systems or, more simply, 'engines'. A key feature of such systems is that the pages of this site are dynamically created based on templates and information from the database. This approach provides a very flexible and convenient management of the site, but the site will contain various auxiliary files. For example, php scripts, configuration files, template files. These files are invisible to the average user (if it uses only the links posted on the site), but it can be indexed by search engines. Indexing of these files is not highly desirable, because they do not contain all no useful information (texts of articles, comments, etc.

are usually found in the database), and the traffic is consumed. The robots.txt file contains a set of rules that allow utilities to close any files on the web for indexing. Approach to the creation of such a file must be very careful, because You can easily hide the pages with useful content. In the first place you can look for a robots.txt file ready for your engine (for wordpress, I found pieces of 5 different files). After that, check that all the directories with all the service files denied access.

The next step is to check the syntax file. It is convenient to perform with the Yandex (). Now the main thing. We must verify that all pages our site available to robots. There is convenient to use the service Google webmasters tools (). To use it you have to create an account and confirm ownership of the site (you need to place a file with a special name in the root site). I advise you to read 'all the file robots.txt' and the article 'How do I check the file robots.txt'. That's it. Successful 'saitostroitel'stva'.

Categories: General, Uncategorized Tags: Tags: