Thursday, May 13, 2010

Learn how to create a good Robots.txt for Google


Learn how to create a good Robots.txt for Google

Although professional search engine optimization consultants for robots.txt concept is familiar to those who just started his journey in the world of search engine optimization is perhaps not so obvious. Basically, robots.txt will tell search engine spiders coming to crawl your site on directories and files that you want to be indexed from your site. It is wrong to use the robots.txt actually can have a huge impact on your site. You can make it hard or even impossible for search engine robots to scan your site is simply a mistake in robots. Txt.
Why use robots.txt ? Here are the reasons why you should use robots.txt: - a signal to search engine spiders not to crawl or index certain sections or pages of your site. - To prevent indexing completely. You can exclude certain areas of your site from the index. - Issuance of separate regulations indexing of specific search engines.
Creating a robots.txt Robots.txt is a simple file that can be edited in Notepad and then saved in the root directory of your website, where you also have your home page.

Each entry for your robots . txt must contain two lines:
User-Agent: [Spider / Bot name] Disallow: [Directory / File Name]
On your reason for using robots.txt, this is how you can put it together:
1. If you want to delete the file from the individual search:

User-Agent: Googlebot Disallow: / private / privatefile.htm
2. If you want to delete only the section of the site from the spiders:
You do not need to specify every robot that you want to delete more, because you can simply use the wildcard character '*', which will lead to his immediate isolation. User-Agent: * Disallow: / newsection /
3. If you want to search engine spiders to index all:
Using the same template, '*', then the signal that all the spiders can crawl the entire site. Remember that you should leave the second line, disable, empty, it's your ban from nowhere.
User-Agent: * Disallow:
Once you have received your robots.txt on the site, make sure that they were made correctly. The tool used for this Google Webmaster Tool. As we mentioned in a previous blog (Free Search Engine Marketing strategies through Google Webmaster Tool), one of the features offered by this tool lets you check the robots.txt. Basically, Google will automatically and in real time retrievethe robots.txt on your site, and adds the main address in the address list to be updated.
How to find consultants, search engine optimization, we recommend that you pay very much attention to how you use this is because of their large impact on search marketing strategy. They can help your site for higher visibility of search engines, but can also be very destructive if you do it wrong.

No comments:

Post a Comment

Powered by Blogger.