How To Optimize WordPress Robots.txt File For Google Search Engine

After yesterday we have been speaking about HTML Sitemap for WordPress, which can help you in SEO of your blog, found through colleagues perishable press some interesting news about the Robots.txt in WordPress blogs. For those who constantly seeks to improve the SEO of your blog and get ahead of the organic results of search engines, Robots.txt is something very important to take into consideration, and should be used with caution and as correctly as possible. With some of the most common techniques, you can streamline some indexing processes, and also delete the indexing unimportant pages or sections on your blog.




Better rules for WordPress Robots.txt
WHAT ARE ROBOTS.TXT?
The robots.txt file is a file that allows you to block access of robots and spiders of search engines. This file can also be used to allow access to files or specific to your blog boards. Basically, this file is used to communicate with the Google search engine, Bing, Yahoo, etc. to when they visit your site, knowing what may or may not index. A robots.txt file can perform a wide range of operations, and in fact it can be quite powerful when used in the best possible way.
ROBOTS.TXT And WORDPRESS
For those who use WordPress, the goal is that your articles and pages are scanned and indexed. The same is not true of the original files of WordPress installation, as well as their boards. You also want to make sure that your RSS feeds and trackbacks are not included in search engine results. It is also a good practice to declare your sitemap. Let’s see how to start your robots.txt file:
User-agent: *
Disallow: / feed /
Disallow: / trackback /
Disallow: / wp-admin /
Disallow: / wp-content /
Disallow: / wp-includes /
Disallow: /xmlrpc.php
Disallow: / wp-
Sitemap: http://exemplo.com/sitemap.xml




This is just a base for you to start work, since you can allow or exclude access to virtually everything that relates to your blog. To use this code on your WordPress blog, copy and paste the code into a file that will then call “robots.txt and should be placed in the root of your server. It should be accessible as follows:
http://www.example.com/robots.txt
If we look at the robots.txt School WordPress, we quickly realized that there is one more line in this file. Let’s see:
User-agent: *
Disallow: / feed /
Disallow: / trackback /
Disallow: / wp-admin /
Disallow: / wp-content /
Disallow: / wp-includes /
Disallow: /xmlrpc.php
Disallow: / wp-
Allow: / wp-content / uploads /
Sitemap: http://www.example.com/sitemap.xml
The line that there is more, “Allow: / wp-content / uploads /” is a line that tells the search engines that the content within the uploads folder is to index. Usually the uploads folder is where all the images we carry to the blog, which would like to see indexed in google images, for example. That way we can not only serve the community, but also earn some extra traffic through Google Images.
There are many tricks to allow indexing and not indexing certain parts of your blog, but start with a file of this type is already half way to a more intelligent indexing of your blog content.




LEAVE A REPLY

Please enter your comment!
Please enter your name here