Technical Pages: Off Page SEO Basics
This post is a follow-up of last week post about SEO: On Page SEO: Search Engine Unfriendly Technologies. I invite you to read this previous post as well. If you would like to have a whole picture of my WordPress SEO guide, you can check the series from the very first post: Comprehensive introduction to SEO.
The Robots.txt file is used only by Search Engines. It is located in the root folder of your website. It has 2 goals: Give the Sitemap Url to the Spiders, tell the Search Engine Robots which part of your website they shouldn’t index.
Here is an example Robots.txt:
User-Agent: * Disallow: /cgi/ Disallow: /admin/ Disallow: /content/this-article.html Sitemap: http://www.mysite.com/sitemap.xml
Sitemap is an internet protocol which is widely accepted by Search Engines. The aim of the Sitemap is to feed the Search Engine Robots in an automated way the Urls of the new pages to index.
This improves drastically the speed of indexation of your newest pages, which is very useful when you launch a new website.
You can find all the useful information about Sitemaps on http://sitemaps.org which also provides a file validator to check the coding of your sitemap file.
You can generate this file manually, but it is better to have it generated dynamically when your website is updated. It is recommended to put the freshest urls at the top of your Sitemap file.
A Sitemap shouldn’t include more than 50 000 Urls and weight more than 50Mo. In the case you have more Urls, you should have additional Sitemap files regrouped in a Sitemap Index.
The Sitemap.xml file should be declared in the Robots.txt (located in the root folder).
Added to the classical Sitemap, you can add specific Sitemaps for some types of content.
Google News Sitemap
It will highly help regarding the speed of indexation of your articles in Google News. It is limited to 1000 urls and it must include only “news-type” articles. Also, it should be updated in real time, every time you post a new article.
It will let you submit your videos while associating them with meta-data and a snapshot which will be shown in the SERP.
It is indexing the files with geographic information like KLM and GeoRSS which will be shown in apps like Google Maps and Google Earth.
When you're learning how to optimize website for search engines, you need to understand that the very first step is to create a website. If you're interested in starting your own blog, I have written a step-by-step guide that will show you how to start blogging for money for as little as $3.49 per month (this low price is guaranteed only through my link). You will also receive your own domain name for free ($15 value) by clicking on this link and purchasing at least 12 months of hosting with BlueHost. Keep in mind that if you're learning how to do seo, the first thing you need is your own self-hosted website. It will help you look more professional in front of your visitors, clients, companies, and everyone else, including search engines.