Ensuring your website is fully indexed and ranking well in search engines can be a long hard struggle, especially if you don’t get your site off to the right start. Paying particular attention to key on and off-page factors that form the essential basics of a solid SEO campaign can transform your website visitor and lead generation conversions into something worth shouting about.
Knowing the true present state of your website (as seen through the eyes of Google) can help direct your efforts into the very areas that are letting down your online visibility status, so the best place to start is with Google’s webmaster tools.
1 – Google Webmaster Tools
Set up an account if you don’t already have one, and within a few hours you will see data about your sites performance in Google, detailing old/broken links, number of pages indexed in Google, duplication of important meta tags and many more key areas.
By taking immediate action against any areas of concern in the webmaster reports, your site will soon start benefiting from being cleaner and Google standards compliant.
2 – Meta Tag Data
Ensure all your pages contain all the vital Meta tags:
a) Meta Title
70 characters max – Primary and secondary keywords to be used here.
b) Meta Keywords
7-10 keywords Max - related to the pages theme, primary keywords first.
c) Meta Description
No more than 150 characters. Main keyword should be present here with short info describing the content of this page as this is visible in search results pages, so needs to grab the visitors attention.
Here's a fantastic Example – zappo’s - Free shipping BOTH ways on shoes, clothing, and more! 365-day return policy, over 1000 brands, 24/7 friendly customer service.
3 – Get your keywords into the right places.
a) Priority here is the h1 page header h1 - your primary keyword needs to be the first words here.
b) Other headers H2, H3, H4 – use primary, secondary keywords here too.
c) Body copy – aim for a minimum of 250-300 words per page, and a good rule of thumb for keyword/text density for the whole page (forgetting about navigation links) should be around 4-7%. Don’t overdo it, keyword stuffing is frowned upon by Google, so remember to write your content for people, not search engine robots, or your site may get penalised.
4 – XML Sitemap
Imagine driving around somewhere for the first time without a map – you’d get lost. The same principle applies for websites in the eyes of a search engine; if the search engine can’t easily find your pages or know what they are or where they are located – your pages won’t get indexed very quickly and Google’s understanding of your site as whole will be incomplete.
Simply adding an XML site map allows search engines, once they are told of the location of the file (see step 5), to access each and every page of your website, enabling a much faster and complete indexing of your website in search engines. This is especially important if you have dynamically generated content that keeps growing on your website through a Content Managed System, such as blogs, news, online shop features etc – developing a dynamic site map will ensure all new pages that you add through your CMS will automatically get added to your XML Site Map, without you having to worry about it.
5 – Robots.txt file
This file is important as it not only informs search engines of the location of your XML Sitemap, but also contains details about which pages, folders, files etc that you don’t want search engines to crawl and index in their search engines. This can be very important to ensure that if you have duplicate content pages, perhaps for purposes of testing different versions of page layouts or advertising landing pages, you disallow search engines from reading and indexing that specific content.
A robots.txt file is one of the first files a search engine spider will look for on your web server, so adding this with the location of your XML Sitemap will be giving Google the most important site data that it requires from the start – the location of ALL your websites content.