in the early days of search engine optimization, it was common practice to try and “stuff” the meta keyword field with various keywords that a webmaster wanted associated with a given web page. spam tactics took hold early and often.

the meta description tag had suffered the same fate as the meta keyword tag. many people tried spamming the field to influence/trick the search engines into thinking a page had certain relevance that was not necessarily true.

as a result of all this tomfoolery, Google and other engines began to discount the value of the meta tags (keyword and description) and their content.

while i recommend against adding meta keywords to every page of your site (waste of time), i highly recommend creating unique, human-friendly meta description tags for every page of your site.

while Google may no longer assign value to meta description tags, the meta description tag plays a BIG role in conveying a strong marketing message and from the vantage point of usability.

try the following two examples on for size:

Dry Dog Food from
Find the largest selection of dry dog food on the internet at We offer dry dog foods from brands such as Purina, Gravy Train, Tasty Eats and more. Free shipping on orders over $75.

dog food
HOME | DOG FOOD | CAT FOOD | ABOUT US Terms and Conditions – Privacy Policy – Copyright…

which one would you rather click on?

the first example clearly speaks to the content found on the subsequent landing page as well as to the end user who is able to make a more intelligent decision on which link to follow.

the second example had and un-optimized title tag and no meta description tag at all. when you omit the meta description tag, the search engine will have a tendency to grab the first pieces of text it can find on the page and use that as the description. in many cases, that text will be a top navigation bar, hence the “home”, “about us”, etc.

as you can see, the meta description tag can go a long way with influencing your conversion rates.

a more qualified lead will almost always convert better than a casual observer who is not sure of what they’ll find when they click on a non-descript link in the search results.

keep all this information in mind when building and/or optimizing your site. your customers will thank you.

what every website owner needs to understand is that the most important aspect of their website is its content. this might sound silly, but it can’t be stated enough.

i think too many people lose sight of this fact.

the more new, relevant, keyword optimized content you add to your site on a regular basis the happier the search engines will be with your site.

try to make it a habit to add new pages on a regular basis, like three new content pages per week, every week.

the addition of content on a regular basis gets noticed by the spiders and gives them reason to re-visit your site more often. always a good thing.

and don’t forget to give each page a unique, keyword optimized title tag and meta description.

also, make sure you provide the spider an easy way to find these new pages with plain text hyperlinks pointing to them. and for that matter, put a keyword or three into the hyperlinks that speaks to the content of the new landing page. optimized internal linking goes a long way with the engines.

now if only i had the wherewithal to follow my own advice and add content regularly.

: (

a common problem with large websites that are driven by content management systems (CMS) is the case of over indexation.

what the heck is that?

a website can end up hurting itself based on how its URL strings are generated. i’ll explain…

let’s say you visit the following site:

then you click on a category page called Category 1 and the resulting URL is:

now let’s say that you visit the home page, go to a different category page (maybe Category 2), then from that category page you click on the first category page (Category 1). but this time, the exact same category page of content loads but the URL string now reads:

so you really have, because of the different URL string, two versions of the exact same category page.

this is not only common with CMS driven websites but also very common with various types of website user tracking solutions (a method to track a user’s path through a site).

this problem also exists with websites that use session id’s to track a user through a site and append the URL strings with unique session id numbers. so that every new visit by a user or a search engine spider for example would generate a brand new iteration of every page they follow because the URL string for a static page looks different every time. such as:


both of these URL’s are the exact same page but because of the appended session id string at the end, the engines see these as multiple pages, not one.

so what’s the problem?

the problem is that search engines see multiple versions of a single page and:

a. think you are trying to submit more than one copy of a page in order to spam the index (not nice)
2. the engine must now try and determine which version of the page is the most important or most relevant (diluting the effectiveness of the page as a whole)

you should never let the engines try and decipher on their own what version of a page is the right version of a page.

each page of content on your site should have a single URL assigned to it and should be unique from the other pages on your site.

if you find that you are able to overcome your URL obstacles, don’t forget to 301 redirect the legacy iterations of pages to the proper version upon fixing your site.