one excellent way to make your website more optimized for natural search is to have search engine friendly URL strings. in the “old” days, a search engine friendly URL string was the difference between something like:

http://www.gravy.com/good/gravy.html

and something along the lines of dynamic, database-driven website URL strings such as:

http://www.grossgravy.com/index.asp?pid=156263&cid=33452&msscsid=1293n8913n9923h97123b78g9512h35945

the latter used to really annoy the search engine spiders as they were/are designed for scale and had a hard time parsing such monstrosities. they would hit something like that and would just disregard it.

the bots became smarter and better over time and were able to understand URL strings with one or two dynamic variables (?, &, =, etc.) and now can understand/index multiple variable URL strings.

that is not to say that they appreciate these types of URLs, my guess is that they would prefer the shorter, more SEO-friendly strings over the dyanmic strings any day. in addition, it has long been thought that the shorter the URL string, the more search engine optimized it is and the more relevance/weight is given to it by the bots. the understanding was that the further down the URL path the less relevant the content must be (this was with regard to URLs that used folder structures). for example:

http://www.gravy.com/tasty.html

was given more weight/relevance than the following URL…

http://www.gravy.com/peanut/buttery/tasty/gravy.html

i’m not 100% sold on the idea that the latter URL is given less weight any longer. i strongly believe that as long as the string is bot-friendly that being a few folder levels down in the hierarchy will not penalize that given page. this is open for debate of course.

even so, i still follow the practice of using shorter URL strings as much as possible with the higher level category pages being no more than one folder level off the root and sub-category level pages being two folder levels off the root. old habits die hard.

since the bots are better at indexing complex URL strings, the new(er) factor to consider when architecting your URLs is to inject keyword text and phrases into your URL strings. this is fairly easy to do if your site is a basic HTML-type site since you have the ability to create real folders and actually name your HTML files yourself. on dynimically driven websites, you really have to be a developer with experience creating dynamic websites and a good understanding of URL re-writing techniques using Apache web server or ISAPI Rewrite for IIS (Windows) servers.

the best practice is to separate words by hyphens “-” to simulate a “space” in the words. Google has recently claimed that underscores “_” are acceptable as well but i would stick with hyphens (old habits).

if you have a website that sells products of varying types but that can be categorized well, you would do yourself justice by optimizing your URL strings with keywords. let’s say you sell pet products online and you have a category for dog foods. a possible URL structure to follow might be:

http://www.petproducts.com/dog-food/puppies/
or
http://www.petproducts.com/dog-food/adult/

or something like…

http://www.petproducts.com/dog-food.asp

then, if the landing page that the keyword-rich URL points to has the same keywords that the URL string contains, you are really creating a better optimized page than if the URL didn’t have any keywords injected into them at all.

this is really the coup de gravy of optimization… having a keyword-rich URL string, with the same keywords in your Title tag and in your body copy.

now go out there and force your web developers to create keyword optimized URL strings.

Leave a Reply

Your email address will not be published. Required fields are marked *

Post Navigation