i have the pleasure of working with some of the largest retailer and publisher brands in the US. but on occasion i have the opportunity to work on something a little different. i don’t know if this has ever really been covered on the blogosphere (i haven’t checked), but i thought i’d write a post regarding non-profit (or not-for-profit) organizations and search engine optimization.

as a website whose core is not to “sell” anything, in reality “sales” and “conversions” are every bit as important for an NPO website as they are on any retailer website. driving traffic to NPO and trying to convert the visitors to make a donation is the name of the game. very similar to a retailer driving traffic to their site and hoping that visitors will buy their products.

what makes an NPO unique within the world of search engine optimization?

the fact that they raise money for a cause, not for financial gain’s sake, and the fact that many people can relate to goodwill causes.

so what?

these ideals are what make NPO’s a “natural” for gaining quality, relevant inbound links as well as gaining the attention of viewer’s merely by making users aware of their organization.

i can try all i want to get people to link to a retailer site and the success rate is painfully low. but, i can target the right websites and request a link back to an NPO and the success rate is much higher.

in addition, most if not all NPO’s have third-party affiliations of some kind, who are more than happy to link to the NPO site. these affiliations can be in the form of corporate sponsors, politicians, government agencies, celebrities, etc. a large pool of possible high-quality inbound links. and for no cost.

wow, aren’t all us website owners jealous now? (i am)

but seriously, the pool of resources available to an NPO is just endless. retailers could only hope to have this kind of influence over inbound links (even if they paid for them).

but the opportunity/advantage doesn’t stop with links, it can also manifest itself in an NPO’s ability to drive traffic to their site by increasing awareness on the web.

what NPO’s understand all too well is the idea that more people would donate to various causes if they were aware of their existence.

for an NPO to fund a documentary or any sort of traditional media tactic (TV, print, radio) costs a lot of money but an NPO can run press releases on a regular basis and distribute them over the web for low cost. via syndication an NPO can release a tremendous amount of awareness-type information to the masses for very low cost.

in addition, the creation and distribution of video is much cheaper now than it ever has been. the opportunities for NPO’s to disseminate their video assets and increase awareness is great.

also, just like retailers who get in new product every year or season, medical breathroughs and environmental discoveries happen on a regular basis. these are the opportunities for NPO’s to “spread the word” via the web.

in short, non-profits should take heed and realize the vast potential that SEO can hold within their organization and its long-term effect on fund raising.

wow, Google is now allowing site-owners to submit an XML sitemap feed to help index all of their site’s video content.

very cool.

this is a great supplement to the optimizing video for natural search post i ran in the past.

now you can provide some meta data around each video file on your site (a title and a description).

this is a major step forward for those with websites whose video content is not indexed or optimized for natural search at all.

i wrote in the past about Google offering excellent low or no cost website tools for the small to medium-sized business website. they have also offered an excellent site analysis feature called Google Webmaster Tools which gives you detailed insight into linking information, crawler errors, xml sitemap feeds and a view to how Google “sees” your site.

well as of today, Google Webmaster Tools has made itself even better…

there is a new “content analysis” feature that gives you insight into: title tag issues (duplicate titles and/or too short of titles for example), meta description tag issues and non-indexable content issues (flash, images, etc.).

how freakin’ cool.

there is also additional insight into Google’s crawl of your XML sitemap feed and any errors it encounters with that crawl.

the free tools that Google is offering the website owner these days is fantastic. keep ’em coming Google…

one excellent way to make your website more optimized for natural search is to have search engine friendly URL strings. in the “old” days, a search engine friendly URL string was the difference between something like:

http://www.gravy.com/good/gravy.html

and something along the lines of dynamic, database-driven website URL strings such as:

http://www.grossgravy.com/index.asp?pid=156263&cid=33452&msscsid=1293n8913n9923h97123b78g9512h35945

the latter used to really annoy the search engine spiders as they were/are designed for scale and had a hard time parsing such monstrosities. they would hit something like that and would just disregard it.

the bots became smarter and better over time and were able to understand URL strings with one or two dynamic variables (?, &, =, etc.) and now can understand/index multiple variable URL strings.

that is not to say that they appreciate these types of URLs, my guess is that they would prefer the shorter, more SEO-friendly strings over the dyanmic strings any day. in addition, it has long been thought that the shorter the URL string, the more search engine optimized it is and the more relevance/weight is given to it by the bots. the understanding was that the further down the URL path the less relevant the content must be (this was with regard to URLs that used folder structures). for example:

http://www.gravy.com/tasty.html

was given more weight/relevance than the following URL…

http://www.gravy.com/peanut/buttery/tasty/gravy.html

i’m not 100% sold on the idea that the latter URL is given less weight any longer. i strongly believe that as long as the string is bot-friendly that being a few folder levels down in the hierarchy will not penalize that given page. this is open for debate of course.

even so, i still follow the practice of using shorter URL strings as much as possible with the higher level category pages being no more than one folder level off the root and sub-category level pages being two folder levels off the root. old habits die hard.

since the bots are better at indexing complex URL strings, the new(er) factor to consider when architecting your URLs is to inject keyword text and phrases into your URL strings. this is fairly easy to do if your site is a basic HTML-type site since you have the ability to create real folders and actually name your HTML files yourself. on dynimically driven websites, you really have to be a developer with experience creating dynamic websites and a good understanding of URL re-writing techniques using Apache web server or ISAPI Rewrite for IIS (Windows) servers.

the best practice is to separate words by hyphens “-” to simulate a “space” in the words. Google has recently claimed that underscores “_” are acceptable as well but i would stick with hyphens (old habits).

if you have a website that sells products of varying types but that can be categorized well, you would do yourself justice by optimizing your URL strings with keywords. let’s say you sell pet products online and you have a category for dog foods. a possible URL structure to follow might be:

http://www.petproducts.com/dog-food/puppies/
or
http://www.petproducts.com/dog-food/adult/

or something like…

http://www.petproducts.com/dog-food.asp

then, if the landing page that the keyword-rich URL points to has the same keywords that the URL string contains, you are really creating a better optimized page than if the URL didn’t have any keywords injected into them at all.

this is really the coup de gravy of optimization… having a keyword-rich URL string, with the same keywords in your Title tag and in your body copy.

now go out there and force your web developers to create keyword optimized URL strings.

if you have a website with a lot of pages (one hundred or more) and it is well indexed in the search engines, you really have a great resource for developing your own organic link building campaign.

link building is the act of soliciting other websites to create hyperlinks that point back to your site, in order to increase your ranking and relevance to the search engines.

paid link building entails paying someone to link back to you. organic link building is obtaining links that don’t cost you a dime.

but what many people seem to forget is that their own web site is a possible resource for quality, keyword-optimized links.

let me explain…

over time, your site will build up PageRank. and if you follow good SEO tactics and continually add quality content to your site, you will have a web site with a PageRank of four/five or better.

now this means that any plain text hyperlink you put on your site to link to other pages within your site will pass along a decent amount of PageRank and will help provide some weight behind whatever keyword phrase you utilize within that hyperlink.

the engines, especially Google, like to see keyword optimized text within hyperlinks that subsequently link to landing pages that contain the same text within their Title tags, their header tag(s), their body copy, etc.

so technically, just by utilizing your own website, you can create a decent amount of link popularity around a few key terms by creating the right hyperlinks and pointing them to their respective landing pages.

if you have a handful of terms that you consider the most important to your website, consider creating a single landing page for each one of these terms and then creating a site-wide footer with these keyword optimized hyperlinks that point to these respective pages.

in essence you would be passing a good deal of PageRank/weight/relevance for these terms (by having these links on every page of your site) and applying this to these landing pages.

couple that with an organic link building campaign outside of your website and you’ll be sitting pretty.

in the early days of search engine optimization, it was common practice to try and “stuff” the meta keyword field with various keywords that a webmaster wanted associated with a given web page. spam tactics took hold early and often.

the meta description tag had suffered the same fate as the meta keyword tag. many people tried spamming the field to influence/trick the search engines into thinking a page had certain relevance that was not necessarily true.

as a result of all this tomfoolery, Google and other engines began to discount the value of the meta tags (keyword and description) and their content.

while i recommend against adding meta keywords to every page of your site (waste of time), i highly recommend creating unique, human-friendly meta description tags for every page of your site.

while Google may no longer assign value to meta description tags, the meta description tag plays a BIG role in conveying a strong marketing message and from the vantage point of usability.

try the following two examples on for size:

Dry Dog Food from DogFoodExperts.com
Find the largest selection of dry dog food on the internet at DogFoodExperts.com. We offer dry dog foods from brands such as Purina, Gravy Train, Tasty Eats and more. Free shipping on orders over $75.

dog food
HOME | DOG FOOD | CAT FOOD | ABOUT US Terms and Conditions – Privacy Policy – Copyright…

which one would you rather click on?

the first example clearly speaks to the content found on the subsequent landing page as well as to the end user who is able to make a more intelligent decision on which link to follow.

the second example had and un-optimized title tag and no meta description tag at all. when you omit the meta description tag, the search engine will have a tendency to grab the first pieces of text it can find on the page and use that as the description. in many cases, that text will be a top navigation bar, hence the “home”, “about us”, etc.

as you can see, the meta description tag can go a long way with influencing your conversion rates.

a more qualified lead will almost always convert better than a casual observer who is not sure of what they’ll find when they click on a non-descript link in the search results.

keep all this information in mind when building and/or optimizing your site. your customers will thank you.

what every website owner needs to understand is that the most important aspect of their website is its content. this might sound silly, but it can’t be stated enough.

i think too many people lose sight of this fact.

the more new, relevant, keyword optimized content you add to your site on a regular basis the happier the search engines will be with your site.

try to make it a habit to add new pages on a regular basis, like three new content pages per week, every week.

the addition of content on a regular basis gets noticed by the spiders and gives them reason to re-visit your site more often. always a good thing.

and don’t forget to give each page a unique, keyword optimized title tag and meta description.

also, make sure you provide the spider an easy way to find these new pages with plain text hyperlinks pointing to them. and for that matter, put a keyword or three into the hyperlinks that speaks to the content of the new landing page. optimized internal linking goes a long way with the engines.

now if only i had the wherewithal to follow my own advice and add content regularly.

: (

a common problem with large websites that are driven by content management systems (CMS) is the case of over indexation.

what the heck is that?

a website can end up hurting itself based on how its URL strings are generated. i’ll explain…

let’s say you visit the following site:

http://www.overindexed.com/

then you click on a category page called Category 1 and the resulting URL is:

http://www.overindexed.com/cat1.asp?=linkHome

now let’s say that you visit the home page, go to a different category page (maybe Category 2), then from that category page you click on the first category page (Category 1). but this time, the exact same category page of content loads but the URL string now reads:

http://www.overindexed.com/cat1.asp?=cat2

so you really have, because of the different URL string, two versions of the exact same category page.

this is not only common with CMS driven websites but also very common with various types of website user tracking solutions (a method to track a user’s path through a site).

this problem also exists with websites that use session id’s to track a user through a site and append the URL strings with unique session id numbers. so that every new visit by a user or a search engine spider for example would generate a brand new iteration of every page they follow because the URL string for a static page looks different every time. such as:

http://www.overindexed.com/index.asp?mscssid=H9TQE6XGE5SP8J6AMGSHN5NXNN1F8TRE

and

http://www.overindexed.com/index.asp?mscssid=S6WGD5…

both of these URL’s are the exact same page but because of the appended session id string at the end, the engines see these as multiple pages, not one.

so what’s the problem?

the problem is that search engines see multiple versions of a single page and:

a. think you are trying to submit more than one copy of a page in order to spam the index (not nice)
2. the engine must now try and determine which version of the page is the most important or most relevant (diluting the effectiveness of the page as a whole)

you should never let the engines try and decipher on their own what version of a page is the right version of a page.

each page of content on your site should have a single URL assigned to it and should be unique from the other pages on your site.

if you find that you are able to overcome your URL obstacles, don’t forget to 301 redirect the legacy iterations of pages to the proper version upon fixing your site.

i spoke previously about 301 Permanent Redirects and how they are an effective way to maintain/pass along your site’s PageRank and inbound links to the new site and/or landing page.

one thing i’d like to mention is that with regard to 301’ing a site/landing pages from one domain to a completely new domain, Google may take some time to resolve this update within its algorithm.

for example, if your “old” site is:

www.domain1.com

and the new site is:

www.newdomain.com

and you’ve 301 redirected the home page of the old to the home page of the new and redirected all the major landing pages to the new site’s URL structure.

a change in the actual domain name may cause up to a 2-3 month lag in resolution within Google. scary indeed.

patience is definitely a virtue here.

the typical results i’ve seen after the resolution is a return to previous site rankings/traffic followed by marked increases in the months to follow.

: )

this is merely a rant to say that i can’t stand the term “web 2.0”. the word has no real meaning and no real definition.

i have such a hatred for the term because i am surrounded by it on a daily basis and yet no one can explain it to me fully or give me examples of what it is.

somet things that get mentioned are:

video optimization
blogging
AJAX
user reviews and interaction
wiki’s
blah blah blah

stupid. i want to gag myself when people try to sound like they know what they are talking about by using the term web 2.0

end of rant.