Tuesday, December 18, 2007

Getting Listed on Search Engines Part III - Directories

There are a number of ways to get listed in a search engine. Quite possibly the slowest is direct site submission. Links in and links out are very important to search engines. One popular way to provide links in to your website is by getting listed in a directory.

A web directory is defined as a directory on the World Wide Web that specializes in linking to other web sites and categorizing those links. Pages are generally categorized based upon their the whole web site rather than one page or a set of keywords. Inclusion on directories is usually limited to one or two categories. Once submitted to a directory the directory editors will evaluate your site and approve or deny you.

Most Popular Directory

The dmoz Open Directory Project is the largest, most comprehensive human-edited directory. Many search engines use listings in the Open Directory Project, some of these search engines are: AOL Search, Netscape, Google, Lycos and Hotbot. According to their website there are hundreds of other search engines that use them.

Most importantly the Open Directory Project is 100% free, no charge to submit a site or use the data from their directory. Upon submission pages are reviewed by volunteers and determined whether or not they are applicable for listing.

Open Directory Project at dmoz.org

Other Directories
There are other directory services than dmoz the Open Directory Project.

Business.com is the leading directory focusing on business products and business services and has over 65,000 categories.

Yahoo! Directory is operated by Yahoo! and can increase the likelihood that your site will eventually show in their search engine results pages. For more information on submission visit their Directory Help Page.

The Librarians' Internet Index is maintained and organized by librarians into 14 main topics and nearly 300 related topics.

How Directories Help Listings

Getting your web site listed on good directories is one of the best way to get quality inbound links that are highly respect by search engines. Submissions to some directories are reviewed and approved by human editors and organized into categories and sub categories. Page listed in categories on directories have high quality and relevance and search engines take this into effect.

Questions, comments and criticism are always welcome, search engine optimization and web development are life long learning adventures.

Friday, November 30, 2007

Getting Listed On Search Engines Part II - Blogs

The best way to get listed and rank high is to use as many SEO techniques as you can. The happier you make the search engines the higher your pages will soar in the result pages.

The Art and Zen of Blogging
Search engines really enjoy blogs because they have RSS feeds, frequent content updates, lots of links. On blogger if you have post pages enabled in the archive settings it will create a search engine friendly named page from your title and link off of your front page for each post.

Try to start off with about 5 to 10 content rich posts before you start pushing them to the search engines. Make sure these are not just copy pasted articles from other peoples websites. You can use other sites for ideas about writing posts but keep the words your own. Plagiarism is frowned upon in school by teachers and staff, plagiarism is also frowned upon by search engines.

To further make them exciting to search engines create a free feedburner account and encourage people to subscribe to your site by adding the chicklets for My Yahoo!, iGoogle and RSS. Create a Yahoo! account if you do not already have one and add your feed to your My Yahoo! and make sure it is on top of all the others. Addition of your RSS feed to a My Yahoo! will give you a higher chance of being indexed by yahoo and likewise for Google.

Also the addition of a digg button for each article is nice, digg will create a page with a link to your article where people can leave comments. Make sure you digg your own blog posts so digg is aware that your post exists. Your title and description should be relevant to what your post is about.

If at all possible try to get some people to link to your blog too, the more links to your pages the merrier. Just make sure people use the keywords you are wanting to dominate in the links to your sites.

So you have about 5 to 10 entries with relevant wonderful content. How do I tell the search engines I exist?

Blog Pinging
There are services that exists strictly for finding new blog entries and publishing them. Rather than waiting around for these services to come find you pinging tells them, "Hey! I have a updated by blog come spider me!". In order to do this you must execute a blog ping.

Depending on your blog service, whenever you publish a new blog post a ping is sent out but to only a few search engines There are websites which will alert a bunch of blog search engines for you. These "blog pinging" websites should only be used for blogs, otherwise it is considered a spam ping and you will upset the search engines and pinging services. Also do not ping more than once per blog post, you tell them where you are and they will eventually get to you.

When you do a ping make sure that you select relevant services, if you have a blog all in english do not ping non-english services. Also make sure you select the big blog search engines:

  • technorati.com
  • Google Blog Search
  • my.yahoo.com
  • blo.gs - owned by Yahoo!
  • Feed Burner - owned by Google
Some of the blog pinging services have different search engines included in them. I personally like to use the Blog Flux Pinger for pinging and some of the non-pinging blog tools provided by pingoat.
  • Pingoat - Does not require registration and has some nice tools other than blog pinging. Spider simulator, sitemap.xml generator, keyword analysis
  • Ping-o-Matic! - Does not require registration, strictly just a blog pinging utility. They report statistics on how many pings have been issued
  • Blog Flux Pinger - Requires registration and requires that your blog be at least a week old or have 5 entries. They provide a bunch of other services such as a directory, page rank checker, free custom polls, and a bunch of web hosting service reviews. Also supports all of the 5 search engines that I listed above as being important.

Questions, comments and criticism are always welcome, search engine optimization and web development are life long learning adventures.

Getting Listed On Search Engines Part I - Where Am I

You can almost always track where your visitors are coming from. To keep a good eye on your traffic there are some excellent tools you can use. There are the roll your own variety which are hosted on your server, and the pre-made, already hosted, ready to use 3rd party variety. My favorite is to use 1 of each if you can.

Self Hosted Website Traffic Analysis
My favorite tool in this genre is AWStats. This is a free tool that with closely look at your log files for apache, IIS, webstar, mail server, wap, proxy, streaming servers and some ftp servers. Based on the data collected from these log files, graphs, charts and a plethora of other data will be generated for your servers.

AWStats provides number of unique visitors, pages, hits, bandwidth statistics broken down by month, day of month, day of week and hour! AWStats will also tell you which country the visitors are from, which spiders have been to your site, when the spider was last seen, and how many hits they generated. This tool also provides duration of visits, entry/exit statistics by page, visitor operating system and browser, screen sizes, keywords, keyphrases. There is also plugins you can add to AWStats or you can write your own. The possibilities are truly endless.

AWStats can also be a bit cumbersome to setup and requires you to know what you are doing in apache configuration files. You need to be able to setup Allow, Deny rules in apache for the awstats folder and have apache generate custom log files that awstats can read and use.

The statistics that can be collected server side are very pure since we don't have to depend on client side scripts to say "Hey I'm at your site and this is how I found you!". We still do have to depend on client side scripts to retrieve the screen sizes.

There are other tools out, Webalizer, WebTrends, W3Perl, all of these are free but I have personally not had any experience with them.

3rd Party Website Traffic Analysis
My favorite tool in this genre is Google Analytics. If you don't have a Google account yet, go sign up for one! Once you sign up for this free service, and create a analytics profile for your site, Google will give you a little piece of javascript to put somewhere on all the pages you would like to track. Google is really good about providing easy to use and understand step by step instructions to getting this done.

You can have up to 50 different profiles setup, which means you can track up to 50 different web sites from one account. There are approximately 80 different reports each that can be shaped to your likings. You can view traffic by geographic region, how long they have stayed on a specific page, number of visits, number of pages per visit. Most importantly you can tell where the traffic came from, directly typing in the URL, search engines, or referring sites. If they came from a search engine it will list which keywords were used to find you.

Google analytics is not without it's drawbacks though. The results may be slightly skewed, some ad filtering programs and browser extensions block the Urchin javascript used by Google analytics which will cause the visitor to not be counted. Also if they visitor doesn't have javascript turned on in their browser then they aren't tracked. Privacy applications such as Tor will show users coming from IP addresses that aren't theirs, which will equate to a Tor user in California showing up as coming from Germany.

Do not let the drawbacks of Google Analytics scare you away, it is an excellent tool to use especially if you do not have access to the server log files.

Am I Listed?

Once you have made your website a candy emporium for search engines and have the ability to see where your traffic is coming from, you need to get listed on the big 3.

  • To check and see if you are listed on Google do a search at their site for site:YOURFULLURL
  • To check and see if you are listed on yahoo do a search at their site for your partial URL, everything after http:// will work or everything after http://www.
  • To check and see if you are listed on msn do a search at their site for your full URL
So you have now determined if you are listed or not, if you are then great, get to optimizing and stop reading now. Go create some candy for your search engine friends! If you aren't listed then we have a bit of work to do.

Questions, comments and criticism are always welcome, search engine optimization and web development are life long learning adventures.

Thursday, November 29, 2007

What Is Search Engine Optimization

A good place to start is to address some common questions you may have about what SEO is and the means of implementing it.

What does SEO stand for?

SEO stands for Search Engine Optimization. It is the process of making a website friendly to search engines and end users. The end result is an increased volume of traffic in your area of the web.

Why not use an SEO consultant?

From my recent experience with SEO consultants they tend to be marketing maniacs with minimal knowledge of technology and enough basic knowledge of search engines to trick people into using their ways.

They tend to use promise you top 10 slots but what they don't tell you is that they are top 10 spots in niche markets with very little volumes of traffic. It is easy to obtain a top 10 position when there are only 50 pages in the search results.

An SEO consultant can also cost you thousands of dollars just to teach you information that is readily available. Here is the pricing plan for a popular SEO consultant:
  • 15 Minute Phone Conversation With No Previous Review of Web Site- $150
  • 1 Hour Phone Conversation With Site Review - $1500
  • Indepth Audit Report and 1 Hour Phone Conversation - $6000
Most likely all the information provided to you about search engine optimization can be derived from free sources.

What is important to search engines?

Search engines look for a variety of things, I tend to view them as hungry children looking for candy. One of the single most delicious items is relevant unique content. Without tasty content you will not do well in your result page rankings.
  • Headings and Bolds - H tags and bolded words have emphasis to the reader and also have emphasis to a search engine.
  • Title Tags - The titles of your pages must be relevant to your content.
  • Keywords - Words that accurately describe the content of your site.
    • Keyword Placement - A search engine will enjoy your keywords at the top of the page more than the ones at the bottom of the page.
    • Keyterm Proximity - Search engines assume if keyterms are placed close to each other they are probably related to each other.
    • Keyword Density - This is calculated and should be between 4% and 8%, if it is over 10% then you may be blacklisted for keyword stuffing and your page not appear in results. There are free tools to calculate this.
    • Keywords in File Names - Using keywords in a filename can change the results of your listings. Just remember use dashes '-' to seperate your words.
  • Image Alt Text and Titles - Short, descriptive phrases tend to do well here.
  • Meta Tags - Google doesn't look at meta-tags any more and other search engines do not deeply look into these. Although the are still nice to have! (Thanks to James for pointing this out)

What will get me blacklisted from a search engine?

A variety of things will get you blacklisted, as long as you use techniques that are not shady and trying to trick the search engine then you will be fine. They may be hungry children but they are intelligent hungry children. Here is what you should not do!
  • Cloaking - Serving different content based upon the User-Agent or IP address of a visitor. This is done to serve up a tasty page to search engines and the opposite to users. Avoid this at all costs.
  • Keyword Stuffing - Overuse of keywords in content and metatags. Stick to between 4 and 8 keywords and check that your keyword density is between 4% and 8%.
  • Doorway Pages - This is a form of cloaking. Highly optimized pages that might use a redirect or meta-refresh to push users to a different page.
  • Hidden Text - Text that is the same color as your background that the search engine spiders can see but users can not.
  • Broken Links - If you can't write a page correctly why should you be listed?
  • Duplicate Content - This just clutters up the search results and all but one page ignored.
To get an idea of how search engines function, read The Anatomy of a Large-Scale Hypertextual Web Search Engine by Sergey Brin and Lawrence Page, the founders of Google.

Questions, comments and criticism are always welcome, search engine optimization and web development are life long learning adventures.