Web Hosting
Home > Articles > Search Engine Optimization > Solutions for Affiliates and Site Owners with Dynamic Websites

By Joshua Sloan

Joshua is an eMarketing professional with insights into the experiences of website owners, affiliates, SEO consultants and online advertisers. He is currently involved with online marketing at 1&1 Internet (www.1and1.com), the world’s largest web hosting company, whose U.S. offices are located in Philadelphia, PA


As an online marketer that has also managed an affiliate program, I know that one the most common type of affiliate is one who owns a virtual mall or Web shopping or search portal. While many of these dynamically built sites are visually impressive, they face a major battle to gain inclusion into natural search engine results because many of the generated URLs are not search engine friendly.

I want to offer you some solutions, and hopefully convince you to consider URL rewriting as important tool to get more pages into the search engines.

While sites that are popular or well marketed don’t necessarily have to be as concerned about their natural visibility in search engine results as lesser-known sites that depend on those results to drive traffic, search engine results do matter. By implementing a solution for the dynamic URL “problem”, an affiliate or Web publisher can dramatically improve the traffic and earning potential of their site.

Although there are many elements that go into optimizing both static and dynamic sites for search engines, such as proper use of content, the robots.txt file and meta data, and link poularity, there is no perfect optimization for all Web pages or Web sites that work equally well for all search engines. This is why “natural optimization” has become more popular recently.

Still, the majority of affiliates with dynamic and database driven sites can benefit from some search engine optimization (SEO) and the creation of search engine friendly URLs. Think about all that good content like product descriptions, reviews and discussions that a dynamic site might have “trapped” behind an bot-impenetrable URL.

While search bots are getting better at indexing dynamic sites there are still challenges to get many pages listed by search engines. I hope to show you some techniques that can help you if you have such a site.

If you are not a skilled Web programmer, consult one to study the links I am giving you. There are also commercial products to help you accomplish the task, but I am pointing you towards free solutions as well.


Problem #1 Session ID Variables

Most Search bots, except the new MSN bot, will not follow links with session IDs (SSIDS) assigned top them. Google can parse one or two non alpha numeric Ids and characters such as “&” and “=”, but other than other characters will be regarded as ‘stop text’.

Stop text helps keep the search bot caught in a perpetual loop, because each page that is requested contains links to other pages, and each of the linked URLs will contain the current session ID. This makes them different URLs each time the page is requested. That means a vast number of unique URLs are continually created to spider and index. Because bots will only go so deep, many of the site’s pages are left unindexed.

Problem #2 Special Characters in the URL

Like with session IDs, any URL containing special characters like “?, =, &, ; and + “ can create problems for search engine spiders.

For example, if your site has pages with URLs such as: http://www.mysimon.com/4013-4237_8-0.html?qt=fish&tag=ksrch.tpcat, you may have problems with search engines indexing and following such links.

Imagine the search value of having that same page optimized for search engines with a URL such as: http://www.mysimon.com/products/topic_fish/fish.html

The Solutions Are Out There

There are a handful of solutions to help you create more search friendly URLs. While there are both commercial and free solutions, I prefer free.

For ASP.NET websites the free answer to making friendly URLS is at:

• http://msdn.microsoft.com/asp.net/using/building/web/default.aspx?pull=/library/en-us/dnaspp/html/URLRewriting.asp
Commercial Products to help dynamic Windows Web sites search engine friendly:

• Exception Digital Enterprise Solutions (http://www.xde.net/index.jsp?tool=xqasp-deep-web) offers software that can change the dynamic URLs to static ones.

• Named XQASP, it will remove the "?" in the query string and replace it with “/”, thereby allowing the search engine spiders to index the dynamic content. $250 per domain (single server).

• URL Rewrite http://www.smalig.com/URL_rewrite-en.htm €23

• OPURL http://www.opcode.co.uk/components/rewrite.asp $45 Euro

• ISAPI_Rewrite http://www.isapirewrite.com/ $69

• Mod Rewrite for IIS http://www.iismods.com/URL-rewrite/index.htm $39.90

• DCSearchSafe http://www.cftagstore.com/index.cfm/page/viewtag/tagId/50 $100 Euro

For Linux/Unix/Apache hosted sites which might use php, cgi, or other programming to server dynamic pages, the free solution is the mod_rewrite perl module which exists on the majority of Web servers. This requires no reprogramming of the site and is just a URL translation file for pages and directories
Check with your hosting company to see if this solution is already installed before proceeding. Using mod_rewrite to rewrite URLs requires the server facility to be able to set up and use a .htaccess file on the domain and requires someone with root level access to the hosting account.
You (or your programmer) can easily learn about using mod_rewrite to create search friendly URLS by visiting the following sites.
• http://httpd.apache.org/docs/misc/rewriteguide.html
• http://www.cre8asiteforums.com/viewtopic.php?p=2346&highlight=#2346
• http://httpd.apache.org/docs/mod/mod_rewrite.html
• http://www.freewebmasterhelp.com/tutorials/htaccess/

Free Mod_rewrite Rule Generator
• http://www.webmaster-toolkit.com/mod_rewrite-rewriterule-generator.shtml
mod_rewrite RewriteRule generator will take a dynamic URL and generate the correct syntax to place in a .htaccess file. This allows the URL to be rewritten in a format suitable for spidering. You can use it to rewrite for a directory or a page name!
• A good article for PHP programmers (which does not require using mod_rewrite): http://www.stargeek.com/php-seo.php

Caution: mod_rewrite can also be used to detect the search bots (by IP address), and allow you to give the bot a search friendly URL. However this could be considered a form of cloaking as it involves a referrer check (user_agent/IP). Example: If the refferer is Googlebot, drop the session ID or rewrite the URL. If it is a human with a browser, allow the normal URL to be used with special characters or session IDs. This can be done for multiple bots, but requires a list of all known ones. This list can be obtained at http://fantomaster.com/fasvsspy01.html (for a fee) at at http://www.iplists.com/ (free of charge). Remember that loading content different from what a human visitor would see IS considered cloaking and is banned by most search engines.

In conclusion, you now have a starting point for getting dozens, if not hundreds of additional pages into natural search results. The value of this should be obvious when weighed against paid search advertising (or if combined with it!) If your site is dynamic and you choose to use an SEO company, it is a must that they consider at least one of the solutions presented here. Its far more cost effective than a complete site redesign, and will show a much faster ROI than some other techniques you may be offered.

Web HostingWeb Hosting

Web Host


search engine optimization article