How to switch an existing site to friendly URLs

By • Oct 3rd, 2008 • Category: Advice

Update: In mid 2009 I made a free service to do this for you. Check out

About a month ago I had the privilege of getting to switch a client’s existing site of 500+ nasty looking .cfm’s to 510 prettier, search engine friendly URLs.

The real trick was doing so while still keeping their current ranks in tact, because their existing pages already ranked well in search engines and had a ton of incoming links. I also wasn’t being paid to rewrite each and every page.

There are plenty of tutorials available that explain how make pages with search friendly URLs, but from what I found not one covered how to do it if you do not want to change your existing pages.

This tutorial covers a trick I learned that will allow you to permanently redirect your old URLs to new, friendlier ones, and have those still load the same existing file at the old, ugly URL.

Essentially what this tutorial does is show you how to:

  1. Make your existing URLs permanently redirect to new search friendly URLs
  2. Make each new pretty URL redirect back to the same ugly URL page
  3. Make each page load… keeping the new friendly URL

It’s not hard to make a new site with search engine friendly URLs, because with new sites the ugly URLs that run the pages are never seen. Therefore you don’t have to worry about what happens when someone types one in. Because of this, it’s a good idea to always start a new site with pretty URLs.

With existing sites however, it’s important to make sure all URLs are accounted for, otherwise the ranks you had on the old URLs will be lost and you will be penalized for duplicate content.

What you have to do in this case is tell the old URLs to point to the new URLs, and then the new URLs where the file is that needs to be loaded. If your old URL also happens to be the name of the file (as was the case with this client) you are typically out of luck and have to rename all of the files (unless you know me).

Here is an example of my trick:

RewriteEngine On
RewriteBase /

# Force no-www preference
RewriteCond %{HTTP_HOST} !^$
RewriteRule ^(.*)$1 [R=301,L]

# Force www preference
#RewriteCond %{HTTP_HOST} !^$
#RewriteRule ^(.*)$1 [R=301,L]

# Fix never-ending redirects
RewriteCond %{ENV:REDIRECT_STATUS} 200
RewriteRule .* - [L]

# Make the ugly addresses look pretty
RewriteRule ^index.html$ / [R=301,L]
RewriteRule ^ugly.html$ /pretty/ [R=301,L]

# Make the dynamic addresses pretty too
RewriteCond %{QUERY_STRING} ^id=1$
RewriteRule ^products.php$ /wheat-beer/? [R=301,L]
RewriteCond %{QUERY_STRING} ^id=2$
RewriteRule ^products.php$ /coffee-beer/? [R=301,L]
RewriteCond %{QUERY_STRING} ^id=3$
RewriteRule ^products.php$ /beer/seasonal/? [R=301,L]

# Make the pretty addresses load the same content
RewriteRule ^pretty/$ /ugly.html [L]
RewriteRule ^wheat-beer/$ /products.php?id=1 [L]
RewriteRule ^coffee-beer/$ /products.php?id=2 [L]
RewriteRule ^beer/seasonal/$ /products.php?id=3 [L]

# Add the trailing slash if missing
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule ^(.+[^/])$ /$1/ [R=301,L]

There’s a lot of money in SEO and being able to quickly give existing sites search friendly URLs. Because of this I’m not going to go into great detail of how the .htaccess file works, but I will point out the important parts. I will leave learning mod_rewrite and regular expressions as well as unpaid troubleshooting up to you.

If you know what a .htaccess file does then you will recognize that the top of it turns the Apache rewrite engine on and declares the base directory. The second and third part is strictly for good SEO and forces either or All SEO’s should know that under no circumstance should the same content load at more than one address.

The fourth section is what keeps the script from never-ending redirects. Without this block telling the server to stop redirecting, your server will keep redirecting from old URL to new to old and so on. Firefox will eventually give you the “redirecting in a way that will never complete” error if you forget this step.

Fifth, the old static pages are permanently redirect to the new friendly URLs, and sixth, the dynamic pages are as well. These will need to be tweaked when passing more than one parameter.

Seventh, the server is told to go back to the file at the ugly URL to load the content, but no redirect header is sent so the address will not change. Then finally the server is told to add the trailing slash if someone forgot to type it in.

My friendly URLs demonstration contains this .htaccess file to show this method actually works. Notice the links are all in the form of the older URL’s but when visited or indexed they will show the pretty URLs.

By switching the URLs of my client’s site to ones containing relevant keywords I was able to move them several places in search engines for nearly all their products. They now have a PR 5 although there is still a lot more that can be done to their site.

Tagged as: , , ,

Email this author | All posts by

3 Responses »

  1. Very clever, so this needs to be added to the .htaccess file on every folder right?


  2. Hi there,

    Very interesting and thanks for the guide.
    I am not really familiar with all this web fixing stuff, so do I need to upload this code in (notepad) then upload it to my site or do I need to go thru .htaccess?

    Love to hear back


  3. I made a free service that will do this for you. Check out


Leave a Reply