This post is mainly aimed at small to medium businesses that are just starting out and are keen to get something going, or have just gone live. I can't tell you how many times I have taught people over the past few years just a handful of strategically important things. So...here goes again, this time in a way that I can now simply refer to. As for my credibility - I would rather not divulge that here, read my Web Site Optimisation Rules and you decide. They are, after all, common sense, and common knowledge....like most things I blog about!
My number 12 rule: Test your own site with multiple web browsers!
These days it takes just a few minutes to download and install different web browsing possilibities and I recommend you do it to test your own site. Nothing compares to a quick visual check that your web site still looks okay, or better, than doing so in a browser that you maybe do not often use.
Analyse your web traffic analysis reports to check which browsers account for 90-95% of your traffic and then make sure that 90-95% of the time your site renders/displays as you wish it to.
Also using your traffic analysis reports, you can verify which of your visitors turned into form submitters and became potential customers. Make sure your forms work correctly in their browsers before you publish changes!!
Reasoning:
1. It is just so darn tricky to know for sure that even if you are following the standards perfectly, the web browsers may not be. Best to check it out visually, using the tools that your clients use.
2. It is very quick and simple to do - so do it. You will also see your site in new ways that might lead to further design benefits.
And that is my Rule 12. I will be uploading the others as time allows!
The best agile insights, coaching tools, collaboration practices, productivity principles, business and individual recommendations that make real positive impacts to my clients. You can use them immediately for yourself or contact me for deeper support.
Showing posts with label web. Show all posts
Showing posts with label web. Show all posts
Monday, 8 September 2008
Tuesday, 2 September 2008
Basics of Web Site Optimisation - Rule 11
This post is mainly aimed at small to medium businesses that are just starting out and are keen to get something going, or have just gone live. I can't tell you how many times I have taught people over the past few years just a handful of strategically important things. So...here goes again, this time in a way that I can now simply refer to. As for my credibility - I would rather not divulge that here, read my Web Site Optimisation Rules and you decide. They are, after all, common sense, and common knowledge....like most things I blog about!
My number 11 rule: Do not use JavaScript for navigation effects, especially not dynamic menus
JavaScript is a small very powerful programming language for programming web sites in order to give them more functionality than is available in straight static HTML. It is a fantastic language from many perspectives that I will not go into on this entry. People use it to do form validations, image manipulation, dynamic menus and other miscellaneous bits that need to respond to whatever the visitor is doing, or where the visitor comes from in order for the visitor to have an interactive experience with the web site.
Read more about JavaScript at wikipedia.
I follow the mantra "Use as little JavaScript as possible, whilst still making the web site look good, and work well." I use CSS as much as possible to make it look good, and as little JavaScript as possible to provide a little tasteful motion, and to do some form validation in order to partially constrict the flood of the requests that could otherwise be submitted via the contact forms.
Reasoning:
1. Search engines can not follow dynamic Java Script menus, thus large parts of your site will go unindexed.
2. Your site will appear smaller and less important to the search engines when they rank you against your competitors.
3. Your site will appear badly organised and less usable to the search engines when they rank you against your competitors.
4. Different versions of browsers treat JavaScript differently, meaning even more testing and "fiddling" in order to get your JavaScript to work correctly and consistently across all your targetted browsers.
And that is my Rule 11. I will be uploading the others as time allows!
My number 11 rule: Do not use JavaScript for navigation effects, especially not dynamic menus
JavaScript is a small very powerful programming language for programming web sites in order to give them more functionality than is available in straight static HTML. It is a fantastic language from many perspectives that I will not go into on this entry. People use it to do form validations, image manipulation, dynamic menus and other miscellaneous bits that need to respond to whatever the visitor is doing, or where the visitor comes from in order for the visitor to have an interactive experience with the web site.
Read more about JavaScript at wikipedia.
I follow the mantra "Use as little JavaScript as possible, whilst still making the web site look good, and work well." I use CSS as much as possible to make it look good, and as little JavaScript as possible to provide a little tasteful motion, and to do some form validation in order to partially constrict the flood of the requests that could otherwise be submitted via the contact forms.
Reasoning:
1. Search engines can not follow dynamic Java Script menus, thus large parts of your site will go unindexed.
2. Your site will appear smaller and less important to the search engines when they rank you against your competitors.
3. Your site will appear badly organised and less usable to the search engines when they rank you against your competitors.
4. Different versions of browsers treat JavaScript differently, meaning even more testing and "fiddling" in order to get your JavaScript to work correctly and consistently across all your targetted browsers.
And that is my Rule 11. I will be uploading the others as time allows!
Monday, 25 August 2008
Basics of Web Site Optimisation - Rule 10
This post is mainly aimed at small to medium businesses that are just starting out and are keen to get something going, or have just gone live. I can't tell you how many times I have taught people over the past few years just a handful of strategically important things. So...here goes again, this time in a way that I can now simply refer to. As for my credibility - I would rather not divulge that here, read my Web Site Optimisation Rules and you decide. They are, after all, common sense, and common knowledge....like most things I blog about!
My number 10 rule: Create a web site/shortcut icon
This is one of the simplest and most overlooked goodies in a web site's "bag of tricks"!
Ever wonder why some web sites have a little icon in the address bar? For instance the little white-on-orange background "B" in the address bar right now? It is because they have created a "favicon.ico" in the root of their web directory. Not only is it displayed by the browser in the address bar when your visitor visits, but if the visitor bookmarks it, it is displayed in the bookmark view as well - which makes your site stand out from all the others without favicon.ico's! (and vice-versa, makes your site stand out stand out negatively, if all the others do)
The favicon.ico's that I have created were all based on the old specification of 16x16 pixels in the icon format. These days it is possible to use bigger images, with different formats, but I am still preferring the older specification as it is a very small overhead for your visitor and whatever old browser they may be using 5 years from now. Read more on wikipedia's Favicon entry!
Reasoning:
1. This is more a marketing and branding tool/optimisation than a search engine optimisation. It is extremely subtle in the world of too many messages and by using it, you have an opportunity to reinforce your company/site's brand in the visitor's mind - a very good thing. In the world of marketing, brand awareness is key - no one is going to buy "you" if they don't even "know you" - so first seek to build awareness, then start your "trust campaign".
2. It looks professional and is becoming expected these days
3. And a reminder waiting for each bookmarker is a great place to be
And that is my Rule 10. I will be uploading the others as time allows!
My number 10 rule: Create a web site/shortcut icon
This is one of the simplest and most overlooked goodies in a web site's "bag of tricks"!
Ever wonder why some web sites have a little icon in the address bar? For instance the little white-on-orange background "B" in the address bar right now? It is because they have created a "favicon.ico" in the root of their web directory. Not only is it displayed by the browser in the address bar when your visitor visits, but if the visitor bookmarks it, it is displayed in the bookmark view as well - which makes your site stand out from all the others without favicon.ico's! (and vice-versa, makes your site stand out stand out negatively, if all the others do)
The favicon.ico's that I have created were all based on the old specification of 16x16 pixels in the icon format. These days it is possible to use bigger images, with different formats, but I am still preferring the older specification as it is a very small overhead for your visitor and whatever old browser they may be using 5 years from now. Read more on wikipedia's Favicon entry!
Reasoning:
1. This is more a marketing and branding tool/optimisation than a search engine optimisation. It is extremely subtle in the world of too many messages and by using it, you have an opportunity to reinforce your company/site's brand in the visitor's mind - a very good thing. In the world of marketing, brand awareness is key - no one is going to buy "you" if they don't even "know you" - so first seek to build awareness, then start your "trust campaign".
2. It looks professional and is becoming expected these days
3. And a reminder waiting for each bookmarker is a great place to be
And that is my Rule 10. I will be uploading the others as time allows!
Saturday, 23 August 2008
Basics of Web Site Optimisation - Rule 9
This post is mainly aimed at small to medium businesses that are just starting out and are keen to get something going, or have just gone live. I can't tell you how many times I have taught people over the past few years just a handful of strategically important things. So...here goes again, this time in a way that I can now simply refer to. As for my credibility - I would rather not divulge that here, read my Web Site Optimisation Rules and you decide. They are, after all, common sense, and common knowledge....like most things I blog about!
My number 9 rule: Use these 9 Meta Tags
These are probably "over kill" as far as Meta Tag usage is concerned, but I prefer this approach to lesser, until I learn something concrete that differs from my experience which has seen some rapid and sustained visitor rate growth where I applied them.
Place Meta Tags in all of your page header sections. There are smart tools for generating the content-related ones, but it is better if they fit into your whole marketing and supporting web site design approach, making them very manageable in a manual way on static web sites. On dynamic, use a clever algorithm to assist you in carefully crafted statements.
Make sure the web page content, including words, links, image names - everything where you can specify text, all aligns with the Keywords, Description and Abstract Meta Tags!
<META name="Keywords" content="[5 comma separated keywords that are present at least twice or more on the page!]">
<META name="Description" content="[enter a short statement containing as many of the keywords as possible]">
<META name="Date" content="[last update date]">
- NOT STRICTLY meta tag legitimate or required but I use it to track the date I last made a change to the page - and use this date to facilitate track which page version search engines have in their cache
<META name="abstract" content="[enter a short statement containing as many of the keywords as possible, possibly reusing the Description meta tag]">
<META name="revisit-after" content="[a number that is tuned to the amount of maintenance you do on the site - initially I use 7 because I perform so much tuning and generally I roll out content in a staged approach to ensure high quality and maintain control over the site's visitor growth] days">
<META name="rating" content="general">
- there are a number of options here for instance "adult", but I only use general
<META name="next" content="[choose or use your site metrics to research the next web page from your site that most visitors normally go to]">
- some web browsers (Firefox) will pre-cache the html page you specify here, making the user experience of your web site seem quicker, if they actually follow your "directions". Very difficult to get right without careful web site design, and if wrong actually wastes your visitors' bandwidth and makes their internet experience slower unnecessarily.
<META name="robots" content="index,follow">
- instruct search engines to index this web page, and follow all links on the page
- in the old days of restricted bandwidth, and depending on what I was marketing on the site, I would instruct the search engine to "noindex"
<META content="text/html; charset=ISO-8859-1">
- very important to clarify to the web browsers what the page content is, and which character set is intended otherwise some foreign visitors could end up with strange symbols unintentionally, making your web site difficult to use
Reasoning:
1. By using Meta Tags at all the search engines will rank you higher
2. By using the correctly completed Meta Tags, the search engines will rank you even higher
3. They should repeat your key messages you are trying to convey to your visitors - they serve as a check point for detecting if your message is clear and the page is correctly aligned to achieve maximum impact at the search engines AND visitors
4. Search engine spiders, web crawling bots, web caching servers and even web site blocker software use the tags in their "decision" software to decide whether to analyse, cache, or allow the page to be displayed.
And that is my Rule 9. I will be uploading the others as time allows!
My number 9 rule: Use these 9 Meta Tags
These are probably "over kill" as far as Meta Tag usage is concerned, but I prefer this approach to lesser, until I learn something concrete that differs from my experience which has seen some rapid and sustained visitor rate growth where I applied them.
Place Meta Tags in all of your page header sections. There are smart tools for generating the content-related ones, but it is better if they fit into your whole marketing and supporting web site design approach, making them very manageable in a manual way on static web sites. On dynamic, use a clever algorithm to assist you in carefully crafted statements.
Make sure the web page content, including words, links, image names - everything where you can specify text, all aligns with the Keywords, Description and Abstract Meta Tags!
<META name="Keywords" content="[5 comma separated keywords that are present at least twice or more on the page!]">
<META name="Description" content="[enter a short statement containing as many of the keywords as possible]">
<META name="Date" content="[last update date]">
- NOT STRICTLY meta tag legitimate or required but I use it to track the date I last made a change to the page - and use this date to facilitate track which page version search engines have in their cache
<META name="abstract" content="[enter a short statement containing as many of the keywords as possible, possibly reusing the Description meta tag]">
<META name="revisit-after" content="[a number that is tuned to the amount of maintenance you do on the site - initially I use 7 because I perform so much tuning and generally I roll out content in a staged approach to ensure high quality and maintain control over the site's visitor growth] days">
<META name="rating" content="general">
- there are a number of options here for instance "adult", but I only use general
<META name="next" content="[choose or use your site metrics to research the next web page from your site that most visitors normally go to]">
- some web browsers (Firefox) will pre-cache the html page you specify here, making the user experience of your web site seem quicker, if they actually follow your "directions". Very difficult to get right without careful web site design, and if wrong actually wastes your visitors' bandwidth and makes their internet experience slower unnecessarily.
<META name="robots" content="index,follow">
- instruct search engines to index this web page, and follow all links on the page
- in the old days of restricted bandwidth, and depending on what I was marketing on the site, I would instruct the search engine to "noindex"
<META content="text/html; charset=ISO-8859-1">
- very important to clarify to the web browsers what the page content is, and which character set is intended otherwise some foreign visitors could end up with strange symbols unintentionally, making your web site difficult to use
Reasoning:
1. By using Meta Tags at all the search engines will rank you higher
2. By using the correctly completed Meta Tags, the search engines will rank you even higher
3. They should repeat your key messages you are trying to convey to your visitors - they serve as a check point for detecting if your message is clear and the page is correctly aligned to achieve maximum impact at the search engines AND visitors
4. Search engine spiders, web crawling bots, web caching servers and even web site blocker software use the tags in their "decision" software to decide whether to analyse, cache, or allow the page to be displayed.
And that is my Rule 9. I will be uploading the others as time allows!
Friday, 15 August 2008
Basics of Web Site Optimisation - Rule 8
This post is mainly aimed at small to medium businesses that are just starting out and are keen to get something going, or have just gone live. I can't tell you how many times I have taught people over the past few years just a handful of strategically important things. So...here goes again, this time in a way that I can now simply refer to. As for my credibility - I would rather not divulge that here, read my Web Site Optimisation Rules and you decide. They are, after all, common sense, and common knowledge....like most things I blog about!
My number 8 rule: Register on Web Directories
Web Directories are huge databases of links and brief descriptions of web sites, and usually are maintained by actual humans. They are not Search Engines, but in some ways they are similar because they try to get (encourage web masters/site owners) to register on them so that they become ever bigger repositories of information that is similar to library catalogues.
To be accepted, your site needs to be up to a certain standard - which is a good thing! The editors of these Web Directories see 1000's of sites and it should be reassuring to you that someone has assessed your application form, and your web site, and found you/your idea/your web site acceptable to the rest of the world!
For more up to the date information of the Web Directories, access wikipedia's List of Web Directories and get registered everywhere suitable!
Your time is obviously limited and therefore over time it is apparent that in general the most important one is the Open Directory Project. Be very careful to select the right categorisation for your web site. VERY CAREFUL! Like all good information clustering, and all good search engine ranking systems, things need to be as aligned as possible for optimal benefit. Be as specific with your categorisation as possible, and make sure your web site is reflecting that profile!!
Reasoning:
1. All of the major search engines use the web directories as a starting point for their "entire web crawl".
2. Search engines rate sites that are reviewed by real human editors of various Web Directories and accepted in them, quite highly.
3. Search engines will use the web directory classification of your web site to create a cluster for your site, and clustered information/sites rank more highly than complete "unknowns" for search engines.
4. While you are registering, and you are accepted, you will find competitor web sites that you can examine for marketing insights.
5. After you are accepted, it should be possible for you to form alliances (link swap) with some of the other sites registered in the same area, or in similar but alternate categories of the web directory - a very good thing!
And that is my Rule 8. I will be uploading the others as time allows!
My number 8 rule: Register on Web Directories
Web Directories are huge databases of links and brief descriptions of web sites, and usually are maintained by actual humans. They are not Search Engines, but in some ways they are similar because they try to get (encourage web masters/site owners) to register on them so that they become ever bigger repositories of information that is similar to library catalogues.
To be accepted, your site needs to be up to a certain standard - which is a good thing! The editors of these Web Directories see 1000's of sites and it should be reassuring to you that someone has assessed your application form, and your web site, and found you/your idea/your web site acceptable to the rest of the world!
For more up to the date information of the Web Directories, access wikipedia's List of Web Directories and get registered everywhere suitable!
Your time is obviously limited and therefore over time it is apparent that in general the most important one is the Open Directory Project. Be very careful to select the right categorisation for your web site. VERY CAREFUL! Like all good information clustering, and all good search engine ranking systems, things need to be as aligned as possible for optimal benefit. Be as specific with your categorisation as possible, and make sure your web site is reflecting that profile!!
Reasoning:
1. All of the major search engines use the web directories as a starting point for their "entire web crawl".
2. Search engines rate sites that are reviewed by real human editors of various Web Directories and accepted in them, quite highly.
3. Search engines will use the web directory classification of your web site to create a cluster for your site, and clustered information/sites rank more highly than complete "unknowns" for search engines.
4. While you are registering, and you are accepted, you will find competitor web sites that you can examine for marketing insights.
5. After you are accepted, it should be possible for you to form alliances (link swap) with some of the other sites registered in the same area, or in similar but alternate categories of the web directory - a very good thing!
And that is my Rule 8. I will be uploading the others as time allows!
Monday, 4 August 2008
Basics of Web Site Optimisation - Rule 7
This post is mainly aimed at small to medium businesses that are just starting out and are keen to get something going, or have just gone live. I can't tell you how many times I have taught people over the past few years just a handful of strategically important things. So...here goes again, this time in a way that I can now simply refer to. As for my credibility - I would rather not divulge that here, read my Web Site Optimisation Rules and you decide. They are, after all, common sense, and common knowledge....like most things I blog about!
My number 7 rule: Use Cascading Style Sheets (CSS) to configure your site's look and feel and store in separate files
Most of the target audience of these rules I am writing up, commission web sites from friends, or friend-of-friends, or recommendations of small web site development shops ... who don't seem to know, or know enough about Cascading Style Sheets (CSS).
The basics are that with your CSS stored in a separate file, you can edit the look and feel of your entire web site, without having to know much about what you are doing. CSS is a very simple and easy to understand configurable set of options for which there are a large number of cheap (and free even) tools to help you make your site exactly the way you would like - independently of some technical person. (I once taught a very busy CEO just enough about CSS for him to take ownership of that aspect of his 1000 page web site!). You name the different sets something useful which makes them easier to use throughout your site such as "ProductDescription".
People who implement web sites typically are more interested in the underlying functionality and more technical aspects than about how pretty it looks, or how well it matches the site owner's evolving branding efforts. By making use of CSS, anyone can be placed in charge of actually experimenting and figuring out ideal combinations of colours and font styles, text sizes, margins, borders and spacings that work in harmony together to create a truly unique and excellent user web experience.
To see what you can do with CSS's, have a look at The CSS Zen Garden. This site contains beautiful samples of completely different CSS competitive entries for the same web page...and it is completely amazing with what the entrants have come up with - radical differences!
In some ways this CSS rule is related to my rule 5: Comply with web standards - html as CSS's are also governed by W3C standards! You can teach yourself about them by following the CSS Tutorials from the W3C.
And, because there is a standard for them, the W3C have once again provided an online checking facility for you to use to validate your own, or your supplier's efforts: simply submit your HTML or CSS web document to the Online Cascading Style Sheet (CSS) Checker.
Reasoning:
1. With CSS you can evolve really really very good looking web sites! It is almost impossible to get a web site perfect in just 1 attempt!
2. Anyone can configure and maintain them - making your site's look and feel very easy to update and preferably maintained by someone who really cares!
3. Separate your CSS definitions from your HTML web pages by placing them in separate files that you include in your header tags. Web browsers cache the CSS definition files and thus your HTML pages are smaller, easier to maintain, faster to download and even more focussed for search engine optimisation strategies.
4. By using CSS's it is possible to test updates of the look and feel against the production web site without having to go live first! It is possible to configure modern web browers to use alternatve CSS definition files against any web site!
And that is my Rule 7. I will be uploading the others as time allows!
My number 7 rule: Use Cascading Style Sheets (CSS) to configure your site's look and feel and store in separate files
Most of the target audience of these rules I am writing up, commission web sites from friends, or friend-of-friends, or recommendations of small web site development shops ... who don't seem to know, or know enough about Cascading Style Sheets (CSS).
The basics are that with your CSS stored in a separate file, you can edit the look and feel of your entire web site, without having to know much about what you are doing. CSS is a very simple and easy to understand configurable set of options for which there are a large number of cheap (and free even) tools to help you make your site exactly the way you would like - independently of some technical person. (I once taught a very busy CEO just enough about CSS for him to take ownership of that aspect of his 1000 page web site!). You name the different sets something useful which makes them easier to use throughout your site such as "ProductDescription".
People who implement web sites typically are more interested in the underlying functionality and more technical aspects than about how pretty it looks, or how well it matches the site owner's evolving branding efforts. By making use of CSS, anyone can be placed in charge of actually experimenting and figuring out ideal combinations of colours and font styles, text sizes, margins, borders and spacings that work in harmony together to create a truly unique and excellent user web experience.
To see what you can do with CSS's, have a look at The CSS Zen Garden. This site contains beautiful samples of completely different CSS competitive entries for the same web page...and it is completely amazing with what the entrants have come up with - radical differences!
In some ways this CSS rule is related to my rule 5: Comply with web standards - html as CSS's are also governed by W3C standards! You can teach yourself about them by following the CSS Tutorials from the W3C.
And, because there is a standard for them, the W3C have once again provided an online checking facility for you to use to validate your own, or your supplier's efforts: simply submit your HTML or CSS web document to the Online Cascading Style Sheet (CSS) Checker.
Reasoning:
1. With CSS you can evolve really really very good looking web sites! It is almost impossible to get a web site perfect in just 1 attempt!
2. Anyone can configure and maintain them - making your site's look and feel very easy to update and preferably maintained by someone who really cares!
3. Separate your CSS definitions from your HTML web pages by placing them in separate files that you include in your header tags. Web browsers cache the CSS definition files and thus your HTML pages are smaller, easier to maintain, faster to download and even more focussed for search engine optimisation strategies.
4. By using CSS's it is possible to test updates of the look and feel against the production web site without having to go live first! It is possible to configure modern web browers to use alternatve CSS definition files against any web site!
And that is my Rule 7. I will be uploading the others as time allows!
Monday, 28 July 2008
Basics of Web Site Optimisation - Rule 6
This post is mainly aimed at small to medium businesses that are just starting out and are keen to get something going, or have just gone live. I can't tell you how many times I have taught people over the past few years just a handful of strategically important things. So...here goes again, this time in a way that I can now simply refer to. As for my credibility - I would rather not divulge that here, read my Web Site Optimisation Rules and you decide. They are, after all, common sense, and common knowledge....like most things I blog about!
My number 6 rule: Find valuable reciprocal link partners, manage them and the relationship!
The reason why it is called the World Wide Web rather that the World Wide Collection of Islands - the connections/links between web sites! It is most important that you get your site connected, and preferably with partners that you treat as the valuable partners they should be, provided you choose wisely!
A good process is to scour the internet for sites that meet your criteria, and then make contact with their web master very politely. This is hard work and you will need to approach each target, depending on the target, with a customised strategy.
You will need to manage those that you find as potentials, those that you have tried to contact and are waiting for a respone, those that rejected you, and those that accepted you, for a long time. I used a spreadsheet that I updated every time I found a useful looking partner, and once per month or every three, I would actually do the work of checking links and emailing web masters.
The problem with those web sites/masters that accept you, is that you need to check back on them from time to time to ensure that they have not dropped you silently (and are thus receiving web visitors from you, for free). It is very bad practice when this happens. Another reason you need to check is that sometime web sites are sold, or change focus....and the changed focus is not something you wish to be associated with.
You can (and should regularly anyway, for your internal links will change forever) use the W3C (remember the W3C from my Web Site Optimisation Rule 5?) Online Link Checker. Simply enter the page name you would like to check for broken links.
I have used Xenu Link Sleuth in the past, which is easy to use, easy to install, and very fast to run! And it is free!
Reasoning:
1. Reality - Even though you build it - no one is going to come ... by random chance! Links are what makes the WWW work. If you want to be a part of the working WWW, you have to be a part. :)
2. Brand awareness - You have to be in the "market place" for potential visitors to even know you exist. In the online world, this means you need to be seen, and preferably with the right people!
3. Information clustering, horizontal - People looking for particular kinds of web sites tend to look for them! By this I mean they will follow any potential link that looks similar to what they are looking for, at that moment in time. So if your site is focussed on selling bananas, you should like to link up with related sites that might be focussed on selling other fruits. Depending on what you're selling, you may wish to actually link to other sites selling bananas - especially if your regions do not overlap!
4. Information clustering, vertical - Similar to horizontal, but this time looking for upstream/downstream (vertical) industry partners. So the banana farmers, the distributors, the retailers. Each of these partners could tell a very good story to their web site visitors - imagine a visitor to the "farm's site" following a link to the "distributor's site" and then to your site, and buying bananas from you!
5. Search Engine Optimisation (SEO) - The more links that are pointing to your site, the better your SEO ranking will be - thus giving you a better chance at appearing earlier in the search results for instance Google would return to queries that match your site. Combining this reason with the Information Clustering reasons, and you have an even more powerful effect that you will benefit from!
6. Market space - By following how visitors find you, and where they go from your site, you will get a better idea of your online market space. If it does not match what you expect or want, then, using reciprocal links, you can shift your online market space more favourably towards where you want it to be.
Rarely. New source of income - Check your reports to see which of your reciprocal link partners are more valuable to you, and which you are more valuable to them - perhaps, if the volumes of traffic you direct to the other site(s) are truly massive, you can even establish a new (small) source of income.
And that is my Rule 6. I will be uploading the others as time allows!
My number 6 rule: Find valuable reciprocal link partners, manage them and the relationship!
The reason why it is called the World Wide Web rather that the World Wide Collection of Islands - the connections/links between web sites! It is most important that you get your site connected, and preferably with partners that you treat as the valuable partners they should be, provided you choose wisely!
A good process is to scour the internet for sites that meet your criteria, and then make contact with their web master very politely. This is hard work and you will need to approach each target, depending on the target, with a customised strategy.
You will need to manage those that you find as potentials, those that you have tried to contact and are waiting for a respone, those that rejected you, and those that accepted you, for a long time. I used a spreadsheet that I updated every time I found a useful looking partner, and once per month or every three, I would actually do the work of checking links and emailing web masters.
The problem with those web sites/masters that accept you, is that you need to check back on them from time to time to ensure that they have not dropped you silently (and are thus receiving web visitors from you, for free). It is very bad practice when this happens. Another reason you need to check is that sometime web sites are sold, or change focus....and the changed focus is not something you wish to be associated with.
You can (and should regularly anyway, for your internal links will change forever) use the W3C (remember the W3C from my Web Site Optimisation Rule 5?) Online Link Checker. Simply enter the page name you would like to check for broken links.
I have used Xenu Link Sleuth in the past, which is easy to use, easy to install, and very fast to run! And it is free!
Reasoning:
1. Reality - Even though you build it - no one is going to come ... by random chance! Links are what makes the WWW work. If you want to be a part of the working WWW, you have to be a part. :)
2. Brand awareness - You have to be in the "market place" for potential visitors to even know you exist. In the online world, this means you need to be seen, and preferably with the right people!
3. Information clustering, horizontal - People looking for particular kinds of web sites tend to look for them! By this I mean they will follow any potential link that looks similar to what they are looking for, at that moment in time. So if your site is focussed on selling bananas, you should like to link up with related sites that might be focussed on selling other fruits. Depending on what you're selling, you may wish to actually link to other sites selling bananas - especially if your regions do not overlap!
4. Information clustering, vertical - Similar to horizontal, but this time looking for upstream/downstream (vertical) industry partners. So the banana farmers, the distributors, the retailers. Each of these partners could tell a very good story to their web site visitors - imagine a visitor to the "farm's site" following a link to the "distributor's site" and then to your site, and buying bananas from you!
5. Search Engine Optimisation (SEO) - The more links that are pointing to your site, the better your SEO ranking will be - thus giving you a better chance at appearing earlier in the search results for instance Google would return to queries that match your site. Combining this reason with the Information Clustering reasons, and you have an even more powerful effect that you will benefit from!
6. Market space - By following how visitors find you, and where they go from your site, you will get a better idea of your online market space. If it does not match what you expect or want, then, using reciprocal links, you can shift your online market space more favourably towards where you want it to be.
Rarely. New source of income - Check your reports to see which of your reciprocal link partners are more valuable to you, and which you are more valuable to them - perhaps, if the volumes of traffic you direct to the other site(s) are truly massive, you can even establish a new (small) source of income.
And that is my Rule 6. I will be uploading the others as time allows!
Tuesday, 22 July 2008
Basics of Web Site Optimisation - Rule 5
This post is mainly aimed at small to medium businesses that are just starting out and are keen to get something going, or have just gone live. I can't tell you how many times I have taught people over the past few years just a handful of strategically important things. So...here goes again, this time in a way that I can now simply refer to. As for my credibility - I would rather not divulge that here, read my Web Site Optimisation Rules and you decide. They are, after all, common sense, and common knowledge....like most things I blog about!
My number 5 rule: Comply with the web standards - html
It is pretty amazing to me that there are many "professional" web site developers that do not know that there is actually a consortium of key organisations behind a set of standards for the technologies and protocols used on the internet. The World Wide Web Consortium (w3c) has created, ratified and published standards for HTML and related protocols for years and years!
For the purposes of web sites, in the past, I standardised web pages on an intermediate/transitional standard (HTML 4.01 Transitional Specification) as web browsers I was testing with (Opera, Internet Explorer, Firefox, Netscape) generally rendered (displayed) the intended result as I designed/implemented or acceptably close to that (or the client's requirement).
While researching this entry, I noticed that the W3C have just released an "Editor's Draft" of the latest specification on 17 July 2008. Check it out at HTML 5. I would not consider standardising on this one just yet though.
So how do you know if your web developer has even attempted to create standards compliant web pages or web site for you? You could do it the hard way - use your browser's functionality to view the HTML source of your web page and look for the "DOCTYPE" tag on the first line, for example:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
But verifying that your page is supposed to be complying with one of the standards is not good enough! The wonderful W3C provide very useful and usable online Quality Assurance tools! Use the HTML Validator - simply enter the page name you would like to validate (your page has to be publicly accessible), and see what results you get back!
Alternatively there is a little program you can run, also via the W3C, that will give the same results. It is freely available, and its name is "tidy" or "HTML tidy". It actually is running behind the scenes of the online W3C HTML Validator link above! You can get information and download details from here.
I recommend, if you are interested (this is not difficult actually), that you get a little training in the subject matter! Again the W3C delivers and you can follow the W3 Schools Online Web Tutorials!
Reasoning:
1. If your web pages comply with the standard, the greater the chance that your site will render as you intend in your visitors' different browsers - an excellent idea!
2. If your web pages comply with the standard, the greater the chance that your site will be scraped by the search engine bots - a very good thing!
3. If your pages are NOT COMPLIANT, your search engine RANKINGS will be NEGATIVELY AFFECTED - a VERY BAD thing!
4. If your pages are NOT COMPLIANT, they will take longer to render in a browser as the browser will do its best to guess at correcting the page and this takes extra processing time - a VERY BAD thing!
5. By complying with the standard, you gain tool and developer freedom should you one day decide to maintain your site in a different tool or by a different developer (no lock-in) - a good thing.
And that is my Rule 5. I will be uploading the others as time allows!
My number 5 rule: Comply with the web standards - html
It is pretty amazing to me that there are many "professional" web site developers that do not know that there is actually a consortium of key organisations behind a set of standards for the technologies and protocols used on the internet. The World Wide Web Consortium (w3c) has created, ratified and published standards for HTML and related protocols for years and years!
For the purposes of web sites, in the past, I standardised web pages on an intermediate/transitional standard (HTML 4.01 Transitional Specification) as web browsers I was testing with (Opera, Internet Explorer, Firefox, Netscape) generally rendered (displayed) the intended result as I designed/implemented or acceptably close to that (or the client's requirement).
While researching this entry, I noticed that the W3C have just released an "Editor's Draft" of the latest specification on 17 July 2008. Check it out at HTML 5. I would not consider standardising on this one just yet though.
So how do you know if your web developer has even attempted to create standards compliant web pages or web site for you? You could do it the hard way - use your browser's functionality to view the HTML source of your web page and look for the "DOCTYPE" tag on the first line, for example:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
But verifying that your page is supposed to be complying with one of the standards is not good enough! The wonderful W3C provide very useful and usable online Quality Assurance tools! Use the HTML Validator - simply enter the page name you would like to validate (your page has to be publicly accessible), and see what results you get back!
Alternatively there is a little program you can run, also via the W3C, that will give the same results. It is freely available, and its name is "tidy" or "HTML tidy". It actually is running behind the scenes of the online W3C HTML Validator link above! You can get information and download details from here.
I recommend, if you are interested (this is not difficult actually), that you get a little training in the subject matter! Again the W3C delivers and you can follow the W3 Schools Online Web Tutorials!
Reasoning:
1. If your web pages comply with the standard, the greater the chance that your site will render as you intend in your visitors' different browsers - an excellent idea!
2. If your web pages comply with the standard, the greater the chance that your site will be scraped by the search engine bots - a very good thing!
3. If your pages are NOT COMPLIANT, your search engine RANKINGS will be NEGATIVELY AFFECTED - a VERY BAD thing!
4. If your pages are NOT COMPLIANT, they will take longer to render in a browser as the browser will do its best to guess at correcting the page and this takes extra processing time - a VERY BAD thing!
5. By complying with the standard, you gain tool and developer freedom should you one day decide to maintain your site in a different tool or by a different developer (no lock-in) - a good thing.
And that is my Rule 5. I will be uploading the others as time allows!
Wednesday, 16 July 2008
Basics of Web Site Optimisation - Rule 4
This post is mainly aimed at small to medium businesses that are just starting out and are keen to get something going, or have just gone live. I can't tell you how many times I have taught people over the past few years just a handful of strategically important things. So...here goes again, this time in a way that I can now simply refer to. As for my credibility - I would rather not divulge that here, read my Web Site Optimisation Rules and you decide. They are, after all, common sense, and common knowledge....like most things I blog about!
My number 4 rule: Analyse your site traffic
If you are hosting your site at a basic web hosting ISP, you should have been supplied at least a basic (or advance) traffic reporting system. Use the system! Figure out what reports are useful (I will also be providing some of my ideas in this post) and monitor them!
If you do not have the luxury of being supplied with a traffic reporting system to help you with your analysis, then I recommend using Google Analytics. It is free and does give you quite a bit of useful basic information, and it is all accessible online. There is however the issue of privacy, but this is something that every responsible person should evaluate for themselves and decide on - I am not getting into that debate here!
Inbetween the easy, the free, and the raw examination of the log files, you have a range of commercial tools that will process your raw log files and present them as very very sophisticated and valuable reports! Personally I am not going to recommend any of them as my exposure is very limited, and also, if you're planning on using these tools yourself, you've probably moved into a "NON Basics of Web Site Optimisation" categorisation. :)
The final option is to actually take the raw web server log files and read them. As insane as this may appear at first glance, they are actually fairly interpretable, although you have the high risk of "not being able to see the wood for all the trees" - you may be overwhelmed! You will need some non-basic editor and ability to do complex searches, and regular expression based searches are a real win!
The kinds of reports these tools, or this analysis provides you with are listed below in my reasoning section.
Reasoning:
1. How else are you able to run little experiments with different pages, with different site structures, with different little things you are going to do, unless you can measure and verify the effect?
2. Visitors - daily, weekly, monthly and yearly. NOT VISITS! Visitors are physical entities (people or webbots) looking at your site. VISITS gives you an indication of how exciting they found your site, but not a good one
3. Bounce Rate - how many people found your site and instantly did not like and left. Gives you an indication that your Search Engine Optimisation strategy might be bringing you the wrong visitors, or that you should change your content to give these visitors something to stick around for!
4. Average Time on Site - is interesting. Search engine web bots normally "read" a lot faster than humans, but they are coded these days to be "sensitive" to the web server and act more like a human reader. But keep an eye on the average trend and work out if your pages are too long or complicated.
5. Pages/Visit - anything greater than 1 is good obviously. Compare this figure with the Average Time on Site though and again work out how fast or slow your visitors are reading. Check what your market is. Make sure it is consistent to have people reading detailed technical information slowly and carefully, versus quick brochure type information which can be gleaned in seconds, especially if there are "pretty pictures"
6. Top Content - monitor your top 10 pages, or top 20% of all your pages. Make sure 80% of your traffic is going to them. Make sure most of your best income-spinners are on these pages. Do not ever ever ever "break" these - everyone loves them, the search engines love them. Be VERY CAREFUL with them
7. Top Landing Pages - check that your new pages fall into this list as well as your Top Content
8. Top Exit Pages - it may make sense for you to think about deleting some of the really bad pages you have, except if you actually have had sales from them or need them to compete in a "me too" market place
9. Keywords - of course! How are people finding your site - give them more of what they want, or change the pages you have in order to make them more findable by searchers. This is in effect your market place competitive effort - make sure you are attracting your intended market with your offering! Alternatively (or additionally) identify an under-served market niche you are already attracting customers for, and could easily move into potentially. (there will be some rules about this in the future)
10. All Traffic Sources - here you want to compare the traffic you get directly (people who know and love you or are recommended to your site by their friends), versus your top "reciprocal link partners", versus your top search engine traffic suppliers. Make sure you have balanced your risks. Search engines change algorithms all the time and you could be dropped for periods. Same with link partners. And if you are not getting any direct traffic - you should definitely do some more off line marketing and public relations campaigning - think about loyalty clubbing and newsletters, etc!
11. Where your visitors are geographically dispersed - make sure your target region is correctly reflected as it is no good attracting visitors from Timbaktu if you are trying to sell items in London (and vice-versa) etc.
12. New Visitor versus Returning Visitor - new is always good - you are getting new traffic and new potential customers. Returning is very important though - this is your loyalty and/or "love" factor. If you are not getting "repeat business" then you are doing something very wrong potentially - it depends on what you are using your web site for though. Check how many pages the repeaters view, how much time they spend, and compare against your new visitors. Get a feeling for your "market"
All of these reports are about trend analysis. It is not useful to get excited or depressed about a spot measurement you do randomly. You need to see the changes over time! There are times however, once you become very familiar with your traffic patterns (eg you know when the Google web spider indexes your site), when spot analysis can get exciting - eg you just timed and launched a new section of your site and Google has finally located it and is indexing it. (believe me this is exciting if your site is dynamic and you just published 800 new pages, on an existing 100 page site...)
13. Use combinations of the above reports to check how many times your enquiry form was opened, check how many times it was submitted, check how many visitors and repeat visitors you had, check how many new visitors actually submitted on their first visit (rare, but does occur)
With a bit of thought and enquiry monitoring in your office, and the above reports, you should be able to verify where your income is actually coming from. Many people are very surprised to learn the truth. Make sure you know!
There are many more reports, and combinations of statistics you can look at - it all depends on what your needs are. These above I find sufficient for mine usually, and most small to medium businesses I have helped in the past can use them to improve their offering, to improve their processing, to improve their web site, and above all to improve their business.
And that is my Rule 4. I will be uploading the others as time allows!
My number 4 rule: Analyse your site traffic
If you are hosting your site at a basic web hosting ISP, you should have been supplied at least a basic (or advance) traffic reporting system. Use the system! Figure out what reports are useful (I will also be providing some of my ideas in this post) and monitor them!
If you do not have the luxury of being supplied with a traffic reporting system to help you with your analysis, then I recommend using Google Analytics. It is free and does give you quite a bit of useful basic information, and it is all accessible online. There is however the issue of privacy, but this is something that every responsible person should evaluate for themselves and decide on - I am not getting into that debate here!
Inbetween the easy, the free, and the raw examination of the log files, you have a range of commercial tools that will process your raw log files and present them as very very sophisticated and valuable reports! Personally I am not going to recommend any of them as my exposure is very limited, and also, if you're planning on using these tools yourself, you've probably moved into a "NON Basics of Web Site Optimisation" categorisation. :)
The final option is to actually take the raw web server log files and read them. As insane as this may appear at first glance, they are actually fairly interpretable, although you have the high risk of "not being able to see the wood for all the trees" - you may be overwhelmed! You will need some non-basic editor and ability to do complex searches, and regular expression based searches are a real win!
The kinds of reports these tools, or this analysis provides you with are listed below in my reasoning section.
Reasoning:
1. How else are you able to run little experiments with different pages, with different site structures, with different little things you are going to do, unless you can measure and verify the effect?
2. Visitors - daily, weekly, monthly and yearly. NOT VISITS! Visitors are physical entities (people or webbots) looking at your site. VISITS gives you an indication of how exciting they found your site, but not a good one
3. Bounce Rate - how many people found your site and instantly did not like and left. Gives you an indication that your Search Engine Optimisation strategy might be bringing you the wrong visitors, or that you should change your content to give these visitors something to stick around for!
4. Average Time on Site - is interesting. Search engine web bots normally "read" a lot faster than humans, but they are coded these days to be "sensitive" to the web server and act more like a human reader. But keep an eye on the average trend and work out if your pages are too long or complicated.
5. Pages/Visit - anything greater than 1 is good obviously. Compare this figure with the Average Time on Site though and again work out how fast or slow your visitors are reading. Check what your market is. Make sure it is consistent to have people reading detailed technical information slowly and carefully, versus quick brochure type information which can be gleaned in seconds, especially if there are "pretty pictures"
6. Top Content - monitor your top 10 pages, or top 20% of all your pages. Make sure 80% of your traffic is going to them. Make sure most of your best income-spinners are on these pages. Do not ever ever ever "break" these - everyone loves them, the search engines love them. Be VERY CAREFUL with them
7. Top Landing Pages - check that your new pages fall into this list as well as your Top Content
8. Top Exit Pages - it may make sense for you to think about deleting some of the really bad pages you have, except if you actually have had sales from them or need them to compete in a "me too" market place
9. Keywords - of course! How are people finding your site - give them more of what they want, or change the pages you have in order to make them more findable by searchers. This is in effect your market place competitive effort - make sure you are attracting your intended market with your offering! Alternatively (or additionally) identify an under-served market niche you are already attracting customers for, and could easily move into potentially. (there will be some rules about this in the future)
10. All Traffic Sources - here you want to compare the traffic you get directly (people who know and love you or are recommended to your site by their friends), versus your top "reciprocal link partners", versus your top search engine traffic suppliers. Make sure you have balanced your risks. Search engines change algorithms all the time and you could be dropped for periods. Same with link partners. And if you are not getting any direct traffic - you should definitely do some more off line marketing and public relations campaigning - think about loyalty clubbing and newsletters, etc!
11. Where your visitors are geographically dispersed - make sure your target region is correctly reflected as it is no good attracting visitors from Timbaktu if you are trying to sell items in London (and vice-versa) etc.
12. New Visitor versus Returning Visitor - new is always good - you are getting new traffic and new potential customers. Returning is very important though - this is your loyalty and/or "love" factor. If you are not getting "repeat business" then you are doing something very wrong potentially - it depends on what you are using your web site for though. Check how many pages the repeaters view, how much time they spend, and compare against your new visitors. Get a feeling for your "market"
All of these reports are about trend analysis. It is not useful to get excited or depressed about a spot measurement you do randomly. You need to see the changes over time! There are times however, once you become very familiar with your traffic patterns (eg you know when the Google web spider indexes your site), when spot analysis can get exciting - eg you just timed and launched a new section of your site and Google has finally located it and is indexing it. (believe me this is exciting if your site is dynamic and you just published 800 new pages, on an existing 100 page site...)
13. Use combinations of the above reports to check how many times your enquiry form was opened, check how many times it was submitted, check how many visitors and repeat visitors you had, check how many new visitors actually submitted on their first visit (rare, but does occur)
With a bit of thought and enquiry monitoring in your office, and the above reports, you should be able to verify where your income is actually coming from. Many people are very surprised to learn the truth. Make sure you know!
There are many more reports, and combinations of statistics you can look at - it all depends on what your needs are. These above I find sufficient for mine usually, and most small to medium businesses I have helped in the past can use them to improve their offering, to improve their processing, to improve their web site, and above all to improve their business.
And that is my Rule 4. I will be uploading the others as time allows!
Tuesday, 15 July 2008
Basics of Web Site Optimisation - Rule 3
This post is mainly aimed at small to medium businesses that are just starting out and are keen to get something going, or have just gone live. I can't tell you how many times I have taught people over the past few years just a handful of strategically important things. So...here goes again, this time in a way that I can now simply refer to. As for my credibility - I would rather not divulge that here, read my Web Site Optimisation Rules and you decide. They are, after all, common sense, and common knowledge....like most things I blog about!
My number 3 rule: Choose or move to a good web site/domain name
Reasoning:
1. A great deal of emphasis is placed on web site / domain names by search engines when they rank your site
2. A great site name is easy for visitors to remember, to type, to tell their friends about, and their friends will also be able to spell!
3. A great name does not get easily confused with another site
4. Choose carefully when choosing .com or .co.uk/.za/.ch/.etc! Where are your visitors? What are you providing? Where would your visitors EXPECT you to be? Your domain name suffix also sets up expectations for visitors who do not know you at all and are trying to distinguish you from your millions of competitors!
5. Similar to my previous rule regarding good page and sub-directory names, the closer to English and your "market speak" your site name is, the better it will do, eg, bad:
http://www.baddayatwork.com, versus
good:
http://www.good-day-at-beach.com
6. Try and make sure each of your web pages' content somehow relates to your domain name - you do this by either creating very small web sites that are very focussed, or by choosing broader domain names that can be more easily applied to the different types of content on each page
7. One day when you busy swapping links with valuable online partners, the closer your domain name is aligned with theirs (or broad enough) the better it will be,
eg, bad:
valuable partner's site is all about pencils, eg http://www.pencils.com, your site is named:
http://www.erasers.com, versus
good:
http://www.pencil-erasers.com
best:
When the search engines look back from pencils.com to your pencil-erasers.com and vice-versa, they will identify a cluster of related sites, and thus rank both your sites higher!
As a final note - it does matter, and it does not matter what domain name you eventually choose. But in the initial stages it helps to have a good one. After your site is well known, after you are receiving enough enquiries as you can cope with, then it really does not matter. How many sites have "bad" names yet are now household names - yahoo, google - these are not (or were not) common English words before they became very well known web sites! And in my industry it is common for the "guru's" to create site names based on the concatenation of their first and last names - just look at the list I read on the right of this page to see what I mean!
And that is my Rule 3. I will be uploading the others as time allows!
My number 3 rule: Choose or move to a good web site/domain name
Reasoning:
1. A great deal of emphasis is placed on web site / domain names by search engines when they rank your site
2. A great site name is easy for visitors to remember, to type, to tell their friends about, and their friends will also be able to spell!
3. A great name does not get easily confused with another site
4. Choose carefully when choosing .com or .co.uk/.za/.ch/.etc! Where are your visitors? What are you providing? Where would your visitors EXPECT you to be? Your domain name suffix also sets up expectations for visitors who do not know you at all and are trying to distinguish you from your millions of competitors!
5. Similar to my previous rule regarding good page and sub-directory names, the closer to English and your "market speak" your site name is, the better it will do, eg, bad:
http://www.baddayatwork.com, versus
good:
http://www.good-day-at-beach.com
6. Try and make sure each of your web pages' content somehow relates to your domain name - you do this by either creating very small web sites that are very focussed, or by choosing broader domain names that can be more easily applied to the different types of content on each page
7. One day when you busy swapping links with valuable online partners, the closer your domain name is aligned with theirs (or broad enough) the better it will be,
eg, bad:
valuable partner's site is all about pencils, eg http://www.pencils.com, your site is named:
http://www.erasers.com, versus
good:
http://www.pencil-erasers.com
best:
When the search engines look back from pencils.com to your pencil-erasers.com and vice-versa, they will identify a cluster of related sites, and thus rank both your sites higher!
As a final note - it does matter, and it does not matter what domain name you eventually choose. But in the initial stages it helps to have a good one. After your site is well known, after you are receiving enough enquiries as you can cope with, then it really does not matter. How many sites have "bad" names yet are now household names - yahoo, google - these are not (or were not) common English words before they became very well known web sites! And in my industry it is common for the "guru's" to create site names based on the concatenation of their first and last names - just look at the list I read on the right of this page to see what I mean!
And that is my Rule 3. I will be uploading the others as time allows!
Thursday, 10 July 2008
Basics of Web Site Optimisation - Rule 2
This post is mainly aimed at small to medium businesses that are just starting out and are keen to get something going, or have just gone live. I can't tell you how many times I have taught people over the past few years just a handful of strategically important things. So...here goes again, this time in a way that I can now simply refer to. As for my credibility - I would rather not divulge that here, read my Web Site Optimisation Rules and you decide. They are, after all, common sense, and common knowledge....like most things I blog about!
My number 2 rule: Good page & sub-directory names
Reasoning:
1. What is "good"? Good is quite hard to explain concisely. It is a collection of decisions that combine to effectively target your audience, as well as search engine spiders. "Good" should align as closely as possible to how your target audience would think, and how they would submit keywords and phrases in order to find your content, in order to find what you are offering. This is very hard to do - there is a branch of study called "Information Architecture" that is working on this, as are the real search engine scientists. It is simplist for me to illustrate with clear examples..
2. Examples of bad names:
http://your.site.com/i.html
http://your.site.com/b/a/d.html
http://your.site.com/bestpracticeagilesoftwaredevelopment.html
http://your.site.com/agilesoftwaredevelopmentbestpractice.html
3. Examples of good names:
http://your.site.com/best-practice/agile-software-development.html
http://your.site.com/agile-software-development/best-practice.html
4. I chose the good example carefully as I also wanted to illustrate some of the subtleties involved in choosing good names. And this is where Competitive Advantage and a bit of luck truly takes its course. I can imagine someone using a search engine would enter things like:
"best practice for agile software development"
"best practices of agile software development"
as well as
"agile software development best practice"
"agile software development best practices"
Depending on your research, depending on how you interact with your industry, depending on how you speak to your clients and how they in turn speak to you, these are factors that determine how you should name your pages and sub-directories. Of course, if you have a new web site, and you have not had much exposure to your target client base, then you are playing the guessing game, which is okay! Do not panic - make sure that you realise that, and reduce your risk by running experiments and monitoring the results! (see my previous rule about how to measure your web site ROI very simply)
5. Did you notice above that I also replaced the spaces in potential search phrases with "-" in my page and sub-directory names? As much as possible your web site structure must reflect natural language usage....
6. Did you also notice that well chosen sub-directories quickly give visitors insight into what other content might be in a particular sub-directory? Make sure your themes in each of your well structured sub-directories are consistent. Consistency, relating things that are similar, linking them, and linking to external related sites and external related sites linking back to your pages are things that all add up and count in your search engine rating against your competitor web sites. (synergy!)
And that is my Rule 2. I will be uploading the others as time allows!
My number 2 rule: Good page & sub-directory names
Reasoning:
1. What is "good"? Good is quite hard to explain concisely. It is a collection of decisions that combine to effectively target your audience, as well as search engine spiders. "Good" should align as closely as possible to how your target audience would think, and how they would submit keywords and phrases in order to find your content, in order to find what you are offering. This is very hard to do - there is a branch of study called "Information Architecture" that is working on this, as are the real search engine scientists. It is simplist for me to illustrate with clear examples..
2. Examples of bad names:
http://your.site.com/i.html
http://your.site.com/b/a/d.html
http://your.site.com/bestpracticeagilesoftwaredevelopment.html
http://your.site.com/agilesoftwaredevelopmentbestpractice.html
3. Examples of good names:
http://your.site.com/best-practice/agile-software-development.html
http://your.site.com/agile-software-development/best-practice.html
4. I chose the good example carefully as I also wanted to illustrate some of the subtleties involved in choosing good names. And this is where Competitive Advantage and a bit of luck truly takes its course. I can imagine someone using a search engine would enter things like:
"best practice for agile software development"
"best practices of agile software development"
as well as
"agile software development best practice"
"agile software development best practices"
Depending on your research, depending on how you interact with your industry, depending on how you speak to your clients and how they in turn speak to you, these are factors that determine how you should name your pages and sub-directories. Of course, if you have a new web site, and you have not had much exposure to your target client base, then you are playing the guessing game, which is okay! Do not panic - make sure that you realise that, and reduce your risk by running experiments and monitoring the results! (see my previous rule about how to measure your web site ROI very simply)
5. Did you notice above that I also replaced the spaces in potential search phrases with "-" in my page and sub-directory names? As much as possible your web site structure must reflect natural language usage....
6. Did you also notice that well chosen sub-directories quickly give visitors insight into what other content might be in a particular sub-directory? Make sure your themes in each of your well structured sub-directories are consistent. Consistency, relating things that are similar, linking them, and linking to external related sites and external related sites linking back to your pages are things that all add up and count in your search engine rating against your competitor web sites. (synergy!)
And that is my Rule 2. I will be uploading the others as time allows!
Monday, 7 July 2008
Basics of Web Site Optimisation - Rule 1
This post is mainly aimed at small to medium businesses that are just starting out and are keen to get something going, or have just gone live. I can't tell you how many times I have taught people over the past few years just a handful of strategically important things. So...here goes again, this time in a way that I can now simply refer to. As for my credibility - I would rather not divulge that here, read my Web Site Optimisation Rules and you decide. They are, after all, common sense, and common knowledge....like most things I blog about!
My number 1 rule: No email addresses on the web site
Reasoning:
1. SPAM, SPAM, and more SPAM. Any email address on the web site, which is not sitting on some kind of island (most people with a web site actually want to be found...right?), will be picked up by the email scraper web bots that spammers (or their suppliers) use and will subsequently be mail-bombed by them (even though it is illegal)
2. If it happens to be a personal email address, then frequently all the above spamming, or other events such as the person being on leave, or being absent, or leaving the company, will result in potentially very important emails lying in an Inbox for an unacceptable amount of time, and result in a bad image of the company (unacceptable web time is getting less and less almost daily it seems - I would say 1/2 hour these days is pushing the limit for modern web users). The result of closing a spammed-to-death email address account is also a very painful IT process, especially if it is tied to the user's LAN account as frequently is the case.
3. If it happens to be a organisational email address (eg info, help, request), this is slightly better than number 2 above, at least more than 1 (non-administrator) person can access the account should they need to, and such "1 way" email accounts can be created and deleted and easily replaced as they become spammed-to-death.
So what is the better way? I recommend using a standard HTML form (or a handful of them, as per your needs - based on the categorisation of queries that you are hoping will be submitted via your site)
Reasoning:
1. Forms, via the web server behind them, can be set up to email a organisational email address very easily - same benefit as having an email address on the site!
2. You can put a few very simple fields on the form that will help with eventual processing of the query (very useful once things start getting busier, trust me I have a couple of T-shirts on this one - things, if they go as planned, get very busy much quicker than someone can be found to assist!)
3. You can add some very lightweight validation to the form to assist clients to give you the information you need in as few correspondences as possible (again, the busier you get, the less time you have for this, the more time clients need to wait between first submission, and final conclusion)
4. Forms can be submitted by anyone who is browsing your site, whereas email facilities might not be available to someone browsing your site from their phone, internet cafe, library, free kiosk for instance
5. Many people are using web based email accounts these days, and it requires some effort to get your email address over into their web mail client
6. The fastest way to actually get a client to submit something usable to you has to be a competitive advantage - fewer distractions, fewer potential glitches, fewer context switches, fewer Window switches, etc!
7. You get automatic Return On Investment (ROI) tracking as forms can have unique names, and submitted forms can also have unique names. This allows you, by hand, or with basic web traffic reporting systems, to see how many visitors (not visits!) your web site received, how many times your enquiry form was opened (= number of people who considered sending you a query) and compare those figures with how many actually submitted the form - how many times the submitted form was opened. (= visitors that wanted to become your client). e.g.: 1000 visitors, enquiry form opened 100 times, submission form opened 10 times would mean that 1% of your traffic resulted in sales leads. Possible interpretations would be that SEO is attracting the wrong kind of visitor, and/or the enquiry form is too difficult to be used or does not inspire sufficient confidence to be submitted, etc.
8. In all online systems I have worked with or advised, there is always a manager who also wants to have some idea of the amount of work coming in, and how quickly it is processed, and an ability to be able to do something if something goes wrong. Forms can be posted to multiple email addresses - or the email server can create backups and/or copies - all online, all at the manager's finger tips: in his/her Inbox (or rule-based sub folder)
9. The email server can also send auto-responses to the client so long as an email address is entered into a field that can be easily processed (this is good as the potential client will then give you more time than average to follow up with a real response, especially if you send the vital information that the submitted form has been received, will be processed, as soon as possible, between office hours of 8:00 to 18:00 GMT+2, for instance) (and thank you for your submission, we appreciate you, very very very much! thank you, thank you, and by the way, here is another backup email address you can reply to if you do not hear anything within 1 working day, quoting this reference number) (the email server and/or script doing the emailing can also generate the reference tracking number for you)
Nice, I think. This approach has always worked very well for me. Perhaps for you to.
What to do if you absolutely still must have an email address on the web site? And, to be honest, everyone using the forms method above, still does..
1. Create a tidy image of an organisational address, and use that instead
2. It is possible to get "clever" with some javascript to essentially fake or provide the user with a clickable html version of the email address on demand which spam bots can't get to, or less "clever" to generate and then paste into their email client. (there are plenty of places to download and customise such scripts or any javascript programmer could put it together (or download it) for you in a few minutes)
That's the way I do it. There is another way some use:
3. Obfuscate (lightly scramble) the email address so that the scraper-bots can not detect that it is an email address and use that. People try all sorts of techniques, but I am unsure how effective they really are - eg joeATsoapDOTcom, joeRemoveThis@soapRemoveThisToo.com
And that is my Rule 1. I will be uploading the others as time allows!
My number 1 rule: No email addresses on the web site
Reasoning:
1. SPAM, SPAM, and more SPAM. Any email address on the web site, which is not sitting on some kind of island (most people with a web site actually want to be found...right?), will be picked up by the email scraper web bots that spammers (or their suppliers) use and will subsequently be mail-bombed by them (even though it is illegal)
2. If it happens to be a personal email address, then frequently all the above spamming, or other events such as the person being on leave, or being absent, or leaving the company, will result in potentially very important emails lying in an Inbox for an unacceptable amount of time, and result in a bad image of the company (unacceptable web time is getting less and less almost daily it seems - I would say 1/2 hour these days is pushing the limit for modern web users). The result of closing a spammed-to-death email address account is also a very painful IT process, especially if it is tied to the user's LAN account as frequently is the case.
3. If it happens to be a organisational email address (eg info, help, request), this is slightly better than number 2 above, at least more than 1 (non-administrator) person can access the account should they need to, and such "1 way" email accounts can be created and deleted and easily replaced as they become spammed-to-death.
So what is the better way? I recommend using a standard HTML form (or a handful of them, as per your needs - based on the categorisation of queries that you are hoping will be submitted via your site)
Reasoning:
1. Forms, via the web server behind them, can be set up to email a organisational email address very easily - same benefit as having an email address on the site!
2. You can put a few very simple fields on the form that will help with eventual processing of the query (very useful once things start getting busier, trust me I have a couple of T-shirts on this one - things, if they go as planned, get very busy much quicker than someone can be found to assist!)
3. You can add some very lightweight validation to the form to assist clients to give you the information you need in as few correspondences as possible (again, the busier you get, the less time you have for this, the more time clients need to wait between first submission, and final conclusion)
4. Forms can be submitted by anyone who is browsing your site, whereas email facilities might not be available to someone browsing your site from their phone, internet cafe, library, free kiosk for instance
5. Many people are using web based email accounts these days, and it requires some effort to get your email address over into their web mail client
6. The fastest way to actually get a client to submit something usable to you has to be a competitive advantage - fewer distractions, fewer potential glitches, fewer context switches, fewer Window switches, etc!
7. You get automatic Return On Investment (ROI) tracking as forms can have unique names, and submitted forms can also have unique names. This allows you, by hand, or with basic web traffic reporting systems, to see how many visitors (not visits!) your web site received, how many times your enquiry form was opened (= number of people who considered sending you a query) and compare those figures with how many actually submitted the form - how many times the submitted form was opened. (= visitors that wanted to become your client). e.g.: 1000 visitors, enquiry form opened 100 times, submission form opened 10 times would mean that 1% of your traffic resulted in sales leads. Possible interpretations would be that SEO is attracting the wrong kind of visitor, and/or the enquiry form is too difficult to be used or does not inspire sufficient confidence to be submitted, etc.
8. In all online systems I have worked with or advised, there is always a manager who also wants to have some idea of the amount of work coming in, and how quickly it is processed, and an ability to be able to do something if something goes wrong. Forms can be posted to multiple email addresses - or the email server can create backups and/or copies - all online, all at the manager's finger tips: in his/her Inbox (or rule-based sub folder)
9. The email server can also send auto-responses to the client so long as an email address is entered into a field that can be easily processed (this is good as the potential client will then give you more time than average to follow up with a real response, especially if you send the vital information that the submitted form has been received, will be processed, as soon as possible, between office hours of 8:00 to 18:00 GMT+2, for instance) (and thank you for your submission, we appreciate you, very very very much! thank you, thank you, and by the way, here is another backup email address you can reply to if you do not hear anything within 1 working day, quoting this reference number) (the email server and/or script doing the emailing can also generate the reference tracking number for you)
Nice, I think. This approach has always worked very well for me. Perhaps for you to.
What to do if you absolutely still must have an email address on the web site? And, to be honest, everyone using the forms method above, still does..
1. Create a tidy image of an organisational address, and use that instead
2. It is possible to get "clever" with some javascript to essentially fake or provide the user with a clickable html version of the email address on demand which spam bots can't get to, or less "clever" to generate and then paste into their email client. (there are plenty of places to download and customise such scripts or any javascript programmer could put it together (or download it) for you in a few minutes)
That's the way I do it. There is another way some use:
3. Obfuscate (lightly scramble) the email address so that the scraper-bots can not detect that it is an email address and use that. People try all sorts of techniques, but I am unsure how effective they really are - eg joeATsoapDOTcom, joeRemoveThis@soapRemoveThisToo.com
And that is my Rule 1. I will be uploading the others as time allows!
Subscribe to:
Posts (Atom)