I had a little free time over lunch so I browsed on over to to the latest CIO articles quickly and came across a series of three that I quite enjoyed: "Three Keys to Getting Your Projects Under Control"
Part 1 (Plug Leaks)
Part 2 (Have an Idea)
Part 3 (Go Granular)
The series attracted my attentions as I recently attended the BCS miniSPA event which is a summary of the 6 most highly voted sessions from the real SPA event held in March earlier this year.
At the miniSPA, I attended Marina Haase's workshop on Best Practices for Finding your way into new Projects - quickly....
Other than the ?obviously huge? amount of good ideas that attendees put forward during the individual brainstorming slot, Marina also introduced us, during the group work, to Analogy/Metaphorical Brainstorming techniques which I had heard about and practiced a little privately on some problems, but never in a group. The effects were pretty amazing and I definitely recommend you find out more about the technique and try it!
Edward De Bono also promotes using analogies for creative thinking in his books on Lateral Thinking and The 6 Thinking Hats. Although he advocates random selection techniques for the selection of the scenario that needs to pull/generate ideas.
The best agile insights, coaching tools, collaboration practices, productivity principles, business and individual recommendations that make real positive impacts to my clients. You can use them immediately for yourself or contact me for deeper support.
Wednesday, 30 July 2008
Monday, 28 July 2008
Basics of Web Site Optimisation - Rule 6
This post is mainly aimed at small to medium businesses that are just starting out and are keen to get something going, or have just gone live. I can't tell you how many times I have taught people over the past few years just a handful of strategically important things. So...here goes again, this time in a way that I can now simply refer to. As for my credibility - I would rather not divulge that here, read my Web Site Optimisation Rules and you decide. They are, after all, common sense, and common knowledge....like most things I blog about!
My number 6 rule: Find valuable reciprocal link partners, manage them and the relationship!
The reason why it is called the World Wide Web rather that the World Wide Collection of Islands - the connections/links between web sites! It is most important that you get your site connected, and preferably with partners that you treat as the valuable partners they should be, provided you choose wisely!
A good process is to scour the internet for sites that meet your criteria, and then make contact with their web master very politely. This is hard work and you will need to approach each target, depending on the target, with a customised strategy.
You will need to manage those that you find as potentials, those that you have tried to contact and are waiting for a respone, those that rejected you, and those that accepted you, for a long time. I used a spreadsheet that I updated every time I found a useful looking partner, and once per month or every three, I would actually do the work of checking links and emailing web masters.
The problem with those web sites/masters that accept you, is that you need to check back on them from time to time to ensure that they have not dropped you silently (and are thus receiving web visitors from you, for free). It is very bad practice when this happens. Another reason you need to check is that sometime web sites are sold, or change focus....and the changed focus is not something you wish to be associated with.
You can (and should regularly anyway, for your internal links will change forever) use the W3C (remember the W3C from my Web Site Optimisation Rule 5?) Online Link Checker. Simply enter the page name you would like to check for broken links.
I have used Xenu Link Sleuth in the past, which is easy to use, easy to install, and very fast to run! And it is free!
Reasoning:
1. Reality - Even though you build it - no one is going to come ... by random chance! Links are what makes the WWW work. If you want to be a part of the working WWW, you have to be a part. :)
2. Brand awareness - You have to be in the "market place" for potential visitors to even know you exist. In the online world, this means you need to be seen, and preferably with the right people!
3. Information clustering, horizontal - People looking for particular kinds of web sites tend to look for them! By this I mean they will follow any potential link that looks similar to what they are looking for, at that moment in time. So if your site is focussed on selling bananas, you should like to link up with related sites that might be focussed on selling other fruits. Depending on what you're selling, you may wish to actually link to other sites selling bananas - especially if your regions do not overlap!
4. Information clustering, vertical - Similar to horizontal, but this time looking for upstream/downstream (vertical) industry partners. So the banana farmers, the distributors, the retailers. Each of these partners could tell a very good story to their web site visitors - imagine a visitor to the "farm's site" following a link to the "distributor's site" and then to your site, and buying bananas from you!
5. Search Engine Optimisation (SEO) - The more links that are pointing to your site, the better your SEO ranking will be - thus giving you a better chance at appearing earlier in the search results for instance Google would return to queries that match your site. Combining this reason with the Information Clustering reasons, and you have an even more powerful effect that you will benefit from!
6. Market space - By following how visitors find you, and where they go from your site, you will get a better idea of your online market space. If it does not match what you expect or want, then, using reciprocal links, you can shift your online market space more favourably towards where you want it to be.
Rarely. New source of income - Check your reports to see which of your reciprocal link partners are more valuable to you, and which you are more valuable to them - perhaps, if the volumes of traffic you direct to the other site(s) are truly massive, you can even establish a new (small) source of income.
And that is my Rule 6. I will be uploading the others as time allows!
My number 6 rule: Find valuable reciprocal link partners, manage them and the relationship!
The reason why it is called the World Wide Web rather that the World Wide Collection of Islands - the connections/links between web sites! It is most important that you get your site connected, and preferably with partners that you treat as the valuable partners they should be, provided you choose wisely!
A good process is to scour the internet for sites that meet your criteria, and then make contact with their web master very politely. This is hard work and you will need to approach each target, depending on the target, with a customised strategy.
You will need to manage those that you find as potentials, those that you have tried to contact and are waiting for a respone, those that rejected you, and those that accepted you, for a long time. I used a spreadsheet that I updated every time I found a useful looking partner, and once per month or every three, I would actually do the work of checking links and emailing web masters.
The problem with those web sites/masters that accept you, is that you need to check back on them from time to time to ensure that they have not dropped you silently (and are thus receiving web visitors from you, for free). It is very bad practice when this happens. Another reason you need to check is that sometime web sites are sold, or change focus....and the changed focus is not something you wish to be associated with.
You can (and should regularly anyway, for your internal links will change forever) use the W3C (remember the W3C from my Web Site Optimisation Rule 5?) Online Link Checker. Simply enter the page name you would like to check for broken links.
I have used Xenu Link Sleuth in the past, which is easy to use, easy to install, and very fast to run! And it is free!
Reasoning:
1. Reality - Even though you build it - no one is going to come ... by random chance! Links are what makes the WWW work. If you want to be a part of the working WWW, you have to be a part. :)
2. Brand awareness - You have to be in the "market place" for potential visitors to even know you exist. In the online world, this means you need to be seen, and preferably with the right people!
3. Information clustering, horizontal - People looking for particular kinds of web sites tend to look for them! By this I mean they will follow any potential link that looks similar to what they are looking for, at that moment in time. So if your site is focussed on selling bananas, you should like to link up with related sites that might be focussed on selling other fruits. Depending on what you're selling, you may wish to actually link to other sites selling bananas - especially if your regions do not overlap!
4. Information clustering, vertical - Similar to horizontal, but this time looking for upstream/downstream (vertical) industry partners. So the banana farmers, the distributors, the retailers. Each of these partners could tell a very good story to their web site visitors - imagine a visitor to the "farm's site" following a link to the "distributor's site" and then to your site, and buying bananas from you!
5. Search Engine Optimisation (SEO) - The more links that are pointing to your site, the better your SEO ranking will be - thus giving you a better chance at appearing earlier in the search results for instance Google would return to queries that match your site. Combining this reason with the Information Clustering reasons, and you have an even more powerful effect that you will benefit from!
6. Market space - By following how visitors find you, and where they go from your site, you will get a better idea of your online market space. If it does not match what you expect or want, then, using reciprocal links, you can shift your online market space more favourably towards where you want it to be.
Rarely. New source of income - Check your reports to see which of your reciprocal link partners are more valuable to you, and which you are more valuable to them - perhaps, if the volumes of traffic you direct to the other site(s) are truly massive, you can even establish a new (small) source of income.
And that is my Rule 6. I will be uploading the others as time allows!
Wednesday, 23 July 2008
Scaling Software Agility by Dean Leffingwell
Dean is the former founder and CEO of Requisite Inc, responsible for the Requirements Gathering and Analysis Tool: Requisite Pro. It seems like his vast experience from startup to merging with IBM has touched on a number of key software development issues and he is now consulting very successfully and writing good books!
I picked up Scaling Software Agility at a book store/stall at SPA 2008 as it seemed to have a couple of new things to say, or at least say them in new ways - and I was very pleased with my choice!
I believe there is something for everyone in this book - wether you are new to agile or an experienced practitioner. The book touches on a number of topics and leads you from brief "beginner" chapters through to more interesting ones that are very relevant in today's software development arena - the scaling of agility.
Things that stand out in my memory of this book are the application of valuable software quality and management metrics, and the many strategies that Dean suggests can be used to counter the arguments typical organisational "police" will use to counter the attempt to "go [more] agile" and potentially inadvertently lead to "acceptable failure" or worse, "death march".
Usually corporations do not react to the infiltration of agile practices as they are kept within [small] team perimeters, thereby "flying under the radar". If you have a requirement to scale agile, then by definition you clearly have more people and teams that you are concerned about. There is more visibility and attention from the people who might strategically oppose the changes they do not understand, and/or department(s) or programme(s) it is being attempted in - key strategic people that you never previously even knew existed, nor what their concerns were, are now watching your every move.
http://www.amazon.co.uk: http://www.amazon.com:
Reason 2: Part 2 follows with more depth about the 7 Agile Practices that work: Agile Component Team, Agile Planning and Tracking, Iterations, Small Frequent Releases, Agile Testing, Continuous Integration, Retrospectives.
Reason 3: The 7 practices Leffingwell recommends for Scaling Agile:
- "Intentional Architecture": Approaches on how to tackle large software systems with Agile Architecture
- "Scalable Lean Requirements": Three simple topics that avoid analysis-paralysis failure mode: vision, roadmap and just-in-time (JIT) elaboration
- "Systems of Systems and the Agile Release Train": how to plan, and deliver, complex software components with interdependencies
- "Managing Highly Distributed Development": It is very difficult, and is a problem all successful software programmes face. Sooner or later the team is too big to fit in 1 room, on 1 floor, of 1 building, of 1 city, of 1 country. Inevitably practices have to be developed that can assist software that is developed by many different people, in different locations
- "Impact on Customers and Operations": How marketing, or product owners, or programme owners, will be convinced that Agile is a good thing for them
- "Changing the Organisation": How to address the arguments and fallacies that the corporate immune system is going to throw around as things become more agile
- "Measuring Business Performance": Real, usable, useful management metrics that can be used to control and manage large scale [agile] software development efforts
Thankyou for reading my recommendations!
I picked up Scaling Software Agility at a book store/stall at SPA 2008 as it seemed to have a couple of new things to say, or at least say them in new ways - and I was very pleased with my choice!
I believe there is something for everyone in this book - wether you are new to agile or an experienced practitioner. The book touches on a number of topics and leads you from brief "beginner" chapters through to more interesting ones that are very relevant in today's software development arena - the scaling of agility.
Things that stand out in my memory of this book are the application of valuable software quality and management metrics, and the many strategies that Dean suggests can be used to counter the arguments typical organisational "police" will use to counter the attempt to "go [more] agile" and potentially inadvertently lead to "acceptable failure" or worse, "death march".
Usually corporations do not react to the infiltration of agile practices as they are kept within [small] team perimeters, thereby "flying under the radar". If you have a requirement to scale agile, then by definition you clearly have more people and teams that you are concerned about. There is more visibility and attention from the people who might strategically oppose the changes they do not understand, and/or department(s) or programme(s) it is being attempted in - key strategic people that you never previously even knew existed, nor what their concerns were, are now watching your every move.
http://www.amazon.co.uk: http://www.amazon.com:
Why I recommend Scaling Software Agility:
Reason 1: Part 1 covers the essentials of Agile, Waterfall, XP, RUP, Scrum, Lean Software, DSDM, FDD in 85 pages!Reason 2: Part 2 follows with more depth about the 7 Agile Practices that work: Agile Component Team, Agile Planning and Tracking, Iterations, Small Frequent Releases, Agile Testing, Continuous Integration, Retrospectives.
Reason 3: The 7 practices Leffingwell recommends for Scaling Agile:
- "Intentional Architecture": Approaches on how to tackle large software systems with Agile Architecture
- "Scalable Lean Requirements": Three simple topics that avoid analysis-paralysis failure mode: vision, roadmap and just-in-time (JIT) elaboration
- "Systems of Systems and the Agile Release Train": how to plan, and deliver, complex software components with interdependencies
- "Managing Highly Distributed Development": It is very difficult, and is a problem all successful software programmes face. Sooner or later the team is too big to fit in 1 room, on 1 floor, of 1 building, of 1 city, of 1 country. Inevitably practices have to be developed that can assist software that is developed by many different people, in different locations
- "Impact on Customers and Operations": How marketing, or product owners, or programme owners, will be convinced that Agile is a good thing for them
- "Changing the Organisation": How to address the arguments and fallacies that the corporate immune system is going to throw around as things become more agile
- "Measuring Business Performance": Real, usable, useful management metrics that can be used to control and manage large scale [agile] software development efforts
Thankyou for reading my recommendations!
Building Scalability and Achieving Performance
This is a short and very useful read on scaling architectures - InfoQ got 3 key architects with backgrounds in Twitter, eBay and Betfair to share their experiences, some approaches they take, and some tool recommendations. Practical advice! Building Scalability and Achieving Performance.
Tuesday, 22 July 2008
Basics of Web Site Optimisation - Rule 5
This post is mainly aimed at small to medium businesses that are just starting out and are keen to get something going, or have just gone live. I can't tell you how many times I have taught people over the past few years just a handful of strategically important things. So...here goes again, this time in a way that I can now simply refer to. As for my credibility - I would rather not divulge that here, read my Web Site Optimisation Rules and you decide. They are, after all, common sense, and common knowledge....like most things I blog about!
My number 5 rule: Comply with the web standards - html
It is pretty amazing to me that there are many "professional" web site developers that do not know that there is actually a consortium of key organisations behind a set of standards for the technologies and protocols used on the internet. The World Wide Web Consortium (w3c) has created, ratified and published standards for HTML and related protocols for years and years!
For the purposes of web sites, in the past, I standardised web pages on an intermediate/transitional standard (HTML 4.01 Transitional Specification) as web browsers I was testing with (Opera, Internet Explorer, Firefox, Netscape) generally rendered (displayed) the intended result as I designed/implemented or acceptably close to that (or the client's requirement).
While researching this entry, I noticed that the W3C have just released an "Editor's Draft" of the latest specification on 17 July 2008. Check it out at HTML 5. I would not consider standardising on this one just yet though.
So how do you know if your web developer has even attempted to create standards compliant web pages or web site for you? You could do it the hard way - use your browser's functionality to view the HTML source of your web page and look for the "DOCTYPE" tag on the first line, for example:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
But verifying that your page is supposed to be complying with one of the standards is not good enough! The wonderful W3C provide very useful and usable online Quality Assurance tools! Use the HTML Validator - simply enter the page name you would like to validate (your page has to be publicly accessible), and see what results you get back!
Alternatively there is a little program you can run, also via the W3C, that will give the same results. It is freely available, and its name is "tidy" or "HTML tidy". It actually is running behind the scenes of the online W3C HTML Validator link above! You can get information and download details from here.
I recommend, if you are interested (this is not difficult actually), that you get a little training in the subject matter! Again the W3C delivers and you can follow the W3 Schools Online Web Tutorials!
Reasoning:
1. If your web pages comply with the standard, the greater the chance that your site will render as you intend in your visitors' different browsers - an excellent idea!
2. If your web pages comply with the standard, the greater the chance that your site will be scraped by the search engine bots - a very good thing!
3. If your pages are NOT COMPLIANT, your search engine RANKINGS will be NEGATIVELY AFFECTED - a VERY BAD thing!
4. If your pages are NOT COMPLIANT, they will take longer to render in a browser as the browser will do its best to guess at correcting the page and this takes extra processing time - a VERY BAD thing!
5. By complying with the standard, you gain tool and developer freedom should you one day decide to maintain your site in a different tool or by a different developer (no lock-in) - a good thing.
And that is my Rule 5. I will be uploading the others as time allows!
My number 5 rule: Comply with the web standards - html
It is pretty amazing to me that there are many "professional" web site developers that do not know that there is actually a consortium of key organisations behind a set of standards for the technologies and protocols used on the internet. The World Wide Web Consortium (w3c) has created, ratified and published standards for HTML and related protocols for years and years!
For the purposes of web sites, in the past, I standardised web pages on an intermediate/transitional standard (HTML 4.01 Transitional Specification) as web browsers I was testing with (Opera, Internet Explorer, Firefox, Netscape) generally rendered (displayed) the intended result as I designed/implemented or acceptably close to that (or the client's requirement).
While researching this entry, I noticed that the W3C have just released an "Editor's Draft" of the latest specification on 17 July 2008. Check it out at HTML 5. I would not consider standardising on this one just yet though.
So how do you know if your web developer has even attempted to create standards compliant web pages or web site for you? You could do it the hard way - use your browser's functionality to view the HTML source of your web page and look for the "DOCTYPE" tag on the first line, for example:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
But verifying that your page is supposed to be complying with one of the standards is not good enough! The wonderful W3C provide very useful and usable online Quality Assurance tools! Use the HTML Validator - simply enter the page name you would like to validate (your page has to be publicly accessible), and see what results you get back!
Alternatively there is a little program you can run, also via the W3C, that will give the same results. It is freely available, and its name is "tidy" or "HTML tidy". It actually is running behind the scenes of the online W3C HTML Validator link above! You can get information and download details from here.
I recommend, if you are interested (this is not difficult actually), that you get a little training in the subject matter! Again the W3C delivers and you can follow the W3 Schools Online Web Tutorials!
Reasoning:
1. If your web pages comply with the standard, the greater the chance that your site will render as you intend in your visitors' different browsers - an excellent idea!
2. If your web pages comply with the standard, the greater the chance that your site will be scraped by the search engine bots - a very good thing!
3. If your pages are NOT COMPLIANT, your search engine RANKINGS will be NEGATIVELY AFFECTED - a VERY BAD thing!
4. If your pages are NOT COMPLIANT, they will take longer to render in a browser as the browser will do its best to guess at correcting the page and this takes extra processing time - a VERY BAD thing!
5. By complying with the standard, you gain tool and developer freedom should you one day decide to maintain your site in a different tool or by a different developer (no lock-in) - a good thing.
And that is my Rule 5. I will be uploading the others as time allows!
Friday, 18 July 2008
Ethical Office Politics
I have been meaning to read this article for about a month now, and finally got the time this morning! Adrian, the author of the article, covers quite a few topics throughout the piece and I found it an insightful and well thought out argument.
Ethical Office Politics
I think he does a good job of most of the issues I have studied, heard about, thought about and/or experienced.
Ethical Office Politics
I think he does a good job of most of the issues I have studied, heard about, thought about and/or experienced.
Wednesday, 16 July 2008
Basics of Web Site Optimisation - Rule 4
This post is mainly aimed at small to medium businesses that are just starting out and are keen to get something going, or have just gone live. I can't tell you how many times I have taught people over the past few years just a handful of strategically important things. So...here goes again, this time in a way that I can now simply refer to. As for my credibility - I would rather not divulge that here, read my Web Site Optimisation Rules and you decide. They are, after all, common sense, and common knowledge....like most things I blog about!
My number 4 rule: Analyse your site traffic
If you are hosting your site at a basic web hosting ISP, you should have been supplied at least a basic (or advance) traffic reporting system. Use the system! Figure out what reports are useful (I will also be providing some of my ideas in this post) and monitor them!
If you do not have the luxury of being supplied with a traffic reporting system to help you with your analysis, then I recommend using Google Analytics. It is free and does give you quite a bit of useful basic information, and it is all accessible online. There is however the issue of privacy, but this is something that every responsible person should evaluate for themselves and decide on - I am not getting into that debate here!
Inbetween the easy, the free, and the raw examination of the log files, you have a range of commercial tools that will process your raw log files and present them as very very sophisticated and valuable reports! Personally I am not going to recommend any of them as my exposure is very limited, and also, if you're planning on using these tools yourself, you've probably moved into a "NON Basics of Web Site Optimisation" categorisation. :)
The final option is to actually take the raw web server log files and read them. As insane as this may appear at first glance, they are actually fairly interpretable, although you have the high risk of "not being able to see the wood for all the trees" - you may be overwhelmed! You will need some non-basic editor and ability to do complex searches, and regular expression based searches are a real win!
The kinds of reports these tools, or this analysis provides you with are listed below in my reasoning section.
Reasoning:
1. How else are you able to run little experiments with different pages, with different site structures, with different little things you are going to do, unless you can measure and verify the effect?
2. Visitors - daily, weekly, monthly and yearly. NOT VISITS! Visitors are physical entities (people or webbots) looking at your site. VISITS gives you an indication of how exciting they found your site, but not a good one
3. Bounce Rate - how many people found your site and instantly did not like and left. Gives you an indication that your Search Engine Optimisation strategy might be bringing you the wrong visitors, or that you should change your content to give these visitors something to stick around for!
4. Average Time on Site - is interesting. Search engine web bots normally "read" a lot faster than humans, but they are coded these days to be "sensitive" to the web server and act more like a human reader. But keep an eye on the average trend and work out if your pages are too long or complicated.
5. Pages/Visit - anything greater than 1 is good obviously. Compare this figure with the Average Time on Site though and again work out how fast or slow your visitors are reading. Check what your market is. Make sure it is consistent to have people reading detailed technical information slowly and carefully, versus quick brochure type information which can be gleaned in seconds, especially if there are "pretty pictures"
6. Top Content - monitor your top 10 pages, or top 20% of all your pages. Make sure 80% of your traffic is going to them. Make sure most of your best income-spinners are on these pages. Do not ever ever ever "break" these - everyone loves them, the search engines love them. Be VERY CAREFUL with them
7. Top Landing Pages - check that your new pages fall into this list as well as your Top Content
8. Top Exit Pages - it may make sense for you to think about deleting some of the really bad pages you have, except if you actually have had sales from them or need them to compete in a "me too" market place
9. Keywords - of course! How are people finding your site - give them more of what they want, or change the pages you have in order to make them more findable by searchers. This is in effect your market place competitive effort - make sure you are attracting your intended market with your offering! Alternatively (or additionally) identify an under-served market niche you are already attracting customers for, and could easily move into potentially. (there will be some rules about this in the future)
10. All Traffic Sources - here you want to compare the traffic you get directly (people who know and love you or are recommended to your site by their friends), versus your top "reciprocal link partners", versus your top search engine traffic suppliers. Make sure you have balanced your risks. Search engines change algorithms all the time and you could be dropped for periods. Same with link partners. And if you are not getting any direct traffic - you should definitely do some more off line marketing and public relations campaigning - think about loyalty clubbing and newsletters, etc!
11. Where your visitors are geographically dispersed - make sure your target region is correctly reflected as it is no good attracting visitors from Timbaktu if you are trying to sell items in London (and vice-versa) etc.
12. New Visitor versus Returning Visitor - new is always good - you are getting new traffic and new potential customers. Returning is very important though - this is your loyalty and/or "love" factor. If you are not getting "repeat business" then you are doing something very wrong potentially - it depends on what you are using your web site for though. Check how many pages the repeaters view, how much time they spend, and compare against your new visitors. Get a feeling for your "market"
All of these reports are about trend analysis. It is not useful to get excited or depressed about a spot measurement you do randomly. You need to see the changes over time! There are times however, once you become very familiar with your traffic patterns (eg you know when the Google web spider indexes your site), when spot analysis can get exciting - eg you just timed and launched a new section of your site and Google has finally located it and is indexing it. (believe me this is exciting if your site is dynamic and you just published 800 new pages, on an existing 100 page site...)
13. Use combinations of the above reports to check how many times your enquiry form was opened, check how many times it was submitted, check how many visitors and repeat visitors you had, check how many new visitors actually submitted on their first visit (rare, but does occur)
With a bit of thought and enquiry monitoring in your office, and the above reports, you should be able to verify where your income is actually coming from. Many people are very surprised to learn the truth. Make sure you know!
There are many more reports, and combinations of statistics you can look at - it all depends on what your needs are. These above I find sufficient for mine usually, and most small to medium businesses I have helped in the past can use them to improve their offering, to improve their processing, to improve their web site, and above all to improve their business.
And that is my Rule 4. I will be uploading the others as time allows!
My number 4 rule: Analyse your site traffic
If you are hosting your site at a basic web hosting ISP, you should have been supplied at least a basic (or advance) traffic reporting system. Use the system! Figure out what reports are useful (I will also be providing some of my ideas in this post) and monitor them!
If you do not have the luxury of being supplied with a traffic reporting system to help you with your analysis, then I recommend using Google Analytics. It is free and does give you quite a bit of useful basic information, and it is all accessible online. There is however the issue of privacy, but this is something that every responsible person should evaluate for themselves and decide on - I am not getting into that debate here!
Inbetween the easy, the free, and the raw examination of the log files, you have a range of commercial tools that will process your raw log files and present them as very very sophisticated and valuable reports! Personally I am not going to recommend any of them as my exposure is very limited, and also, if you're planning on using these tools yourself, you've probably moved into a "NON Basics of Web Site Optimisation" categorisation. :)
The final option is to actually take the raw web server log files and read them. As insane as this may appear at first glance, they are actually fairly interpretable, although you have the high risk of "not being able to see the wood for all the trees" - you may be overwhelmed! You will need some non-basic editor and ability to do complex searches, and regular expression based searches are a real win!
The kinds of reports these tools, or this analysis provides you with are listed below in my reasoning section.
Reasoning:
1. How else are you able to run little experiments with different pages, with different site structures, with different little things you are going to do, unless you can measure and verify the effect?
2. Visitors - daily, weekly, monthly and yearly. NOT VISITS! Visitors are physical entities (people or webbots) looking at your site. VISITS gives you an indication of how exciting they found your site, but not a good one
3. Bounce Rate - how many people found your site and instantly did not like and left. Gives you an indication that your Search Engine Optimisation strategy might be bringing you the wrong visitors, or that you should change your content to give these visitors something to stick around for!
4. Average Time on Site - is interesting. Search engine web bots normally "read" a lot faster than humans, but they are coded these days to be "sensitive" to the web server and act more like a human reader. But keep an eye on the average trend and work out if your pages are too long or complicated.
5. Pages/Visit - anything greater than 1 is good obviously. Compare this figure with the Average Time on Site though and again work out how fast or slow your visitors are reading. Check what your market is. Make sure it is consistent to have people reading detailed technical information slowly and carefully, versus quick brochure type information which can be gleaned in seconds, especially if there are "pretty pictures"
6. Top Content - monitor your top 10 pages, or top 20% of all your pages. Make sure 80% of your traffic is going to them. Make sure most of your best income-spinners are on these pages. Do not ever ever ever "break" these - everyone loves them, the search engines love them. Be VERY CAREFUL with them
7. Top Landing Pages - check that your new pages fall into this list as well as your Top Content
8. Top Exit Pages - it may make sense for you to think about deleting some of the really bad pages you have, except if you actually have had sales from them or need them to compete in a "me too" market place
9. Keywords - of course! How are people finding your site - give them more of what they want, or change the pages you have in order to make them more findable by searchers. This is in effect your market place competitive effort - make sure you are attracting your intended market with your offering! Alternatively (or additionally) identify an under-served market niche you are already attracting customers for, and could easily move into potentially. (there will be some rules about this in the future)
10. All Traffic Sources - here you want to compare the traffic you get directly (people who know and love you or are recommended to your site by their friends), versus your top "reciprocal link partners", versus your top search engine traffic suppliers. Make sure you have balanced your risks. Search engines change algorithms all the time and you could be dropped for periods. Same with link partners. And if you are not getting any direct traffic - you should definitely do some more off line marketing and public relations campaigning - think about loyalty clubbing and newsletters, etc!
11. Where your visitors are geographically dispersed - make sure your target region is correctly reflected as it is no good attracting visitors from Timbaktu if you are trying to sell items in London (and vice-versa) etc.
12. New Visitor versus Returning Visitor - new is always good - you are getting new traffic and new potential customers. Returning is very important though - this is your loyalty and/or "love" factor. If you are not getting "repeat business" then you are doing something very wrong potentially - it depends on what you are using your web site for though. Check how many pages the repeaters view, how much time they spend, and compare against your new visitors. Get a feeling for your "market"
All of these reports are about trend analysis. It is not useful to get excited or depressed about a spot measurement you do randomly. You need to see the changes over time! There are times however, once you become very familiar with your traffic patterns (eg you know when the Google web spider indexes your site), when spot analysis can get exciting - eg you just timed and launched a new section of your site and Google has finally located it and is indexing it. (believe me this is exciting if your site is dynamic and you just published 800 new pages, on an existing 100 page site...)
13. Use combinations of the above reports to check how many times your enquiry form was opened, check how many times it was submitted, check how many visitors and repeat visitors you had, check how many new visitors actually submitted on their first visit (rare, but does occur)
With a bit of thought and enquiry monitoring in your office, and the above reports, you should be able to verify where your income is actually coming from. Many people are very surprised to learn the truth. Make sure you know!
There are many more reports, and combinations of statistics you can look at - it all depends on what your needs are. These above I find sufficient for mine usually, and most small to medium businesses I have helped in the past can use them to improve their offering, to improve their processing, to improve their web site, and above all to improve their business.
And that is my Rule 4. I will be uploading the others as time allows!
Tuesday, 15 July 2008
Best Practice Applied In Wrong Context - Example 1
A friend of mine was ranting the other day. He had just done an iteration retrospective with his development team wherein they took a look at their quality metrics and discovered that "quality" had actually dropped off even though his team had spent more time than ever before on the client's official Quality Process.
I discussed this further with him and we agreed that quality is a mostly subjective concept when it comes to software ... we agreed that it can't be objectively meaasured ... we agreed that it can't be artificially injected "to meet the required metric" ... we agreed that it is something that Software Quality Assurers infer based on monitoring the various metric trends that make sense in the particular environment/context that the software is being developed in/for.
So with all this agreeance I asked him to explain further.
This story is probably a symptom of why I think there is so much cynism in the software industry. Read on if you dare!
It turns out that the client has a well worked out, well defined Quality Process that they are extremely happy with. This Process is guaranteed to prevent massive loss of life/income/spiralling out of control costs/etc etc - you can imagine why a large company invests huge amounts of time and money in creating a bullet proof Quality Process: manage risk, whatever that risk is.
Okay ... so why is my friend ranting? His team followed the process, passed a bunch of procedural milestones apparently and everyone was happy. Yet when he and his team look at the metrics they defined for how they measure quality, they noticed that the number of issues had risen, that some important tests had not been run early enough in the iteration to find issues that they could then respond to before the end of iteration. There were known open issues, and the were issues that had been addressed, but had not been signed off. There was waste accumulating that had not been a problem before.
How did this happen? The people whose responsibility it was to run the tests, to provide the early feedback had been too busy ensuring the team met the Quality Process requirements - they had been documenting, and reviewing and getting documentation reviewed and spending a large amount of time away from the product they were responsible for delivering. They were going on a tangent from users' needs.
And it showed.
There is no happy ending here - key client representatives (project stakeholders, but not users) have to ensure that their organisation's process is followed. Even if they know, and everyone else knows, that the process is not adding value, and that indeed, as above, the process is actually diminishing value. And it appears often that several times a group of client representatives need to experience failure and pain before they will attempt to address a badly formulated, or in the example above, placed, process. Sometimes, regrettably, these lessons are learned during retrenchment phases.
Yuck!
I discussed this further with him and we agreed that quality is a mostly subjective concept when it comes to software ... we agreed that it can't be objectively meaasured ... we agreed that it can't be artificially injected "to meet the required metric" ... we agreed that it is something that Software Quality Assurers infer based on monitoring the various metric trends that make sense in the particular environment/context that the software is being developed in/for.
So with all this agreeance I asked him to explain further.
This story is probably a symptom of why I think there is so much cynism in the software industry. Read on if you dare!
It turns out that the client has a well worked out, well defined Quality Process that they are extremely happy with. This Process is guaranteed to prevent massive loss of life/income/spiralling out of control costs/etc etc - you can imagine why a large company invests huge amounts of time and money in creating a bullet proof Quality Process: manage risk, whatever that risk is.
Okay ... so why is my friend ranting? His team followed the process, passed a bunch of procedural milestones apparently and everyone was happy. Yet when he and his team look at the metrics they defined for how they measure quality, they noticed that the number of issues had risen, that some important tests had not been run early enough in the iteration to find issues that they could then respond to before the end of iteration. There were known open issues, and the were issues that had been addressed, but had not been signed off. There was waste accumulating that had not been a problem before.
How did this happen? The people whose responsibility it was to run the tests, to provide the early feedback had been too busy ensuring the team met the Quality Process requirements - they had been documenting, and reviewing and getting documentation reviewed and spending a large amount of time away from the product they were responsible for delivering. They were going on a tangent from users' needs.
And it showed.
There is no happy ending here - key client representatives (project stakeholders, but not users) have to ensure that their organisation's process is followed. Even if they know, and everyone else knows, that the process is not adding value, and that indeed, as above, the process is actually diminishing value. And it appears often that several times a group of client representatives need to experience failure and pain before they will attempt to address a badly formulated, or in the example above, placed, process. Sometimes, regrettably, these lessons are learned during retrenchment phases.
Yuck!
Basics of Web Site Optimisation - Rule 3
This post is mainly aimed at small to medium businesses that are just starting out and are keen to get something going, or have just gone live. I can't tell you how many times I have taught people over the past few years just a handful of strategically important things. So...here goes again, this time in a way that I can now simply refer to. As for my credibility - I would rather not divulge that here, read my Web Site Optimisation Rules and you decide. They are, after all, common sense, and common knowledge....like most things I blog about!
My number 3 rule: Choose or move to a good web site/domain name
Reasoning:
1. A great deal of emphasis is placed on web site / domain names by search engines when they rank your site
2. A great site name is easy for visitors to remember, to type, to tell their friends about, and their friends will also be able to spell!
3. A great name does not get easily confused with another site
4. Choose carefully when choosing .com or .co.uk/.za/.ch/.etc! Where are your visitors? What are you providing? Where would your visitors EXPECT you to be? Your domain name suffix also sets up expectations for visitors who do not know you at all and are trying to distinguish you from your millions of competitors!
5. Similar to my previous rule regarding good page and sub-directory names, the closer to English and your "market speak" your site name is, the better it will do, eg, bad:
http://www.baddayatwork.com, versus
good:
http://www.good-day-at-beach.com
6. Try and make sure each of your web pages' content somehow relates to your domain name - you do this by either creating very small web sites that are very focussed, or by choosing broader domain names that can be more easily applied to the different types of content on each page
7. One day when you busy swapping links with valuable online partners, the closer your domain name is aligned with theirs (or broad enough) the better it will be,
eg, bad:
valuable partner's site is all about pencils, eg http://www.pencils.com, your site is named:
http://www.erasers.com, versus
good:
http://www.pencil-erasers.com
best:
When the search engines look back from pencils.com to your pencil-erasers.com and vice-versa, they will identify a cluster of related sites, and thus rank both your sites higher!
As a final note - it does matter, and it does not matter what domain name you eventually choose. But in the initial stages it helps to have a good one. After your site is well known, after you are receiving enough enquiries as you can cope with, then it really does not matter. How many sites have "bad" names yet are now household names - yahoo, google - these are not (or were not) common English words before they became very well known web sites! And in my industry it is common for the "guru's" to create site names based on the concatenation of their first and last names - just look at the list I read on the right of this page to see what I mean!
And that is my Rule 3. I will be uploading the others as time allows!
My number 3 rule: Choose or move to a good web site/domain name
Reasoning:
1. A great deal of emphasis is placed on web site / domain names by search engines when they rank your site
2. A great site name is easy for visitors to remember, to type, to tell their friends about, and their friends will also be able to spell!
3. A great name does not get easily confused with another site
4. Choose carefully when choosing .com or .co.uk/.za/.ch/.etc! Where are your visitors? What are you providing? Where would your visitors EXPECT you to be? Your domain name suffix also sets up expectations for visitors who do not know you at all and are trying to distinguish you from your millions of competitors!
5. Similar to my previous rule regarding good page and sub-directory names, the closer to English and your "market speak" your site name is, the better it will do, eg, bad:
http://www.baddayatwork.com, versus
good:
http://www.good-day-at-beach.com
6. Try and make sure each of your web pages' content somehow relates to your domain name - you do this by either creating very small web sites that are very focussed, or by choosing broader domain names that can be more easily applied to the different types of content on each page
7. One day when you busy swapping links with valuable online partners, the closer your domain name is aligned with theirs (or broad enough) the better it will be,
eg, bad:
valuable partner's site is all about pencils, eg http://www.pencils.com, your site is named:
http://www.erasers.com, versus
good:
http://www.pencil-erasers.com
best:
When the search engines look back from pencils.com to your pencil-erasers.com and vice-versa, they will identify a cluster of related sites, and thus rank both your sites higher!
As a final note - it does matter, and it does not matter what domain name you eventually choose. But in the initial stages it helps to have a good one. After your site is well known, after you are receiving enough enquiries as you can cope with, then it really does not matter. How many sites have "bad" names yet are now household names - yahoo, google - these are not (or were not) common English words before they became very well known web sites! And in my industry it is common for the "guru's" to create site names based on the concatenation of their first and last names - just look at the list I read on the right of this page to see what I mean!
And that is my Rule 3. I will be uploading the others as time allows!
Thursday, 10 July 2008
Basics of Web Site Optimisation - Rule 2
This post is mainly aimed at small to medium businesses that are just starting out and are keen to get something going, or have just gone live. I can't tell you how many times I have taught people over the past few years just a handful of strategically important things. So...here goes again, this time in a way that I can now simply refer to. As for my credibility - I would rather not divulge that here, read my Web Site Optimisation Rules and you decide. They are, after all, common sense, and common knowledge....like most things I blog about!
My number 2 rule: Good page & sub-directory names
Reasoning:
1. What is "good"? Good is quite hard to explain concisely. It is a collection of decisions that combine to effectively target your audience, as well as search engine spiders. "Good" should align as closely as possible to how your target audience would think, and how they would submit keywords and phrases in order to find your content, in order to find what you are offering. This is very hard to do - there is a branch of study called "Information Architecture" that is working on this, as are the real search engine scientists. It is simplist for me to illustrate with clear examples..
2. Examples of bad names:
http://your.site.com/i.html
http://your.site.com/b/a/d.html
http://your.site.com/bestpracticeagilesoftwaredevelopment.html
http://your.site.com/agilesoftwaredevelopmentbestpractice.html
3. Examples of good names:
http://your.site.com/best-practice/agile-software-development.html
http://your.site.com/agile-software-development/best-practice.html
4. I chose the good example carefully as I also wanted to illustrate some of the subtleties involved in choosing good names. And this is where Competitive Advantage and a bit of luck truly takes its course. I can imagine someone using a search engine would enter things like:
"best practice for agile software development"
"best practices of agile software development"
as well as
"agile software development best practice"
"agile software development best practices"
Depending on your research, depending on how you interact with your industry, depending on how you speak to your clients and how they in turn speak to you, these are factors that determine how you should name your pages and sub-directories. Of course, if you have a new web site, and you have not had much exposure to your target client base, then you are playing the guessing game, which is okay! Do not panic - make sure that you realise that, and reduce your risk by running experiments and monitoring the results! (see my previous rule about how to measure your web site ROI very simply)
5. Did you notice above that I also replaced the spaces in potential search phrases with "-" in my page and sub-directory names? As much as possible your web site structure must reflect natural language usage....
6. Did you also notice that well chosen sub-directories quickly give visitors insight into what other content might be in a particular sub-directory? Make sure your themes in each of your well structured sub-directories are consistent. Consistency, relating things that are similar, linking them, and linking to external related sites and external related sites linking back to your pages are things that all add up and count in your search engine rating against your competitor web sites. (synergy!)
And that is my Rule 2. I will be uploading the others as time allows!
My number 2 rule: Good page & sub-directory names
Reasoning:
1. What is "good"? Good is quite hard to explain concisely. It is a collection of decisions that combine to effectively target your audience, as well as search engine spiders. "Good" should align as closely as possible to how your target audience would think, and how they would submit keywords and phrases in order to find your content, in order to find what you are offering. This is very hard to do - there is a branch of study called "Information Architecture" that is working on this, as are the real search engine scientists. It is simplist for me to illustrate with clear examples..
2. Examples of bad names:
http://your.site.com/i.html
http://your.site.com/b/a/d.html
http://your.site.com/bestpracticeagilesoftwaredevelopment.html
http://your.site.com/agilesoftwaredevelopmentbestpractice.html
3. Examples of good names:
http://your.site.com/best-practice/agile-software-development.html
http://your.site.com/agile-software-development/best-practice.html
4. I chose the good example carefully as I also wanted to illustrate some of the subtleties involved in choosing good names. And this is where Competitive Advantage and a bit of luck truly takes its course. I can imagine someone using a search engine would enter things like:
"best practice for agile software development"
"best practices of agile software development"
as well as
"agile software development best practice"
"agile software development best practices"
Depending on your research, depending on how you interact with your industry, depending on how you speak to your clients and how they in turn speak to you, these are factors that determine how you should name your pages and sub-directories. Of course, if you have a new web site, and you have not had much exposure to your target client base, then you are playing the guessing game, which is okay! Do not panic - make sure that you realise that, and reduce your risk by running experiments and monitoring the results! (see my previous rule about how to measure your web site ROI very simply)
5. Did you notice above that I also replaced the spaces in potential search phrases with "-" in my page and sub-directory names? As much as possible your web site structure must reflect natural language usage....
6. Did you also notice that well chosen sub-directories quickly give visitors insight into what other content might be in a particular sub-directory? Make sure your themes in each of your well structured sub-directories are consistent. Consistency, relating things that are similar, linking them, and linking to external related sites and external related sites linking back to your pages are things that all add up and count in your search engine rating against your competitor web sites. (synergy!)
And that is my Rule 2. I will be uploading the others as time allows!
Monday, 7 July 2008
Basics of Web Site Optimisation - Rule 1
This post is mainly aimed at small to medium businesses that are just starting out and are keen to get something going, or have just gone live. I can't tell you how many times I have taught people over the past few years just a handful of strategically important things. So...here goes again, this time in a way that I can now simply refer to. As for my credibility - I would rather not divulge that here, read my Web Site Optimisation Rules and you decide. They are, after all, common sense, and common knowledge....like most things I blog about!
My number 1 rule: No email addresses on the web site
Reasoning:
1. SPAM, SPAM, and more SPAM. Any email address on the web site, which is not sitting on some kind of island (most people with a web site actually want to be found...right?), will be picked up by the email scraper web bots that spammers (or their suppliers) use and will subsequently be mail-bombed by them (even though it is illegal)
2. If it happens to be a personal email address, then frequently all the above spamming, or other events such as the person being on leave, or being absent, or leaving the company, will result in potentially very important emails lying in an Inbox for an unacceptable amount of time, and result in a bad image of the company (unacceptable web time is getting less and less almost daily it seems - I would say 1/2 hour these days is pushing the limit for modern web users). The result of closing a spammed-to-death email address account is also a very painful IT process, especially if it is tied to the user's LAN account as frequently is the case.
3. If it happens to be a organisational email address (eg info, help, request), this is slightly better than number 2 above, at least more than 1 (non-administrator) person can access the account should they need to, and such "1 way" email accounts can be created and deleted and easily replaced as they become spammed-to-death.
So what is the better way? I recommend using a standard HTML form (or a handful of them, as per your needs - based on the categorisation of queries that you are hoping will be submitted via your site)
Reasoning:
1. Forms, via the web server behind them, can be set up to email a organisational email address very easily - same benefit as having an email address on the site!
2. You can put a few very simple fields on the form that will help with eventual processing of the query (very useful once things start getting busier, trust me I have a couple of T-shirts on this one - things, if they go as planned, get very busy much quicker than someone can be found to assist!)
3. You can add some very lightweight validation to the form to assist clients to give you the information you need in as few correspondences as possible (again, the busier you get, the less time you have for this, the more time clients need to wait between first submission, and final conclusion)
4. Forms can be submitted by anyone who is browsing your site, whereas email facilities might not be available to someone browsing your site from their phone, internet cafe, library, free kiosk for instance
5. Many people are using web based email accounts these days, and it requires some effort to get your email address over into their web mail client
6. The fastest way to actually get a client to submit something usable to you has to be a competitive advantage - fewer distractions, fewer potential glitches, fewer context switches, fewer Window switches, etc!
7. You get automatic Return On Investment (ROI) tracking as forms can have unique names, and submitted forms can also have unique names. This allows you, by hand, or with basic web traffic reporting systems, to see how many visitors (not visits!) your web site received, how many times your enquiry form was opened (= number of people who considered sending you a query) and compare those figures with how many actually submitted the form - how many times the submitted form was opened. (= visitors that wanted to become your client). e.g.: 1000 visitors, enquiry form opened 100 times, submission form opened 10 times would mean that 1% of your traffic resulted in sales leads. Possible interpretations would be that SEO is attracting the wrong kind of visitor, and/or the enquiry form is too difficult to be used or does not inspire sufficient confidence to be submitted, etc.
8. In all online systems I have worked with or advised, there is always a manager who also wants to have some idea of the amount of work coming in, and how quickly it is processed, and an ability to be able to do something if something goes wrong. Forms can be posted to multiple email addresses - or the email server can create backups and/or copies - all online, all at the manager's finger tips: in his/her Inbox (or rule-based sub folder)
9. The email server can also send auto-responses to the client so long as an email address is entered into a field that can be easily processed (this is good as the potential client will then give you more time than average to follow up with a real response, especially if you send the vital information that the submitted form has been received, will be processed, as soon as possible, between office hours of 8:00 to 18:00 GMT+2, for instance) (and thank you for your submission, we appreciate you, very very very much! thank you, thank you, and by the way, here is another backup email address you can reply to if you do not hear anything within 1 working day, quoting this reference number) (the email server and/or script doing the emailing can also generate the reference tracking number for you)
Nice, I think. This approach has always worked very well for me. Perhaps for you to.
What to do if you absolutely still must have an email address on the web site? And, to be honest, everyone using the forms method above, still does..
1. Create a tidy image of an organisational address, and use that instead
2. It is possible to get "clever" with some javascript to essentially fake or provide the user with a clickable html version of the email address on demand which spam bots can't get to, or less "clever" to generate and then paste into their email client. (there are plenty of places to download and customise such scripts or any javascript programmer could put it together (or download it) for you in a few minutes)
That's the way I do it. There is another way some use:
3. Obfuscate (lightly scramble) the email address so that the scraper-bots can not detect that it is an email address and use that. People try all sorts of techniques, but I am unsure how effective they really are - eg joeATsoapDOTcom, joeRemoveThis@soapRemoveThisToo.com
And that is my Rule 1. I will be uploading the others as time allows!
My number 1 rule: No email addresses on the web site
Reasoning:
1. SPAM, SPAM, and more SPAM. Any email address on the web site, which is not sitting on some kind of island (most people with a web site actually want to be found...right?), will be picked up by the email scraper web bots that spammers (or their suppliers) use and will subsequently be mail-bombed by them (even though it is illegal)
2. If it happens to be a personal email address, then frequently all the above spamming, or other events such as the person being on leave, or being absent, or leaving the company, will result in potentially very important emails lying in an Inbox for an unacceptable amount of time, and result in a bad image of the company (unacceptable web time is getting less and less almost daily it seems - I would say 1/2 hour these days is pushing the limit for modern web users). The result of closing a spammed-to-death email address account is also a very painful IT process, especially if it is tied to the user's LAN account as frequently is the case.
3. If it happens to be a organisational email address (eg info, help, request), this is slightly better than number 2 above, at least more than 1 (non-administrator) person can access the account should they need to, and such "1 way" email accounts can be created and deleted and easily replaced as they become spammed-to-death.
So what is the better way? I recommend using a standard HTML form (or a handful of them, as per your needs - based on the categorisation of queries that you are hoping will be submitted via your site)
Reasoning:
1. Forms, via the web server behind them, can be set up to email a organisational email address very easily - same benefit as having an email address on the site!
2. You can put a few very simple fields on the form that will help with eventual processing of the query (very useful once things start getting busier, trust me I have a couple of T-shirts on this one - things, if they go as planned, get very busy much quicker than someone can be found to assist!)
3. You can add some very lightweight validation to the form to assist clients to give you the information you need in as few correspondences as possible (again, the busier you get, the less time you have for this, the more time clients need to wait between first submission, and final conclusion)
4. Forms can be submitted by anyone who is browsing your site, whereas email facilities might not be available to someone browsing your site from their phone, internet cafe, library, free kiosk for instance
5. Many people are using web based email accounts these days, and it requires some effort to get your email address over into their web mail client
6. The fastest way to actually get a client to submit something usable to you has to be a competitive advantage - fewer distractions, fewer potential glitches, fewer context switches, fewer Window switches, etc!
7. You get automatic Return On Investment (ROI) tracking as forms can have unique names, and submitted forms can also have unique names. This allows you, by hand, or with basic web traffic reporting systems, to see how many visitors (not visits!) your web site received, how many times your enquiry form was opened (= number of people who considered sending you a query) and compare those figures with how many actually submitted the form - how many times the submitted form was opened. (= visitors that wanted to become your client). e.g.: 1000 visitors, enquiry form opened 100 times, submission form opened 10 times would mean that 1% of your traffic resulted in sales leads. Possible interpretations would be that SEO is attracting the wrong kind of visitor, and/or the enquiry form is too difficult to be used or does not inspire sufficient confidence to be submitted, etc.
8. In all online systems I have worked with or advised, there is always a manager who also wants to have some idea of the amount of work coming in, and how quickly it is processed, and an ability to be able to do something if something goes wrong. Forms can be posted to multiple email addresses - or the email server can create backups and/or copies - all online, all at the manager's finger tips: in his/her Inbox (or rule-based sub folder)
9. The email server can also send auto-responses to the client so long as an email address is entered into a field that can be easily processed (this is good as the potential client will then give you more time than average to follow up with a real response, especially if you send the vital information that the submitted form has been received, will be processed, as soon as possible, between office hours of 8:00 to 18:00 GMT+2, for instance) (and thank you for your submission, we appreciate you, very very very much! thank you, thank you, and by the way, here is another backup email address you can reply to if you do not hear anything within 1 working day, quoting this reference number) (the email server and/or script doing the emailing can also generate the reference tracking number for you)
Nice, I think. This approach has always worked very well for me. Perhaps for you to.
What to do if you absolutely still must have an email address on the web site? And, to be honest, everyone using the forms method above, still does..
1. Create a tidy image of an organisational address, and use that instead
2. It is possible to get "clever" with some javascript to essentially fake or provide the user with a clickable html version of the email address on demand which spam bots can't get to, or less "clever" to generate and then paste into their email client. (there are plenty of places to download and customise such scripts or any javascript programmer could put it together (or download it) for you in a few minutes)
That's the way I do it. There is another way some use:
3. Obfuscate (lightly scramble) the email address so that the scraper-bots can not detect that it is an email address and use that. People try all sorts of techniques, but I am unsure how effective they really are - eg joeATsoapDOTcom, joeRemoveThis@soapRemoveThisToo.com
And that is my Rule 1. I will be uploading the others as time allows!
Wabi-sabi
I stumbled onto "wabi-sabi" and thought "Yes!" that's a new saying for me, that would fit in nicely with this blog! It has apparently even been used in Agile and Wiki discussions! (no guessing where I found it!)
According to our great source of free intelligent information wikipedia on the matter, it has Japanese origin. According to the entry, a 'Richard R. Powell' summarized its meaning by saying "[wabi-sabi] nurtures all that is authentic by acknowledging three simple realities: nothing lasts, nothing is finished, and nothing is perfect."'
Furthermore, the deeper meaning is also expressed by 'Andrew Juniper' as "if an object or expression can bring about, within us, a sense of serene melancholy and a spiritual longing, then that object could be said to be wabi-sabi."
I like it!
Thankyou for supporting!
According to our great source of free intelligent information wikipedia on the matter, it has Japanese origin. According to the entry, a 'Richard R. Powell' summarized its meaning by saying "[wabi-sabi] nurtures all that is authentic by acknowledging three simple realities: nothing lasts, nothing is finished, and nothing is perfect."'
Furthermore, the deeper meaning is also expressed by 'Andrew Juniper' as "if an object or expression can bring about, within us, a sense of serene melancholy and a spiritual longing, then that object could be said to be wabi-sabi."
I like it!
Thankyou for supporting!
Tuesday, 1 July 2008
Cool way of visualising Eclipse's history
Here is a little video visualising the various Eclipse players over the years checking source files, documents and images into the repository. It makes for a few minutes interesting viewing! Eclipse Code Swarm (short version)
Subscribe to:
Posts (Atom)