Wednesday 23 July 2008

Building Scalability and Achieving Performance

This is a short and very useful read on scaling architectures - InfoQ got 3 key architects with backgrounds in Twitter, eBay and Betfair to share their experiences, some approaches they take, and some tool recommendations. Practical advice! Building Scalability and Achieving Performance.

Tuesday 22 July 2008

Basics of Web Site Optimisation - Rule 5

This post is mainly aimed at small to medium businesses that are just starting out and are keen to get something going, or have just gone live. I can't tell you how many times I have taught people over the past few years just a handful of strategically important things. So...here goes again, this time in a way that I can now simply refer to. As for my credibility - I would rather not divulge that here, read my Web Site Optimisation Rules and you decide. They are, after all, common sense, and common knowledge....like most things I blog about!


My number 5 rule: Comply with the web standards - html

It is pretty amazing to me that there are many "professional" web site developers that do not know that there is actually a consortium of key organisations behind a set of standards for the technologies and protocols used on the internet. The World Wide Web Consortium (w3c) has created, ratified and published standards for HTML and related protocols for years and years!

For the purposes of web sites, in the past, I standardised web pages on an intermediate/transitional standard (HTML 4.01 Transitional Specification) as web browsers I was testing with (Opera, Internet Explorer, Firefox, Netscape) generally rendered (displayed) the intended result as I designed/implemented or acceptably close to that (or the client's requirement).

While researching this entry, I noticed that the W3C have just released an "Editor's Draft" of the latest specification on 17 July 2008. Check it out at HTML 5. I would not consider standardising on this one just yet though.

So how do you know if your web developer has even attempted to create standards compliant web pages or web site for you? You could do it the hard way - use your browser's functionality to view the HTML source of your web page and look for the "DOCTYPE" tag on the first line, for example:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">

But verifying that your page is supposed to be complying with one of the standards is not good enough! The wonderful W3C provide very useful and usable online Quality Assurance tools! Use the HTML Validator - simply enter the page name you would like to validate (your page has to be publicly accessible), and see what results you get back!

Alternatively there is a little program you can run, also via the W3C, that will give the same results. It is freely available, and its name is "tidy" or "HTML tidy". It actually is running behind the scenes of the online W3C HTML Validator link above! You can get information and download details from here.

I recommend, if you are interested (this is not difficult actually), that you get a little training in the subject matter! Again the W3C delivers and you can follow the W3 Schools Online Web Tutorials!

Reasoning:
1. If your web pages comply with the standard, the greater the chance that your site will render as you intend in your visitors' different browsers - an excellent idea!
2. If your web pages comply with the standard, the greater the chance that your site will be scraped by the search engine bots - a very good thing!
3. If your pages are NOT COMPLIANT, your search engine RANKINGS will be NEGATIVELY AFFECTED - a VERY BAD thing!
4. If your pages are NOT COMPLIANT, they will take longer to render in a browser as the browser will do its best to guess at correcting the page and this takes extra processing time - a VERY BAD thing!
5. By complying with the standard, you gain tool and developer freedom should you one day decide to maintain your site in a different tool or by a different developer (no lock-in) - a good thing.

And that is my Rule 5. I will be uploading the others as time allows!

Friday 18 July 2008

Ethical Office Politics

I have been meaning to read this article for about a month now, and finally got the time this morning! Adrian, the author of the article, covers quite a few topics throughout the piece and I found it an insightful and well thought out argument.
Ethical Office Politics

I think he does a good job of most of the issues I have studied, heard about, thought about and/or experienced.

Wednesday 16 July 2008

Basics of Web Site Optimisation - Rule 4

This post is mainly aimed at small to medium businesses that are just starting out and are keen to get something going, or have just gone live. I can't tell you how many times I have taught people over the past few years just a handful of strategically important things. So...here goes again, this time in a way that I can now simply refer to. As for my credibility - I would rather not divulge that here, read my Web Site Optimisation Rules and you decide. They are, after all, common sense, and common knowledge....like most things I blog about!

My number 4 rule: Analyse your site traffic

If you are hosting your site at a basic web hosting ISP, you should have been supplied at least a basic (or advance) traffic reporting system. Use the system! Figure out what reports are useful (I will also be providing some of my ideas in this post) and monitor them!

If you do not have the luxury of being supplied with a traffic reporting system to help you with your analysis, then I recommend using Google Analytics. It is free and does give you quite a bit of useful basic information, and it is all accessible online. There is however the issue of privacy, but this is something that every responsible person should evaluate for themselves and decide on - I am not getting into that debate here!

Inbetween the easy, the free, and the raw examination of the log files, you have a range of commercial tools that will process your raw log files and present them as very very sophisticated and valuable reports! Personally I am not going to recommend any of them as my exposure is very limited, and also, if you're planning on using these tools yourself, you've probably moved into a "NON Basics of Web Site Optimisation" categorisation. :)

The final option is to actually take the raw web server log files and read them. As insane as this may appear at first glance, they are actually fairly interpretable, although you have the high risk of "not being able to see the wood for all the trees" - you may be overwhelmed! You will need some non-basic editor and ability to do complex searches, and regular expression based searches are a real win!

The kinds of reports these tools, or this analysis provides you with are listed below in my reasoning section.

Reasoning:
1. How else are you able to run little experiments with different pages, with different site structures, with different little things you are going to do, unless you can measure and verify the effect?
2. Visitors - daily, weekly, monthly and yearly. NOT VISITS! Visitors are physical entities (people or webbots) looking at your site. VISITS gives you an indication of how exciting they found your site, but not a good one
3. Bounce Rate - how many people found your site and instantly did not like and left. Gives you an indication that your Search Engine Optimisation strategy might be bringing you the wrong visitors, or that you should change your content to give these visitors something to stick around for!
4. Average Time on Site - is interesting. Search engine web bots normally "read" a lot faster than humans, but they are coded these days to be "sensitive" to the web server and act more like a human reader. But keep an eye on the average trend and work out if your pages are too long or complicated.
5. Pages/Visit - anything greater than 1 is good obviously. Compare this figure with the Average Time on Site though and again work out how fast or slow your visitors are reading. Check what your market is. Make sure it is consistent to have people reading detailed technical information slowly and carefully, versus quick brochure type information which can be gleaned in seconds, especially if there are "pretty pictures"
6. Top Content - monitor your top 10 pages, or top 20% of all your pages. Make sure 80% of your traffic is going to them. Make sure most of your best income-spinners are on these pages. Do not ever ever ever "break" these - everyone loves them, the search engines love them. Be VERY CAREFUL with them
7. Top Landing Pages - check that your new pages fall into this list as well as your Top Content
8. Top Exit Pages - it may make sense for you to think about deleting some of the really bad pages you have, except if you actually have had sales from them or need them to compete in a "me too" market place
9. Keywords - of course! How are people finding your site - give them more of what they want, or change the pages you have in order to make them more findable by searchers. This is in effect your market place competitive effort - make sure you are attracting your intended market with your offering! Alternatively (or additionally) identify an under-served market niche you are already attracting customers for, and could easily move into potentially. (there will be some rules about this in the future)
10. All Traffic Sources - here you want to compare the traffic you get directly (people who know and love you or are recommended to your site by their friends), versus your top "reciprocal link partners", versus your top search engine traffic suppliers. Make sure you have balanced your risks. Search engines change algorithms all the time and you could be dropped for periods. Same with link partners. And if you are not getting any direct traffic - you should definitely do some more off line marketing and public relations campaigning - think about loyalty clubbing and newsletters, etc!
11. Where your visitors are geographically dispersed - make sure your target region is correctly reflected as it is no good attracting visitors from Timbaktu if you are trying to sell items in London (and vice-versa) etc.
12. New Visitor versus Returning Visitor - new is always good - you are getting new traffic and new potential customers. Returning is very important though - this is your loyalty and/or "love" factor. If you are not getting "repeat business" then you are doing something very wrong potentially - it depends on what you are using your web site for though. Check how many pages the repeaters view, how much time they spend, and compare against your new visitors. Get a feeling for your "market"
All of these reports are about trend analysis. It is not useful to get excited or depressed about a spot measurement you do randomly. You need to see the changes over time! There are times however, once you become very familiar with your traffic patterns (eg you know when the Google web spider indexes your site), when spot analysis can get exciting - eg you just timed and launched a new section of your site and Google has finally located it and is indexing it. (believe me this is exciting if your site is dynamic and you just published 800 new pages, on an existing 100 page site...)
13. Use combinations of the above reports to check how many times your enquiry form was opened, check how many times it was submitted, check how many visitors and repeat visitors you had, check how many new visitors actually submitted on their first visit (rare, but does occur)

With a bit of thought and enquiry monitoring in your office, and the above reports, you should be able to verify where your income is actually coming from. Many people are very surprised to learn the truth. Make sure you know!

There are many more reports, and combinations of statistics you can look at - it all depends on what your needs are. These above I find sufficient for mine usually, and most small to medium businesses I have helped in the past can use them to improve their offering, to improve their processing, to improve their web site, and above all to improve their business.

And that is my Rule 4. I will be uploading the others as time allows!

Tuesday 15 July 2008

Best Practice Applied In Wrong Context - Example 1

A friend of mine was ranting the other day. He had just done an iteration retrospective with his development team wherein they took a look at their quality metrics and discovered that "quality" had actually dropped off even though his team had spent more time than ever before on the client's official Quality Process.

I discussed this further with him and we agreed that quality is a mostly subjective concept when it comes to software ... we agreed that it can't be objectively meaasured ... we agreed that it can't be artificially injected "to meet the required metric" ... we agreed that it is something that Software Quality Assurers infer based on monitoring the various metric trends that make sense in the particular environment/context that the software is being developed in/for.

So with all this agreeance I asked him to explain further.

This story is probably a symptom of why I think there is so much cynism in the software industry. Read on if you dare!

It turns out that the client has a well worked out, well defined Quality Process that they are extremely happy with. This Process is guaranteed to prevent massive loss of life/income/spiralling out of control costs/etc etc - you can imagine why a large company invests huge amounts of time and money in creating a bullet proof Quality Process: manage risk, whatever that risk is.

Okay ... so why is my friend ranting? His team followed the process, passed a bunch of procedural milestones apparently and everyone was happy. Yet when he and his team look at the metrics they defined for how they measure quality, they noticed that the number of issues had risen, that some important tests had not been run early enough in the iteration to find issues that they could then respond to before the end of iteration. There were known open issues, and the were issues that had been addressed, but had not been signed off. There was waste accumulating that had not been a problem before.

How did this happen? The people whose responsibility it was to run the tests, to provide the early feedback had been too busy ensuring the team met the Quality Process requirements - they had been documenting, and reviewing and getting documentation reviewed and spending a large amount of time away from the product they were responsible for delivering. They were going on a tangent from users' needs.

And it showed.

There is no happy ending here - key client representatives (project stakeholders, but not users) have to ensure that their organisation's process is followed. Even if they know, and everyone else knows, that the process is not adding value, and that indeed, as above, the process is actually diminishing value. And it appears often that several times a group of client representatives need to experience failure and pain before they will attempt to address a badly formulated, or in the example above, placed, process. Sometimes, regrettably, these lessons are learned during retrenchment phases.

Yuck!

Basics of Web Site Optimisation - Rule 3

This post is mainly aimed at small to medium businesses that are just starting out and are keen to get something going, or have just gone live. I can't tell you how many times I have taught people over the past few years just a handful of strategically important things. So...here goes again, this time in a way that I can now simply refer to. As for my credibility - I would rather not divulge that here, read my Web Site Optimisation Rules and you decide. They are, after all, common sense, and common knowledge....like most things I blog about!

My number 3 rule: Choose or move to a good web site/domain name

Reasoning:
1. A great deal of emphasis is placed on web site / domain names by search engines when they rank your site
2. A great site name is easy for visitors to remember, to type, to tell their friends about, and their friends will also be able to spell!
3. A great name does not get easily confused with another site
4. Choose carefully when choosing .com or .co.uk/.za/.ch/.etc! Where are your visitors? What are you providing? Where would your visitors EXPECT you to be? Your domain name suffix also sets up expectations for visitors who do not know you at all and are trying to distinguish you from your millions of competitors!
5. Similar to my previous rule regarding good page and sub-directory names, the closer to English and your "market speak" your site name is, the better it will do, eg, bad:
http://www.baddayatwork.com, versus
good:
http://www.good-day-at-beach.com
6. Try and make sure each of your web pages' content somehow relates to your domain name - you do this by either creating very small web sites that are very focussed, or by choosing broader domain names that can be more easily applied to the different types of content on each page
7. One day when you busy swapping links with valuable online partners, the closer your domain name is aligned with theirs (or broad enough) the better it will be,
eg, bad:
valuable partner's site is all about pencils, eg http://www.pencils.com, your site is named:
http://www.erasers.com, versus
good:
http://www.pencil-erasers.com
best:
When the search engines look back from pencils.com to your pencil-erasers.com and vice-versa, they will identify a cluster of related sites, and thus rank both your sites higher!

As a final note - it does matter, and it does not matter what domain name you eventually choose. But in the initial stages it helps to have a good one. After your site is well known, after you are receiving enough enquiries as you can cope with, then it really does not matter. How many sites have "bad" names yet are now household names - yahoo, google - these are not (or were not) common English words before they became very well known web sites! And in my industry it is common for the "guru's" to create site names based on the concatenation of their first and last names - just look at the list I read on the right of this page to see what I mean!

And that is my Rule 3. I will be uploading the others as time allows!

Thursday 10 July 2008

Basics of Web Site Optimisation - Rule 2

This post is mainly aimed at small to medium businesses that are just starting out and are keen to get something going, or have just gone live. I can't tell you how many times I have taught people over the past few years just a handful of strategically important things. So...here goes again, this time in a way that I can now simply refer to. As for my credibility - I would rather not divulge that here, read my Web Site Optimisation Rules and you decide. They are, after all, common sense, and common knowledge....like most things I blog about!

My number 2 rule: Good page & sub-directory names

Reasoning:
1. What is "good"? Good is quite hard to explain concisely. It is a collection of decisions that combine to effectively target your audience, as well as search engine spiders. "Good" should align as closely as possible to how your target audience would think, and how they would submit keywords and phrases in order to find your content, in order to find what you are offering. This is very hard to do - there is a branch of study called "Information Architecture" that is working on this, as are the real search engine scientists. It is simplist for me to illustrate with clear examples..

2. Examples of bad names:
http://your.site.com/i.html
http://your.site.com/b/a/d.html
http://your.site.com/bestpracticeagilesoftwaredevelopment.html
http://your.site.com/agilesoftwaredevelopmentbestpractice.html

3. Examples of good names:
http://your.site.com/best-practice/agile-software-development.html
http://your.site.com/agile-software-development/best-practice.html

4. I chose the good example carefully as I also wanted to illustrate some of the subtleties involved in choosing good names. And this is where Competitive Advantage and a bit of luck truly takes its course. I can imagine someone using a search engine would enter things like:
"best practice for agile software development"
"best practices of agile software development"
as well as
"agile software development best practice"
"agile software development best practices"

Depending on your research, depending on how you interact with your industry, depending on how you speak to your clients and how they in turn speak to you, these are factors that determine how you should name your pages and sub-directories. Of course, if you have a new web site, and you have not had much exposure to your target client base, then you are playing the guessing game, which is okay! Do not panic - make sure that you realise that, and reduce your risk by running experiments and monitoring the results! (see my previous rule about how to measure your web site ROI very simply)

5. Did you notice above that I also replaced the spaces in potential search phrases with "-" in my page and sub-directory names? As much as possible your web site structure must reflect natural language usage....

6. Did you also notice that well chosen sub-directories quickly give visitors insight into what other content might be in a particular sub-directory? Make sure your themes in each of your well structured sub-directories are consistent. Consistency, relating things that are similar, linking them, and linking to external related sites and external related sites linking back to your pages are things that all add up and count in your search engine rating against your competitor web sites. (synergy!)

And that is my Rule 2. I will be uploading the others as time allows!

A smarter SMART for even better collaborative Objectives (including OKRs)

My favourite coaching tools: SMART Acronym Another Update