I’m used to working either as a contractor or permy for other companies, normally on projects of 1-6 months duration, in technology areas I know well. In these scenarios, I can usually estimate quite accurately.

However, my current project is a departure for me. I am writing some software on my own, for a customer (ie. I’m not working within a company), in an industry I didn’t know (manufacturing) using technologies I had never really used before (WPF/WCF/XP embedded/real-time). Oh, and requirements (as always) weren’t exactly set in stone (this isn’t the customer’s fault – it’s a fact of life that if you’re writing something new people don’t know what they want until they see something to base their ideas on).

The customer needs to know up-front roughly whether they can afford it, and if they’re going to have something timely enough to meet their business needs (ie. they may have a limited window of opportunity). So, how do you come up with an estimate of how long it’s going to take?

Well, I broke it down into what seemed like logical chunks/modules/sub-tasks, estimated those and came up with 4 months… fine, everybody’s happy… except that was a year ago! I was a factor of 3 out! WTF!

Now, I know developers are notoriously ambitious with their estimates but as I said, I’m normally not too shabby. However, I hadn’t factored in just how much time it would take to gain experience in the new stuff. And I don’t just mean learn them (eg. I thought I had already learned WPF) – I mean use them, in anger, on a real project, solving real problems.

So, lesson learned. Estimating is fine if you have real experience in everything you’re using on the project. For anything new, take your original estimate and multiply it by 3 :)