Wednesday, 22 January 2014

Estimation (Effort and Duration) #2/2

Yesterday I spoke about the art of estimation. I highlighted how valuable it is to be able to plan and estimate. I also highlighted that estimation is about getting a number that is close enough for the purpose, not about being accurate.

To get to the meat of things... here is my recipe for estimation success... It won't surprise you to see me say that the key elements of estimation are:
  1. Understand the task, i.e. what needs to be produced
  2. Comparison with similar tasks already completed
  3. Decomposition of tasks into smaller, more measurable activities
  4. Attention to detail (sufficient for our purposes)
The first (requirements) is obvious, but is very often not given enough attention to detail, resulting in an incomplete set of items to be produced. In a SAS context, this list might include technical objects such as Visual Analytics reports, stored processes, information maps, macros, DI Studio jobs, table metadata, library metadata, job schedules, security ACTs and ACEs, userIDs, data sets, views, and control files; on the business side, your list might include a user guide, training materials, a schedule of training delivery, a service model that specifies who supports which elements of your solution, a service catalog that tells users how to request support services, and a process guide that tells support teams how to fulfil support requests; and on the documentation side your list might include requirements, designs & specifications, and test cases.

Beyond identifying the products of your work, you'll need to identify what inputs are required in order to allow you to perform the task.

I'll offer further hints, tips and experience on requirements gathering in a subsequent article.

With regards to comparisons, we need to compare our planned task with similar tasks that have already been completed (and hence, we know how many people worked on them and how long they took). When doing this we need to be sure to look for differences between the tasks and make sure we take account of these by increasing or decreasing our estimate above or below the time it took for the actual task. In doing this, we're already starting to decompose the task because we're already looking for partial elements of the task that differ.

Decomposition is the real key, along with a solid approach to understanding what each of the sub-tasks does. As you decompose a unique task into more recognisable sub-tasks, you'll be able to more confidently estimate the effort/duration of the sub-tasks.

As we decompose the task into smaller tasks, we must be sure that we are clear which of the decomposed tasks is responsible for producing each of the deliverable items. We need to look out for intermediate items that are produced by one sub-task as an input to another sub-task; and pay the same attention to inputs - we must be certain that we understand the inputs and outputs of each sub-task.

I'll offer a deeper article on decomposition next week.

You're probably thinking that requirements, comparisons, and decomposition are quite obvious. So they should be! We already established that all perform estimations every day of our life. All I've done is highlight the things that we do subconsciously. But there is one more key element: attention to detail. We must pay attention to understanding each sub-task in our composition. We must be sir to understand its inputs, its outputs, and how we're going to achieve the outputs from the inputs.

Having a clear understanding of the inputs, the outputs and the process is crucial, and it can often help to do the decomposition with a colleague. Much like with pair programming, two people can challenge each other's understanding of the task in hand and, in our context, make sure that the in, outs and process of each sub-task are jointly understood.

I hope the foregoing has helped encourage you to estimate with more confidence, using your existing everyday skills. However, we should recognise that the change in context from supermarket & driving to software development means that we need a different bank of comparisons. We may need to build that bank with experience.

To learn more about estimating, talk to your more experienced colleagues and do some estimating with them. I'm not as great fan of training courses for estimation. I believe they're too generalistic. In my opinion, you're far better off learning from your colleagues in the context of the SAS environment and your enterprise. However, to the extent that some courses offer practical exercises, and those exercises offer experience, I can see some merit in courses.

Good luck!

Tuesday, 21 January 2014

Estimation (Effort and Duration) #1/2

Estimation: The art of approximating how much time or effort a task might take. I could write a book on the subject (yes, it'd probably be a very dull book!), it's something that I've worked hard at over the years. It's a very divisive subject: some people buy into the idea whilst others persuade themselves that it can't be done. There's a third group which repeatedly try to estimate but find their estimates wildly inaccurate and seemingly worthless and so they eventually end-up in the "can't be done" camp.

Beware. When we say we can't do estimates, it doesn't stop others doing the estimation on our behalf. The end result of somebody else estimating how long it'll take us to deliver a task is a mixture of inaccuracy and false expectations placed upon us. Our own estimates of our own tasks should be better than somebody else's estimate.

My personal belief is that anybody can (with a bit of practice and experience) make decent estimates, but only if they perceive that there is a value in doing so. In this article I'll address both: value, and how to estimate.

So, let's start by understanding the value to be gained from estimation. The purpose is not to beat up the estimator if the work stakes longer than estimated! All teams need to be able to plan - it allows them to balance their use of constrained resources, e.g. money and staff. No team has enough staff and money to do everything it wants at the same time. Having a plan, with estimated effort and duration for each activity, helps the team keep on top of what's happening now and what's happening next; it allows the team to start things in sufficient time to make sure they're finished by when they need to be; it allows teams to understand that they probably won't get a task done in time and hence the team needs to do something to bypass the issue.

Estimates form a key part of a plan. For me, the value of a plan comes from a) the thought processes used to create the plan, and b) the information gained from tracking actual activity against the planned activities and spotting the deviations. There's very little value, in my opinion, in simply having a plan; it's what you do with it that's important.

Estimates for effort or duration of individual tasks combine to form a plan - along with dependencies between tasks, etc.

Okay, so what's the magic set of steps to get an accurate, bullet-proof estimate?...

Well, before we get to the steps, let's be clear that an estimate is (by definition) not accurate nor bullet-proof. I remember the introduction of estimation in maths classes in my youth. Having been taught how to accurately answer mathematical questions, I recall that many of us struggled with the concept of estimation. In hindsight, I can see that estimating required a far higher level of skill than cranking out an accurate answer to a calculation. Instead of dealing with the numbers I was given, I had to make decisions about whether to round them up or round them down, and I had to choose whether to round to the nearest integer, ten or hundred before I performed the eventual calculation.

We use estimation every day. We tot-up an estimate of the bill in our heads as we walk around the supermarket putting things into our trolley (to make sure we don't go over budget). We estimate the distance and speed of approaching vehicles before overtaking the car in front of us. So, it's a skill that we all have and we all use regularly.

When I'm doing business analysis, I most frequently find that the person who is least able to provide detail on what they do is the person whose job is being studied. It's the business analyst's responsibility to help coax the information out of them. It's a skill that the business analyst needs to possess or learn.

So, it shouldn't surprise us to find that we use estimation every day of our life yet we feel challenged to know what to do when we need to use the same skills in a different context, i.e. we do it without thinking in the supermarket and in the car, yet we struggle when asked to consciously use the same techniques in the office. And, let's face it, the result of an inaccurate estimate in the office is unlikely to be as damaging as an inaccurate estimate whilst driving, so we should have confidence in our office-based estimations - if only we could figure out how to do it!

We should have confidence in our ability to estimate, and we should recognise that the objective is to get a value (or set of values) that are close enough; we're not trying to get the exact answer; we're trying to get something that is good enough for our purposes.

Don't just tell your project manager the value that you think they want to hear. That doesn't help you, and it doesn't help the project manager either.

Don't be afraid to be pessimistic and add a fudge factor or contingency factor. If you think it'll take you a day then it'll probably take you a day and a half! I used to work with somebody who was an eternal optimist with his estimations. He wasn't trying to pull the wool over anybody's eyes, he honestly believed his estimations (JH, you know I'm talking you!). Yet everybody in the office knew that his estimations were completely unreliable. Typically, his current piece of work would be "done by lunchtime" or "done by the end of today". We need to be realistic with our estimations, and we need to look back and compare how long the task took compared with our estimation. If you estimated one day and it took two, make sure you double your next estimation. If somebody questions why you think it'll "take you so long", point them to your last similar piece of work and tell them how long it took you.

When creating and supplying an estimate it's worth thinking about three values: the most likely estimate, the smallest estimate, and the biggest estimate. For example, if I want to estimate the time it might take to develop a single DI Studio job that requires a significant degree of complexity, perhaps I can be sure it'll take at least a couple of days to develop the job, certainly no more than two weeks. Armed with those boundaries, I can more confidently estimate that it'll take one week.

If you're not confident with your estimates, try supplying upper and lower bounds alongside them, so that the recipient of your estimates can better understand your degree of confidence.

Tomorrow I'll get into the meat of things and offer my recipe for estimation success.

Monday, 20 January 2014

Business Intelligence (BI) Evolution

I recently stumbled upon an interesting series of papers from IBM. There're entitled Breaking Away With Business Analytics and Optimisation. The informative and deep-thinking series talks about the need for data and good analytical processes, but it also highlights the need for a vision and a focus for our activities; IBM describes this as "breakthrough ideas". I interpret it as meaning the creation of competitive advantage, i.e. doing something better than the competition.

One particular paper in the series that caught my attention was Breaking away with business analytics and optimisation: New intelligence meets enterprise operations. Page 3 of this paper contains a neat new interpretation of the traditional BI evolution diagram. I've shown IBM's diagram above.

Traditionally the BI evolution diagram shows evolution from historic (static) reporting, through data exploration and forecasts, to predictive models and real-time systems, i.e. a gradual transition from "rear-view mirror" reporting to influencing the future. The IBM diagram contains more dimensions and focuses on the drive to achieve competitive advantage ("Breakaway"). Nice. This diagram certainly earns a place alongside the traditional form.

Tuesday, 14 January 2014

NOTE: Thoughts on Lineage

I got quite a lot of interested feedback on the BI Lineage post I made last week. My post highlighted a most informative article from Metacoda's Paul Homes.

Paul himself commented on my post and offered an additional tip. Here's what Paul said:
I agree it would be nice if BI developers could do their own scans without relying on unrestricted admins to do them ahead of time. This would be similar to how DI developers can do their own impact analysis for DI content in SAS Data Integration Studio. Ideally, as with DI, they could be done dynamically, without having to do a scan and have a BI Lineage custom repository to store them in.

In the meantime, one tip I'd suggest to make it easier for the BI developers, is that BI Lineage scans can also be scheduled. An unrestricted admin can schedule a scan, at a high level in the metadata tree, to be done every night for example.
A useful tip indeed. Thanks Paul.

Monday, 13 January 2014

2014, The Year of Personal Data

If 2013 was the year of wearable, personal devices then 2014 will be the year of personal data. In 2013 we saw a huge rise in popularity of wearable devices for measuring steps walked, distance travelled, pulse, calories consumed, and a lot more besides. These devices, and the smartphone, PC and cloud software that accompanied them, put us on the first few rungs of the business intelligence lifecycle - principally allowing us to do historic reporting.

I believe 2014 will see a great evolution of our use of personal data. Rather than the "rear view mirror" historic reporting that we've seen in 2013, we'll see software that predicts your future activity and offers advice and recommendations on how to positively influence your outcomes. It's not beyond the bounds of possibility for your smartphone to start prompting you to go for a walk lunchtime in order to meet your weekly target for steps, or consume no more 400 calories at dinner in order to avoid bursting your weekly calorie target. And those are just simplistic examples.

As an example of the lengths to which you can go to perform data mining on your personal data, I highly recommend the recent report in Wired of the astrophysicist who diagnosed himself with Crohn's disease. A fascinating story.

Friday, 10 January 2014

NOTE: Wrap-Up on Test Coverage and MCOVERAGE

I've spent this week describing the functionality and purpose of the MCOVERAGE system option introduced in SAS V9.3. Coverage testing is an important consideration for your testing strategy - it's important to know how much of your code has been tested.

As its name suggests, MCOVERAGE only logs macro coverage. It's a great shame that there isn't an equivalent for DATA steps. Perhaps it will be added in due course, to DATA steps or DS2, or both.

With some planning, and judicious use of some post-processing capability to make sense of the log(s), MCOVERAGE can be an important tool in your testing arsenal.

I note that HMS Analytical Software GmbH's testing tool (SASunit) includes coverage testing through the use of MCOVERAGE. I've not used SASunit myself, and I can't speak for how complete, reliable and supported it may be, but if you're interested in learning more I suggest you read the SASUnit: General overview and recent developments paper from the 2013 PhUSE conference and take a look at SASunit's SourceForge pages.

What is your experience with using coverage testing and/or MCOVERAGE? Post a comment, I'd love to hear from you.

MCOVERAGE:

NOTE: Macros Newness in 9.4 and 9.3 (MCOVERAGE), 6-Jan-2014
NOTE: Macro Coverage in Testing (MCOVERAGE), 7-Jan-2014
NOTE: Making Sense of MCOVERAGE for Coverage Testing of Your Macros, 8-Jan-2014
NOTE: Expanding Our Use of MCOVERAGE for Coverage Analysis of our Macro Testing, 9-Jan-2014
NOTE: Wrap-Up on Test Coverage and MCOVERAGE, 10-Jan-2014 (this article!)