Many problems are easier to solve when you have data. However, there is a difference between having data and using data.
Several years ago, I worked wit an organization that was experiencing system outages. After months of outages and no effective action, they appointed an Operations Analyst to collect data and get to the bottom of the problem.
Having Data Isn’t Enough
Once they had data, the managers met monthly to review it. At the beginning of the meeting, the Operations Analyst presented a pie chart showing the “outage minutes” (number of minutes a system was unavailable) from the previous month. It was clear from the chart which system the biggest source of outages for the month.
The manager for that system spent the next 40 minutes squirming as the top manager grilled him. At the end of the meeting, the top manager sternly demanded,“Fix it!”
When I arrived, they had many months of data. But, whether they’d made improvements remained a mystery.
I looked at trends in the total number of outage minutes each month. I plotted trends for each application, created time series for each application to see if there were any temporal patterns. That’s as far as I could get with the existing data. I needed to know not just the number of minutes a system was down, but how many employees and customers couldn’t work for when a particular system was down.
One system had a lot of outage minutes, but only a handful of specialists who supported an uncommon legacy product used it. Another system didn’t fail often, but when it did, eight hundred employees were unable to access holdings for any customers.
They had lots of data. But, they weren’t using data in a way that helped them solve problems. They weren’t looking at trends in total outage minutes. The pie charts they loved so much showed the proportion of the whole, not whether the total number was increasing or decreasing over time. Because they didn’t understand the impact, they wasted time chasing insignificant problems.
Presentation Matters
I presented the data in a different way. Which led to a different set of questions, and more data gathering. That data eventually helped this group of managers focus their problem-solving (and stop pointing the roving finger of blame).
Without data, all you have to go on is your intuition and experience. If you’re lucky you may come up with a fix that works. But most good problem solvers don’t rely on luck. In some cases, you may have a good hunch what the problem is. Back up your hunches with data. In either case, I’m not talking about a big measurement program. Aim for “good enough” and “just enough” data. Start by looking for existing useful data , as I did for the call center I helped.
Quantitative and Qualitative Data
But what kind of data do you need? Not all problems involve easy to count factors such as outage minutes, stories completed, defects reported after release.
If you are looking at perceptions and interactions you’ll probably use qualitative data. Qualitative data focuses on experiences and qualities that we can observe, but cannot easily measure. Nothing wrong with that. It’s what we have to go on when the team is discussing team work, relationships, and perceptions. Of course, there are ways to measure some qualitative factors. Subjective reports are often sufficient (and less costly). Often, you can gather this sort of data in quickly in a group meeting.
If you are using quantitative data, it’s often best to prepare data relevant to the focus prior to the problem-solving meeting. Otherwise, you’ll have to rely on people’s memory and opinion, or spend precious time looking up the information you need to understand the issue.
To Choose Which Data, Start with these Questions
When I’m thinking about what data would be useful to understand a problem, I start with a general set of questions:
What are the visible symptoms?
What other effects can we observe?
Who cares about this issue?
How does the issue impact a particular person/group?
What is the impact on our organization?
These questions may lead closer to the real problem, or at least confirm direction. Based on what i find, I may choose where to delve deeper, and get more specific.
When does the problem occur?
How frequently does it occur?
Is the occurrence regular or irregular?
Which factors might contribute to the problem situation?
What other events might influence the context?
Does it always happen, or is it an exception?
Under what circumstances does the problem occur?
What are the circumstances under which it doesn’t occur?
Presentation makes a big difference. It may mean the difference between effective action and inaction, as was the case with the call center.
Data Deepens Understanding of Issues
In a retrospective—which is a special sort of problem-solving meeting—data can make the difference between superficial, ungrounded quick fixes and developing deeper understanding that leads to more effective action—whether you data is qualitative or quantitative.
Here’s some examples how I’ve gathering data for retrospectives an other problem-solving meetings.
Data Type | Method | Examples | Notes |
---|---|---|---|
Qualitative | Spider or Radar Chart | Use of XP practices. Satisfaction with various factors. Adherence to team working agreements. Level of various factors (e.g. training, independence) | Shows both clusters and spreads. Highlights areas of agreement and disagreement. Points towards areas for improvement. |
Leaf Charts | Satisfaction.
Motivation. Safety. Severity of issues. Anything for which there is a rating scale. | Use a pre-defined rating scale to show frequency distribution in the group. Similar to bar charts, but typically used for qualitative data. |
|
Sail boat (Jean Tabaka) | Favorable factors (wind), risks (rocks), unfavorable factors (anchors), | Metaphors such as this can prompt people to get past habitual thinking. | |
Timelines | Project, release, iteration. events over time.
Events may be categorized using various schemes. For example: positive/negative technical and non-technical levels within the organization (team, product, division, industry). | Shows patterns of events that repeat over time. Reveals pivotal events (with positive or negative effects). Useful for prompting memories, showing that people experience the same event differently. | |
Tables | Team skills profile (who has which skills, where there are gaps) | Shows relationships between two sets of information. Shows patterns. | |
Trends | Satisfaction.
Motivation. Safety. Severity of issues. Anything for which there is a rating scale. | Changes over time. | |
Quantitative | Pie Charts | Defects by type, module, source. Severity of issues. | Shows frequency distribution. |
Bar Charts | Bugs found in testing by module + bugs found by customers by module. | Frequency distribution, especially when there is more than one group of things to compare. Similar to histograms, but typically used for quantitative data. | |
Histograms | Distribution of length of outages. | Frequency of continuous data (not categories). | |
Trends | Defects. Outages. Stories completed. Stories accepted/rejected. | Shows movement over time. Often trends are more significant than absolute numbers in spotting problems. Trends may point you to areas for further investigation—which may become a retrospective action. | |
Scatter Plots | Size of project and amount over budget. | Show the relationship between two varianles. | |
Time Series | Outage minutes over a period of time. Through-put. | Show patterns and trends over time. Use when the temporal order of the data might be important, e.g., to see the effects of events. | |
Frequency Tables | Defects Stories accepted on first, 2nd, 3rd, demo. | A frequency table may be a preliminary step for other charts, or stand on its own. | |
Data Tables | Impact of not ready stories. | Show the same data for a numberr of instances. |
Spot On!
It seems to me that oftentimes, the hardest question that seldom gets asked is “How do we know?” — that it’s even a problem (and for whom)? that we will have fixed it after some remedy? And the followup — how can we find out?
Of course, asking such basic questions only arises when coming from an attitude of “I don’t know” — dangerous words in some environments.
Great idea to use data in the retrospective to substantiate the problem and make the discussion more focused
Thank you, Esther for the great article.