Data Driven

We see a lot written about data-driven decisions. For example, an article on the Harvard Business Review website begins: “Not a week goes by without us publishing something here at HBR about the value of data in business. Big data, small data, internal, external, experimental, observational — everywhere we look, information is being captured, quantified, and used to make business decisions.” And, in an article on scholastic.com,  the first of ten ‘truths’ about data-driven decisions is: “If you’re not using data to make decisions, you’re flying blind.”

It would seem that everyone is busy gathering, cleaning, and crunching large amounts of data prior to making a decision. Should you follow their lead? Perhaps an episode from my past can shed some light on this question.

A True Story

A number of years ago I was working for a small consulting company, when a client requested help analyzing the performance of a voice response system that was being developed for a new internet based telephone service. The client was the director of a department in a large telecommunications company. The project, which today might be called descriptive analytics, involved writing a SAS program to analyze the call transaction data from the voice response system. I was not enthusiastic about working on this project, but I was available, and we didn’t want to disappoint an important client.

I started by reviewing the system flowcharts. The system was designed to handle both customer service and inbound sales calls. Callers were first asked if they were existing customers. If they answered no, they were asked if they had seen the website; then they were asked if they had received a mail advertisement; and finally they were asked if they had a promotion code. If they answered yes to this last question, they were asked to enter the code. If they didn’t answer within a short time, or they entered an inappropriate number, they were again asked to enter the number. None of the preceding questions could be bypassed, and only after they had been completed would a potential customer be connected to a representative.

After looking at the flowcharts, I went over to talk to Jim, the person at the client site that I was working for. I told him that I thought that the system was badly designed since many customers would get frustrated by the difficulty of getting through to a representative, that they would hang-up, and sales would be lost. He replied that the project team was very keen on gathering data on their marketing efforts, and in any case, we hadn’t been asked to evaluate the system, only analyze the data.

I didn’t argue. I wrote the SAS program, and in due course, the voice response system went live. Our first report, which showed that 35% of callers were hanging-up, prompted a panicked response from the project team. As a result, Jim suggested that maybe, we should, make some recommendations to the project team. So I put together a presentation, and several days later Jim and I met with the project team in a large conference room.

I pointed out, as gently as I could, that it was not a good idea to make it difficult for potential customers to get through to your sales representatives, and that each new question that the system asked had the effect of providing an opportunity to hang-up. I further pointed out that the potential for lost sales could easily be 10 times the value of any cost savings generated by the system. My words got through, and after I finished, there was complete agreement that changes should be made. It was decided that we would meet again to put together a plan to revise the system.

I was feeling a lot better about the project; it was getting a lot more interesting, and I might actually make a difference.

However, before we could meet again, word came down, that because of the poor financial performance of the new service, senior executives had eliminated the systems budget; there would be no changes to the system; the service would be allowed to die.

3 thoughts on “Data Driven

  1. John Clifford

    This was a great case study, but I had a slightly different moral/conclusion. In my mind, perhaps a better title for your posting would be “Wrong data driven” or “Right data driven.”

    For me, data-driven is always a good idea–BUT only if you are looking at the right data!

    Here’s why:
    I think that you DID do data-driven analysis at the beginning of the project: knowing (and probably backed up by research somewhere) that more phone-tree prompts equals more abandoned calls, you counted the number of prompts (5) and correctly described the new systems as having too many prompts that would cause too many abandoned calls. As you predicted, going live with the system was a disaster. Had you been able to make the financial arguments based on even a rough estimate of the number of abandoned calls, then they would (should?) not have gone live.

    Your story reminds me of Doug Hubbard’s How to Measure Anything book, where he talks about (I’m probably getting the exact wording wrong) the value of information and figuring out what you can learn and the cost of learning it. In your case study, a literature review of how the number of phone-tree prompts correlates to the number of abandoned calls (even better, if there are results on phone trees for new customers), would have allowed you to estimate that 5 prompts could cause some range of abandoned calls. I’m guessing that even the low end would have been 10%. Coupling that with your calculations on the potential value of each abandoned call, I’m sure the system’s saving still would have been dwarfed by the potential lost revenue.

    I would modify the scholastic.com truth that you quote to be “If you’re not using the RIGHT data to make decisions, then you’re flying blind.”

    I would also modify the explanation of “truth” #7: “Don’t shoot first and ask questions later” where the article says: “You should figure out exactly what questions you want the data to answer before you tackle …” I would delete the rest of the sentence about turnkey vs. custom solutions, change “the data” to just “data” (The truth seems to assume you have questions and you have data, the only issue is how to connect them.), and then end with “anything.” Stating the obvious (but sometimes forgotten concept): you need to figure out the questions to answer and then figure out what data can help you answer the questions. Both of these should happen WAY before you start making decisions about the solution.

    Thanks for this post, which is a great reminder to not get focused on the data and data analysis, before you focus on the problem and how data can help.

  2. Ignacy Kaliszewski

    “Don’t put the cart before the horse.”
    In his post on February 11 Robert Rose comments on the recent fever about data driven decision making. It seems to me that the problem is even more general than the manifesto steaming from “Human Centered Operations Research” Robert’s company name (for me they both go together) conveys.
    A human being, if not confined in his/her free will, has been, is, and will be a sovereign in his/her decision making. There has been many prophets and many ideas which have claimed the end of this paradigm, to no effect as yet. Perceptrons, neural networks, artificial (recently: computational) intelligence, big data, robots, and many more, have been, and still are, marketed as means for an easy replacement of traditional reasoning by flesh-and-blood beings.
    With an experience in academia and in business (in that order), I have my own idea how to sell (to be bought by free will, not by any form of mental enforcement) analytic tools to managers: come out with a tool (a table, a spreadsheet, an application), so attractive and so easy that a manager can find it wise and interesting to play with at his/her idle time. To this aim, follow the “simple is beautiful” principle – make it easy. Observe that managers are ALWAYS busy. The success is only when he/she learns something from this tool and eventually absorbs the idea as his/her own. It is a truly human centric approach, but it does not work otherwise.
    Ignacy Kaliszewski
    Systems Research Institute, Polish Academy of Sciences, Warsaw.

  3. Huw Evans

    I’d like John Seddon’s systems thinking approach – listen to the client, user, customer … Then design something effective, then seek to make it more efficient then finally, sustainable. The case study is of ineffective management, dysfunctional even … Just because a van delivering bread has an accident doesn’t mean bread is dangerous ….

Comments are closed.