We see a lot written about data-driven decisions. For example, an article on the Harvard Business Review website begins: “Not a week goes by without us publishing something here at HBR about the value of data in business. Big data, small data, internal, external, experimental, observational — everywhere we look, information is being captured, quantified, and used to make business decisions.” And, in an article on scholastic.com, the first of ten ‘truths’ about data-driven decisions is: “If you’re not using data to make decisions, you’re flying blind.”
It would seem that everyone is busy gathering, cleaning, and crunching large amounts of data prior to making a decision. Should you follow their lead? Perhaps an episode from my past can shed some light on this question.
A True Story
A number of years ago I was working for a small consulting company, when a client requested help analyzing the performance of a voice response system that was being developed for a new internet based telephone service. The client was the director of a department in a large telecommunications company. The project, which today might be called descriptive analytics, involved writing a SAS program to analyze the call transaction data from the voice response system. I was not enthusiastic about working on this project, but I was available, and we didn’t want to disappoint an important client.
I started by reviewing the system flowcharts. The system was designed to handle both customer service and inbound sales calls. Callers were first asked if they were existing customers. If they answered no, they were asked if they had seen the website; then they were asked if they had received a mail advertisement; and finally they were asked if they had a promotion code. If they answered yes to this last question, they were asked to enter the code. If they didn’t answer within a short time, or they entered an inappropriate number, they were again asked to enter the number. None of the preceding questions could be bypassed, and only after they had been completed would a potential customer be connected to a representative.
After looking at the flowcharts, I went over to talk to Jim, the person at the client site that I was working for. I told him that I thought that the system was badly designed since many customers would get frustrated by the difficulty of getting through to a representative, that they would hang-up, and sales would be lost. He replied that the project team was very keen on gathering data on their marketing efforts, and in any case, we hadn’t been asked to evaluate the system, only analyze the data.
I didn’t argue. I wrote the SAS program, and in due course, the voice response system went live. Our first report, which showed that 35% of callers were hanging-up, prompted a panicked response from the project team. As a result, Jim suggested that maybe, we should, make some recommendations to the project team. So I put together a presentation, and several days later Jim and I met with the project team in a large conference room.
I pointed out, as gently as I could, that it was not a good idea to make it difficult for potential customers to get through to your sales representatives, and that each new question that the system asked had the effect of providing an opportunity to hang-up. I further pointed out that the potential for lost sales could easily be 10 times the value of any cost savings generated by the system. My words got through, and after I finished, there was complete agreement that changes should be made. It was decided that we would meet again to put together a plan to revise the system.
I was feeling a lot better about the project; it was getting a lot more interesting, and I might actually make a difference.
However, before we could meet again, word came down, that because of the poor financial performance of the new service, senior executives had eliminated the systems budget; there would be no changes to the system; the service would be allowed to die.