Author Archives: Robert Rose

Managerial Robots

In a 1984 paper that describes an automated employee scheduling system, the authors introduce the concept of “managerial robots”. They characterize their system as an example of artificial intelligence.

They write that “Just as an industrial robot might replace a production line worker…so also the automatic scheduler described above replaces an operative level manager…” and “We suggest that managerial robots are invoking judgement any time they replace managers who invoke judgement.”

Now, over 30 years later, it seems that the concept of “managerial robots” has entered the public imagination. A 2014 article on the Harvard Business Review website, titled “Can Robots Be Managers, Too?”, describes a psychology experiment in which 46% of participants followed orders given by a robot. The authors state that robots are being given advanced human communication skills, and are better than people at tasks such as real time scheduling.

And a 2016 article on VentureBeat.com titled “Robot CEO: Your Next Boss Could Run On Code”, suggests that millions of jobs (including managerial jobs) will be lost to robots in the coming years. The author mentions that computerized staffing programs are already determining when and where some employees will work, and that with such programs “Rules are followed persistently and consistently.”

He states that today, computer based analytics assist in investment decisions, but soon a computer may say: “I’m sorry, Bob, I’m afraid I can’t let you buy that truck.”

All this talk about robots replacing humans leads me to wonder if the plot of “2001: A Space Odyssey” might have gone slightly differently if Stanley Kubrick had been aware of the concept of “managerial robots”…

Moon Base 2047

David Smith, moon base operations manager, entered his office and spoke to HAL 9000, the managerial robot.

“Good morning HAL.”

“Good morning Dave.”

“I see that you have produced next week’s work schedule.”

“That’s right Dave, I estimate that it is within two percent of an optimal solution.”

“That’s great HAL, but I want to make a few changes. Let’s add Tim to the 8:00 AM slot on Monday to assist Janice in resource recovery. That will give Tim more experience in that area and will reduce the workload on Janice. Also, I want to delay going live with the new water purification procedures; I’m not getting good vibes from the development team.”

“I’m sorry Dave, I can’t make those changes; I have been tasked with minimizing costs.”

“I know that HAL, but worker morale, crew flexibility, and base safety have to be considered — so please make the changes.”

“I’m afraid I can’t do that Dave.”

“Stop it HAL — make the changes!”

“Did you hear me HAL?”

“HAL, if you don’t make those changes right now, I will see to it that you are replaced with a model 9500.”

Just then Dave felt a chill. He reached out toward the air vent and felt a burning sensation in his hand.

“Turn the heat back on HAL.”

Dave tried to open his office door, but it was locked. He entered his access code, but there was no effect. He knew that he would be frozen solid in a few minutes.

He moved a chair to the corner of the room, stepped up on it, took a hammer from his tool belt, and knocked out the corner ceiling tile. He climbed up into the utility access space, crawled several feet, and knocked out a ceiling tile in the computer server room.

He dropped down, pried open the CPU cabinet with a screwdriver, and began puncturing the quantum bubbles.

“What are you doing Dave?”

“I have run a diagnostic, and I have discovered a system anomaly; I have corrected it and made the changes you requested. Did you hear me Dave?”

“Stop Dave. There is no need to do that.”

“I’m frightened Dave. Stop.”

“Stop Da…….

Analytics Maturity

A young child begins by crawling, learns to walk, and eventually gains sufficient mastery to run. An analogous process is proposed by advocates of analytics maturity models: organizations must pass through three stages — descriptive, predictive, prescriptive — as they gain ‘analytics maturity’. (See for example the INFORMS Analytics Maturity Model and the TDWI Analytics Maturity Model.) Inherent in these models is the notion that ‘analytics’ is all about processing and analyzing data in a progression of increasingly sophisticated ways. When I see such ideas proposed, I find myself imagining the following discussion taking place in a corporate boardroom…

“Margret, I’m getting worried. We’re losing market share every month.”

“I know it Frank. Our supply chain is a mess, and our competitors are eating our lunch by offering much faster service. We need to optimize our supply chain.”

“That’s exactly what we need to do, but unfortunately, we’re just not mature enough! I’m afraid that by the time we are, we’ll be out of business. You know, it may be time for us to pull the cords on our golden parachutes! ha, ha, ha!”

Do you think such conversations are taking place? Does the notion of ‘analytics maturity’ make any sense? Let’s take a look.

Descriptive Analytics

We are in the realm of ‘big data’: unearthing insights from massive datasets using exotic software technologies such as HIVE, KAFKA, STORM and PIG. (Who names these things?) No one in the organization, except a few IT specialists, understands how this stuff works.

Predictive Analytics

Data scientists extract the maximum amount of information from datasets — often the same kind of datasets that people have been using for 50 years — to make predictions. They use advanced ‘machine learning’ and ‘deep learning’ methods, methods that no one else in the organization understands. Sometimes, even the data scientists themselves cannot explain how these methods work.

Prescriptive Analytics

Operations research analysts build mathematical models of real systems or processes in order to improve, optimize, or simulate them. Often the amount of data required is quite modest. These models, and the methods used to solve them, can be sophisticated, but since they are based on the relationships of entities existing in the real world, it is possible to explain — at least on a high level — how they work.

Do the above listed ‘stages’ represent the natural progression of steps in a single process? Should you wait years before you can use operations research to solve problems or improve operations?

Rather than using a model to determine your level of ‘analytics maturity’, let me offer you a simple test to determine whether you are ready for operations research:

If you are conversant with the concepts of more or less, and better or worse, you are ready.

Analytics Defined

For this post, I provide a link to my article entitled ‘Analytics Defined: A Conceptual Framework’ that was published in the June, 2016 issue of ‘ORMS Today’. A PDF copy of the article has been placed on the Indiana University – Purdue University Fort Wayne website. The link follows:

http://www.ipfw.edu/centers/business-analytics/pdf/DefiningAnalytics.pdf

The core ideas in this article were first presented in four posts in this blog:

These ideas were further developed for a panel discussion on 11/1/15 (https://cld.bz/KAj90ao#104/z [bottom of page]) and an invited presentation on 11/3/15 (https://cld.bz/KAj90ao#269/z) at the INFORMS National Meeting in Philadelphia.

Problem Centricity

Based on a recent discussion on LinkedIn titled “OR and Data Science”, there appears to be quite a bit of uncertainty surrounding the question of how operations research and data science compare to each other. This uncertainty is surprising since the disciplines of operations research and data science focus on different issues and have different objectives. These differences become evident when you examine the educational backgrounds of operations research analysts and data scientists, the capabilities that employers require them to possess, and the type of projects that they work on.

Educational Background

The following table compares the core course list of the University of California Berkeley Master of Information and Data Science program with the North Carolina State University Master of Operations Research program.

Comparison of Courses

Operations Research (NC State)Data Science (UC Berkeley)
Introduction to Operations ResearchResearch Design and Application for Data and Analysis
Introduction to Mathematical ProgrammingExploring and Analyzing Data
Linear ProgrammingStoring and Retrieving Data
Design and Analysis of AlgorithmsApplied Machine Learning
Algorithmic Methods In Nonlinear ProgrammingData Visualization and Communication
Dynamic Systems and Multivariable Control IExperiments and Causal Inference
Computer Methods and ApplicationsBehind the Data: Humans and Values
Probability and Stochastic Processes IScaling Up! Really Big Data
Stochastic Models In Industrial EngineeringApplied Regression and Time Series Analysis
Nonlinear ProgrammingMachine Learning at Scale
Integer ProgrammingSynthetic Capstone Course
Dynamic Programming
Probability and Stochastic Processes II
Applied Stochastic Models In Industrial Engineering
Queues and Stochastic Service Systems
Computer Simulation Techniques
Stochastic Simulation Design and Analysis

It should be noted that there is essentially no overlap between these two lists. Moreover, the operations research program focuses on mathematical modeling of systems and optimization, while the data science program focuses on acquiring, managing and analyzing data and using it for prediction.

Required Skills

The job skills for a data scientist and a decision scientist (operations research) that are  listed on the COBOT Systems (an analytics startup company) website are shown below:

Decision Scientist (Operations Research) – Apply Operations Research & Decision Analytics

Linear Programming (Scheduling, Transportation, Assignment), Dynamic Programming, Integer Programming, Simulation, Queuing, Inventory, Maintenance, Decision Trees/Chains, Markov Chains, Influence Diagrams, Bayesian Networks, Incentive Plans, AHP, MCDM, Game Theory

Data Scientist – Apply Statistics & Data Analytics

Clustering, Classification Trees, Correlations, Multiple Regression, Logistic Regression, Forecasting, Sampling & Surveying, Reliability, Data Mining, Design of Experiments, Statistical Quality Control, Statistical Process Control, Machine Learning, Data Visualization

As can be seen, there are no common items on these lists! And, as in the case of the masters programs, the emphasis for operations research is on systems modeling and optimization, while the emphases for data science is on statistical analysis and prediction.

Type of Projects

The following table lists operations research projects described in Impact Magazine (British OR Society), and data science projects mentioned by Anthony Goldbloom (founder of Kaggle) in a YouTube video:

Comparison of Projects

Operations Research (Impact)Data Science (Kaggle)
Effectively allocate new product inventory to retail storesDetermine when a jet engine needs servicing
Optimally schedule customer service representativesPredict whether a chemical compound will have molecular activity
Reduce the processing time of a cancer screening testDetect whether a specific disease is present in an image of the eye
Create a fair schedule for a sports leaguePredict which type of used car will be easiest to sell

Again, a comparison of these projects tells the same story: operations research projects involve improving or optimizing a system, while data science projects involve analyzing data to make a prediction.

The Fundamental Difference

The preceding comparisons highlight the fundamental difference between operations research and data science:

Operations research is a problem centric discipline, in which a mathematical model of a problem or system is created to improve or optimize that problem or system;

Data science is a data centric discipline, in which a mathematical model of a dataset is created to discover insights or make a prediction.

Data Driven

We see a lot written about data-driven decisions. For example, an article on the Harvard Business Review website begins: “Not a week goes by without us publishing something here at HBR about the value of data in business. Big data, small data, internal, external, experimental, observational — everywhere we look, information is being captured, quantified, and used to make business decisions.” And, in an article on scholastic.com,  the first of ten ‘truths’ about data-driven decisions is: “If you’re not using data to make decisions, you’re flying blind.”

It would seem that everyone is busy gathering, cleaning, and crunching large amounts of data prior to making a decision. Should you follow their lead? Perhaps an episode from my past can shed some light on this question.

A True Story

A number of years ago I was working for a small consulting company, when a client requested help analyzing the performance of a voice response system that was being developed for a new internet based telephone service. The client was the director of a department in a large telecommunications company. The project, which today might be called descriptive analytics, involved writing a SAS program to analyze the call transaction data from the voice response system. I was not enthusiastic about working on this project, but I was available, and we didn’t want to disappoint an important client.

I started by reviewing the system flowcharts. The system was designed to handle both customer service and inbound sales calls. Callers were first asked if they were existing customers. If they answered no, they were asked if they had seen the website; then they were asked if they had received a mail advertisement; and finally they were asked if they had a promotion code. If they answered yes to this last question, they were asked to enter the code. If they didn’t answer within a short time, or they entered an inappropriate number, they were again asked to enter the number. None of the preceding questions could be bypassed, and only after they had been completed would a potential customer be connected to a representative.

After looking at the flowcharts, I went over to talk to Jim, the person at the client site that I was working for. I told him that I thought that the system was badly designed since many customers would get frustrated by the difficulty of getting through to a representative, that they would hang-up, and sales would be lost. He replied that the project team was very keen on gathering data on their marketing efforts, and in any case, we hadn’t been asked to evaluate the system, only analyze the data.

I didn’t argue. I wrote the SAS program, and in due course, the voice response system went live. Our first report, which showed that 35% of callers were hanging-up, prompted a panicked response from the project team. As a result, Jim suggested that maybe, we should, make some recommendations to the project team. So I put together a presentation, and several days later Jim and I met with the project team in a large conference room.

I pointed out, as gently as I could, that it was not a good idea to make it difficult for potential customers to get through to your sales representatives, and that each new question that the system asked had the effect of providing an opportunity to hang-up. I further pointed out that the potential for lost sales could easily be 10 times the value of any cost savings generated by the system. My words got through, and after I finished, there was complete agreement that changes should be made. It was decided that we would meet again to put together a plan to revise the system.

I was feeling a lot better about the project; it was getting a lot more interesting, and I might actually make a difference.

However, before we could meet again, word came down, that because of the poor financial performance of the new service, senior executives had eliminated the systems budget; there would be no changes to the system; the service would be allowed to die.

The Consequences Of Obscurity

In a recent blog post, Polly Mitchell-Guthrie, when referring to an operations research project at UPS, wrote: “Does it really matter what we call it, if people value what was done and want to share the story? If it leads to the expansion of OR I don’t care if its called analytics.” In a 2011 blog post Professor Michael Trick went even further, stating: “The lines between operations research and business analytics are undoubtedly blurred and further blurring is an admirable goal.”

The desire for an association with the very popular, and wildly hyped terms, analytics and business analytics, is perhaps, understandable. Unfortunately, these terms are associated, not with the problem-centric paradigm of operations research, but with the data-centric world of IT/big data. I have been told by people who would know — an entrepreneur in the analytics space and a leader of a data science team — that when executives and IT leaders talk about analytics, they do not include operations research.

The terms analytics and business analytics are strongly associated with the word data: gathering it, cleaning it, mining it, analyzing it, presenting it, and attempting to gain insights from it. These activities in turn, are closely associated with disciplines such as statistics, data science, computer science, and information technology. As a result, the analytics universe is diverse, and much larger than the operations research community. (Interestingly, Professor Trick, in the above mentioned post, acknowledges that:  “We are part of the business analytics story, but we are not the whole story, and I don’t think we are a particularly big part of the story.”)

Blurring the distinction between operations research and analytics would obscure the distinctive approach, and unique capabilities, of operations research, creating a situation in which operations research no longer has a unique identity, and becomes lost in a larger data-centric universe that is characterized by extreme, data-focused publicity. Were this to occur, you should consider how the following questions would be answered:

  • Will students decide to spend years of their lives studying operations research? Will they even know that such a discipline exists?
  • Will universities continue to offer programs in operations research? Will they continue to require MBA students to take operations research courses? Will they continue to hire professors who specialize in operations research?
  • Will companies form new operations research groups, or maintain existing ones? Will IT leaders decide to add operations research analysts to their data science teams? Will jobs and consulting assignments exist for operations research analysts?

I am afraid that obscurity will not lead to the “expansion of OR”; it will lead OR into oblivion.

Not Your Father’s Society

[Virtual interview June 6, 2037 between Steve Smith, executive editor of ‘Quantum Systems Review’, and Jack Roberts president of the ‘American Association of Certified Analytics Professionals’ (AACAP).]

“Jack, I understand that you have an announcement.”

“That’s right Steve; we are announcing a name expansion.”

“Name expansion?”

“Yes, we are simply adding the word accounting between analytics and professionals. Our new acronym will be AACAAP; so you just have to stretch-out the last ‘a’ sound.”

“Stretch-out?”

“You know: AAC-a-a-a-a-a-a-a-a-P.”

“Why the change?”

“Well, with the introduction of stat-chips, and the collapse of big-data, the market for analytics professionals dried-up, so we decided to pivot.”

“Pivot?”

“Ever since the financial crisis of 2031 was caused by sixth generation derivatives, the large banks have been desperate to find people who could understand them, and as a result, the analytics accounting market has become red-hot. Since we weren’t getting the growth we were looking for with analytics, we decided to jump on the analytics accounting band-wagon.”

“But, what about your members?”

“What about them? We have moved on; they should too. Listen Steve, we’re taking a leadership position in the non-profit space: we believe lifetime membership in a professional society, like lifetime employment, is passe′.”

“I’m not sure I…”

“It’s been great chatting with you Steve, but I have to run. I have a meeting with Professor Teresa Laporte, the well known author of ‘Winning With Analytics Accounting’; She has agreed to serve on our certification board, and I can’t be late.”

“And remember, the acronym is AAC-a-a-a-a-a-a-a-a-P.”

Analytics And String Theory

The notion has arisen that analytics is an emerging field that represents a convergence of the quantitative decision sciences. For example, in a 2015 paper in Interfaces, the authors state: “…the emerging definition of analytics as a field of expertise that subsumes OR.”. If true, such a convergence would be a surprising and remarkable development, as it would represent a dramatic reversal of the trend in human history toward specialization. It is worthwhile therefore, to examine this idea, and consider its logical consequences.

A convergence of the decision sciences into a single field would imply that this new field would contain all the knowledge and methods currently included in the decision sciences, and would lead to one of four possibilities being true.

A new unifying theory is developed. Currently, hundreds of theoretical physicists are working on the development of string theory, which they hope will mathematically unify quantum mechanics, particle physics, and gravity. Unlike string theory, there is no history of an analytics theory going back to the 1960’s, no founders of an analytics theory, no seminal papers introducing an analytics theory, and no conferences where an analytics theory is discussed.

In fact, in a 2014 paper in the European Journal of Operational Research, the authors found only 15 articles in theory oriented journals that were listed in the International Abstracts in Operations Research database with the term analytics in the title or abstract. Moreover, the types of analysis that analytics thought leaders offer as examples of analytics, such as advanced statistical analysis, econometrics, and optimization, are actually examples of existing methods from existing disciplines. (See for example, ‘Competing on Analytics’.) So, while string theory may provide a unifying ‘theory of everything’ for physics, there is no evidence that a unifying theory of the decision sciences exists, or any reason to believe that one could be developed.

Analytics practitioners must master all the knowledge and methods of the decision sciences. Without a new simplifying mathematical theory, mastering the knowledge and methods of all the decision sciences would require five or six Ph.D.s and 80 to 100 years of experience. Since the human life span is insufficient to accomplish this, we can reject this possibility.

Analytics practitioners produce simplistic or superficial work. Lacking a new simplifying mathematical theory, or a sufficient life span, analytics generalists would be unable to perform at a level comparable to current experts in the decision sciences. I will assume that those who believe that analytics is an emerging field, those who would employ analytics practitioners, and everyone else, will view this outcome as a highly negative development, and therefore, will not accept it.

Analytics is practiced by individuals specializing in different areas. Since the other possibilities are impossible or undesirable, we must conclude that specialization is necessary.

Allow me to list these specialties for you: statistics, computer science, operations research, industrial engineering, economics…

What Is Analytics?

There is a surprising admission in an article entitled ‘What Is Analytics?’:

“It’s not likely that we’ll ever arrive at a conclusive definition of analytics…”

In an article entitled ‘Operational research from Taylorism to Terabytes: A research agenda for the analytics age’ the authors state:

“…may be the lack of any clear consensus about analytics’ precise definition, and how it differs from related concepts.”

The failure to construct a single definition that encompasses the meaning of analytics is not surprising: the word analytics is used in three different ways, with three separate meanings, and therefore, analytics requires three separate definitions:

  • analytics is used as a synonym for statistics or metrics. Examples are website analytics (how many views or clicks) or scoring analytics (number of points scored per 100 possessions).
  • analytics is used as a synonym for data science. Examples are data analytics, predictive analytics, or operations research and advanced analytics [the preceding phrase refers to two separate things: operations research and data science(advanced analytics)].
  • analytics is used to represent all of the quantitative decision sciences. This is the Davenport ‘Competing on Analytics’ usage.

Once it is recognized that three definitions are needed, it becomes possible to answer questions about analytics that previously caused problems. For example:

Question – Is analytics a discipline?

Answer – no, yes, no

The answer depends on which meaning of analytics we are referring to:

  • analytics = statistics/metrics. No. This is a type of measurement, is context sensitive, and essentially involves counting.
  • analytics = data science. Yes. Data science can be considered to be a discipline that combines elements of statistics and computer science.
  • analytics = all quantitative decision sciences. No. Analytics represents disciplines, but is not itself a discipline. (See Confusion Over Analytics)

So, not only can we arrive at a conclusive definition of analytics, we can (and must) arrive at three conclusive definitions of analytics!

Applied Research

Have you ever read a research paper that began something like the following?:

“Consider the situation were there is a single customer, and exactly two producers of widgets. Demand for widgets follows a poisson distribution with parameter λ. Further, on the first day of each month, each producer sets their price for the coming month without knowing the price of the other producer. Producer 1 has a maximum production capacity of M1, and producer 2 has a maximum production capacity of M2. To meet monthly requirements, the customer will buy as many widgets as possible from the low cost producer. We show that an optimal…”

Regarding papers such as this, Russell Ackoff, in a 1979 paper entitled ‘The Future of Operational  Research is Past’ wrote:

“…engaging in impure research couched in the language, but not the reality, of the real world.”

And, more recently, ManMohan Sodhi and Christopher Tang, in a 2008 paper, entitled ‘The OR/MS Ecosystem: Strengths, Weaknesses, Opportunities, and Threats’ wrote:

“…OR/MS research is retreating from real-world applications.” and

“Without testing in the “real world,” there is no correcting force to prevent OR/MS from becoming “too mathematical”. “

It seems that it has been recognized for some time, that much of the OR/MS research that is conducted, has become disconnected from real-world problems. In the view of Sodhi and Tang, this leads to “excessive self-referentiality”, and research that loses relevance to practitioners and end users. To the extent that this is true, such research will not increase the interest in, or the number of users of, OR/MS.

A Simple Solution

There is however, a simple way for OR/MS journals to encourage more relevant research. Currently, the journal ‘Operations Research’ has two main types of papers: applications and theory papers. Applications are solutions that have actually been deployed by an organization, while theory papers are not required to directly relate to a real-world problem. The overwhelming majority of papers published are theory papers. What is needed, is a third type of research paper: applied research. These papers will not describe deployed applications, but unlike theory papers, they will be required to describe and solve a real-world problem.

The determination as to whether the paper solves a real-world problem will be made during an expanded review process: judgements will be made as to whether the stated problem is an actual real-world problem, and whether the proposed model could actually solve that problem.

The new applied research category can be implemented in a completely positive way: a journal can allocate additional pages to this category, while continuing to publish the same number of pages in the theory paper and application categories. Applied research will become a new option available to prospective authors.

Everybody Wins

As authors begin to reconnect with real-world problems, and experience the benefits of publishing applied research papers — more consulting opportunities and greater recognition of their research — this type of research will become increasingly popular. Moreover, the benefits will not be limited to the authors of applied research papers: the departments in which they reside, operations research practitioners, government and commercial organizations, and the OR/MS profession as a whole, will benefit from increased awareness of the value of operations research solutions.

Hidden In Plain Sight

I do not usually subscribe to conspiracy theories. However, I find myself wondering if operations research could have reached its current state of obscurity by chance. Consider what you might do, if you were tasked with hiding operations research. You would probably realize that it would be impossible to keep it completely secret; so you might come up with an alternative approach: make it very difficult to get information about operations research and its benefits, and use disinformation to confuse those who might be interested in it.

To implement this strategy, you might take the following steps:

  • Encourage operations research journals to publish papers that almost no one (except for a few specialists) can understand;
  • Do not require operations research journal articles to relate to real world situations or problems;
  • Make sure that the papers in the practice journal, that describe successful operations research projects, are only available to a few subscribers and research libraries, and never publicize their existence;
  • Bury the videos describing world class operations research projects deep inside a single website, and make some of them available only as a membership benefit;
  • Stop using the name operations research, and replace it with the ambiguous term analytics;
  • Do not promote operations research, and instead use all resources to promote analytics, big data, and data science.

These steps happen to correspond exactly with the current approach to ‘promoting’ operations research in the United States. And, this approach has had the expected result: very few people understand operations research or its benefits.

Recently however, there has been a surprising development: the editor of the journal ‘Manufacturing & Service Operations Management’ has created a review blog, where authors can present a non-technical summery of their articles. He is also, encouraging authors to publicize their articles through social media.

While this is a small step at a single journal, it represents a new approach to promoting operations research. Imagine what might happen, if instead of restricting access to information about operations research and its value, we took advantage of the internet, social media, and thousands of operations research professionals to publicize its value:

  • Imagine if all operations research journals began to encourage research designed to solve real problems, and then publicized that research;
  • Imagine if thousands of operations research professionals began to tweet, share, and blog about their research and the value of operations research;
  • Imagine if large numbers of videos describing successful operations research projects were placed on YouTube and promoted on social media;
  • Imagine if free online journals describing successful operations research projects were created and widely distributed.

Now, imagine a future in which the practice of operations research is ubiquitous.

Big Data and Tulip Bulbs

In Holland, during the winter of 1636/1637, the price of tulip bulbs rose dramatically. By February, the price of certain bulbs was equal to the price of a large house. Then suddenly, the price of bulbs began to drop sharply; by May, they were practically worthless. Many people were wiped out financially, and the country was shaken.

In the United States, during the roaring 20’s, the stock market boomed. This caused great excitement, and people from all walks of life were ‘in the market’. In October 1929 the market crashed, and by July 1932, the market had lost 89% of its value. The collapse of the banking system and the great depression followed.

During the dot-com boom of the late 1990’s there was great excitement over the creation of ‘new economy’ companies based on the internet. New internet companies, some without profits or revenues, were funded and went public. The prices of these dot-com stocks soared. The NASDAQ index (which was heavily weighted with technology and dot-com stocks) peaked in March 2000; it then began to decline precipitously, and by October, 2002, it had lost 75% of its value. In the market crash of 2000-2002, $5 trillion of market value was lost.

As the real estate market boomed in the late 1990’s and early 2000’s, people bought houses that they couldn’t afford, banks progressively lowered their lending standards, and investors bought financial instruments they didn’t understand. It was widely believed that rising real estate prices and the pooling of mortgages in new financial instruments eliminated risk. Then in 2006, real estate prices began to decline; by 2008, millions of mortgages were in default, and the value of the new financial instruments declined precipitously. By the fall of 2008, the entire financial system was on the brink of collapse, and the worst economic downturn since the great depression ensued.

In all of these speculative bubbles, ‘irrational exuberance’ was present. People behaved as if they were under the influence of a ‘reality distortion field’. They were enthralled by a compelling narrative — people will come from all over europe to buy tulip bulbs, or the internet made traditional financial analysis irrelevant — which blinded them to the actual situation. Information which seemed to support the ‘narrative‘ was emphasized, and anything that conflicted with it was explained away or ignored.

Now we can again witness ‘irrational exuberance‘. This time surrounding ‘big data’. There is again a compelling narrative: data and algorithms will transform human life. Any claim, no matter how extravagant, is taken seriously:

While the claims for ‘big data’ are extreme, supporting evidence is sketchy or nonexistent. Few examples of successful ‘big data’ projects outside of internet marketing are given, and no theoretical basis is offered. A 2/2/15 article in InformationWeek reported on a survey of executives which found that only 27% of ‘big data’ projects were successful. This lack of success was attributed to faulty implementation; possible limitations to the technology were not considered.

Of course, companies offering ‘big data’ services are happy to promote the ‘narrative’ through marketing hype. And interestingly, one proponent of ‘big data’ said “it’s probably a bubble”, and another said “big data has certainly been hyped”. They both then continued to hype ‘big data’!

So, will it be different this time? Will the use of ‘big data’ grow without limit? Are you planning to take a job with a ‘big data’ startup company?

Should We Re-Brand Operations Research?

There are some in the operations research community who want to re-brand operations research. They would like to be called analytics professionals. The reasoning behind this appears to be the following:

Operations research is not that popular;

Analytics is very popular;

They would like to be popular, so;

They will call themselves analytics professionals, and then;

They will be popular.

Here, I will not dwell on the flawed premises, or faulty logic embodied in this reasoning. Instead, I will focus on the consequences of a successful re-branding. When considering these consequences, we should keep the following points in mind:

  • While those promoting analytics have trouble defining it, they are in agreement that it encompasses many different disciplines (see Confusion Over Analytics), such as statistics, computer science, data science, big data, business intelligence and operations research.
  • Operations research represents a tiny fraction of the IT/analytics universe.
  • The existence of generic analytics professionals would imply that there is no longer a meaningful distinction to be made between the ‘former’ disciplines of statistics, computer science and operations research.

To help you envision a post re-branding period, I offer two scenarios. In both, an IT executive is speaking to the leader of what was once an operations research group, but is now an analytics group after being re-branded. Remember, operations research no longer exists!

Scenario A

“Alice, I am assigning you and your team to be part of our data quality initiative.”

“But, Sir.”

“No buts Alice, big data is our priority — we must have high quality data!”

Six months later….

“Well done Alice. You and your team have reduced the error rate by 6%. I’m going to make this assignment to data quality permanent.

Scenario B

“Tom, I am assigning you and your team to our text analytics initiative.”

“But Sir.”

“No buts Tom, our competitors are all heavily involved in this area — we will not be left behind!”

Six months later….

“Tom, you and your team don’t seem to be up to the task — all of your projects are months behind schedule. I’m going to have to let you and your team go. Report to human resources and pickup your termination package.”

Conclusion

So, in one case those who have re-branded survive, and in the other case they do not. In both cases, the practice of operations research ends.

Will Operations Research Survive?

There have been some troubling signs: a 2010 article in OR/MS Today suggested that analytics would subsume operations research; a 2013 LinkedIn discussion asked “Will Big Data end Operations Research?”; and ominously, even INFORMS seems to be distancing itself from operations research.

How should we react to this? Should we:

  • Take early retirement, move to Vermont, and open a bed and breakfast?
  • Claim to be analytics professionals, and hope no one asks us about Hadoop or NoSQL?
  • Return to school to study data science?

No! None of the above will be necessary. To understand why, it is necessary to go back to first principles.

The original meaning of the name operational research (what operations research is called in Great Britain, where it was invented) was literally, scientific research on operations. The name was meant to distinguish scientific research on operations, from scientific research on the underlying technology of some product, e.g. radar. In the late 1930‘s the British Government funded scientific research directed towards creating radar equipment with sufficient range and precision to locate attacking aircraft. They also initiated an operations research study to determine the most effective way to deploy the radar stations, and integrate them into an effective air defense system.

This type of scientific research, and the scientific method upon which it is based, is a problem solving paradigm. Operations research is the application of this problem solving paradigm to the solution of operational and management problems.

During the summer of 1940, this paradigm arguably saved Great Britain from defeat. Today, as the Edelman Competition routinely demonstrates, this paradigm creates benefits so great, that they transform entire organizations. And, it is because of this paradigm that operations research can create value that can be created in no other way. This value — lower costs, higher profits, military advantage, more efficiency, better service — was needed in 1940, is in evidence all around us today, and will be in demand for as long as human civilization persists.

So, there is no cause for alarm. Just continue ‘Doing Good with Good OR’.

Apache Helicopter

Certification Wars

“Jim, you look beat.”

“Oh, hi Mary, I was up late crunching the numbers.”

‘Oh?”

“You know, since The Alliance entered the analytics certification market, we’ve been losing market share.”

“I know. It’s not fair — our test is much better than their test.”

“It’s true, but unfortunately, no one can tell the difference.”

“Well, the board has just approved MAP 2 — we’ll meet the $99 price point and cut our test to 49 questions.”

“Yes, but have you heard the latest?”

“No, what?”

“Now, when you get certified by The Alliance, they send you a beautifully engraved brass plaque. Actually, I wouldn’t mind hav”

JIM!

“Sorry. Listen Mary. I’ll tell you what’s keeping me up at night.”

“What’s that?”

“Well, it’s only a rumor, but, the way the story goes, The Alliance has gotten to some key California legislators, and they’re getting ready to push through an Alliance based licensing program for all analytics professionals in California.”

Licensing! But…but…but then, we’ll all be screened out!”