Separator

Conduct Analytics the Way A Military General Would Conduct A Battle Campaign

Separator
Prithvi Chandrasekhar, Managing Director - Credit Risk & Analytics, InCred“Amateurs discuss strategy. Real generals discuss logistics”. I find that this military truism has a direct parallel in business analytics – “Amateurs discuss big-data and machine-learning. Real generals discuss experimental design”. The implication: a big change in analytic practice. I’d like to see is a renewed focus on the ancient, unglamourous discipline of experimentation (aka Randomized Control Trials, or A/B testing) rather than on shiny new bigdata or machine-learning toys.

I’ve worked in analytics for almost two decades now. I learnt this trade at Capital One, at a time before the term analytics was coined (in those days it was called Information Based Strategy or IBS, an acronym that was open to other interpretations). Over the years, analytics has matured to a point where it is no longer a geeky fringe interest. It is now an essential part of how most industries - BFSI, telecom, retail, transportation, health care and many others - operate.

Yet, it remains widely misunderstood, and therefore widely mis practiced even by serious professionals with a real stake in making analytics work. The biggest, most common misunderstanding - lots of energy expended on cool big-data mining and AI, unsupported by rigorous experimentation.

This came back to mind recently when I was having a beer with an old friend and former client, who is now CMO of a Canadian telecommunication company. His company had a ton of data; really big data: location data that told him where his customers were at any time of the day, text and call meta-data, that told him who his customers communicated with, and how often, millions of customer service call recordings, billing data, payments data, and others.

My friend had made a name for himself and earned his promotion to CMO, as the one who had brought exciting cutting-edge data science to the telco. He had the vision to perceive the telco’s data as a treasure trove. Therefore, he fought hard for expensive IT projects to organize the telco’s data in a Hadoop cluster. He hired a well qualified Business Analytics department to mine this data. The data scientists did excellent work. They used the best analysis techniques available. They used Support Vector Machines for segmentation, instead of demographics, to find tight clusters of less price sensitive customers. They used
XG Boost to squeeze more GINI out of the upsell propensity model.

One analytics project that went viral in the management suite converted the customer’s tone of voice recorded on a retention call to predict which customers were most likely to churn. My friend gave a demo of this project at a board meeting to demonstrate how his analytics investments were making a real difference.

Resist the temptation to build gee-whizz analytics without the right experimental foundation. That results in a battle-field strategy without logistics support – a path to defeat


Yet, almost decade after its inception, Business Analytics remained a support function, working mostly within its own silo. My friend, the analytics evangelist, was becoming more skeptical. The most important business decisions like pricing were made the way they had always been made, based on experience and judgement, with no data scientists at the decision making table. Tellingly, attrition rates didn’t reduce despite the iconic voice-data mining project.

We got talking more deeply about the voice-data-mining project to understand how it had ended. The natural next-step would have been to offer a better price/service package to the customers most likely to churn, and to test it against a business-as-usual control. However, Operations didn’t have flexibility to offer different packages to different customers at the call center level. So, this iconic project remained a show pony despite the quality of the bigdata analytics, the insight and the management support.

The frustration of this story really resonated with me, because this is the narrative arc followed by most analytics projects: great data, amazing technique, cool insights, no action, and therefore no business impact.

My guidance (and the Capital One approach to analytics) would be to turn the action cycle upside down. Start with the treatment; in my Canadian friend’s case, that would be the change in price or service packages offered to retain customers. Test this treatment in the real world against a randomized control on customers selected with very simple analytics. Mine the results of the test with big-data to further refine the strategy. Why doesn’t this happen? Very few people would disagree with the principle of experiment-based learning. This is, after all, the foundation of the scientific method.

Typically, the reason this doesn’t happen is institutional. Experimental learning is difficult and expensive. Outsiders - consultants, academics, and Business Analytics departments do not have the authority, knowledge or resources to make this experimentation happen. Fantastic big-data and machine-learning tools are now relatively cheap and easily available. The temptation is to buy into the belief that these amazing tools can create something out of nothing to quickly sell an impressive project, to publish a journal paper - and to worry about the experimentation later. Unfortunately, the truth is that that usually leads to cycles of frustration and cynicism.

So, my guidance & exhortation to my analytics colleagues is to be like the general who is obsessed about logistics, and to obsess about experimental learning. Resist the temptation to build gee-whizz analytics without the right experimental foundation. That results in a battle-field strategy without logistics support - a path to defeat. Instead, focus first on getting operational, real world, in-market experiments in place, even if imperfect. Our big-data and machine learning tools can kick-in subsequently to game changing effect, enabling triumph on the battle field.