There are two ways of segmenting customers. The most common is to decide how many segments we want and the general parameters. Nonprofits do this a lot: they segment mailing lists into small, medium and large donors. But this approach opens the door to misleading customer or other views.For example, consider segmenting a roomful of people by height. One way would be to decide on the categories: short, average and tall, then assign people to their proper category.The problem with this approach is most clearly seen if the roomful of people is at the four-year-old story time at the library.There are actually only two categories of people here, not three: short four-year-olds and taller moms.In essence, such a segmentation approach takes an opaque block of data and arbitrarily cuts it up into smaller opaque segments. This approach was common in the "decision support systems" of the previous decade. These mostly evolved into "decision justification systems" used to justify arbitrary management decisions.The other way is much harder: to "listen" to the data and see how customers actually group, or "cluster." By listening to the data we can spot our flawed preconceptions of categories. This approach shines light through the block of data, illuminating where the data actually lies, like sonar showing where fish are schooling.The second approach is where the art of data mining comes in. In true data mining, we don't start knowing the right questions to ask, let alone the correct answers. We let the data guide us into the correct ones, using tools like cluster analysis to hear what the data is saying.One axis of measurement should be customer equity. We look for significant groupings of customers that provide clues about why some customers are more profitable than others. We also look for clues about how to move them up the profit chain.For example, at one nonprofit I worked with, we did cluster analysis and found that the natural clustering of patrons was actually more closely tied to how many times they had contributed, rather than the grand total of their gifts. Once we understood that faithfulness in giving was more important than a one-time "flash in the pan," we could properly cultivate our long time, profitable donors, rather than chasing after another long-shot "flash."Progressive Insurance has put such data-driven segmentation into practice. It has refined customers into much more meaningful clusters than their competition. It's been reported that they score vehicles according to 12 different metrics, and credit scores along 16 different metrics. The results illustrate what we know intuitively: all young drivers aren't high risk, all older drivers aren't low risk. The analysis not only allows them to avoid turning away potential customers unnecessarily, but also calculate appropriate premiums for faster positive customer equity.Look for a lot of activity in data mining and data visualization in the next few years. "In five years, 100 million people will be using an information-visualization tool on a near daily basis. And products that have visualization as one of their top three features will earn $1 billion per year," says Ramana Rao, founder and chief technology officer, Inxight Software.A good book to read is The Visual Display of Quantitative Information by Edward R. Tufte (1983, Graphics Press). The title is dry, but the book is filled with examples of elegant, deceptively simple graphical techniques.Some lessons to help you gain insights from your data:
- Do you segment data according to your own rules, or listen to the data?
- When was the last time segmentation strategies were re-examined?
- Are your segments statistically significant?
- How do you visualize your data to see if segments are meaningful?
If you don't have data mining and visualization expertise in-house (few do), hire a professional for a short-term contract.If you don't know what's happening in your customer database, you're "flying blind." And, like a plane without proper instrumentation, you'll eventually crash and burn on the mountain that you should have seen, but didn't.