When Standardized Data Becomes Substandard
In industry after industry, the emergence of standardization, whether by law or by user consensus, has empowered tremendous advantages in economy, efficiency and scale.
Electric appliances are an obvious example. They could never have become the mass market conveniences we know today without the standardization of house current and outlets. Digital devices are another example; PCs could not have become the indispensable tools we now take for granted without standardized operating systems. Manufacturing processes, product branding and quality control systems are also among the beneficiaries of standardization. That list goes on and on.
At the same time, however, it's important to recognize that standardization also carries tradeoffs. Consider, for example, the case of medical treatment or of academic testing, where individual differences are paramount. Particularly true when standards emerge too soon -- before the contours of its subject matter are fully mapped or its complexity defies comprehensive mapping -- those tradeoffs can be significant and can miss the boat, especially when that boat is in motion.
What brings this to mind are the recent acquisitions of Tableau Software and Looker by Salesforce and Google, respectively.I believe both were made in recognition of the emerging power of analytics, driven by big data, to generate business intelligence and reshape the world's commercial environment. Their business reasons are understandable.
The dynamics of today's market necessitate that companies like Salesforce and Google, with extensive networks of users across verticals and industries who demand capabilities to access data faster and make sense of it quicker, must acquire technologies to ingest and draw connections between massive sets of data. This kind of functionality is necessary to deliver on the promises of advanced analytics, intelligence and visualization. It's among the reasons why Salesforce also acquired an analytics provider, Datorama, last year and why Intuit, a company focused on the financial market, acquired Origami Logic for its advanced data integration, ingestion and analytics capabilities.
A premium has been placed on companies that can deliver seamless data connector technologies and advanced intelligence solutions that are adaptable across vertical markets. However, there is the added complexity in bringing together all these capabilities -- the many non-integrated, often SaaS-based platforms -- into a unified solution accessible via the cloud. Salesforce and Google face immense challenges unifying data sources, connectors and analytics platforms before these acquisitions will truly benefit the end user.
On the other hand, one of the obvious advantages to consolidating under the banners of these market leaders is the ultimate standardization of their offerings. This would essentially transform their key features into commodities that can be offered at prices that rivals can't match. For example, in pricing its business intelligence platform for large enterprises, Microsoft charges users less than $10 a month-- an amount that rival business intelligence vendors simply couldn't sustain. Being in a position to scale a product or service enough to enjoy the associated economies is a huge business advantage.
That said, however, the types of data collected, the categories into which they are sorted and the algorithms by which that information is interpreted are not set in stone. While they may be slow to change, business environments, as well as the relationships of people to products and services, will continue to evolve in ways that standardized approaches, particularly those which have become locked into legacy systems, can miss. And as the selection of alternative marketing software platforms declines, the greater that risk becomes.
What, exactly, are those risks? For one thing, the practice of customer mapping has become as crowded and complex as the range of behavior it attempts to capture. A recent analysis by ChiefMarTec.com identifies more than 7,000 separate marketing technology solutions currently on the landscape -- a staggering number that has seen double-digit growth every year since 2011 and shows no signs of slowing down.
While many of them will eventually disappear as the result of technical difficulties, redundancy, consolidation, high cost or other factors, some of them have distinctive qualities with real predictive value. Accurately modeling a target market could easily involve a handful of critical metrics coming from different sources. But unless they're compatible with the leading vendors' standardized offerings, their value may never find its way to the marketing professional.
Ultimately, marketing is all about understanding people -- how they want to be connected with and catered to. And more data from more people can make campaigns more effective.
For better and for worse, the world and the people in it are highly diverse. Human diversity, in turn, begets data diversity, not homogeneity and standardization. The standardization of data may ultimately do more to hinder the ability of marketers to effectively model and reach targets than to enhance it.
Missing The Mark
What gives particular urgency to the risks of relying on standardized data at the expense of non-standard sources comes from the high cost of marketing initiatives. For marketing professionals, realizing an acceptable return on investment from the allocation of funds that grows out of their data analysis is critical. But if that analysis isn't able to include valuable data from non-standard sources, the resulting spend and mix of marketing tools risks being misplaced.
Standardizing onto limited data sets has real-life consequences, like missing out on innovation. It means relinquishing the ability to harness the collective intelligence and power that different datasets can provide. For the consumer, that can translate into reduced personalization, a lack of authenticity and an erosion of trust in the seller. For the vendor, it means a reduced ability to convert potential buyers into actual customers.
Data diversity enhances market intelligence. Good intelligence needs to reflect the wide-ranging and distinctive qualities of each customer's journey.