21st-century consumer packaged goods (CPG) and fast-moving consumer goods (FMCG) manufacturers face a challenge.
Competitive and market forces and a shifting landscape of consumer demands has put these companies under increasing pressure to find top-and-bottom-line revenue growth in places besides increased sales volume.
Over the past decade, an evolving practice that goes by any number of names—Revenue Growth Management, Net Revenue Management , Strategic Revenue Management—has emerged, alongside the advent of increasingly rich data sets and practices.
The pitch sounds tantalizingly straightforward: predictive analytics, leveraged across pricing, promotion, product and marketing strategy to uncover untapped opportunities for expanding margins, lowering risks and optimizing spend.
Straightforward, maybe, but certainly not simple.
Orchestrating data, product and people for CPG revenue management
21st-century consumer packaged goods (CPG) and fast-moving consumer goods (FMCG) manufacturers face a challenge.
This is where the power of data can really shine: intelligently sussing out an optimal solution from complicated systems with hundreds of thousands, or millions, of possible permutations.Dan Hopewell UX/Service Design Principal at Kin + Carta
Revenue Management: From “complicated” to “complex”
CPG revenue management is, by its very nature, a multi-dimensional enterprise. Nothing happens in isolation.
An interconnected set of revenue and cost levers, each as central to business success at the next, are manipulated and optimized in unison. Activities such as promotional planning, pricing strategy and price–pack architecture, portfolio and assortment strategy, marketing mix, and trade spend management are tweaked in concert over a field of hundreds, perhaps thousands, of SKUs. In conjunction, there is demand forecasting that accounts for complicated interactions across various product lines, pack sizes, vendors and channels. This forecasting includes factors such as substitutability, complementarity, cross elasticity, cannibalization and seasonality.
All of this, with an eye, trained holistically on the top and bottom lines.
This juncture is where the power of data can shine: intelligently figuring out an optimal solution from complicated systems with hundreds of thousands, or millions, of possible permutations. And if CPG revenue management were “merely” a complicated problem—an optimization problem that only needs raw number-crunching power—that might be that.
Successful operationalization, however, unfurls a web of overlapping technical and human factors that tip the scales further into something truly complex.
A complex system is, among other things, one that is rife with tradeoffs and contradictory priorities; one that lends itself less to identifying singular “optimal” solutions rather than to finding the appropriate equilibrium between diverse goals in tension.
And the operational complexity of revenue optimization starts with wrangling the data itself.
The wealth of data available to CPGs today presents a double-edged sword.
On the one hand, these rapidly proliferating data sources—including granular shopper-level POS data light-years beyond the monolithic and aggregated sell-in data of years past—open up new frontiers for micro-targeting and precision planning.
This very wealth of data itself, though, presents daunting challenges to efficient operationalization.
Cleansing, harmonizing and integrating the first, second and third-party CPG data from disparate upstream sources and then effectively deploying them across an equally diverse array of downstream retailers and channels is no trivial task. Merely preparing data for even a simple analysis can easily take up more than half of any data team’s time. And the complexity of data sources, destinations and necessary transformations only compounds for manufacturers operating across multiple categories and markets, with their inconsistent retail practices and data availability.
These are ultimately the challenges—and the tradeoffs—of scale.
But, if a firm’s commitment to data is to be anything more than a one-off analysis, the processes and infrastructure for ingesting and extracting data deserve as much strategic forethought as any modern software build and a level of governance befitting it as a full-fledged business function.
This is about striking a strategic balance between the richness and granularity needed to take full advantage of the power of modern data, and the standardization and global consistency needed to efficiently leverage this power at scale, while maintaining the flexibility required to adapt to both local constraints and changing circumstances.
Because change, while often unpredictable, is inevitable.
Change is constant, accelerating and often outside of your control.Dan Hopewell UX/Service Design Principal at Kin + Carta
Adaptation and continuous learning
Change is constant, accelerating and often outside of your control.
From unforeseen shifts in a competitive landscape to evolving consumer preferences to changing regulatory and trade environments, set-it-and-forget-it approaches to data can reach their sell-by dates in a blink of an eye. Historical data and the models depending on it will need constant reassessment, just as business domain rules should be expected to evolve in response to the world around them constantly.
This includes permanent tectonic shifts (such as renewed trade barriers and new consumer protection legislation) as much as temporary earthquakes requiring companies to adapt on the fly (like sudden supply chain disruptions in the Suez Canal or the unexpected demand shocks of a global pandemic).
Anticipating change—indeed, embracing it—is why modern paradigms of both software development and business strategy place learning cycles and iterative building at the center of their philosophies.
Data systems, too, need to be architected upfront with adaptability and learning built-in. Expect from the start that your models will need continuous retraining, that new and better data will need to be integrated, and that any historical data will need to be continually re-validated and recontextualized.
Automation can go a long way towards making such iterative change routine and relatively frictionless, akin to CI/CD in modern software delivery. But it’s just as critical to operationalize learning—to implement feedback loops as a deliberate and intrinsic part of both your tooling and your business processes from the start.
This means instrumenting your systems to gather an ongoing stream of telemetry data about your data.
It also means recognizing where and how to capitalize on human judgement efficiently.
Humans in the loop
Every data challenge is ultimately a human one.
Revenue growth is everyone’s job. Internally within a business, CPG revenue management activities and outcomes span both organizational silos, including sales, marketing, finance, supply chain management, IT and operations, and levels of altitude, from the global/portfolio level down to individual business units, markets and accounts/stores.
Externally, manufacturers must also achieve alignment with their own customers: retailers. Insights from shopper-level sell-out data offer tools to make negotiations with retail partners more rigorously quantitative and value-focused and open up new pathways to creative collaboration for shared value creation. While these opportunities are compelling, and they underscore the ever-present imperative to deliver win–win strategies across an ecosystem of partners.
The array of markets, channels, retailers and categories that CPG revenue optimization must navigate—an interlocking ecosystem of partnerships, relationships and stakeholders—presents companies with a messy web of competing goals and interests they need to serve. And turning any recommendation into action will ultimately hinge on questions of feasibility, prioritization and relationship management.
Even when data successfully drives your decision-making, the humans in the loop can never be treated as an afterthought.
More to the point, the question of data and how its tooling can fit into, support and enhance the end-to-end decision journeys of the stakeholders consuming it ought to be approached with the same level of intention as any user or customer journey.
Orchestrating data, tools and people
The promise of data-driven CPG planning is compelling and increasingly urgent. But the complexities are just as real, and ever-growing. It’s the challenge of operationalizing complex data at scale, and doing so in a way that balances power with efficiency. It’s the challenge of responding and adapting to unforeseen—but inevitable—change. And, ultimately, it’s the challenge of integrating data and people, tools and processes, into an effective cross-functional business capability.
To answer these challenges, Kin + Carta draws on our legacy of experience and thought leadership in product development, engineering and design: adopting, on the one hand, a product thinking approach to data science and engineering and, on the other, a human-centered commitment to designing holistic product and service journeys.
In our view, these twin paradigms together hold the key to scalable value creation from data insights, not just for CPG revenue management but for any complex data initiative.
A product approach to data engineeringEmbracing a product mindset means leveraging all the lessons modern product development has to offer us.
It means engineering adaptive systems with continuous learning and iteration baked into the development process.
It means designing scalable but flexible architectures built on microservices. It means adopting CI/CD pipeline automation, continuous monitoring and other modern DevOps practices.
It ultimately means shipping tools that offer cross-functional business functions self-service data insights with high availability and tailored to their needs.
And this equation goes far beyond just engineering practices.
It also means adopting a product thinking approach across every facet of the data product lifecycle. Product thinking means understanding the goals and needs of your end-users. It means prioritizing against a background of business objectives, constraints and tradeoffs. It means roadmaps that are as responsive and adaptable as the systems they represent.
Applying product thinking to your data systems means looking at them from every angle through the overlapping lenses of business viability, user experience and usability, and technological viability. And it all adds up to sustainable value from roadmap to delivery to product support.
A service orientation for seamless data journeys
Successfully navigating such complex ecosystems—indeed, often ecosystems of ecosystems—also means zooming out holistically beyond just the code.
Your success hinges as much on the process design, change management and operational support driving internal adoption as it does on your product engineering itself.
Realizing data-driven decision-making as a cohesive business capability requires thoughtfully embedding your data and tools within their respective business processes. It means coordinating internal and external data processes, deliberately weaving together operational details (both visible and invisible) into an efficient whole.
An ecosystem-level view means envisioning the creation, delivery and support of data insights as a comprehensive end-to-end service within your business. And successfully pulling this off means intentionally designing out your full data and human journeys while indexing both on critical “moments of truth.”
Situating data and code within a human-centered outlook, adopting a service design perspective helps you leverage technology as a force multiplier, one ultimately in service of your people and your business.