I was queuing for a flight late last year when two people standing behind me started talking about how disappointing their trip had been. They were in consultants in logistics, and they were lamenting how one of their clients was struggling in the wake of a business process change that another firm - a tech consultancy - had agitated for their mutual client to adopt. The client in question purchased and warehoused perishable items, hundreds of thousands of different SKUs that they distributed to retailers ranging from tiny independents to large global chains. The distribution operation was built on efficiencies: fill delivery trucks to a minimum of 80% capacity and deliver products to customers on routes optimized for time and energy consumption. Cheap distribution keeps product prices down, which makes their retail clients more competitive. The up-start tech consultants pushed for flexibility: more frequent product deliveries made to the same stores throughout the day would keep shelves stocked, so they could better match stock to store. If there's a run on one item it can be replenished much sooner, resulting in fewer lost sales. Unfortunately, more frequent deliveries required more frequent truck dispatch; trucks could only be dispatched more frequently if they spent less time being loaded with products, so the load level of a truck fell to below 50% of capacity; expedited dispatch also meant ad-hoc rather than fixed routes, which resulted in driver confusion and receiving delays that translated into higher energy and labor costs of distribution. The intra-day re-stocking didn't capture enough revenue lost due to shelves being empty to justify either lower wholesale margins or higher retail prices.
The two standing behind me were exasperated that their client "listened to those [other] consultants!"
Distribution is not a high-value-added function. Distribution can get stuff form one esoteric locale to another, but that isn't the miracle of modern supply chains. Their magic they create is doing so for mere pennies. Cheap distribution can make something produced tens of thousands of miles away price competitive with the same something produced a few doors down the street. Distribution is about efficiency, because efficiency translates into price. When you're distributing hundreds of thousands of different SKUs, you're a distributor of commodities, and whether toothpaste or tungsten the vast majority of commodity purchases are driven by price, not convenience. Capturing incremental lost sales sounds like a good idea until it meets the cold, hard reality of the price sensitivity of existing sales.
This got me to reflect: why would anybody in a distribution business agree to do something so patently counter to the fundamental economics of their business model?
They're not a company I'm doing business with, and I didn't strike up a relationship with the frustrated consultants, so I don't know this specific situation for fact. But I've seen this pattern at a number of companies now, and I suspect it's the same phenomenon at work.
The short version: companies have forgotten how they function. They've lost institutional knowledge of their own operations.
We've been automating business calculations since the tabulator was introduced in the late 19th century, and business processes since the debut of the business computer in the late 1940s. Early on, business technology was mostly large-scale labor-saving data processing. It wasn't until the 1950s (well, more accurately, the rise of COBOL in the 1960s) that we really started capturing business workflows in code. Although few could appreciate it at the time, this marked the beginning of businesses becoming algorithm companies: all kinds of rule-based decision making such as accounting was moved into code; complex rules for functions like pricing and time-sensitive decisions such as trading quickly followed suit. As general purpose business computers became smaller and cheaper in the '70s and '80s, the reach of computer technology spread to every corner of the organization. As it did, business workflows from order entry to just-in-time manufacturing to fulfillment were automated.
The work that had previously taken people days or weeks could be done in minutes or even seconds. People with intimate knowledge of the business were liberated from computational and administrative burden. The business could grow without adding staff, and suffered fewer mistakes for that growth. Computer technology fueled labor productivity throughout the '80s and '90s as more and more business processes were automated.
Then something happened that went largely un-noticed: those business people who had devised and performed the manual processes that defined the software solutions built in the '70s and '80s retired. It went unnoticed because their knowledge was captured in code, and the developers knew the code. And, the new business people hired in to replace the old were trained in how to run the code, so there was no interruption in normal business operations.
Then the original developers disappeared, either because they aged out, got better opportunities with other firms (tech people tend to change jobs frequently), or got caught up in the offshoring thing in the early 2000s. No matter how it happened, the original developers left the scene and were replaced by new people.
At this point, the cracks started to appear.
Business people knew what they did with the code and tech people knew what the code did, but neither knew why. While regular operations didn't suffer, irregular operations caused fits because nobody knew what a measured response to them was. These led to bad decision-making about the software. Among other things, the new business people didn't know the simple protocol their predecessors followed to contain a crisis. While new people had the tactical knowledge to execute tasks in the software, they didn't know how use the software in tandem with manual procedures to efficiently respond to an irregular operating situation. On the other side of the knowledge base, the new tech people didn't know why extreme scenarios weren't accommodated in the code. Again, without the meta knowledge of how to procedurally minimize a crisis inside and outside the software, they had to defend why the software wasn't "robust" enough to deal with this crisis. Since anything and everything can be codified, they had no ready answers, nor could they chart a procedural course outside of code.
With management screaming about escalating costs and poor customer service and making assurances to higher-ups that This Will Never Happen Again, the decision was made for them. So the software bloated with new rules, complexity (If This Than That), and features rarely invoked to make the software very well prepared to respond to the last crisis. Of course, given the nature of irregular operations, it wasn't entirely accommodative to the next crises. Thus went the progressive cycle of code bloat and fragility.
Once the old code became so cumbersome and brittle that executives were terrified of it, they were compelled to sponsor re-invention-through-technology initiatives. These immediately defaulted into feature parity exercises because nobody on the re-invention team had sufficient business context to imagine the business differently from how it operated (tactically) today. Because the new generation of business people had never been required to master the business rules in the same way that a London taxi driver had to have the knowledge, business users were beholden to yesterday's status quo as codified in the software that ran the business. In addition, these replatforming exercises were characterized by a shift in authority: the business people executed the rules, they didn't shape them. Tech people were the ones who manipulated the code behind the rules; they were the new shapers. Tech, not business, are the authority in replatforming initiatives.
The depletion of business knowledge and the shift of the curation of that knowledge from business to tech leads to the scenario described above: no resident adult present who can authoritatively explain why flexibility would blow out the economics of a mature distribution utility. While tech people are great at compilers and build pipelines, they're crap at business economics. Without a meta understanding of business operations, a re-invention or re-platforming initiative will be little more than a high-cost, high-intensity exercise that gets the business no further than where it is today.
I've seen plenty of companies where business understanding has been depleted. Re-learning the fundamentals is an expensive proposition. So how do we re-build a lost institutional memory? We'll look at the ways of doing that next month.