I consult, write, and speak on running better technology businesses (tech firms and IT captives) and the things that make it possible: good governance behaviors (activist investing in IT), what matters most (results, not effort), how we organize (restructure from the technologically abstract to the business concrete), how we execute and manage (replacing industrial with professional), how we plan (debunking the myth of control), and how we pay the bills (capital-intensive financing and budgeting in an agile world). I am increasingly interested in robustness over optimization.

I work for ThoughtWorks, the global leader in software delivery and consulting.

Friday, March 31, 2017

Questions of Worth

Price is the critical determining factor in purchasing decisions. If I want a new case for my tablet, and I know the case that I want, it's worth a considerable amount of my time to find the lowest price on offer for that case. A penny saved and all that.

Utility purchases are driven by price sensitivity. If I can't really say one product is a premium offering to another, I'll go cheap at the sales. I need calories after I run, a breakfast bar will do, I don't need a designer breakfast bar.

While I was writing chapter 3 of my book, Activist Investing in Strategic Software, I spent time researching the rise of centralized procurement departments in the 1990s. De-centralization in the 1980s created inefficiencies in cost management: it wasn't uncommon to find that one division was paying far more than another division for an identically skilled position supplied by the same vendor. Centralized purchasing found efficiencies by standardizing roles and position specifications and granting preferred partner status to contract labor firms. In theory, standardized buying lifted the burden of negotiation from individual department managers and found cost efficiencies for the company. Buyers could more atomically define what they were buying, sellers swapped margin for volume.

And tech labor became a utility.

Procurement's ascendance didn't create industrial IT (there were already willing buyers and sellers of narrow skill-sets), but it certainly threw copious amounts of fertilizer on it. Within a few years, we saw significant expansion of contract labor firms (or "services", or "consulting", whichever you prefer): firms like Accenture and Infosys grew rapidly, while firms like IBM ditched hardware for services. Buying became an exercise in sourcing for the lowest unit cost any vendor was willing to supply for a particular skill-set. Selling became a race to the bottom in pricing. In this way, tech labor was cast as a utility, like the indistinguishible breakfast bar mentioned above.

In captive IT, the notion of a "knowledge worker" that came to prominence in the 1980s was stampeded by the late 1990s. Knowledge workers are a company's primary labor force, but through the miracle of standardization, tech people became collections of skills, and subsequently interchangeable go-bots. By extension, tech became a secondary labor force to clients. Labor extracted rents from the client for which it toiled, but labor had no equity in the outcomes that it achieved. Tech labor was wage work. It might be high-priced wage work, but it's wage work none-the-less.

With all cash and no equity, employees now had clear rules of the game, too. Certifications became the path to higher salaries. It didn't matter whether you were competent, Sun certified you as a Java developer, Scrum Alliance a Scrum Master, PMI a Project Manager, any employer a Six Sigma blackbelt. In exchange for minor rent extraction by agencies exploiting an industrialized labor market, buyers received 3rd-party reinforcement of their contract labor model.

With all the ink being spilled on subjects that managers of enterprises like to traffic in - things like Agile delivery, product organizations, platforms, disruptive technologies, and the idea economy (obviously, some more meaningful than others) - it's difficult to understand how companies still choose to source labor like it's 1997. The people I need to build long-lived products on my-business-as-a-platform-as-a-service using emerging technologies don't fit any definition of standard procurement. These aren't left-brain skills, they're right-brain capabilities. If you buy the cheapest knob-twisters that money can buy, how could you possibly have any expectation for creative thought and innovative output?

At the same time, it isn't that surprising. Procurement sources all kinds of contract labor, from executive assistants to accountants to recruiters. Yes, technologies like Office, SAP and LinkedIn are fantastic, but they're not exactly the equivalent of serverless in tech. If the bulk of the labor you source is check-the-box, why would you expect - or more to the point, how could you be expected to comprehend - that tech is unique? Accounting is, well, accounting after all. It's not a hotbed of innovation. In fact, it's usually bad news when it is a hotbed of innovation. "Innovation" in tech is - particularly for non-tech manager / administrators - just a buzzword.

In enterprises with dominant procurement functions, "worth" is a function of "cost", not "outcome". If we rent labor on the basis of how much a unit of effort denominated in time will cost, the "worth" of a development capability is the sum of the labor times its unit cost. We therefore value scale because we assume productivity is an implied constant. If we don't understand sausage-making, we simply assume that more gears in the sausage-making machine will yield more sausage. We fail to appreciate the amount of energy necessary to drive those gears, the friction among them, and the distance those gears create between hoofed animal and grill-ready skinned product.

Thus we end up with a payroll of hundreds doing the work of dozens.

Our economic definition of "worth" precludes us from understanding what's going on. We have the labor, so it must be some sort of operational deficiency. We look to process and organization, coaches and rules. All of which is looking in the wrong place. We're not a few coaches and a little bit of process removed from salvation. We staffed poorly, plain and simple.

What a development capability is "worth" has to be correlated to the value it yields, not metered effort or even productive output. Something isn't "worth" what we're willing to pay for it, but what its replacement value is to provide the same degree of satisfaction. If we're getting the output of dozens, we're willing to pay for dozens. The capability of high-yield dozens will be more dear on a unit cost basis. But clear accounting of systemic results will favor the cost of polyskilled dozens over locally optimized low-capability monoskilled masses.

This is the economics of "worth".

Tuesday, February 28, 2017

Our Once and Future Wisdom: Re-acquiring Lost Institutional Knowledge

Last month we looked at the loss of institutional memory and the reasons for it. This month, we look at our options for re-acquiring it.

The erosion of business knowledge is not a recent phenomenon. Management textbooks dating at least as far back as the 1980s included stories of employees performing tasks the reasons for which they didn't really understand. The classic reference case was usually some report people spent hours crafting every month that they distributed to dozens of managers and executives, none of whom read it because they didn't know what it was for. Those execs never put a stop to it because they assumed another exec knew why it was important. Then, during the much-anticipated system replacement, some business analyst tracked down the person who wrote the report specs so long ago; after he was done laughing, that person told the business analyst the crisis that triggered the need for that report ended many years ago, and he couldn't believe they were still wasting time producing that report.

This story always seemed apocryphal - of course that could happen, but people are smart enough that it wouldn't really happen - until I saw it first hand at an investment bank just 6 years ago.

Natural (retirement) and forced attrition (layoffs) have long robbed companies of their knowledge workers. The rise of automation has simply made their loss more acutely painful. Accounting for knowledge hits the income statement in the form of the salaries of experienced and tenured employees; unfortunately, the value of their knowledge has no representation on the balance sheet. Extracting greater cash flows through payroll reduction is value-destructive in ways that accountants cannot (or at any rate, do not) measure.

If we have a business that hasn't yet gone full zombie that we want to pull back from the brink, what can we do to re-build business knowledge? There aren't a lot of high cards we can draw, but playing them in the right combination offers us a strategy. None of these are discreet solutions, but are a collection of non-mutually-exclusive tools that we can use to bridge a knowledge gap.

Tool 1: Dolly the Sheep

Companies that are heavily rule-based - think insurance - eagerly moved their business rules into code. It was easy to move into code; it's just as easy to move it back into human-readable format. Hire some developers fluent in a legacy technology, make sure you have an objective way of auditing their extraction of the rule base, and identify a cadre of employees who can understand those rules well enough to more comprehensively catalog and contextualize them. It's cheap (people paid to document code will be less expensive than people paid to create code) and hygienic (preservation of business information is a good thing) and it makes our business rules accessible to a wide audience spanning business users, managers, business analysts, and quality assurance engineers.

Of course, this is data, not information. A working foundation of facts is better than none, but facts are of limited value without context. And, while it's easy to reverse-engineer facts like rules, it's not so easy to forensically construct the business contexts that encapsulate those rules. A clone of something extinct - our lost business knowledge - runs the risk of suffering severe defects. For example, ghost code - code that is not commented out but will conditionally never be executed - is likely to be confused for real code in a reverse-engineering exercise. The facts are fantastic to have, but facts are not knowledge.

Tool 2: Seek the Jedi Masters

Somebody (well, somebodies) figured out how to automate the business. There are people behind the systems to which we're bound today. Why not put them back on the payroll? If they're still alive (always a good start), local and accessible, and grateful to the company for the income that put food on their table and their children through college, welcome them home. Techniques like value stream mapping bring them back to a business-operations mindset, allowing the business "why" in their heads to be extracted in a structured and coherent manner.

Of course, this isn't as simple as it sounds. Former colleagues won't come cheap. A knowledge worker who was forced out years ago may not feel inclined to share the wealth of their knowledge. The business will have evolved since the time these knowledge workers left. Corporate policies may also interfere with a re-recruitment campaign: one company I worked with forbade engaging contractors for more than 24 months, while another forbade contracting former employees at all.

You could also hire people who work for a primary competitor: In his book The Competitive Advantage of Nations, Michael Porter pointed out how industries tend to form in clusters; so if you're in an industry that isn't post-consolidation there's a good chance you've got a direct competitor nearby, offering a source of business knowledge you can recruit from. Again, this isn't as easy as it sounds. It's hard enough determining whether the people in our own business really understand the business "why" behind the things that they do, or whether they just know the complex motions they go through. It's even harder to do that with people grounded in another company's domain: if our business knowledge is in short supply we won't have the business knowledge to ask the abstract questions to gauge their comprehension of the business; plus, we may speak fundamentally different languages to describe their implementation. If their knowledge is too finely grained - that is, too specific to the context of our competitor - their knowledge won't travel: they're a subject matter expert in our competitor's operations, not in the industry. Plus, if our loss of business fluency was the result of corporate blood-letting, it's highly likely that our competitor up the street has done much the same, and will be no richer in domain expertise than we are.

One final word of caution is that we have to challenge the "why" the experts give us. Ten years ago, I was leading an inception for a company replacing their fleet maintenance systems. Their existing system was a combination of a custom AS/400 based RPG system (that had started life on a System/34), a fat client Visual Basic application, and thin-client Java tools, all of which required manual (operator) steps for data integration with one another. The user got to a step in the workflow in the VB application, then transferred data updates to the AS/400 and resumed there, then transferred data updates to the Java application and resumed there, all over a period of days or even weeks and often going back and forth. Their experts genuinely knew their business, but had grown so accustomed to the data transfer steps that they ended up baked into the initial value stream maps we produced. It took a lot of challenging the "why" on those specific portions of the value stream before they understood how a simple shared database would eliminate lots of no-value-added inventory control steps.

Still, maintaining a connection with the people who were there at the creation helps us identify the things so important for us to know if we're going to evolve or pivot from it. In much the same way as air traffic controllers are taught how to land planes in the event the software fails on them at a critical moment, former knowledge workers can help re-build our knowledge from the ground up.

Tool 3: Buy Before you Try

If you're on your way to becoming a zombie company, why not eat someone else's brain? Re-constructing a lost capability is expensive, so buying a competent operating company - along with its digital assets - is a shortcut. This assumes that you as the buyer can make an informed decision about the competency of the people you're acqui-hiring. It also assumes that the people in the acquired stick around after the acquisition.

A reverse-acquisition can take one company's girth and bloat and wed it to another company's core nimbleness and agility. But M&A is ego-driven: the CEO or board member who wants to do a deal will see the deal through regardless the state of the acquirer or target. A few years ago, I worked with a holding company that had bought two competing firms that collected data about banks and sold it on a subscription basis. As their product was becoming digital, the value of the data they sold was plummeting (as most data tends to do when it becomes digital), so we helped them define a strategy to combine the companies and transform them from providers of data to providers of digital tools. Three days into the inception, we were frustrated that the workshops had ending up with incomplete and unsatisfactory levels of detail. We hypothesized the reason for that was because the experts weren't all that expert. On day 4 we ran a series of experiments in our workshops to test this hypothesis, and in the process confirmed that the activities they performed in the acquisition, curation, publication and distribution of the data they sold were performed for reasons of rote, not reason. The inception was successful in that it exposed am inability to execute on the strategy in the manner they had hoped to do, which led to an entirely different approach to execution.

Buying is a shortcut, and as Murphy's Law teaches us, a shortcut is the longest distance between two points.

More modestly, we can simply license technology to replace major portions of legacy systems, and train or hire experts in that technology. This, though, substitutes solution knowledge for business knowledge, and the prior isn't necessarily a proxy for the latter: even though commercial ERP systems have largely replaced home-grown ones, those commercial solutions are highly customized to the needs of the business.

Tool 4: Play Calvinball

We're barraged by business media to be internal drivers of digital "disruption" because it puts our competitors at a disadvantage, challenging their leadership by forcing them to chase after us. But disruption is also a means of rebuilding lost business knowledge: if we change the rules of the game, we're less restrained by our current assets and procedures. The more we can change, the more we set a new agenda in the competitive landscape. Ideally, we should be playing Calvinball, and make up the rules as we go along.

Disruption is a tool, not a solution. Disruption may be constrained by prevailing legislation and regulation, while regulators tend to look at established firms differently from upstarts - if they look at them at all. In the wake of the 2008 financial crisis, bank lending declined in response to higher capital requirements against risk-weighted assets and tighter lending standards; marketplace lenders skirted balance sheet restrictions and lending regulations simply by not being chartered banks. This allows marketplace lenders to underwrite loans with much more flexibility than a bank. The door to this type of disruption was closed to banks. As with Calvinball, it is the player with the ball who makes the rules, and banks (like many other regulated businesses) aren't the ones holding the ball.

Plus, when we build on existing business rules rather than replace them, we're not moving away from a dependency on fundamental knowledge that we don't have. Re-imagining how an existing offering is packaged, distributed or even consumed doesn't alleviate the need to understand the core characteristics of that offering.

Making the Best of a Bad Hand

Re-gaining lost business knowledge is a slow, sometimes difficult, and usually expensive proposition. Pushing too hard to re-acquire it is like beginning to learn calculus and non-Euclidean geometry the night before a comprehensive final exam: a grade of D- would be a small miracle. But, since strong business knowledge is key to executing any business strategy pursuing growth or evolution, a grade of D- isn't going to cut it.

Worry less about the slow rate of re-acquisition and think instead about where you want your business fluency to be in 6 months, 12 months, and beyond, and how much more effective your organization will be at those times. That guides the extent to which you employ each of the four techniques described in here and how they get you to a greater state of fluency so that you can operationalize the business strategy. For example, contracting legacy language developers to capture encoded logic and hiring in a couple of retired employees for value stream mapping sessions, all in exchange for donuts and a fat payday for a few months, may be an effective and inexpensive precursor to an acquisition, or provide suitable grounding to initiate disruptive change that re-writes the rules of an industry.

This requires us to prioritize organizational learning along side operating performance and delivery goals. The latter two are quantifiably measurable and glare at us from our financial statements; the prior is not and does not. A commitment to learning is an investment that needs board-level visibility and air cover: without the learning there is no execution, and without the execution the strategy is just an elaborate Powerpoint. Board-level patience isn't infinite, so in exchange for an investment in learning, line management will have to commit to strategic execution - even if it has to commit to execution before it has re-learned as much as it would like.

The alternatives are to be acquired (sooner rather than later, at peak value for the book of business the company still commands) or slow obsolescence (and concomitant market irrelevance). Since this gives the people in the company a fighting chance, trading a commitment to learn for a commitment to strategic execution is a fair exchange.

Tuesday, January 31, 2017

Where Has All the Business Knowledge Gone?

I was queuing for a flight late last year when two people standing behind me started talking about how disappointing their trip had been. They were in consultants in logistics, and they were lamenting how one of their clients was struggling in the wake of a business process change that another firm - a tech consultancy - had agitated for their mutual client to adopt. The client in question purchased and warehoused perishable items, hundreds of thousands of different SKUs that they distributed to retailers ranging from tiny independents to large global chains. The distribution operation was built on efficiencies: fill delivery trucks to a minimum of 80% capacity and deliver products to customers on routes optimized for time and energy consumption. Cheap distribution keeps product prices down, which makes their retail clients more competitive. The up-start tech consultants pushed for flexibility: more frequent product deliveries made to the same stores throughout the day would keep shelves stocked, so they could better match stock to store. If there's a run on one item it can be replenished much sooner, resulting in fewer lost sales. Unfortunately, more frequent deliveries required more frequent truck dispatch; trucks could only be dispatched more frequently if they spent less time being loaded with products, so the load level of a truck fell to below 50% of capacity; expedited dispatch also meant ad-hoc rather than fixed routes, which resulted in driver confusion and receiving delays that translated into higher energy and labor costs of distribution. The intra-day re-stocking didn't capture enough revenue lost due to shelves being empty to justify either lower wholesale margins or higher retail prices.

The two standing behind me were exasperated that their client "listened to those [other] consultants!"

Distribution is not a high-value-added function. Distribution can get stuff form one esoteric locale to another, but that isn't the miracle of modern supply chains. Their magic they create is doing so for mere pennies. Cheap distribution can make something produced tens of thousands of miles away price competitive with the same something produced a few doors down the street. Distribution is about efficiency, because efficiency translates into price. When you're distributing hundreds of thousands of different SKUs, you're a distributor of commodities, and whether toothpaste or tungsten the vast majority of commodity purchases are driven by price, not convenience. Capturing incremental lost sales sounds like a good idea until it meets the cold, hard reality of the price sensitivity of existing sales.

This got me to reflect: why would anybody in a distribution business agree to do something so patently counter to the fundamental economics of their business model?

They're not a company I'm doing business with, and I didn't strike up a relationship with the frustrated consultants, so I don't know this specific situation for fact. But I've seen this pattern at a number of companies now, and I suspect it's the same phenomenon at work.

The short version: companies have forgotten how they function. They've lost institutional knowledge of their own operations.

We've been automating business calculations since the tabulator was introduced in the late 19th century, and business processes since the debut of the business computer in the late 1940s. Early on, business technology was mostly large-scale labor-saving data processing. It wasn't until the 1950s (well, more accurately, the rise of COBOL in the 1960s) that we really started capturing business workflows in code. Although few could appreciate it at the time, this marked the beginning of businesses becoming algorithm companies: all kinds of rule-based decision making such as accounting was moved into code; complex rules for functions like pricing and time-sensitive decisions such as trading quickly followed suit. As general purpose business computers became smaller and cheaper in the '70s and '80s, the reach of computer technology spread to every corner of the organization. As it did, business workflows from order entry to just-in-time manufacturing to fulfillment were automated.

The work that had previously taken people days or weeks could be done in minutes or even seconds. People with intimate knowledge of the business were liberated from computational and administrative burden. The business could grow without adding staff, and suffered fewer mistakes for that growth. Computer technology fueled labor productivity throughout the '80s and '90s as more and more business processes were automated.

Then something happened that went largely un-noticed: those business people who had devised and performed the manual processes that defined the software solutions built in the '70s and '80s retired. It went unnoticed because their knowledge was captured in code, and the developers knew the code. And, the new business people hired in to replace the old were trained in how to run the code, so there was no interruption in normal business operations.

Then the original developers disappeared, either because they aged out, got better opportunities with other firms (tech people tend to change jobs frequently), or got caught up in the offshoring thing in the early 2000s. No matter how it happened, the original developers left the scene and were replaced by new people.

At this point, the cracks started to appear.

Business people knew what they did with the code and tech people knew what the code did, but neither knew why. While regular operations didn't suffer, irregular operations caused fits because nobody knew what a measured response to them was. These led to bad decision-making about the software. Among other things, the new business people didn't know the simple protocol their predecessors followed to contain a crisis. While new people had the tactical knowledge to execute tasks in the software, they didn't know how use the software in tandem with manual procedures to efficiently respond to an irregular operating situation. On the other side of the knowledge base, the new tech people didn't know why extreme scenarios weren't accommodated in the code. Again, without the meta knowledge of how to procedurally minimize a crisis inside and outside the software, they had to defend why the software wasn't "robust" enough to deal with this crisis. Since anything and everything can be codified, they had no ready answers, nor could they chart a procedural course outside of code.

With management screaming about escalating costs and poor customer service and making assurances to higher-ups that This Will Never Happen Again, the decision was made for them. So the software bloated with new rules, complexity (If This Than That), and features rarely invoked to make the software very well prepared to respond to the last crisis. Of course, given the nature of irregular operations, it wasn't entirely accommodative to the next crises. Thus went the progressive cycle of code bloat and fragility.

Once the old code became so cumbersome and brittle that executives were terrified of it, they were compelled to sponsor re-invention-through-technology initiatives. These immediately defaulted into feature parity exercises because nobody on the re-invention team had sufficient business context to imagine the business differently from how it operated (tactically) today. Because the new generation of business people had never been required to master the business rules in the same way that a London taxi driver had to have the knowledge, business users were beholden to yesterday's status quo as codified in the software that ran the business. In addition, these replatforming exercises were characterized by a shift in authority: the business people executed the rules, they didn't shape them. Tech people were the ones who manipulated the code behind the rules; they were the new shapers. Tech, not business, are the authority in replatforming initiatives.

The depletion of business knowledge and the shift of the curation of that knowledge from business to tech leads to the scenario described above: no resident adult present who can authoritatively explain why flexibility would blow out the economics of a mature distribution utility. While tech people are great at compilers and build pipelines, they're crap at business economics. Without a meta understanding of business operations, a re-invention or re-platforming initiative will be little more than a high-cost, high-intensity exercise that gets the business no further than where it is today.

I've seen plenty of companies where business understanding has been depleted. Re-learning the fundamentals is an expensive proposition. So how do we re-build a lost institutional memory? We'll look at the ways of doing that next month.