I consult, write, and speak on running better technology businesses (tech firms and IT captives) and the things that make it possible: good governance behaviors (activist investing in IT), what matters most (results, not effort), how we organize (restructure from the technologically abstract to the business concrete), how we execute and manage (replacing industrial with professional), how we plan (debunking the myth of control), and how we pay the bills (capital-intensive financing and budgeting in an agile world). I am increasingly interested in robustness over optimization.

I work for ThoughtWorks, the global leader in software delivery and consulting.

Saturday, September 30, 2017

Innovation Exhaustion

"We've tried nothin' and we're all out of ideas."

-- Ned Flanders' mom, "The Simpsons", season 8 episode 8: Hurricane Neddy

We're constantly being told by the popular business press that we live in an "ideas economy," where survival is a function of disruption because consumer behaviors and emerging technologies are conspiring to obsolete the economics of established businesses. There are plenty of examples - music publishing, mass-market retailing, local transportation - where new entrants have left a wake of creative destruction in their path.

Management consultants love to trot this stuff out, because fear of the unknown (who will destroy your company?) twined with tantalizing prospects of runaway riches (you could be the next air-b-n-amazon-uber-twit-book!) make for eager and pliable clients.

What those management consultants don't tell you is, it's expensive being in the ideas business. R&D isn't cheap: there is far more demand for engineering labor than there are engineers to be hired. And, a lot of R&D is terminal: you have to try a lot of things before you find something that pays for itself. Costly research that tells you only what not to do is cold comfort when you're trying to figure out what it is you should be doing.

It also appears that the economics of the "ideas economy" have been slowly eroding for a very long time. According to this paper, the number of people working in research has grown at a much faster rate than economic growth. Consider semiconductors: "The number of researchers required to double chip density today is more than 18 times larger than the number required in the 1970s." Inflation has risen only 6.5 times since 1970. Yowza.

Of course, that could mean there are too many slackers in research jobs, or that we have more eggheads than our economies can afford. But the real culprit seems to be a scarcity of ideas: they're just getting harder to find. As Izabella Kaminska wrote in the FT, "if research productivity is declining it stands to reason it is being offset by increased research effort. This essentially implies that it is getting harder to find new ideas as research progresses."

A big reason for this is economic maturation. There were far more impressive productivity gains in the early stages of the industrial revolution and microcomputer revolution than there were later in their respective lifecycles: the once factories were mechanized and all the back-office accounting computerized, the big and easy gains were made.

But even Amazon is showing signs of innovation fatigue. In the last 6 years, sales are up 5x, but employee headcount is up 10x. Liabilities are growing as fast as cash, suggesting free cash flow isn't improving with time. And, Amazon's growth rate is far lower than what Wal-Mart's was at a similar point in its history. If growth has slowed, capital intensity is up, and total labor spend is up, either platform monopoly economics aren't what we think they're supposed to be or they will take a very, very long time to materialize. This isn't to say Amazon isn't going to grow, or be a threat to traditional retailers and other industries, but it is to say that even Amazon is showing evidence of idea exhaustion.

What about the major disruption that appears to be on the horizon, like distributed ledger technology?

Blockchain could eliminate redundancies across companies, reduce fees for simple transfers, and usher in all kinds of innovation. It can, but the economics won't materialize as rapidly as ideas of yore. As long as the network is stubbornly difficult to secure and access, trust will remain with the institutions using the network, not the network itself. As long as trust remains with institutions and not the network, the institutions will have no choice but to maintain their own ledgers for a long time. That means that companies in ecosystems that adopt distributed ledger technology will find opportunities for innovation and gain some efficiencies, but will not be able to exploit its full potential for quite some time. Innovation and productivity from disruptive ideas, while still present, will fall short of potential.

This is the storyline with all emerging disruptive technologies. We may get autonomous long-haul trucks but we will still require drivers in the cab, we have shared ledgers but a lot of that data will remain duplicated throughout consuming organizations, we allow initial coin offerings but we regulate them as securities. There are economic benefits, sure, but the economic windfall they promise is just out of reach. That revolutionary new economy is delayed at the airport.

The chattering classes are telling us that we live in an "ideas economy" a full half-century after it was ripe to traffic in ideas. A more appropriate term might be the "ambition economy", because to reap the benefits of the possible requires a significant break away from the known and familiar. That's more than innovation driven by a single firm; it requires moving ecosystems of consumers, suppliers and regulators.

For those caught in the crossfire - firms that don't much like the prospect of winning a participant trophy in a costly innovation arms race, and don't have the gravitas to lead ecosystem change - what alternative do you have? We'll look at the options next month.

Thursday, August 31, 2017

Partners

"Greed and patience don't live together very well."

-- Keith Jackson, ESPN 30 for 30, Who Killed the USFL?

Businesses rely on a network of suppliers to operate and grow, including providers of components, back-office operations, distribution, marketing, retail, information technology and even office supplies. They do this for a variety of reasons, ranging from areas of specialty (assembling large finished goods is different from manufacturing small, precision components), depth of expertise (some companies are better at selling things than making things), accessibility of labor (difficult to hire in a location where there are too many jobs chasing too few employees), and appeal to the people with the skill set (a wholesaler doesn't offer that much career growth for an attorney).

All relationships, whether personal or commercial, are based on need. A buyer looking for widgets will go to another supplier if they can't get the widgets they're looking for. Similarly, a seller may choose not to sell widgets to a cheapskate buyer, and will find other customers instead.

Although rationally the statement of cash flows should triumph in commercial relationships, we're very often asked to extend our balance sheets to help someone else. In a commercial context, this is the point at which terms like "supplier" and "vendor", "client" and "customer" are ditched in favor of the aspirationally higher ground implied by the word "partner". Rather than evaluating transactionally (this relationship comes at a high cost to me), we evaluate strategically (this relationship is important to me).

Choosing to underwrite a shortcoming in a relationship is to make a leap of faith that there will be tangible or intangible rewards for doing so. The executive who keeps changing the specifications but always gives a glowing recommendation, the company that provides your firm with the annual revenue if not the timely cash flows. A partner puts up with deficiencies because they get much more out of the relationship.

Partnership, then, encompasses more than just a relationship of need, but a relationship worth it to both parties to make sacrifices to sustain. When we partner, we each agree to ebbs and flows in the relationship - "in sickness and in health" - and that we will not merely tolerate, but accommodate. A seller that has to roll somebody off a team because they can't travel; a buyer that has to reduce the amount they spend. In these situations, a partner sets aside the short-term impairment for the long-term benefits of continuity and consistency.

Of course, there are more benefits than merely convenience. Each partner changes independently, and those changes keep the partnership relevant and fresh. In the process, each learns continuously from the other, evolves what they do and matures how they do it. Strong partnerships make stronger individuals.

Partnership implies equivalency. Yet the commercial world is full of alleged "partnerships" that are superior-subordinate, making them inherently unbalanced. Companies stay in condescending or even abusive relationships because they're afraid of the uncertainty of the alternative. Sellers do this because suckling at the teat of easy revenue is far easier than hustling new business. Buyers do this because they feel held hostage by a supplier. Even though it comes at a high commercial cost (squeezed margins) and high human cost (second class status and compromised careers for those involved), such business "partnerships" can last for a long time.

Egalitarian partnership, then, is more often wishful thinking than willful practice.

Whatever else they may do, partners do not try to get the better of one another. If one party feels it has to out-maneuver the other in every contract negotiation, pad or dispute every invoice, cast doubt on quality or contribution well after delivery as a means of finagling a discount, or flaunt payment terms, it isn't a partnership. This isn't competition that makes for stronger individuals and better outcomes, it's subversion that prioritizes individual gain over mutual outcome.

There's nothing wrong with transactional relationships, and if we're honest, most commercial engagements don't have the potential to become genuine partnerships. Partnership is investment, and like all investments, there's only so many you can make and maintain. Over-using the term and confusing one type of relationship for another does the people and companies you do business with a disservice because it implies a commitment to them that you're not making. Transact faithfully with all (the world is a better place when it gets by on trust), and partner intensely with those who equally benefit from your association.

Monday, July 31, 2017

Invest in What You Know

Every day, millions of people buy expensive things they don't know much about: cars and residential homes, enterprise software and entire enterprises. Having a deep pocket - or investiture by people with deep pockets - is the only qualification required for an individual to have buying authority. As we saw previously, emotions have a share - often a disproportionate one at that - in buying decisions. This makes value a relative rather than an absolute concept, and absurd as a summable metric.

When purchases get large, we re-cast them as investments. As assets, acquisitions appreciate in value on their own (e.g., real estate) or they enable us to derive greater economic value than we otherwise would without them: a truck depreciates in value, but it is inefficient to run a flower delivery business without a truck, so having a truck on the asset line of the balance sheet boosts revenue on the income statement. Unfortunately, a lack of expertise in the things that we buy tends to give non-economic factors an important role in the decision. We may know horticulture and the asthetics of flower arrangement but not know much about forecasting operating costs and reliability in city driving, so our business investment comes down to factors like style, comfort, or just liking one salesperson over another.

Purely financial investments aren't immune to this, either. We're not experts in industrials or tech firms or utilities or the ETFs that collect them, so we develop criteria (consistent dividends, revenue growth), create justification frameworks (safety, income), and consult experts (research firms), but in the end we follow our emotions (I soooooooo love their products I'll park my IRA in their stock). We want to equate investing with rationality, but a lack of expertise - and the pressure to make investments - make it anything but rational.

Many people in the tech industry - myself included - have advocated recasting technology as a financial phenomenon that yields returns rather than an operating cost to be minimized. Well, more accurately, recasting some portion of technology this way. We don't need to measure return-on-the-time-and-expense-system: it's a tax on our business and all we want to do is pay as little per staff member as we possibly can. But we can't expect to create high-risk call options (R&D) or make strategic capital allocations (platforms) with stay-in-your-swim-lane staffing and structure. Form follows finance: because of the outsized effect that finance has on operations, we start by changing the funding model, which clears the path for new structure and process. Follow the money.

If tech is going to function as a financial rather than an operating phenomenon, it must take its guiding principles from financial investing. Benjamin Graham implied and Peter Lynch practiced the idea that you should only invest in what you know. Get to know the industry dynamics, the company in particular, and the people operating it before pledging any capital. You'll still have disappointments, but far fewer surprises. This separates thoughtful investing from reckless gambling.

In technology, "investing in what you know" requires substantial business domain knowledge and tech fluency with generous helpings of behavioral science and economics. Successive waves of efficiency gains mean we can't take intimate business knowledge for granted any more. All the organization, process, and ceremonies won't compensate for a lack of these things, the evidence of which is seen in the reference cases of Product organizations that create confusion rather than cohesion, and the large replatforming initiatives that require additional cash calls and goal reduction to be deemed successes. If we're going to "invest in what we know", the leadership imperative is in securing the fundamentals so that we have the basic competencies in place.

But that presents us with a recursive investing challenge. Developing the capability to competently invest in technology is an investment itself, and must be held to the same standard: are we investing in something we know? Do we know what we're looking for in that investment into capability? Or will investments in our future leaders be more emotional than rational?

Friday, June 30, 2017

The Value Myth

When we think about value, we think in terms of hard measures like increasing revenue or decreasing cost, or soft measures like increasing customer satisfaction or reducing customer friction. This all sounds great, but we know in practice that value is not as concrete as we would like to believe: projections are conjecture, there are multiple forces at work that determine the result we get, and counterfactuals can't be proven to know for fact whether we'd have been better off doing something different or nothing at all given how circumstances played out. Good as it might be that value allows people to relate their actions to hoped-for outcomes, it is naive to think that the outcomes will result from the sum of the actions that we take. Business is far more complex and far more messy.

Saying otherwise is disingenuous, because it gives business a theoretical tidiness that it simply does not possess. Perhaps this is inevitable when non-business people like program managers (coordinator-administrators) and developers (engineer-nerds) traffic in business concepts (finance). Whatever the reason, it isn't helpful if tech wants to be taken seriously by the professionals - particularly the finance professionals - who run the business. Showing a direct line-of-sight from tech or process to business outcome sets up tech to get played and manipulated. Going from tech to business value in one step is a short-cut to being relegated by the board; it is not a path to business relevancy.

In this series of posts, I've taken a different tack, focusing on value and worth as behavioral rather than economic concepts. It stands to reason that if value and worth are in the eye of the beholder, their definitions will be heavily influenced by individual biases. Success of any business initiative comes down to behaviors, so the better we understand those the better we understand the complexity of what value really is in a complex business context.

Value is different things to different people for vastly different reasons. Consider insurance claims. During a storm, high winds blow a tree down and onto a house, collapsing a section of the roof. Insurance adjusters don't care about the aesthetics of different colored roof shingles used in the repair. The adjuster only cares that the roof is repaired and the house won't be taking on ballast the next time it rains. The insurance adjuster is under orders to repair the house with minimum impact to the insurance company's cash flow, and is therefore focused on the utility (which is easy to quantify), not the aesthetics (which are not). That the first thing any prospective buyer will point out is that those green roof shingles clash with the existing gray roof shingles appears nowhere in the adjusters "cost of repair" spreadsheet: whether green or gray or fluorescent pink, those shingles will keep out the rain and the snow and the critters. For the insurance company, the asset they insure was repaired with minimum impact on their cash flow.

The same applies to tech. An engineer infatuated with the tech stack supporting a hopelessly implemented feature set. A user clinging to an interface backed by an impenetrable monolith of code. The CFO who is tone deaf to responsiveness and excessive defects, solely because of the price. Try as we like to frame "business value" as an absolute, in practice it is a relative concept, interpreted and reconciled to the motivations and desires of each individual in the value chain. What gets measured is what gets managed, so "business value" becomes the means through which individual value is realized: we need this over-hyped cutting-edge tech stack because it will help us deliver it faster; how conveniently coincident that experience with that over-hyped cutting-edge technology flatters the resumes of the people working on it.

This creates cascading re-interpretations and re-assertions that smother value, ironically justified by the pursuit of value. A firm I audited years ago had, some months prior, formed a cross-functional committee of tech, business and finance to choose a mobile development toolkit. Business believed it needed a mobile solution, tech aspired to create one, but the board didn't share in the enthusiasm. In the end, finance won out, choosing the tool that cost the least but that tech found unstable and yielded software solutions the business didn't much care for. The tool was never used. Instead, developers rolled their own frameworks and infrastructure, below the radar of the CFO and with a wink-and-a-nod agreement with their business partner that they would do so. The purchased framework had negative value to its intended constituents, to a point that tech believed there was more value (and with the complicity of business in the decision, to the business as well) in creating proprietary development infrastructure. Whether spending twice for infrastructure was tech rescuing the "value" jeopardized by a crap product, or tech being intransigent and subversive to the board's agenda, all depends on your definition of value under the circumstances.

We don't win the triple crown of value all that often. Marketing doesn't appreciate losing the pricey boutique firm they could talk to each and every day, but tech looks like stars to the CFO for sending the work to a cheap offshore supplier. The CIO doesn't like being held hostage by employees who used an obscure tech stack in the name of getting something done "faster", only to be making it debilitatingly expensive as they exit the firm and go into private practice. Eliyahu Goldratt pointed out the tradeoff of local optimization for systemic optimization a long, long time ago. Local optimization infiltrates every value calculation, in temporal ways that defy models of value.

We want to believe that "value" is an absolute measure of something that improves the condition of the enterprise: In unitate es virtus. But we know that people have different interpretations and goals that materially impact the business outcome, so value is a weighted sum of disparate, unexposed agendas. The larger the enterprise, the more complex the calculus.

Value is money, and where there is money, there is politics. With that in mind, value is perhaps best understood as something Mike Royko taught us about the fundamentals of politics many years ago: Ubi est mea?

"Where's mine?"

Wednesday, May 31, 2017

Questions of Value

In March, we looked at questions of worth. This month, we look at "questions of value".

In the dictionary, value is defined by worth, and worth is defined by value. Why ask the question twice? Because even if they refer to the same thing, the words mean different things in different circumstances. In economic terms, "worth" refers to stored value, such as accumulated financial reserves (one's "net worth") or the price we're willing to pay to replace something we already own. We use the word "value" in reference to economic (or other) power unleashed by something that we have or do. An object has sentimental "value" to which we ascribe an inexplicably high economic "worth". An investment in a truck yields economic "value" on the income statement well above the worth we ascribe to it on the balance sheet, because without it we couldn't achieve delivery efficiencies.

Value traffics in moving, worth in storage.

Value is what we're willing to pay for something in exchange for the returns that it provides. We value cars for reasons ranging from their resale value to the status we think they project to the friends and strangers who see us driving it. We value houses for the school districts we can put our kids in, the relative price of houses nearby, their convenience to how we live and make our living, and the status that living in that post code conveys.

In software, we want to make decisions about where we invest based on value. But because we can't predict the future, value is conjecture. This forces us to ask: what defines value? And who defines value?

Value, like love, is a many splendored thing. There is value derived from features, there is value derived from construction, and there is value amplified from not spending too much. An asset that does many things, is low maintenance, and costs little will be higher yield than one that does few things, is high maintenance, and costs dearly. The problem of defining value is the problem of projection because there are no absolutes in those projections.

This creates a bit of a problem, because value is a future-tense term, and we can't know with much certainty what the most important characteristics are to realizing that value. This becomes a big problem when we want to "buy for value". Worth is bankable, if vulnerable to erosion; value is in the eye of the beholder and may never materialize.

Consider a house. Buyers define all kinds of evaluation criteria, things like proximity to public transportation, newer appliances, and rooms and layout that accommodates their possession and lifestyle. But a house is a building and its utility is a function of its construction as much as its design. Since most home buyers aren't carpenters or plumbers or electricians, they're not able to judge quality of the build. They rely on the opinions of experts. Hence we have inspectors, who are licensed in most states and built into residential contract law to provide their expert opinion on the house.

Selection criteria and expert opinions only go so far, though. A homeowner doesn't really know if a house is what they want until they've lived in it for a while, and besides, some of their criteria will be contradictory and some of their priorities will be out of order. Inspectors have limited expertise with building codes, practices and materials, and they're only spending a couple of hours looking over the carpentry, masonry, electrical, plumbing and mechanical of an entire building that took hundreds of person days to build. For all the sweating and scrutiny, at best our opinions tell us that we shouldn't buy something; they don't necessarily tell us specifically why we should. All house purchases are compromises, and in the end, the purchase is made for substantially - perhaps even largely - emotional reasons, and complex ones at that.

This applies to all kinds of purchases where "value" is a factor. Like cars: we develop criteria (seats, storage, zero-to-sixty speed), poll experts (trade press like Consumer Reports and Car & Driver), but still make a decision that is partially - even largely - informed by emotions. Look, it's got four doors, space for the kids & clubs, it's fuel efficient, and will you just look at those shouty rims?

Questions of value become even more conflicted when multiple stakeholders have different ways in calculating value, and ambiguous authority in setting it. We'll take a closer look at that next month.

Sunday, April 30, 2017

Concrete Versus Abstract

Until a few years ago, enterprise software development was pretty easy to justify and execute because the income statement was the primary customer. Automating back office tasks, expanding market reach, creating customer self-service tools, even legacy technology replacements were all investments that could be explained in a straightforward manner as taking costs out or capturing revenue that would otherwise have been lost. Business cases weren't all that complex, and results were easy to direct.

The nature of enterprise tech investments has changed. We want consumer-facing tech that is less workflow-driven and more situationally adaptive. We want platforms on which we can quickly build complex applications, not a collection of solutions that we tie together with jumbled integration and overlapping data warehouses. These investments are justified not by cost or revenue, but by a future defined by new consumer behaviors and new services; it is a future we do not control and over which we cannot judge our influence, and by their emergent nature offers no baseline for measurement. As tech investments become more ambitious, their investment criteria become more nebulous and vague, their justification more speculation and hope, and their actual impact more difficult to trace and validate.

The landscape has become far more abstract, too. For one thing, the business threats have become less physical. A business had months to prepare for a rival building an outlet down the street, but can't immediately see (let alone know how to respond to) a purely digital competitor slowly siphoning away customers. For another, the tech is less tangible. Technology was easier to grasp when you could map software solutions to a handful of rack-mounted servers and desktop PCs; try explaining cloud-based AI architecture to non-tech buyers as anything other than a black box.

Yet enterprise execution remains rooted in the concrete. Companies achieve scale and efficiency through disciplined execution: develop patterns of operations, sweat the details, codify procedures, and spread through an ever growing network. The more cookie-cutter, the more predictable; the more efficient, the more cash flow from operations.

I've written before that this encourages heavy levels of debt finance, and that debt finance stifles investment by crowding it out: debt not only consumes cash that could otherwise be used for investment, it discourages high-risk investments with unknown returns. Debt finance binds a company to a tomorrow that is the same as today. Visions are equity plays, not debt ones.

Capital structures aside, there's another aspect to this: enterprise leadership that isn't equipped for the challenge. The enterprise leader must be capable of forming a considered opinion on tech and business matters to know whether someone is feeding them the level truth or blowing sunshine up their backside. That leader must be able to describe the world through the lens of a plausibly achievable future state, not something that comes across as improbable sci-fi, and more substantive than a how-things-have-always-been-only-faster state. That leader must be pragmatic enough to know how to walk the fine line between enabling knowledge workers to hold the future hostage to their subject matter expertise, and overwhelming those knowledge workers with change fatigue, maturing them into future operating leaders of the business.

This requires a leader with depth of knowledge in operational, technical and financial matters. It also requires an ability to think abstractly and translate abstraction into concrete action. He or she will have to articulate an integrated business & technology operating model, restructure finance from cost-driven to investment-driven, and change recruiting and retention and contracting practices. Plus, she or he must be able to explain why the current modus operandi is geared toward running the wrong kind of business (an efficient opco), and what is necessary for it to become the business it needs to be (an efficient opco ingesting the innovation generated by a biz&tech platformco).

This leader cannot over-emphasize one area - the transformation, the vision, the tech, the finance - above others. Doing so creates an imbalance that will lead to organ rejection by the established enterprise. Key enterprise constituencies don't react favorably to having their area of specialization demoted. It starts whispers that "this person leading our so-called replatforming just doesn't get the business" that undermine their leadership.

The hard-driving entrepreneur, the professional administrator, the technology futurist are the wrong leader archetypes. This calls for statesmanship, someone who can project from policy to practice in a complex corporate and competitive landscape, translate goals and changes into multiple business tongues (the executor, the developer, the middle manager), behave diplomatically while remaining above corporate politics, and be patient for new business principles to sprout within the enterprise.

Of course, people who fit this bill are as rare as hen's teeth. Operating companies don't incubate abstract thinkers, they incubate concrete thinkers because they reward concrete execution. Nor does it help that we've taken authority away from middle managers instead of developing them into the next generation leaders.

The firm that does recognize the scope of this leadership challenge but can't staff the role from within or without will resort to the multi-headed leadership team (at least one business and one tech, potentially more from across the corporation depending on the political landscape) and hope that the whole will be equal to the sum of the parts. Conway's Law guarantees that the outcome will be plagued with local optimizations that inhibit - and potentially impair - the hoped for outcomes.

Ambitious transformative tech investments may be viable, and even necessary for survival. They need vision and execution, cooperation and skills. But they're going absolutely nowhere without the right leaders and leadership.

Friday, March 31, 2017

Questions of Worth

Price is the critical determining factor in purchasing decisions. If I want a new case for my tablet, and I know the case that I want, it's worth a considerable amount of my time to find the lowest price on offer for that case. A penny saved and all that.

Utility purchases are driven by price sensitivity. If I can't really say one product is a premium offering to another, I'll go cheap at the sales. I need calories after I run, a breakfast bar will do, I don't need a designer breakfast bar.

While I was writing chapter 3 of my book, Activist Investing in Strategic Software, I spent time researching the rise of centralized procurement departments in the 1990s. De-centralization in the 1980s created inefficiencies in cost management: it wasn't uncommon to find that one division was paying far more than another division for an identically skilled position supplied by the same vendor. Centralized purchasing found efficiencies by standardizing roles and position specifications and granting preferred partner status to contract labor firms. In theory, standardized buying lifted the burden of negotiation from individual department managers and found cost efficiencies for the company. Buyers could more atomically define what they were buying, sellers swapped margin for volume.

And tech labor became a utility.

Procurement's ascendance didn't create industrial IT (there were already willing buyers and sellers of narrow skill-sets), but it certainly threw copious amounts of fertilizer on it. Within a few years, we saw significant expansion of contract labor firms (or "services", or "consulting", whichever you prefer): firms like Accenture and Infosys grew rapidly, while firms like IBM ditched hardware for services. Buying became an exercise in sourcing for the lowest unit cost any vendor was willing to supply for a particular skill-set. Selling became a race to the bottom in pricing. In this way, tech labor was cast as a utility, like the indistinguishible breakfast bar mentioned above.

In captive IT, the notion of a "knowledge worker" that came to prominence in the 1980s was stampeded by the late 1990s. Knowledge workers are a company's primary labor force, but through the miracle of standardization, tech people became collections of skills, and subsequently interchangeable go-bots. By extension, tech became a secondary labor force to clients. Labor extracted rents from the client for which it toiled, but labor had no equity in the outcomes that it achieved. Tech labor was wage work. It might be high-priced wage work, but it's wage work none-the-less.

With all cash and no equity, employees now had clear rules of the game, too. Certifications became the path to higher salaries. It didn't matter whether you were competent, Sun certified you as a Java developer, Scrum Alliance a Scrum Master, PMI a Project Manager, any employer a Six Sigma blackbelt. In exchange for minor rent extraction by agencies exploiting an industrialized labor market, buyers received 3rd-party reinforcement of their contract labor model.

With all the ink being spilled on subjects that managers of enterprises like to traffic in - things like Agile delivery, product organizations, platforms, disruptive technologies, and the idea economy (obviously, some more meaningful than others) - it's difficult to understand how companies still choose to source labor like it's 1997. The people I need to build long-lived products on my-business-as-a-platform-as-a-service using emerging technologies don't fit any definition of standard procurement. These aren't left-brain skills, they're right-brain capabilities. If you buy the cheapest knob-twisters that money can buy, how could you possibly have any expectation for creative thought and innovative output?

At the same time, it isn't that surprising. Procurement sources all kinds of contract labor, from executive assistants to accountants to recruiters. Yes, technologies like Office, SAP and LinkedIn are fantastic, but they're not exactly the equivalent of serverless in tech. If the bulk of the labor you source is check-the-box, why would you expect - or more to the point, how could you be expected to comprehend - that tech is unique? Accounting is, well, accounting after all. It's not a hotbed of innovation. In fact, it's usually bad news when it is a hotbed of innovation. "Innovation" in tech is - particularly for non-tech manager / administrators - just a buzzword.

In enterprises with dominant procurement functions, "worth" is a function of "cost", not "outcome". If we rent labor on the basis of how much a unit of effort denominated in time will cost, the "worth" of a development capability is the sum of the labor times its unit cost. We therefore value scale because we assume productivity is an implied constant. If we don't understand sausage-making, we simply assume that more gears in the sausage-making machine will yield more sausage. We fail to appreciate the amount of energy necessary to drive those gears, the friction among them, and the distance those gears create between hoofed animal and grill-ready skinned product.

Thus we end up with a payroll of hundreds doing the work of dozens.

Our economic definition of "worth" precludes us from understanding what's going on. We have the labor, so it must be some sort of operational deficiency. We look to process and organization, coaches and rules. All of which is looking in the wrong place. We're not a few coaches and a little bit of process removed from salvation. We staffed poorly, plain and simple.

What a development capability is "worth" has to be correlated to the value it yields, not metered effort or even productive output. Something isn't "worth" what we're willing to pay for it, but what its replacement value is to provide the same degree of satisfaction. If we're getting the output of dozens, we're willing to pay for dozens. The capability of high-yield dozens will be more dear on a unit cost basis. But clear accounting of systemic results will favor the cost of polyskilled dozens over locally optimized low-capability monoskilled masses.

This is the economics of "worth".

Tuesday, February 28, 2017

Our Once and Future Wisdom: Re-acquiring Lost Institutional Knowledge

Last month we looked at the loss of institutional memory and the reasons for it. This month, we look at our options for re-acquiring it.

The erosion of business knowledge is not a recent phenomenon. Management textbooks dating at least as far back as the 1980s included stories of employees performing tasks the reasons for which they didn't really understand. The classic reference case was usually some report people spent hours crafting every month that they distributed to dozens of managers and executives, none of whom read it because they didn't know what it was for. Those execs never put a stop to it because they assumed another exec knew why it was important. Then, during the much-anticipated system replacement, some business analyst tracked down the person who wrote the report specs so long ago; after he was done laughing, that person told the business analyst the crisis that triggered the need for that report ended many years ago, and he couldn't believe they were still wasting time producing that report.

This story always seemed apocryphal - of course that could happen, but people are smart enough that it wouldn't really happen - until I saw it first hand at an investment bank just 6 years ago.

Natural (retirement) and forced attrition (layoffs) have long robbed companies of their knowledge workers. The rise of automation has simply made their loss more acutely painful. Accounting for knowledge hits the income statement in the form of the salaries of experienced and tenured employees; unfortunately, the value of their knowledge has no representation on the balance sheet. Extracting greater cash flows through payroll reduction is value-destructive in ways that accountants cannot (or at any rate, do not) measure.

If we have a business that hasn't yet gone full zombie that we want to pull back from the brink, what can we do to re-build business knowledge? There aren't a lot of high cards we can draw, but playing them in the right combination offers us a strategy. None of these are discreet solutions, but are a collection of non-mutually-exclusive tools that we can use to bridge a knowledge gap.

Tool 1: Dolly the Sheep

Companies that are heavily rule-based - think insurance - eagerly moved their business rules into code. It was easy to move into code; it's just as easy to move it back into human-readable format. Hire some developers fluent in a legacy technology, make sure you have an objective way of auditing their extraction of the rule base, and identify a cadre of employees who can understand those rules well enough to more comprehensively catalog and contextualize them. It's cheap (people paid to document code will be less expensive than people paid to create code) and hygienic (preservation of business information is a good thing) and it makes our business rules accessible to a wide audience spanning business users, managers, business analysts, and quality assurance engineers.

Of course, this is data, not information. A working foundation of facts is better than none, but facts are of limited value without context. And, while it's easy to reverse-engineer facts like rules, it's not so easy to forensically construct the business contexts that encapsulate those rules. A clone of something extinct - our lost business knowledge - runs the risk of suffering severe defects. For example, ghost code - code that is not commented out but will conditionally never be executed - is likely to be confused for real code in a reverse-engineering exercise. The facts are fantastic to have, but facts are not knowledge.

Tool 2: Seek the Jedi Masters

Somebody (well, somebodies) figured out how to automate the business. There are people behind the systems to which we're bound today. Why not put them back on the payroll? If they're still alive (always a good start), local and accessible, and grateful to the company for the income that put food on their table and their children through college, welcome them home. Techniques like value stream mapping bring them back to a business-operations mindset, allowing the business "why" in their heads to be extracted in a structured and coherent manner.

Of course, this isn't as simple as it sounds. Former colleagues won't come cheap. A knowledge worker who was forced out years ago may not feel inclined to share the wealth of their knowledge. The business will have evolved since the time these knowledge workers left. Corporate policies may also interfere with a re-recruitment campaign: one company I worked with forbade engaging contractors for more than 24 months, while another forbade contracting former employees at all.

You could also hire people who work for a primary competitor: In his book The Competitive Advantage of Nations, Michael Porter pointed out how industries tend to form in clusters; so if you're in an industry that isn't post-consolidation there's a good chance you've got a direct competitor nearby, offering a source of business knowledge you can recruit from. Again, this isn't as easy as it sounds. It's hard enough determining whether the people in our own business really understand the business "why" behind the things that they do, or whether they just know the complex motions they go through. It's even harder to do that with people grounded in another company's domain: if our business knowledge is in short supply we won't have the business knowledge to ask the abstract questions to gauge their comprehension of the business; plus, we may speak fundamentally different languages to describe their implementation. If their knowledge is too finely grained - that is, too specific to the context of our competitor - their knowledge won't travel: they're a subject matter expert in our competitor's operations, not in the industry. Plus, if our loss of business fluency was the result of corporate blood-letting, it's highly likely that our competitor up the street has done much the same, and will be no richer in domain expertise than we are.

One final word of caution is that we have to challenge the "why" the experts give us. Ten years ago, I was leading an inception for a company replacing their fleet maintenance systems. Their existing system was a combination of a custom AS/400 based RPG system (that had started life on a System/34), a fat client Visual Basic application, and thin-client Java tools, all of which required manual (operator) steps for data integration with one another. The user got to a step in the workflow in the VB application, then transferred data updates to the AS/400 and resumed there, then transferred data updates to the Java application and resumed there, all over a period of days or even weeks and often going back and forth. Their experts genuinely knew their business, but had grown so accustomed to the data transfer steps that they ended up baked into the initial value stream maps we produced. It took a lot of challenging the "why" on those specific portions of the value stream before they understood how a simple shared database would eliminate lots of no-value-added inventory control steps.

Still, maintaining a connection with the people who were there at the creation helps us identify the things so important for us to know if we're going to evolve or pivot from it. In much the same way as air traffic controllers are taught how to land planes in the event the software fails on them at a critical moment, former knowledge workers can help re-build our knowledge from the ground up.

Tool 3: Buy Before you Try

If you're on your way to becoming a zombie company, why not eat someone else's brain? Re-constructing a lost capability is expensive, so buying a competent operating company - along with its digital assets - is a shortcut. This assumes that you as the buyer can make an informed decision about the competency of the people you're acqui-hiring. It also assumes that the people in the acquired stick around after the acquisition.

A reverse-acquisition can take one company's girth and bloat and wed it to another company's core nimbleness and agility. But M&A is ego-driven: the CEO or board member who wants to do a deal will see the deal through regardless the state of the acquirer or target. A few years ago, I worked with a holding company that had bought two competing firms that collected data about banks and sold it on a subscription basis. As their product was becoming digital, the value of the data they sold was plummeting (as most data tends to do when it becomes digital), so we helped them define a strategy to combine the companies and transform them from providers of data to providers of digital tools. Three days into the inception, we were frustrated that the workshops had ending up with incomplete and unsatisfactory levels of detail. We hypothesized the reason for that was because the experts weren't all that expert. On day 4 we ran a series of experiments in our workshops to test this hypothesis, and in the process confirmed that the activities they performed in the acquisition, curation, publication and distribution of the data they sold were performed for reasons of rote, not reason. The inception was successful in that it exposed am inability to execute on the strategy in the manner they had hoped to do, which led to an entirely different approach to execution.

Buying is a shortcut, and as Murphy's Law teaches us, a shortcut is the longest distance between two points.

More modestly, we can simply license technology to replace major portions of legacy systems, and train or hire experts in that technology. This, though, substitutes solution knowledge for business knowledge, and the prior isn't necessarily a proxy for the latter: even though commercial ERP systems have largely replaced home-grown ones, those commercial solutions are highly customized to the needs of the business.

Tool 4: Play Calvinball

We're barraged by business media to be internal drivers of digital "disruption" because it puts our competitors at a disadvantage, challenging their leadership by forcing them to chase after us. But disruption is also a means of rebuilding lost business knowledge: if we change the rules of the game, we're less restrained by our current assets and procedures. The more we can change, the more we set a new agenda in the competitive landscape. Ideally, we should be playing Calvinball, and make up the rules as we go along.

Disruption is a tool, not a solution. Disruption may be constrained by prevailing legislation and regulation, while regulators tend to look at established firms differently from upstarts - if they look at them at all. In the wake of the 2008 financial crisis, bank lending declined in response to higher capital requirements against risk-weighted assets and tighter lending standards; marketplace lenders skirted balance sheet restrictions and lending regulations simply by not being chartered banks. This allows marketplace lenders to underwrite loans with much more flexibility than a bank. The door to this type of disruption was closed to banks. As with Calvinball, it is the player with the ball who makes the rules, and banks (like many other regulated businesses) aren't the ones holding the ball.

Plus, when we build on existing business rules rather than replace them, we're not moving away from a dependency on fundamental knowledge that we don't have. Re-imagining how an existing offering is packaged, distributed or even consumed doesn't alleviate the need to understand the core characteristics of that offering.

Making the Best of a Bad Hand

Re-gaining lost business knowledge is a slow, sometimes difficult, and usually expensive proposition. Pushing too hard to re-acquire it is like beginning to learn calculus and non-Euclidean geometry the night before a comprehensive final exam: a grade of D- would be a small miracle. But, since strong business knowledge is key to executing any business strategy pursuing growth or evolution, a grade of D- isn't going to cut it.

Worry less about the slow rate of re-acquisition and think instead about where you want your business fluency to be in 6 months, 12 months, and beyond, and how much more effective your organization will be at those times. That guides the extent to which you employ each of the four techniques described in here and how they get you to a greater state of fluency so that you can operationalize the business strategy. For example, contracting legacy language developers to capture encoded logic and hiring in a couple of retired employees for value stream mapping sessions, all in exchange for donuts and a fat payday for a few months, may be an effective and inexpensive precursor to an acquisition, or provide suitable grounding to initiate disruptive change that re-writes the rules of an industry.

This requires us to prioritize organizational learning along side operating performance and delivery goals. The latter two are quantifiably measurable and glare at us from our financial statements; the prior is not and does not. A commitment to learning is an investment that needs board-level visibility and air cover: without the learning there is no execution, and without the execution the strategy is just an elaborate Powerpoint. Board-level patience isn't infinite, so in exchange for an investment in learning, line management will have to commit to strategic execution - even if it has to commit to execution before it has re-learned as much as it would like.

The alternatives are to be acquired (sooner rather than later, at peak value for the book of business the company still commands) or slow obsolescence (and concomitant market irrelevance). Since this gives the people in the company a fighting chance, trading a commitment to learn for a commitment to strategic execution is a fair exchange.

Tuesday, January 31, 2017

Where Has All the Business Knowledge Gone?

I was queuing for a flight late last year when two people standing behind me started talking about how disappointing their trip had been. They were in consultants in logistics, and they were lamenting how one of their clients was struggling in the wake of a business process change that another firm - a tech consultancy - had agitated for their mutual client to adopt. The client in question purchased and warehoused perishable items, hundreds of thousands of different SKUs that they distributed to retailers ranging from tiny independents to large global chains. The distribution operation was built on efficiencies: fill delivery trucks to a minimum of 80% capacity and deliver products to customers on routes optimized for time and energy consumption. Cheap distribution keeps product prices down, which makes their retail clients more competitive. The up-start tech consultants pushed for flexibility: more frequent product deliveries made to the same stores throughout the day would keep shelves stocked, so they could better match stock to store. If there's a run on one item it can be replenished much sooner, resulting in fewer lost sales. Unfortunately, more frequent deliveries required more frequent truck dispatch; trucks could only be dispatched more frequently if they spent less time being loaded with products, so the load level of a truck fell to below 50% of capacity; expedited dispatch also meant ad-hoc rather than fixed routes, which resulted in driver confusion and receiving delays that translated into higher energy and labor costs of distribution. The intra-day re-stocking didn't capture enough revenue lost due to shelves being empty to justify either lower wholesale margins or higher retail prices.

The two standing behind me were exasperated that their client "listened to those [other] consultants!"

Distribution is not a high-value-added function. Distribution can get stuff form one esoteric locale to another, but that isn't the miracle of modern supply chains. Their magic they create is doing so for mere pennies. Cheap distribution can make something produced tens of thousands of miles away price competitive with the same something produced a few doors down the street. Distribution is about efficiency, because efficiency translates into price. When you're distributing hundreds of thousands of different SKUs, you're a distributor of commodities, and whether toothpaste or tungsten the vast majority of commodity purchases are driven by price, not convenience. Capturing incremental lost sales sounds like a good idea until it meets the cold, hard reality of the price sensitivity of existing sales.

This got me to reflect: why would anybody in a distribution business agree to do something so patently counter to the fundamental economics of their business model?

They're not a company I'm doing business with, and I didn't strike up a relationship with the frustrated consultants, so I don't know this specific situation for fact. But I've seen this pattern at a number of companies now, and I suspect it's the same phenomenon at work.

The short version: companies have forgotten how they function. They've lost institutional knowledge of their own operations.

We've been automating business calculations since the tabulator was introduced in the late 19th century, and business processes since the debut of the business computer in the late 1940s. Early on, business technology was mostly large-scale labor-saving data processing. It wasn't until the 1950s (well, more accurately, the rise of COBOL in the 1960s) that we really started capturing business workflows in code. Although few could appreciate it at the time, this marked the beginning of businesses becoming algorithm companies: all kinds of rule-based decision making such as accounting was moved into code; complex rules for functions like pricing and time-sensitive decisions such as trading quickly followed suit. As general purpose business computers became smaller and cheaper in the '70s and '80s, the reach of computer technology spread to every corner of the organization. As it did, business workflows from order entry to just-in-time manufacturing to fulfillment were automated.

The work that had previously taken people days or weeks could be done in minutes or even seconds. People with intimate knowledge of the business were liberated from computational and administrative burden. The business could grow without adding staff, and suffered fewer mistakes for that growth. Computer technology fueled labor productivity throughout the '80s and '90s as more and more business processes were automated.

Then something happened that went largely un-noticed: those business people who had devised and performed the manual processes that defined the software solutions built in the '70s and '80s retired. It went unnoticed because their knowledge was captured in code, and the developers knew the code. And, the new business people hired in to replace the old were trained in how to run the code, so there was no interruption in normal business operations.

Then the original developers disappeared, either because they aged out, got better opportunities with other firms (tech people tend to change jobs frequently), or got caught up in the offshoring thing in the early 2000s. No matter how it happened, the original developers left the scene and were replaced by new people.

At this point, the cracks started to appear.

Business people knew what they did with the code and tech people knew what the code did, but neither knew why. While regular operations didn't suffer, irregular operations caused fits because nobody knew what a measured response to them was. These led to bad decision-making about the software. Among other things, the new business people didn't know the simple protocol their predecessors followed to contain a crisis. While new people had the tactical knowledge to execute tasks in the software, they didn't know how use the software in tandem with manual procedures to efficiently respond to an irregular operating situation. On the other side of the knowledge base, the new tech people didn't know why extreme scenarios weren't accommodated in the code. Again, without the meta knowledge of how to procedurally minimize a crisis inside and outside the software, they had to defend why the software wasn't "robust" enough to deal with this crisis. Since anything and everything can be codified, they had no ready answers, nor could they chart a procedural course outside of code.

With management screaming about escalating costs and poor customer service and making assurances to higher-ups that This Will Never Happen Again, the decision was made for them. So the software bloated with new rules, complexity (If This Than That), and features rarely invoked to make the software very well prepared to respond to the last crisis. Of course, given the nature of irregular operations, it wasn't entirely accommodative to the next crises. Thus went the progressive cycle of code bloat and fragility.

Once the old code became so cumbersome and brittle that executives were terrified of it, they were compelled to sponsor re-invention-through-technology initiatives. These immediately defaulted into feature parity exercises because nobody on the re-invention team had sufficient business context to imagine the business differently from how it operated (tactically) today. Because the new generation of business people had never been required to master the business rules in the same way that a London taxi driver had to have the knowledge, business users were beholden to yesterday's status quo as codified in the software that ran the business. In addition, these replatforming exercises were characterized by a shift in authority: the business people executed the rules, they didn't shape them. Tech people were the ones who manipulated the code behind the rules; they were the new shapers. Tech, not business, are the authority in replatforming initiatives.

The depletion of business knowledge and the shift of the curation of that knowledge from business to tech leads to the scenario described above: no resident adult present who can authoritatively explain why flexibility would blow out the economics of a mature distribution utility. While tech people are great at compilers and build pipelines, they're crap at business economics. Without a meta understanding of business operations, a re-invention or re-platforming initiative will be little more than a high-cost, high-intensity exercise that gets the business no further than where it is today.

I've seen plenty of companies where business understanding has been depleted. Re-learning the fundamentals is an expensive proposition. So how do we re-build a lost institutional memory? We'll look at the ways of doing that next month.