I consult, write, and speak on running better technology businesses (tech firms and IT captives) and the things that make it possible: good governance behaviors (activist investing in IT), what matters most (results, not effort), how we organize (restructure from the technologically abstract to the business concrete), how we execute and manage (replacing industrial with professional), how we plan (debunking the myth of control), and how we pay the bills (capital-intensive financing and budgeting in an agile world). I am increasingly interested in robustness over optimization.

I work for ThoughtWorks, the global leader in software delivery and consulting.

Thursday, June 30, 2016

In Tech, Portfolio Management is a Metaphor, Not a Way of Life

A few years ago, I wrote about the chronic misuse of the word "governance" in technology. The word "portfolio" is suffering the same fate.

The reason for introducing the word "portfolio" into tech management is that some portion of tech spend is really an investment in the business, something that differentiates it and gives it a competitive edge - for example, mobile or web client-facing apps, or internal software that codifies workflows to capture efficiencies. This is fundamentally different from utility tech, the basic technology a company needs to function, such as ERP or email. To get a better return from our strategic opportunities, we should think in terms of investing-like behaviors and outcomes, as opposed to traditional project-oriented ones (such as time and budget). The word "portfolio" enters the lexicon because when we have multiple technology investments, we logically have a collection of them. Since most firms have more opportunities for investment than capital to invest, there is a degree of selectivity of what gets included in the investment portfolio.

While I was writing chapter 8 of Activist Investing in Strategic Software, it occurred to me that the use of word "portfolio" in technology has increased in recent years. Unfortunately, the activity described by aspirant "technology investment portfolio managers" are a very small fraction of those characteristic of financial portfolio management. The principal problem is that the word "portfolio" suggests a degree of decision-making flexibility that the captive technology investor doesn't really enjoy. Consider:

  1. Although we can create diversity of our investment outcomes, the strategic software investor is limited to a single investment type - e.g., an operationally-based delivery. A financial investor has many more choices of vehicles and outcomes than a captive investor does. There are no fixed income products available to the captive technology investor, it's all equity. Plus, although we can run multiple experiments to qualify an investment and accelerate our delivery frequency to get things in production faster, all of our investment positions are inherently long. The only short position we can take in something is not to invest.
  2. While there are countless investment opportunities, it's rare that a company can pick and choose every investment it makes. Some investments are forced on it by regulation; others by competition; still others by reliability of dilapidated legacy assets; and sometimes because the boss says this or that is what we're going to do and that's all there is to it. A captive technology investment portfolio isn't as discretionary as a financial one.
  3. Our investment goals are denominated in different and sometimes only quasi-economic measures of value. All financial investments denominate their performance in the same measure, even those that have explicit non-financial goals. As I've written before, it's futile to concoct synthetic measures like "business value".
  4. Most often, we can't measure the impact of any given investment in isolation of all other changes a business makes to itself and those that happen in its commercial ecosystem: we change the economics, processes, technology, and policy of our business all the time, while our commercial partners are also making the same changes to theirs. Isolating an outcome to a single decision or action (like a specific tech investment) is very difficult. In addition, because counterfactuals are unprovable, we can't measure whether an alternative investment would have yielded a better or worse outcome. In contrast, we can measure our results of financial portfolios against how well "Hindsight Capital Partners" performed over the same time period.
  5. Venture capital is a high risk business that has more misses than hits. The success rate doesn't improve when the VC is corporate rather than financial. A company doesn't have unlimited capital to experiment at scale and can't afford to have a low success rate on investment decisions.

As much as we may want to be investors, the portfolio metaphor is very limited in captive investing situations. It lacks diversity, is inherently imprecise, and both performance and competency as investors are as much a matter of opinion as fact. It's potentially dangerous, too, because managed poorly it can damage its operational solvency (that is, the capability to get things done through technology) by making a mess of its capital management.

The premise of the book I'm writing - that activist investing behaviors yields better performing tech investments - is also substantially a metaphor. There are real aspects to it, specifically that shareholder activist behaviors - being investigative, inquisitive, interrogative and invasive - are highly suitable to captive technology investing. But "shareholder activism" only goes only so far: we can't buy out other "investors" and we don't gain control of the board through proxy fights. We have fewer levers to throw to change outcomes, and virtually all of them are operational (process, scope, personnel) in nature: few are the captive investments that can show better performance on the basis of financial engineering alone.

While metaphors are limited, they do help us to interpret our world differently. When we think in investment terms, we see financial expectations and possibilities much more clearly than when we think strictly in operating terms. When we think in shareholder activism terms, we understand the importance of good governance structures and mechanics, and the need for diligence by those investing in the business through technology. Interpreting delivery of strategic software through these lenses adds dimensions that make the operations that create them more value-generative to their host businesses.

But we do ourselves no favors by getting carried away with it. As helpful as the portfolio metaphor is, it's just a metaphor - not a way of life.

Tuesday, May 31, 2016

The Times, We've No Idea How Much They're Changin', Part III

In the last post, we looked at the changing relationship between people and possessions, particularly how the perception of land has changed. But there's more to this than just changes in dwelling and mortgage finance. Land was part of a land-labor-laborer troika, a relationship that has existed since the dawn of humanity. Land could feed and protect the laborer and be a means to a better life, but not without expending labor to till the soil and construct buildings. Just as the perception of land is changing, the perception of labor is changing as well, which erodes the relevance of this troika on an individual level.

Over the course of centuries, labor evolved from a means of subsistence, to a means of income, to a means of achieving economic prosperity (through entrepreneurism and meritocracies). It also was a source of individual identity and self-worth: to be employed communicated one's ability to contribute to society. Yet labor has taken a beating in recent years. A two decade period of nearly uninterrupted growth that began in the early 1980s made labor for the real economy king by the late 1990s (peak employment and household income levels), but a couple of recessions soon thereafter meant it went from being scarce to abundant before the first decade of the new century was out. Meanwhile, real economy jobs lost favor to jobs created by ever-cheaper capital, through things like venture-capital subsidized start-ups, property flipping, and day trading. These latter jobs produced far more lottery winners (remember equity options in the 1990s?) than their real economy counterparts. Other factors, like the erosion of job stability, the replacement of pensions with 401(k)s, and more recent worry over the gig economy and robots and automation displacing real economy labor, have cast labor as transient and eroded it's perception as a cornerstone of societal durability.

Labor has gone from champ to chump in less than 20 years. It's no wonder every spring commencement speech contains the obligatory "pursue your passion" line: the not-so-subtle message is that labor on its own is no longer a thing of fulfillment.

This belies the idea that labor is a vehicle to achieving quality of life. Because real wages have been stagnant for nearly a generation, there's less perceived value to labor. If you don't feel that your lot will improve through your labor, you're less inclined to labor for anything better.

Not to mention that for many people there's less urgent need to labor. Previous generations accumulated a lot of wealth, creating familial support nets. Empty-nester parents welcome their children back to live with them in cavernous houses (compared to the small houses of their grandparents and great-grandparents where they couldn't return even if they had wanted to). The cash they accumulated through investing has become inter-generational transfer payments. Why work to meet your own subsistence needs if you don't have to?

Labor, which was once at best a key to individual freedom and at worst a denial of recreation time, is being cast as indentured servitude (we're forced to work by the system) or an emotional prison (the job isn't someone's passion, it's just the job they could find).

This points to change in the definition of what freedom is.

In the 1960s, we had a generation push back on going to war because it didn't understand why they should have to (e.g., go fight a proxy war in a far away land). Today, we have a generation pushing back on going to work because it doesn't understand why they should have to (e.g., if we have so much food and so much stuff why do we have to solve the same problems over and over again that just lead us to more food and more stuff?) Freedom is becoming independence from the need to solve basic problems long ago solved like transportation and food to pursue Maslow-like self-actualization. Freedom is no longer something each person achieves individually by "working the land", it's provided by a sophisticated, intricate system that charges rents in exchange for alleviating burdens - both cost and time - of ownership.

We started this blog series with the observation that change can be a long time coming, particularly because economic habits die hard. We'll still have household formation, individual property ownership, and most of us will have to work for a living for a long time to come. Tax receipts are substantially derived from income. And the land-labor-laborer troika has been displaced before (productivity increases during the industrial revolution de-valued labor for two generations) only to find its way back (the rise of America as an economic power in the 19th century). Yet these ideas around labor are in the ether, to a point that Switzerland will hold a referendum next month whether or not to provide a universal basic income for every citizen.

Which brings us back to where we started. The winds of change blowing today would alter fundamental economic relationships that have been around for centuries, and the technology is here now to make them practical realities. The technology exists today to change shelter, transportation and investment activity of an individual to allow them more free time to pursue personal fulfillment, but that's no guarantee that a critical mass of people will in sufficient numbers, or will do so before societal winds change direction. The times, they have a-changed, and they'll continue to do so. Change happens at a societal pace, not a technological one.

Saturday, April 30, 2016

The Times, They're Still A-Changin', Part II

I ended the previous post by stating that the stage is set for more radical change. Why?

Consider the changing attitude toward land, property and shelter.

For the pre-1965 generation, land meant a lot of things. It was wilderness to be fought with, to be made into a suitable place to live. It was where you built your shelter. It was how you earned or supplemented your living, by farming, mining, logging, or guiding. It was sweat: always one more addition, improvement, or repair to be made. It was the story of America as taught to schoolchildren: land was the reason why the Pilgrims came, and "taming the land" was said to bring out the best in the early settlers who were themselves held out as American heroes and role models. It was very personal history, too: settlers intertwined family stories - and legends - with the land itself.

Land was freedom, and property ownership was independence.

Property was extraordinarily important to the American psyche. A great deal of desirable land remained remote and undeveloped until well after the Second World War (the Interstate network wasn't begun until the Eisenhower administration). Familial bonds with land were strong, particularly where property passed from generation to generation. Schools taught the history of European settlement, including battles with indigenous peoples and noteworthy settlers. A lot of the materials, tools and trade were similar to those employed by generations past, so people could relate to how their grandparents had lived. Activities like camping and hunting gave young people first hand experience of how the European settlers and indigenous people lived, reinforcing the perception of land as well as the myths about it. Land was a great investment ("they're not making more of it" as the saying went), appreciating in value virtually everywhere and almost without interruption. In part, this built up a perception of value in land. And in part, it was a reminder that you weren't too far removed from the rough-and-tumble of the wilderness.

Land no longer captures the American imagination quite so much. More people live in dense urban areas, a large number of them rent, and those who do own expect to move - either upsizing or downsizing - long before their mortgage matures. People who bought property from 2002 through 2008 suffered financially and emotionally with the housing collapse. High crop yields create less demand for farm acreage and farming families. Land has been repurposed: farms near urban areas were more valuable as residential subdivisions, and previously remote rustic areas have been developed into recreation communities or suburbs. Building and zoning regulations restrict what property owners can and cannot do with their land. Early to mid 20th century industrial manufacturers favored rural or suburban owners with the space for industrial products; 21st century providers of services and digital products - that is, where the American economy has been shifting toward for the last 40 years - favor densely populated urban centers. History classes emphasize the high cost - war, disease, resettlement - borne by indigenous peoples at the hands of European settlers. Activities like camping are now either cheap vacation choices (ever notice how many private campgrounds there are near floating casinos?), or some combination of tests of strengths balanced with stewardship.

Land is no longer freedom. Renting is freedom: renting allows you to have many different living experiences and gives you the freedom to change your living accommodation based on your lifestyle, rather than having your lifestyle dictated by the land. Land is red tape and well rehearsed ceremonies to purchase plots or pull permits; it ties you down to a mortgage and a location.

The changing perception of land also reflects the fact that shelter and sustenance - things directly related to land ownership and management - are problems long ago solved on a mass scale. There is little value in rehashing them again and again on an individual basis, when we could use our life's energy to solve the next wave of challenges, from sustainability to space exploration.

There's been a similar change in attitude toward another symbol of independence, the automobile. Because it was a way to get away ("freedom of the open road") or quite literally a means of getaway (think John Dillinger, or Bonnie and Clyde), the automobile captured the American imagination. But the automobile has changed from a symbol of freedom and possibility to one of captivity (monthly payments) and inescapable utility (suburban communities aren't designed with walking in mind, and suburban mass transit is inefficient). The car that sits idle in the garage nearly 99% of the time isn't untapped potential, it's a tax of modern living.

The things which were the physical incarnation of freedom for prior generations have become symbols of economic entrapment to newer ones. Per the previous post, technology enabling things like the "sharing economy" aren't leading change as much as they're riding the changing wave of sentiment.

This wave has a long way to go before it crests. The shift in attitudes toward land and transportation portends a change in asset ownership and associated activities like lending and insurance that we've long taken for granted. That doesn't mean a concentration of assets in the hands of a few owners: technologies like blockchain make it easier to fractionalize ownership. This will allow people to invest in small fractions of many residential properties bundled into a single instrument, and do so incrementally over a long period of time. In essence, they would live in a rented house, but own small fractions of many others. Just as people have shown a preference for passive over active investing, future generations may find it appealing to be able to invest in residential real estate without the need to mortgage future cash flows for a specific spot of dirt in an asset appreciation lottery.

Of course, that's all "some day". But change is afoot, and the stage is set for more still more that goes beyond assets, to the nature of labor itself. We'll look at that in the next post.

Thursday, March 31, 2016

The Times, They Have A-Changed, Part I

"This technology revolution was not invented by robo-advisers. They have simply noticed, and taken advantage of, a broader and deeper shift towards passive investment through ETFs and index funds."

-- John Gapper, Robots are Better Investors than People

We like to think of "technology revolutions", but as Mr. Gapper points out, revolutions aren't led by technology. The landscape is littered with shuttered technology companies that showed that a thing was possible, but failed to foment a "revolution".

Revolutions happen once we have critical mass of people with different attitudes and behaviors. Consider the changes in investing behaviors referred to above. Once investors realized that actively managed funds charged higher fees but performed no better (and frequently worse) than passively managed funds, they switched. Today, as investors come to realize that financial advisors are charging fees for putting their money in passive funds, they're amenable to robo-advisors that charge lower fees for doing exactly the same thing.

A change from human advisor to robo-investing won't happen at the pace set by technology, however. It took a long time for investors to change their preference from active to passive funds. Index funds first appeared in the 1970s, as did research showing that active funds didn't consistently outperform the broader market. Yet it took decades for investors to invest differently.

Why did it take so long? Attitudes, prejudices and predispositions take a long time to change. Sometimes they don't: those who hold a belief may never let it go, even in the face of overwhelming evidence to the contrary. And, people financially, politically or emotionally invested in something will fight to preserve it. Active fund managers initially derided passive funds, while today, facing massive capital outflows, they're fighting for survival. Those who stand to lose from change will also fight back with marketing designed to co-opt the next generation, such as the way manufacturers of sweet fizzy drinks simultaneously play to an older generation's nostalgia while encouraging them to create a sentimental moment - vis-a-vis their product - with a grandchild.

No matter how much technology we throw at something, entrenched behaviors don't start to change until a generation comes along that isn't as emotionally committed to them. And that still only describes how change starts, not how it finishes - and it can take a very long time to run its course. To understand the dynamics of change, we need to look at both ends of the generational spectrum: as people enter, people also leave. This is most obvious in the workforce where, in general, people join around age 18 and leave around age 65.

The United States has just completed a major generational shift, not so much for the one arriving as the one that has recently left.

The Great Depression and the Second World War shaped American domestic and foreign policy well into the 1990s. Yet as long ago as 1965, America had reared a generation without any direct experience of either, making them less constrained by the values held by the people who had come before them. And, starting 1965, a generation began to arrive born to parents who were themselves bred following the Depression-and-WW II-era. Prior to 1945, most everybody had direct experience to the privations of one or both. After 1965, we had generations grow up in households where those two seminal events were things their grandparents told them about from time to time, and which they only briefly studied in high school history classes.

Despite the social upheaval that coincided with the maturation of this post-depression-and-war generation (the late 1960s), the value system of the pre-1965 generation dominated American society and the American workplace, first through sheer numbers (those who held it made up the bulk of the working population) and later through positions of seniority (older people were more likely to hold executive positions).

The numbers are now vastly different. People born in 1945 reached age 65 in 2010. There are very few in the American workforce with a direct experience of life during WWII, let alone the Great Depression. Nor are there too many who are just one generation removed from those events (that is, grew up in households directly influenced by them); those who are one-generation-removed will largely exit the American workforce by 2030.

It's no coincidence that we've seen more change in the last decade than we arguably did in the three previous decades combined. But not so much because of a new generation arriving, bringing with it new expectations and demands, as much as the old generations leaving and relinquishing the top rung of authority and social influence. Out of numbers and out of power, those value systems no longer hold sway. Although we live and work in a "post-1965" world today, it took over 40 years - two additional generations - for that to happen.

Because change is a function of society more than technology, it's slow in coming but swift once it arrives. And, while a lot of change happened with the completion of the pre- to post-1965 shift (at least, in the workforce), the stage is set for still more revolutionary change. We'll look at specific examples in the next post.

Monday, February 29, 2016

How an Operational Gap Becomes a Generation Gap Becomes a Valuation Gap

A decade or so ago, when an IT organization (captive or company) hit rock bottom - bloated payroll, lost confidence and ruptured trust resulting from rampant defects, rocky deployments, functional mis-fits, and long delivery cycles - it would give anything a try, even that crazy Agile stuff. It didn't matter if it was incumbent management with their backs against the wall, or new management brought in to clean house, desperate times called for desperate measures. To people looking to shake things up, Agile made intuitive sense and a lot of it was appealing, even if its proponents had a bit too much evangelical zeal and some of it sounded a little strange. So they'd bite the bullet and do it, selectively anyway: a build server was easy to provision; releasing software for testing every couple of weeks was done easily enough, and getting everybody in the same physical workspace (or at least in close proximity to one another) could be done with a little arm-twisting; of course, developers were instructed to only pair program "opportunistically", and automating QA was a fight nobody wanted to fight, so they'd sit with the team and test incrementally but go on testing the way they always had. Still, even with compromises, there was a good story to tell, usually to do with much higher quality and eliminating rework, and that made Agile A Good Thing for all parties concerned.

Fast forward to today, and we see that Agile is much more ambitious. A few short years ago we were content to have an automated build execute every few minutes; today we want every check-in to trigger a build that is promoted through progressively more complex tests in virtual environments managed, instantiated and torn down by scripts. We used to be content releasing for user testing every other week, and to production every couple of months; we now aspire to release to production many times a day. We used to want Master Story Lists to guide incremental development; today we want to iteratively experiment through code and have the feedback inform requirements definition and prioritization. We used to be satisfied with better delivery of internally-facing software; today we want to evolve software products that are used by people across our ecosystem, from interested parties to customers to investors to employees. Today, Agile wants to push against everything that creates an artificial or temporal constraint, be it organization, management, accounting policy, or even capital structure.

Although Agile has evolved, the entire tech world hasn't moved with it. In fact, some of it hasn't moved at all: it's still common to see non-Agile organizations that do big up-front design; work in functional and skill silos; have manual builds, deployments and testing; and make "big bang" releases. And, it's still common for them to face a "rock bottom" moment where they conclude maybe it's finally time to look into this Agile stuff.

As hard as it was a decade ago to inject Agile into a non-Agile organization, it's much harder today for a non-Agile organization to complete a transformation. This seems counterintuitive: since the non-Agile to Agile path is so well trod, it should be much easier than it was in those pioneering days of yore. Although there's never been more tools, frameworks, languages, books, blogs, and countless other resources available to the individual practitioner aspiring to work differently, organizational change tends not to be self-directed. The challenge isn't taking an organization through the same well-established game plan, it's finding the people - the transformational leaders - who are willing to shepherd it through its journey.

Unfortunately, re-living the same internal negotiations to reach the same compromises, solving technical and organizational problems long ago solved, only to end up with an operating model that is considerably far short of the state-of-practice today is not a destination opportunity for an experienced change leader. Even assuming, as my colleague Alan Fiddes pointed out, that the change agents brought in still have the vocabulary to carry on arguments last fought so long ago, any change agent worth their salt isn't going to reset their career clock back a decade, no matter the financial inducement.

This might simply mean that the approach to change itself is what has to change: require far less shepherding from without by expecting more self-directed change from within, brought about by setting the right goals, creating the right incentives (motivate people) and measuring the right things (what gets measured is what gets managed). Why shouldn't it be self-directed? It isn't unreasonable to expect people in a line of work as dynamic as software development to keep their skills sharp and practices current. For people leading an organization that's a little dated in how it develops software, then, the question to hold people to isn't "why aren't you doing Agile" but "we're going to deploy any time and all the time effective fiscal Q3, so how are you going to operate to be able to support that?" It's amazing what people will do when sufficiently motivated, change agents be damned.

Whether there's a more effective means of change or not, being far adrift of the state of practice points to a more severe threat to the business as a whole: a generation gap.

* * *

Three decades ago, the state of practice didn't vary that much across companies. Yes, there were people coding C over Rdb deployed in VMS on minicomputers and people coding COBOL over IMS deployed in OS/390 on mainframes, but the practice landscape wasn't especially diverse: waterfall prevailed and a lot of code was still data-crunching logic run in batch. At the time, captive IT, consulting firms, governments, new tech firms (think Microsoft in the mid-80s), and established tech stalwarts (H-P, IBM) could reasonably expect to compete for the same labor. College grads in Computer Science or Management Information Systems learned practices that reinforced the modus operandi common to all consumers of business computing.

The landscape has changed. Practices are far less homogeneous, as they've had to evolve to accommodate a diverse community of interactive users pushing for features and functionality with little tolerance for failure. The familiar combatants still slug it out for labor, but must now also compete against tech product firms untethered to any legacy practices, norms, policies or technologies. Today's grads are trained in new practices and expect their employer to practice them, too.

Companies lagging behind in their state of practice will struggle to compete for newly minted labor: why would somebody with highly marketable tech skills go to work at a place stuck in the past, when they can work in a more current - even cutting edge - environment?

This isn't just a hiring problem. A practice gap is fuel for a generation gap if it deflects young, skilled people from becoming employees. By failing to hire the next generation employee, a company develops an intrinsic inability to understand its next generation customer.

A company isn't going to reach a new generation of buyer - consumer or business - if it is tone deaf to them. A company ignorant of the next generation's motivations, desires, values and expectations has little chance of recognizing what it isn't doing to win their attention, let alone their business. Since social systems are self-reinforcing, a company is unlikely to break the deadlock of ignorance and inaction.

Failing to bridge a generation gap not only cuts a business off from growth opportunities, it sets the stage for long-term irrelevance. Investors recognize this, even when management does not. Growth changes from being a "risk" to being an "uncertainty", and when that happens a company's future1 is no longer priced at a premium, but a discount. In this way, an operational gap becomes a generation gap becomes a valuation gap.

Outdated practices are an indicator that management has it's head buried in the sand: it has a problem it can't see, that it doesn't know how to solve, that is starved for information it can't get because it has elected to disassociate itself from its source. The motivation to change how you practice shouldn't be to become more competitive today, but to still be relevant tomorrow.

 

 

1 By way of example, Yahoo net of its Alibaba holding and cash has frequently been valued at or near $0 by investors in recent years.

Sunday, January 31, 2016

Are Microservices to Ecosystems as Core Competencies were to Conglomerates?

As far back as the 19th century, industrial firms pursued vertical integration strategies. The thinking was that by owning the supply chain from raw materials to retail outlets, a firm had direct control over its entire cost structure, making it better able to squeeze efficiencies out of it and being less susceptible to supply shocks. This was important because, for large industrial firms, competing on price was the primary strategy for winning market share.

During the 1950's and 60's, companies also pursued conglomerate strategies: bringing seemingly unrelated businesses under one roof, sometimes seeking synergies (as Sears did owning a retail merchandiser and retail brokerage - "buy your stocks where you buy your socks"), and sometimes not (as LTV did owning a steel company and an airline). The rationale for the conglomerate was entirely financial: cheap (mostly debt) capital allowed large companies to grow through acquisition, and regulators were less likely to block acquisitions of unrelated firms on monopolistic grounds.

By the 1980s, both strategies had begun to lose favor. The financial benefit had evaporated: high interest rates clobbered the profits of debt-fueled acquisitions and forced divestiture. But the operating benefits weren't there, either. Different types of businesses (manufacturing, distribution, retail) require different types of leadership and have very different cultures. And, within each of those businesses, some functions are differentiating (such as fleet optimization for a logistics company) while some functions are not (nobody beats their competitors by having a superior accounting back office). Management thinking embraced "core competencies": own and hone the things that differentiate, rent and commoditize the things that do not. This also allowed for better matching of capital with company: the risks and returns from a company that owns a fleet of railcars is easier to assess than the risks and returns from a company that owns ore mines, railcars, and foundries. By breaking them up, the individual investor can choose what types of businesses to expose their capital to (a raw materials company, or an asset company, or a refining company), and the pricing of that capital more accurately reflects the risks.

Tech firms today are embracing the "vertical integration" and "conglomerate" strategies of yore by positioning themselves as "platform" and "ecosystem" companies. The thinking is that by combining multiple and diverse capabilities into a single offering, a company creates both efficiencies and synergistic value for counterparties in some activity, such as crowdsource funding or payments. The ecosystem strategy takes this even further, combining unrelated capabilities under one roof (eBay buying Skype in 2005, SAP developing HANA in 2010, Facebook buying Oculus in 2014), often justifiable if only because digital commerce is still in its infancy and nobody is really sure what's going to work and what's not.

But what if you could extract utility-like functionality from within an ecosystem into an independent company? Take payroll as an example: rather than have every Human Resources platform company need its own team of people to write and maintain state and federal witholding rules, hive those off into an independent business that makes them available as a metered service offering, charging a tiny usage tax (say, $0.001) each time it's invoked. The technology to do this is certainly available: code the algorithms as a microservice or a blockchain smart contract, and deploy them in an elastic cloud environment (as usage will tend to spike with pay cycles).

To the HR platform company, there are a lot of good reasons to do this. It monetizes a non-value generative activity (nobody subscribes to a payroll system because their implementation of the witholding rules are better than everybody else's). It throws off utility-like revenue streams that move in lock step with the broader job market. It disaggregates a staid HR utility function that needs to be deployed infrequently (potentially only as often as once per year, when new tax laws come into effect) from more innovative ones that are more exploratory in nature and demand frequent deployments (such as time entry performed through emerging wearable tech). It separates a legacy and risk-averse tech culture from a cutting-edge risk-prone one. It takes a high fixed cost for maintaining a non-differentiating characteristic off the P&L (teams maintaining rule-heavy legacy code are rarely inexpensive). It's stable cash flows would be attractive to debt finance, better aligning capital investment in HR technology. It removes asymmetric risk that can be more accurately insured (smothered in a platform, correct calculations offer no financial upside, while a faulty calculation exposes it to costly litigation and reputational damage).

True, it eliminates a barrier to entry of future competitors. And, while the market would support a handful of utilities to prevent monopoly, thin competition would give those utilities oligopic pricing power. But it creates a one-time financial windfall for the first movers, laggards would be pressured to subscribe by shareholders demanding the same structural benefits to the income statement, and low switching costs would keep utility pricing power in check.

Given that tech is in a period of both cheap capital (interest rates remain low, VC levels remain high, and companies such as Alibaba and Facebook can make acquisitions inexpensively with their own high-priced shares) and rapid growth (growth in consumption of tech products such as social media today mirrors growth in consumption of manufactured goods in the 50s and 60s), it's little surprise that we're seeing a return to industrial strategies past. But technologies like microservices and blockchain could be the modern equivalent of "core competencies" to sweep through businesses. Blockchain proponents already champion the potential of decentralized autonomous organizations (DAOs). With MBAs now eschewing investment banking in favor of tech companies, financial engineering of this nature isn't too far away.

Thursday, December 31, 2015

Counterfactuals

Earlier this year, my house should have burned to the ground.  A CR2032 battery exploded and caught fire in a confined place dense with flammable objects.  But my house didn't burn down: at the moment the battery exploded, I was sitting a few feet away from it. I heard a loud bang, investigated, and stamped out the fire within a few seconds.

I wasn't planning to be there at the time. A series of minor reschedules and reprioritizations had me sitting in my home office at the moment the battery exploded. I happened to be in the right place at the right time to prevent something from happening.  It was business as usual for the rest of the day.

There is no law or reason governing why I was there at the moment the battery exploded, or for the battery to have exploded at precisely that moment.  What if my schedule isn't rearranged and I'm not home when the battery explodes? What if it happens hours earlier or later when everybody is asleep in the house? I can't help but think that there are more quantum realities where something bad happened than there are where something bad did not happen.

But that isn't necessarily true.  The battery could have exploded at a time when nobody was around to put out the fire, but the ensuing fire may have simply died out causing no damage.  And maybe the chemical processes that triggered the explosion was itself an anomaly of circumstances set in motion by my presence in the room at that particular moment.  I have to be content with the only certainty I have, which is that a battery exploded and nothing bad happened.

Counterfactuals are messy because they're not provable.  Strategy and investing, execution and operations are loaded with counterfactuals.  They're each similarly messy because we don't really know what would have happened had we chosen another course of action.  Conventional wisdom says that Kodak should have bet on the digital technology they invented, and not dithered around with its legacy analogue business. But what if Kodak were to have invested heavily in digital and their distribution partners made good on their threats to shift the lucrative film processing business to Fuji?  Would Kodak's bet on the future have imploded for being starved of cash flow?  Hindsight isn't 20/20, it's narrative fallacy.

Just as it's difficult to have retrospective strategic certainty, it's next to impossible to forecast the outcomes of alternative paths, decisions or circumstances.  It sounds simple enough: invest in the highest-return opportunities, and consistently assess the performance of and the risks to delivery.  But we're forced to interpret and project from incomplete and imperfect information. Our plans and forecasts are just as likely to be narrative fallacy as fact-based assessments.  Microsoft paid dearly for Skype, but we can't know what might have been had it been acquired by Google or Facebook - and the possibilities may very well have justified the premium Microsoft paid.

The landscape of strategic software investing and governance is riddled with speculation.  As much as we would like to think we've made the best decision available to us, or dodged a bullet, we rarely know with certainty.  No matter how formulaic we're told that portfolio management or delivery operations can be, they will never be as optimal or predictable as promised, because business is imprecise and untidy.