I consult, write, and speak on running better technology businesses (tech firms and IT captives) and the things that make it possible: good governance behaviors (activist investing in IT), what matters most (results, not effort), how we organize (restructure from the technologically abstract to the business concrete), how we execute and manage (replacing industrial with professional), how we plan (debunking the myth of control), and how we pay the bills (capital-intensive financing and budgeting in an agile world). I am increasingly interested in robustness over optimization.

I work for ThoughtWorks, the global leader in software delivery and consulting.

Saturday, April 30, 2016

The Times, They're Still A-Changin', Part II

I ended the previous post by stating that the stage is set for more radical change. Why?

Consider the changing attitude toward land, property and shelter.

For the pre-1965 generation, land meant a lot of things. It was wilderness to be fought with, to be made into a suitable place to live. It was where you built your shelter. It was how you earned or supplemented your living, by farming, mining, logging, or guiding. It was sweat: always one more addition, improvement, or repair to be made. It was the story of America as taught to schoolchildren: land was the reason why the Pilgrims came, and "taming the land" was said to bring out the best in the early settlers who were themselves held out as American heroes and role models. It was very personal history, too: settlers intertwined family stories - and legends - with the land itself.

Land was freedom, and property ownership was independence.

Property was extraordinarily important to the American psyche. A great deal of desirable land remained remote and undeveloped until well after the Second World War (the Interstate network wasn't begun until the Eisenhower administration). Familial bonds with land were strong, particularly where property passed from generation to generation. Schools taught the history of European settlement, including battles with indigenous peoples and noteworthy settlers. A lot of the materials, tools and trade were similar to those employed by generations past, so people could relate to how their grandparents had lived. Activities like camping and hunting gave young people first hand experience of how the European settlers and indigenous people lived, reinforcing the perception of land as well as the myths about it. Land was a great investment ("they're not making more of it" as the saying went), appreciating in value virtually everywhere and almost without interruption. In part, this built up a perception of value in land. And in part, it was a reminder that you weren't too far removed from the rough-and-tumble of the wilderness.

Land no longer captures the American imagination quite so much. More people live in dense urban areas, a large number of them rent, and those who do own expect to move - either upsizing or downsizing - long before their mortgage matures. People who bought property from 2002 through 2008 suffered financially and emotionally with the housing collapse. High crop yields create less demand for farm acreage and farming families. Land has been repurposed: farms near urban areas were more valuable as residential subdivisions, and previously remote rustic areas have been developed into recreation communities or suburbs. Building and zoning regulations restrict what property owners can and cannot do with their land. Early to mid 20th century industrial manufacturers favored rural or suburban owners with the space for industrial products; 21st century providers of services and digital products - that is, where the American economy has been shifting toward for the last 40 years - favor densely populated urban centers. History classes emphasize the high cost - war, disease, resettlement - borne by indigenous peoples at the hands of European settlers. Activities like camping are now either cheap vacation choices (ever notice how many private campgrounds there are near floating casinos?), or some combination of tests of strengths balanced with stewardship.

Land is no longer freedom. Renting is freedom: renting allows you to have many different living experiences and gives you the freedom to change your living accommodation based on your lifestyle, rather than having your lifestyle dictated by the land. Land is red tape and well rehearsed ceremonies to purchase plots or pull permits; it ties you down to a mortgage and a location.

The changing perception of land also reflects the fact that shelter and sustenance - things directly related to land ownership and management - are problems long ago solved on a mass scale. There is little value in rehashing them again and again on an individual basis, when we could use our life's energy to solve the next wave of challenges, from sustainability to space exploration.

There's been a similar change in attitude toward another symbol of independence, the automobile. Because it was a way to get away ("freedom of the open road") or quite literally a means of getaway (think John Dillinger, or Bonnie and Clyde), the automobile captured the American imagination. But the automobile has changed from a symbol of freedom and possibility to one of captivity (monthly payments) and inescapable utility (suburban communities aren't designed with walking in mind, and suburban mass transit is inefficient). The car that sits idle in the garage nearly 99% of the time isn't untapped potential, it's a tax of modern living.

The things which were the physical incarnation of freedom for prior generations have become symbols of economic entrapment to newer ones. Per the previous post, technology enabling things like the "sharing economy" aren't leading change as much as they're riding the changing wave of sentiment.

This wave has a long way to go before it crests. The shift in attitudes toward land and transportation portends a change in asset ownership and associated activities like lending and insurance that we've long taken for granted. That doesn't mean a concentration of assets in the hands of a few owners: technologies like blockchain make it easier to fractionalize ownership. This will allow people to invest in small fractions of many residential properties bundled into a single instrument, and do so incrementally over a long period of time. In essence, they would live in a rented house, but own small fractions of many others. Just as people have shown a preference for passive over active investing, future generations may find it appealing to be able to invest in residential real estate without the need to mortgage future cash flows for a specific spot of dirt in an asset appreciation lottery.

Of course, that's all "some day". But change is afoot, and the stage is set for more still more that goes beyond assets, to the nature of labor itself. We'll look at that in the next post.

Thursday, March 31, 2016

The Times, They Have A-Changed, Part I

"This technology revolution was not invented by robo-advisers. They have simply noticed, and taken advantage of, a broader and deeper shift towards passive investment through ETFs and index funds."

-- John Gapper, Robots are Better Investors than People

We like to think of "technology revolutions", but as Mr. Gapper points out, revolutions aren't led by technology. The landscape is littered with shuttered technology companies that showed that a thing was possible, but failed to foment a "revolution".

Revolutions happen once we have critical mass of people with different attitudes and behaviors. Consider the changes in investing behaviors referred to above. Once investors realized that actively managed funds charged higher fees but performed no better (and frequently worse) than passively managed funds, they switched. Today, as investors come to realize that financial advisors are charging fees for putting their money in passive funds, they're amenable to robo-advisors that charge lower fees for doing exactly the same thing.

A change from human advisor to robo-investing won't happen at the pace set by technology, however. It took a long time for investors to change their preference from active to passive funds. Index funds first appeared in the 1970s, as did research showing that active funds didn't consistently outperform the broader market. Yet it took decades for investors to invest differently.

Why did it take so long? Attitudes, prejudices and predispositions take a long time to change. Sometimes they don't: those who hold a belief may never let it go, even in the face of overwhelming evidence to the contrary. And, people financially, politically or emotionally invested in something will fight to preserve it. Active fund managers initially derided passive funds, while today, facing massive capital outflows, they're fighting for survival. Those who stand to lose from change will also fight back with marketing designed to co-opt the next generation, such as the way manufacturers of sweet fizzy drinks simultaneously play to an older generation's nostalgia while encouraging them to create a sentimental moment - vis-a-vis their product - with a grandchild.

No matter how much technology we throw at something, entrenched behaviors don't start to change until a generation comes along that isn't as emotionally committed to them. And that still only describes how change starts, not how it finishes - and it can take a very long time to run its course. To understand the dynamics of change, we need to look at both ends of the generational spectrum: as people enter, people also leave. This is most obvious in the workforce where, in general, people join around age 18 and leave around age 65.

The United States has just completed a major generational shift, not so much for the one arriving as the one that has recently left.

The Great Depression and the Second World War shaped American domestic and foreign policy well into the 1990s. Yet as long ago as 1965, America had reared a generation without any direct experience of either, making them less constrained by the values held by the people who had come before them. And, starting 1965, a generation began to arrive born to parents who were themselves bred following the Depression-and-WW II-era. Prior to 1945, most everybody had direct experience to the privations of one or both. After 1965, we had generations grow up in households where those two seminal events were things their grandparents told them about from time to time, and which they only briefly studied in high school history classes.

Despite the social upheaval that coincided with the maturation of this post-depression-and-war generation (the late 1960s), the value system of the pre-1965 generation dominated American society and the American workplace, first through sheer numbers (those who held it made up the bulk of the working population) and later through positions of seniority (older people were more likely to hold executive positions).

The numbers are now vastly different. People born in 1945 reached age 65 in 2010. There are very few in the American workforce with a direct experience of life during WWII, let alone the Great Depression. Nor are there too many who are just one generation removed from those events (that is, grew up in households directly influenced by them); those who are one-generation-removed will largely exit the American workforce by 2030.

It's no coincidence that we've seen more change in the last decade than we arguably did in the three previous decades combined. But not so much because of a new generation arriving, bringing with it new expectations and demands, as much as the old generations leaving and relinquishing the top rung of authority and social influence. Out of numbers and out of power, those value systems no longer hold sway. Although we live and work in a "post-1965" world today, it took over 40 years - two additional generations - for that to happen.

Because change is a function of society more than technology, it's slow in coming but swift once it arrives. And, while a lot of change happened with the completion of the pre- to post-1965 shift (at least, in the workforce), the stage is set for still more revolutionary change. We'll look at specific examples in the next post.

Monday, February 29, 2016

How an Operational Gap Becomes a Generation Gap Becomes a Valuation Gap

A decade or so ago, when an IT organization (captive or company) hit rock bottom - bloated payroll, lost confidence and ruptured trust resulting from rampant defects, rocky deployments, functional mis-fits, and long delivery cycles - it would give anything a try, even that crazy Agile stuff. It didn't matter if it was incumbent management with their backs against the wall, or new management brought in to clean house, desperate times called for desperate measures. To people looking to shake things up, Agile made intuitive sense and a lot of it was appealing, even if its proponents had a bit too much evangelical zeal and some of it sounded a little strange. So they'd bite the bullet and do it, selectively anyway: a build server was easy to provision; releasing software for testing every couple of weeks was done easily enough, and getting everybody in the same physical workspace (or at least in close proximity to one another) could be done with a little arm-twisting; of course, developers were instructed to only pair program "opportunistically", and automating QA was a fight nobody wanted to fight, so they'd sit with the team and test incrementally but go on testing the way they always had. Still, even with compromises, there was a good story to tell, usually to do with much higher quality and eliminating rework, and that made Agile A Good Thing for all parties concerned.

Fast forward to today, and we see that Agile is much more ambitious. A few short years ago we were content to have an automated build execute every few minutes; today we want every check-in to trigger a build that is promoted through progressively more complex tests in virtual environments managed, instantiated and torn down by scripts. We used to be content releasing for user testing every other week, and to production every couple of months; we now aspire to release to production many times a day. We used to want Master Story Lists to guide incremental development; today we want to iteratively experiment through code and have the feedback inform requirements definition and prioritization. We used to be satisfied with better delivery of internally-facing software; today we want to evolve software products that are used by people across our ecosystem, from interested parties to customers to investors to employees. Today, Agile wants to push against everything that creates an artificial or temporal constraint, be it organization, management, accounting policy, or even capital structure.

Although Agile has evolved, the entire tech world hasn't moved with it. In fact, some of it hasn't moved at all: it's still common to see non-Agile organizations that do big up-front design; work in functional and skill silos; have manual builds, deployments and testing; and make "big bang" releases. And, it's still common for them to face a "rock bottom" moment where they conclude maybe it's finally time to look into this Agile stuff.

As hard as it was a decade ago to inject Agile into a non-Agile organization, it's much harder today for a non-Agile organization to complete a transformation. This seems counterintuitive: since the non-Agile to Agile path is so well trod, it should be much easier than it was in those pioneering days of yore. Although there's never been more tools, frameworks, languages, books, blogs, and countless other resources available to the individual practitioner aspiring to work differently, organizational change tends not to be self-directed. The challenge isn't taking an organization through the same well-established game plan, it's finding the people - the transformational leaders - who are willing to shepherd it through its journey.

Unfortunately, re-living the same internal negotiations to reach the same compromises, solving technical and organizational problems long ago solved, only to end up with an operating model that is considerably far short of the state-of-practice today is not a destination opportunity for an experienced change leader. Even assuming, as my colleague Alan Fiddes pointed out, that the change agents brought in still have the vocabulary to carry on arguments last fought so long ago, any change agent worth their salt isn't going to reset their career clock back a decade, no matter the financial inducement.

This might simply mean that the approach to change itself is what has to change: require far less shepherding from without by expecting more self-directed change from within, brought about by setting the right goals, creating the right incentives (motivate people) and measuring the right things (what gets measured is what gets managed). Why shouldn't it be self-directed? It isn't unreasonable to expect people in a line of work as dynamic as software development to keep their skills sharp and practices current. For people leading an organization that's a little dated in how it develops software, then, the question to hold people to isn't "why aren't you doing Agile" but "we're going to deploy any time and all the time effective fiscal Q3, so how are you going to operate to be able to support that?" It's amazing what people will do when sufficiently motivated, change agents be damned.

Whether there's a more effective means of change or not, being far adrift of the state of practice points to a more severe threat to the business as a whole: a generation gap.

* * *

Three decades ago, the state of practice didn't vary that much across companies. Yes, there were people coding C over Rdb deployed in VMS on minicomputers and people coding COBOL over IMS deployed in OS/390 on mainframes, but the practice landscape wasn't especially diverse: waterfall prevailed and a lot of code was still data-crunching logic run in batch. At the time, captive IT, consulting firms, governments, new tech firms (think Microsoft in the mid-80s), and established tech stalwarts (H-P, IBM) could reasonably expect to compete for the same labor. College grads in Computer Science or Management Information Systems learned practices that reinforced the modus operandi common to all consumers of business computing.

The landscape has changed. Practices are far less homogeneous, as they've had to evolve to accommodate a diverse community of interactive users pushing for features and functionality with little tolerance for failure. The familiar combatants still slug it out for labor, but must now also compete against tech product firms untethered to any legacy practices, norms, policies or technologies. Today's grads are trained in new practices and expect their employer to practice them, too.

Companies lagging behind in their state of practice will struggle to compete for newly minted labor: why would somebody with highly marketable tech skills go to work at a place stuck in the past, when they can work in a more current - even cutting edge - environment?

This isn't just a hiring problem. A practice gap is fuel for a generation gap if it deflects young, skilled people from becoming employees. By failing to hire the next generation employee, a company develops an intrinsic inability to understand its next generation customer.

A company isn't going to reach a new generation of buyer - consumer or business - if it is tone deaf to them. A company ignorant of the next generation's motivations, desires, values and expectations has little chance of recognizing what it isn't doing to win their attention, let alone their business. Since social systems are self-reinforcing, a company is unlikely to break the deadlock of ignorance and inaction.

Failing to bridge a generation gap not only cuts a business off from growth opportunities, it sets the stage for long-term irrelevance. Investors recognize this, even when management does not. Growth changes from being a "risk" to being an "uncertainty", and when that happens a company's future1 is no longer priced at a premium, but a discount. In this way, an operational gap becomes a generation gap becomes a valuation gap.

Outdated practices are an indicator that management has it's head buried in the sand: it has a problem it can't see, that it doesn't know how to solve, that is starved for information it can't get because it has elected to disassociate itself from its source. The motivation to change how you practice shouldn't be to become more competitive today, but to still be relevant tomorrow.



1 By way of example, Yahoo net of its Alibaba holding and cash has frequently been valued at or near $0 by investors in recent years.

Sunday, January 31, 2016

Are Microservices to Ecosystems as Core Competencies were to Conglomerates?

As far back as the 19th century, industrial firms pursued vertical integration strategies. The thinking was that by owning the supply chain from raw materials to retail outlets, a firm had direct control over its entire cost structure, making it better able to squeeze efficiencies out of it and being less susceptible to supply shocks. This was important because, for large industrial firms, competing on price was the primary strategy for winning market share.

During the 1950's and 60's, companies also pursued conglomerate strategies: bringing seemingly unrelated businesses under one roof, sometimes seeking synergies (as Sears did owning a retail merchandiser and retail brokerage - "buy your stocks where you buy your socks"), and sometimes not (as LTV did owning a steel company and an airline). The rationale for the conglomerate was entirely financial: cheap (mostly debt) capital allowed large companies to grow through acquisition, and regulators were less likely to block acquisitions of unrelated firms on monopolistic grounds.

By the 1980s, both strategies had begun to lose favor. The financial benefit had evaporated: high interest rates clobbered the profits of debt-fueled acquisitions and forced divestiture. But the operating benefits weren't there, either. Different types of businesses (manufacturing, distribution, retail) require different types of leadership and have very different cultures. And, within each of those businesses, some functions are differentiating (such as fleet optimization for a logistics company) while some functions are not (nobody beats their competitors by having a superior accounting back office). Management thinking embraced "core competencies": own and hone the things that differentiate, rent and commoditize the things that do not. This also allowed for better matching of capital with company: the risks and returns from a company that owns a fleet of railcars is easier to assess than the risks and returns from a company that owns ore mines, railcars, and foundries. By breaking them up, the individual investor can choose what types of businesses to expose their capital to (a raw materials company, or an asset company, or a refining company), and the pricing of that capital more accurately reflects the risks.

Tech firms today are embracing the "vertical integration" and "conglomerate" strategies of yore by positioning themselves as "platform" and "ecosystem" companies. The thinking is that by combining multiple and diverse capabilities into a single offering, a company creates both efficiencies and synergistic value for counterparties in some activity, such as crowdsource funding or payments. The ecosystem strategy takes this even further, combining unrelated capabilities under one roof (eBay buying Skype in 2005, SAP developing HANA in 2010, Facebook buying Oculus in 2014), often justifiable if only because digital commerce is still in its infancy and nobody is really sure what's going to work and what's not.

But what if you could extract utility-like functionality from within an ecosystem into an independent company? Take payroll as an example: rather than have every Human Resources platform company need its own team of people to write and maintain state and federal witholding rules, hive those off into an independent business that makes them available as a metered service offering, charging a tiny usage tax (say, $0.001) each time it's invoked. The technology to do this is certainly available: code the algorithms as a microservice or a blockchain smart contract, and deploy them in an elastic cloud environment (as usage will tend to spike with pay cycles).

To the HR platform company, there are a lot of good reasons to do this. It monetizes a non-value generative activity (nobody subscribes to a payroll system because their implementation of the witholding rules are better than everybody else's). It throws off utility-like revenue streams that move in lock step with the broader job market. It disaggregates a staid HR utility function that needs to be deployed infrequently (potentially only as often as once per year, when new tax laws come into effect) from more innovative ones that are more exploratory in nature and demand frequent deployments (such as time entry performed through emerging wearable tech). It separates a legacy and risk-averse tech culture from a cutting-edge risk-prone one. It takes a high fixed cost for maintaining a non-differentiating characteristic off the P&L (teams maintaining rule-heavy legacy code are rarely inexpensive). It's stable cash flows would be attractive to debt finance, better aligning capital investment in HR technology. It removes asymmetric risk that can be more accurately insured (smothered in a platform, correct calculations offer no financial upside, while a faulty calculation exposes it to costly litigation and reputational damage).

True, it eliminates a barrier to entry of future competitors. And, while the market would support a handful of utilities to prevent monopoly, thin competition would give those utilities oligopic pricing power. But it creates a one-time financial windfall for the first movers, laggards would be pressured to subscribe by shareholders demanding the same structural benefits to the income statement, and low switching costs would keep utility pricing power in check.

Given that tech is in a period of both cheap capital (interest rates remain low, VC levels remain high, and companies such as Alibaba and Facebook can make acquisitions inexpensively with their own high-priced shares) and rapid growth (growth in consumption of tech products such as social media today mirrors growth in consumption of manufactured goods in the 50s and 60s), it's little surprise that we're seeing a return to industrial strategies past. But technologies like microservices and blockchain could be the modern equivalent of "core competencies" to sweep through businesses. Blockchain proponents already champion the potential of decentralized autonomous organizations (DAOs). With MBAs now eschewing investment banking in favor of tech companies, financial engineering of this nature isn't too far away.

Thursday, December 31, 2015


Earlier this year, my house should have burned to the ground.  A CR2032 battery exploded and caught fire in a confined place dense with flammable objects.  But my house didn't burn down: at the moment the battery exploded, I was sitting a few feet away from it. I heard a loud bang, investigated, and stamped out the fire within a few seconds.

I wasn't planning to be there at the time. A series of minor reschedules and reprioritizations had me sitting in my home office at the moment the battery exploded. I happened to be in the right place at the right time to prevent something from happening.  It was business as usual for the rest of the day.

There is no law or reason governing why I was there at the moment the battery exploded, or for the battery to have exploded at precisely that moment.  What if my schedule isn't rearranged and I'm not home when the battery explodes? What if it happens hours earlier or later when everybody is asleep in the house? I can't help but think that there are more quantum realities where something bad happened than there are where something bad did not happen.

But that isn't necessarily true.  The battery could have exploded at a time when nobody was around to put out the fire, but the ensuing fire may have simply died out causing no damage.  And maybe the chemical processes that triggered the explosion was itself an anomaly of circumstances set in motion by my presence in the room at that particular moment.  I have to be content with the only certainty I have, which is that a battery exploded and nothing bad happened.

Counterfactuals are messy because they're not provable.  Strategy and investing, execution and operations are loaded with counterfactuals.  They're each similarly messy because we don't really know what would have happened had we chosen another course of action.  Conventional wisdom says that Kodak should have bet on the digital technology they invented, and not dithered around with its legacy analogue business. But what if Kodak were to have invested heavily in digital and their distribution partners made good on their threats to shift the lucrative film processing business to Fuji?  Would Kodak's bet on the future have imploded for being starved of cash flow?  Hindsight isn't 20/20, it's narrative fallacy.

Just as it's difficult to have retrospective strategic certainty, it's next to impossible to forecast the outcomes of alternative paths, decisions or circumstances.  It sounds simple enough: invest in the highest-return opportunities, and consistently assess the performance of and the risks to delivery.  But we're forced to interpret and project from incomplete and imperfect information. Our plans and forecasts are just as likely to be narrative fallacy as fact-based assessments.  Microsoft paid dearly for Skype, but we can't know what might have been had it been acquired by Google or Facebook - and the possibilities may very well have justified the premium Microsoft paid.

The landscape of strategic software investing and governance is riddled with speculation.  As much as we would like to think we've made the best decision available to us, or dodged a bullet, we rarely know with certainty.  No matter how formulaic we're told that portfolio management or delivery operations can be, they will never be as optimal or predictable as promised, because business is imprecise and untidy.

Monday, November 30, 2015

Corporate Middle Management as an Autopoietic System

[T]he aim of such systems is ultimately to produce themselves: their own organization and identity is their most important product.

-- Gareth Morgan, Images of Organization, p. 236.

In the early 1970s, biologists Humberto Maturana and Francisco Varela coined the term autopoiesis to define the self-maintaining nature of living cells: biological cells produce the components that maintain the structure that creates more components (in this case, more cells). This is in contrast to allopoietic systems, which use components (raw materials such as silicon and plastic) to generate something (mobile phones and computers) which are distinct from the thing that created it (the factory where they are made).

In the mid-1980s, Gareth Morgan applied the concept of autopoiesis to organizational study.

They do so by engaging in circular patterns of interaction whereby change in one element of the system is coupled with changes elsewhere, setting up continuous patterns of interaction that are always self-referential. They are self-referential because a system cannot enter into interactions that are not specified in the pattern of relations that define its organization.

-- Ibid, p. 236

A few months ago, I described the organizational pathologies that we see in slow-growth, (mostly) equity funded firms: because it faces no real pressure (no credit rating to support, no competitive threats to revenue) it will suffer from operations bloat. A significant source of that bloat will be a large middle management.

Left unchecked, tech organizations will grow vertically around line activities (software products that support business functions) and horizontally around shared services (testing, infrastructure). They will also establish one or more program management offices to navigate delivery of complex business initiatives across the fractured organizational landscape. For every management expansion, there is an equal and opposite hiring spree. The host business will mirror tech's management structures, creating business product managers opposite technology line managers, and a business PMO responsible for business change management functions opposite the tech PMO responsible for delivery of tech assets. This management sprawl happens for a variety of reasons: people are promoted into management for fear they might quit, IT gets burned by a delivery failure and creates new hierarchy in response, a senior business manager doesn't want to be seen as mapping to a low level of the tech organization, a new boss prefers to delegate rather than get his hands dirty with the details, there is a low level of trust between tech and business and matching staff is how it maintains equilibrium, and so forth.

Middle management is not a value-generative function. Because they are not engineers or analysts, middle managers don't directly contribute to solution development. Instead, they negotiate on behalf of their sphere of responsibility with other middle managers. They create documentation templates, project control forms, release and implementation workflows, and program checklists to create contracts among other managers to secure the time and attention of the people who do contribute to solution development. These contracts implicitly protect every middle manager by sharing responsibility (my work was dependent on somebody else who failed to deliver) and deflecting responsibility (another initiative took priority so we had no choice but to let this other deadline slip). The web of contracts allows middle management to self-perpetuate.

[W]e can describe autopoietic systems as those producing more of their own complexity than the one produced by their environment.

-- Carlos Gershenson, Requisite Variety, Autopoiesis, and Self-Organization.

It also serves as the fuel for growth: perpetual negotiation spurs middle management to expand its library of templates, forms, workflows and checklists. That, in turn, adds to the structure, because it requires more middle managers to fill out more program documentation. For example, a few years ago, the logical database storage for a legacy asset was overwhelmed when a new type of transaction was implemented; since then, the infrastructure department requires every new initiative to complete a "long-term storage analysis forecast". But, some initiatives don't generate many transactions at all and will have little impact on storage allocated to any asset. The managers in those initiatives don't have to fill out a "long-term storage analysis forecast", but must still fill out a "storage analysis forecast exemption" form to document why management concluded the forecast document wasn't necessary.

In this way, middle management is autopoietic: based on a flow of documentation, it creates components (middle managers) that maintains the structure (the bloated middle management) that creates new components (more middle managers).

* * *

[T]he brain does not process information from an environment, and does not represent the environment in memory. Rather, it establishes and assigns patterns of variation and points of reference as expressions of its own mode of organization. The system thus organizes its environment as part of itself. If one thinks about it, the idea that the brain can make representations of its environment presumes some external point of reference from which it is possible to judge the degree of correspondence between the representation and the reality. This implicitly presumes that the brain must have a capacity to see and understand its world from a point of reference outside itself. Clearly this cannot be so[.]

-- Morgan, pages 237-8.

Suppose we want to introduce Agile into an organization because we want delivery to be more efficient and effective, and we want a better relationship between business and technology. One way we think we can do that is by simplifying our management processes and making them more collaborative, and Agile appears to offer us a means of doing that.

If we have a large middle management function, we can't expect that Agile will simplify our requirements, development, release, or change management activities. What we should expect is that Agile will get co-opted by the very structures that it is there to disrupt. A middle manager cannot comprehend Agile as a different means to an end, because the only end a middle manager is pursuing is successful contract negotiation with other middle managers. A release plan becomes a closed-ended project plan, Stories in an iteration become a commitment, tasks per each Story become the coin of negotiation with other managers for their "resources". Adopting Agile - everything from adaptive planning to continuous delivery - requires a level of abstract thinking about why we do the things we do and how they lead to a delivery outcome that will be well beyond that of an incumbent middle management. Any middle manager capable of abstract thinking will have left the organization long ago: survival requires concrete thinking within a very narrow scope of self-referential activity.

When we recognize that identity involves the maintenance of a recurring set of relations, we quickly see that the problem of change hinges on the way systems deal with variations that influence their current mode of operation. Our attention is thus drawn to system processes that try to maintain identity by ignoring or counteracting threatening fluctuations, and to the way variations can lead to the emergence of new modes of organization.

-- Ibid, p 239.

For a large middle management, Agile is not a welcome change. If we have business and tech people working together daily rather than having a temporally shifted conversation through documentation, if we have technology generalists rather than specialists, if we capture knowledge in automated tests and ops scripts, we need far less intermediation in the delivery process. This obviates the need for an expansive middle management function.

An autopoietic system is capable of autoimmune responses. Co-opting, described above, is one. Ignoring is another: enough people refusing to change can force management to re-think its commitment to that change. Subverting is a third: creating obstacles and impediments to change to sow uncertainty and doubt on its effectiveness are behaviors intended to reinforce middle management's identity. Autoimmune forces are powerful: a function that exists solely for its own perpetuation - even when not by charter, but as a matter of social contract among its members - will become shrill in its own defense.

The policeman is not here to create disorder. The policeman is here to preserve disorder.

-- Richard J. Daley, 48th Mayor of Chicago

What does change look like under these circumstances?

Clearly, it isn't willing acceptance by the incumbents. We can't expect actors in a system to accept change that results in the destruction of that system. As Upton Sinclair famously wrote, "it is difficult to get a man to understand something, when his salary depends upon his not understanding it."

By definition, change of an autopoietic system must be triggered internally, and happen as a result of randomness. Morgan argues that "random variation provides the seed of possibility that allows the emergence and evolution of new system identities." Random changes create the possibility of new relations, which, if they're not absorbed or stifled by other parts of the system, can lead to new identities. Morgan argues that "Human ideas and practices seem to develop in a similar manner, exerting a major transformational effect once they acquire a critical level of support." Nassim Taleb makes the same argument in his book, Fooled by Randomness.

The corporate change leader doesn't have the luxury of time for the forces of randomness to reform an entrenched middle management. There are two policies that can accelerate structural change: disallowing self-referential justification for middle management practices (that is, expanding its scope of reference so that every action must be justified by a delivery outcome, not a middle management negotiation), and aggressive dismantling of the middle management structures themselves. The prior, if not compromised, puts an incumbent invested in the status quo on the defensive and robs it of it's raison d'être. It also helps to identify people in middle management who are reformable and coachable. The latter reduces the need for the prior.

Bad events in organisations are generally the product of bad systems rather than bad people ... [W]e need to go on and ask what it is about modern corporate life that has made such misbehavior not only possible but appear increasingly common.

-- John Kay, Organisations advance by asking "what went wrong" rather than "who is to blame

It's hopeful to believe that an incumbent middle management will "see the light" once introduced to a different set of practices, mechanics and tools. But the broader corporate reality tells us otherwise. When we introduce change, we quickly come into direct conflict with a self-referential ecosystem that, despite obvious internal contradictions and shortcomings, has an extraordinarily strong survival instinct. We also discover latent, institutionalized corporate misanthropy directed at users, customers, suppliers and business partners. A change toward Agile, and the value system it represents, is less enabling, and more threatening, than we'd like to think. To be successful as change agents, we have to dismantle the structures, processes and people behind the status quo while simultaneously replacing them with a new normal.

Saturday, October 31, 2015

Potential and Motivation

When a management with a reputation for brilliance tackles a business with a reputation for bad economics, it is the reputation of the business that remains intact.

-- Warren Buffett

For a business to have "potential", it needs opportunity, money, willingness, talent, and aptitude. Yes, a business without all of these things still has potential: it might be poorly funded but have knowledge-acquisitive people and a clear opportunity; it may have weak capability but good cash flow. But somebody agreeing to lead or acquire a business because of its "potential" still needs to compensate for deficiencies in any of these areas.

Potential needs a wake up call if it is to be realized. Sometimes, our source of motivation is external: new competitors, or a threat to our business model. It can be internal: an activist investor on the board forces the CEO to accept new operating targets.

Potential and motivation are both the purview of leadership. A good leader creates potential where there is none: putting someone in a stretch role, freeing capital for reinvestment through a restructuring, diversifying the product line by acquiring a business, hiring a strong product team to engineer and market new offerings. A good leader motivates with incentives and rewards, culture and risk-taking: define business objectives that require cleverness and innovation and not just operational efficiency; flatten and simplify the organization, distribute authority and responsibility; recognize team behaviors over individual heroism.

The intersection of motivation and potential is never as far down each axis as we hope because they're constrained by our ambition (or lack thereof), and by fear. We can't imagine things any different so we fail to define an opportunity. We see competitive threats developing but we lack the will to act. We are reluctant to make difficult staff cuts to free up money. We don't recognize the outdated skills of our people, the absence of abstract thinking, the dearth of current technologies and practices, so we do nothing to upgrade our talent base. We're afraid of offending our current employees by changing the compensation structure. We are more comfortable micromanaging, demanding commitment, and promoting people through hierarchy than we are giving people autonomy and rewarding them for innovation.

Worst of all is when we simply talk ourselves down: of course we want to get to such-and-such state someday, but we can't possibly make so much change so quickly. This amounts to regulatory capture of a leader's ambition - and by extension, of a business' potential - by the business itself.

The difference between a change leader and a caretaker isn't lofty vision or inspiring words, it's the ability to create maximum potential within the organization, and the motivatiors that drive people's behaviors and actions to realize it.