I consult, write, and speak on running better technology businesses (tech firms and IT captives) and the things that make it possible: good governance behaviors (activist investing in IT), what matters most (results, not effort), how we organize (restructure from the technologically abstract to the business concrete), how we execute and manage (replacing industrial with professional), how we plan (debunking the myth of control), and how we pay the bills (capital-intensive financing and budgeting in an agile world). I am increasingly interested in robustness over optimization.

I work for ThoughtWorks, the global leader in software delivery and consulting.

Saturday, December 31, 2016

Myths of Replatforming

Replatforming is all the rage these days.

Platforms are conceptually popular with investors: in theory, a platform makes the mundane portions of a business efficient, scalable and adaptable, allowing a company to release the creative talents of its people to pursue growth and innovation. Replatforming makes for a convincing story following an acquisition because it explains to investors how deal synergies will be achieved, sets a tone of equivalency among employees of both the acquired and acquiring firm, describes a vehicle through which the company will reinvent itself with modern technology and practices, and conveys pragmatism in the need to clean house of dilapidated infrastructure. The replatformed business promises innovation at scale and larger cash flows from operations that, combined, position it to grow both organically and through acquisition. Destiny awaits.

In practice, replatforming is messy. Core systems are a complex web of integrated rules and functions spun over generations. They're difficult to disentangle because we have all the wrong people: most of those with the business understanding that went into building those systems have long since left, while the architects and senior engineers who created the outer layers - the web and mobile components - are still on the payroll. Replatforming initiatives become a black hole almost from the start because people don't know how to "eat the elephant": their ways of working are out-dated, they struggle to come to terms with the totality of what needs to change, and they can't envision a future that is much different from the present.

I've had a front row seat to a number of replatforming initiatives over the years, and I've seen several myths that plague them from the inside.

We'll build the platform first, then change the organization once it's live. Employees will see replatforming as an exercise in re-creating software. The existing systems have their shortcomings, but we run a multi-billion dollar business on that code, so we must be pretty good at software. Let's stick to what we know to create the software - because that's what the business needs, right? - and then we'll change process and organization once it's up and running. The fatal flaw in this line of thinking is Conway's Law: software mirrors the communication patterns of the organization that develops it. When the goal of replatforming is to rapidly innovate at scale (as it is usually alleged to be) you have to start with cross-functional, integrated teams with devolved authority that can autonomously deliver end-to-end solutions. Coming from traditional organizational structures of hierarchy, shared services and specialists, that's a lot of change. It creates confusion and disorder that just about everybody is uncomfortable with. It also makes for a slow start on developing the software that makes middle managers nervous; the more nervous they get, the more inclined they'll be to abandon organizational change. An over-managed, hierarchical, silo'd organization will develop over-engineered, tightly-coupled, brittle software; no after-the-fact restructure will compensate for that because you're bound by Conway's Law. As the saying goes, 'tis always best to begin as one intends to proceed. Replatforming doesn't succeed unless organization change precedes.

We need a product organization. As business operations become more encoded in algorithm and less executed through manual labor, we need long-term stewardship of our software from both our business and engineering communities. The popular way of doing this is by creating a product organization. Don't bother doing that if you can only define their responsibilities in a self-referential manner (e.g., "the product owner owns the product"), if you disproportionately define the scope of their responsibilities as user experience, if they have no direct accountability for how their outcomes impact the line of the business, and if you're substituting systems knowledge (how things work) for business knowledge (why things work). Not only is this not value-generative, it adds a layer of decision making intermediation and creates ambiguity in responsibilities for requirements, analysis, design, and prioritization. It's made much worse when product managers are alleged to have authority, but have their decisions reversed by higher-ups or invalidated by people with stronger business knowledge. Per the prior paragraph, we need to create the organization that will both create and perpetuate the platform. Creating a mature product organization is hard. Better to encapsulate product responsibility into the line of the business (where it arguably should be in the first place, and not adjunct to it, but that's another blog for another day) than to create a product organization stuck in perpetual adolescence. The latter will result in systemic distortions in the code, courtesy Conway's Law.

We need to put all business requests on hold for the next [long period of time] while we replatform. The older a company is and the more acquisitions it has been involved with, the more far reaching and complex its core systems will be. The more depleted it is of business knowledge - the why, not the how - the more mysterious those systems will be. Employees will be predisposed to define a new platform through the lens of feature parity with the old. The lower the caliber of business knowledge (that is, the more that system knowledge substitutes for business knowledge), the higher the degree of feature parity that employees will insist defines the minimum viable product - a position reinforced by traditional software development methods that released software once every few months, not days or hours. Additionally, ambitious replatforming efforts lay bare deficiencies in organization, skills, capability, knowledge, process, and infrastructure. Changing those takes time. These two points are conjoined to form the mistaken belief that business needs will have to take a back seat while people figure all this stuff out, but those business needs would be deferred regardless because there's no way to go live with a partial solution, and we can't very well pursue a moving target. To destroy this myth, start with that "long period of time" the business is being asked to wait. It always seems to be on the order of a year to a year and a half. At the very least double it, because nobody's estimate for something they've never done before is going to be particularly accurate. Think your customers will wait that long - two to three years - for you to get your act together? At best, a fork-lift replacement will get you tomorrow where you needed to be yesterday. From an engineering perspective, progressive strangulation of incumbent systems is almost always a question of will, not possibility. From a business perspective, progressive strangulation is a question of personas and user journeys, not features and functionality.

We'll build it the way we should have built it if we knew then what we know now. It's tempting to think our best bet is to re-build something with the wisdom our people gained from the experience of building it the first time round. That's a risky bet. It assumes they've drawn the right conclusions from their experiences. That's a lot to hope for, since that requires a lot of introspection, critical analysis, and personal acknowledgement of error and mistake, something most people aren't predisposed to do. It also assumes they've kept current with technology and process. Microservices, containerization, elastic cloud, and continuous delivery are significant departures from where software development was just a decade ago. The people who got the company into the mess it's in aren't the people who will get the company out of it. In their hands, you'll get what you've already got, only worse, like multi-generational in-bred aristocracy. Replatforming requires forward thinking technology, ideas, and execution. Changing culture, capability and mind-set requires a major transfusion; be prepared to lose a lot of blood.

Aging infrastructure, cheap capital and a dearth of innovation have fueled consolidation in a variety of industries. Replatforming will be with us for as long as those conditions exist. The myths don't have to be.

Wednesday, November 30, 2016

The Patrol Method and Objections to Self-Directed Agile Teams

In the previous post, we saw there are quite a few similarities between the Patrol method and self-directed Agile teams. It stands to reason that the resistance, doubt and objections faced by each from sponsors, leaders and members alike will be very similar. If that's the case, one can learn from the other.

These excerpts from the 1950s edition of the Scoutmaster's Handbook will sound familiar if you've ever tried to implement a self-directed Agile team:

"Some don't grasp the possibilities of the patrol method, and subsequently don't see the importance of it." If you can't appreciate the adaptability and agility of an organization characterized by strong capability and strong leadership throughout the ranks - one where people are continuously completing, learning, and adjusting to an extent that they perform best with team-level autonomy rather than top-down hierarchy - you won't see the value in a method designed to bring that about.
"Some lack faith in boys' ability to carry out responsibilities." You have to be willing to trust in people's capacity to both learn and make good decisions. Of course, trust is earned and not given, and while it takes time to develop it takes only seconds to erode. But an organization is a lot more efficient when it gets by on trust, because trust requires far less supervision and administration. Again, you have to be committed to what the right organization can deliver, and not make the organization and it's people hostage to the deliverables themselves.
"Some give up if it doesn't function perfectly right off the bat." Self-directed teams are a departure from traditional command-and-control hierarchy. The initial experience - and the initial results - can be very poor if self-direction is adopted swiftly and suddenly: self-direction requires very different behaviors and these can take a long time to develop. Until they do, teams will experience mission confusion as they come to grips with new expectations, interruption caused by external dependencies and boundaries they now have the responsibility to negotiate and manage, and seizure as people struggle to come to terms with responsibility for an entire solution rather than discreet, small tasks. How the team struggles with Tuckman's stages of group development will be mirrored in its results: at best a roller coaster where results are up one iteration then down the next, at worst a flatline where they struggle to get anything across the line. If we're not cognizant of (or better still, actually measuring) the development of new organizational "muscle memory", then the appearance of chaos within the team twined with few deliverables to show will cause the faint of heart to bail before the team gets through forming, norming and storming, to actually performing.
"Others don't like to part with authority. They found a chance to play and show off their specialties and don't realize they're stifling leadership development." Some people responsible for delivery of large programs or simple projects may be more comfortable concentrating authority rather than distributing it. This is an indication that they're more worried about the success of what they're responsible for than they are building up the people who can successfully deliver. While they may be good executors - a safe pair of hands you can trust to get something across the finish line - they're not good organization builders who can sustain what they create. Curiously enough, while self-directed teams are often accused of "not scaling", it is the characteristics common to command-and-control hierarchies - asymmetric knowledge distribution and concentrated decision-making - that don't scale.
"Still others have the old idea of training by mass instruction too ingrained in their systems to change." The best way of learning is to do something and not just study it. But on-the-job training can be very expensive, especially if somebody does, and re-does, and re-re-does, and still doesn't get a basic grasp of what it is they're doing. There is a certain allure to separating skill acquisition from skill application, if for no other reason than we can measure the effort spent on skill acquisition in the form of number of hours spent in classroom training and number of people who earn certifications. Things like training and certifications are proxies for competency: if we earn the certifications we will know what we're doing which will jump-start our execution. This makes people feel good about progress toward plan. But giving training to an individual and that individual demonstrating competency are entirely different things. The prior is useless without the latter.
"Also, some by temperament aren't suited to this way of training and are happier in a system other than the Patrol method." Quite a few people can't function without a strong hierarchy. Asking them to perform in a system defined by collaborative teams of peers who are self-directing themselves toward reaching a goal is an unreasonable expectation, and potentially damaging, to somebody who simply can't function that way.

From this, we can glean the characteristics a leader committing to the formation of self-directed teams needs to have if they are to succeed.

Humility - Self-directed teams are about its members, not its leaders.

Patience - A self-directed team will suffer setbacks and disappointments, particularly in the early going. The point isn't to obsess about failure, but to make sure every member of the team learns from it and doesn't make the same mistakes again.

Faith in people - If you believe people need to be told what to do, rather than can figure out what it is they need to do, you'll not get far with self-directed teams.

Belief in the wisdom of crowds - You have to believe that the whole is greater than the sum of the parts, and that a hive-mind will come up with a better solution through execution and continuous feedback than one dictated to them by a small cabal of people.

These prior two points suggest that you have to espouse a professional rather than an industrial mindset. If you can do that, you have the mindset for self-directed teams.

Commitment to the method - Above all else, you have to believe that if faithfully applied, the method will create the conditions - generalized skills, strong leaders, and independently-functioning units - under which a program is far more likely to succeed than if it is delivered in a command-and-control style. If you lack confidence in the values behind the method, you'll be quick to abandon practicing the method.

"The troop that is run as a club, with the Scoutmaster as boss, dies when the boss leaves." Hierarchies with central decision-making can be effective, but they are brittle because of their dependence on a handful of key people. If your goal is to build a resilient, evolutionary, adaptive organization, the price of admission is decentralization. Decentralization requires empowerment; empowerment requires atomic leadership and capability. The history of organizational development teaches us that the process of building an organization that can function this way is a very difficult one indeed.

Monday, October 31, 2016

The Patrol Method and Self Directed Agile Teams

As part of my research into method earlier this year, I picked up a 1959 edition of the Scoutmaster's Handbook. The core of the philosophy for a Scout troop was what Robert Baden-Powell, the founder of Scouting, called the Patrol method. The early editions of the Boy Scouts of America's version of the Boy Scout Handbook were mostly written by the same person, William "Green Bar Bill" Hillcourt, and revised over many years.

The Patrol method Hillcourt described in the Scoutmaster's Handbook was essentially a self-directed Agile team. The key characteristics are:

  • Small team size: patrols are 6 to 8 people. Several patrols form a troop, but patrols are autonomous.
  • Pairing: experienced people teach new people on the team.
  • Continuous feedback: the Court of Honor is "...a peer system in which Scouts discuss each other's behaviors and is part of the self-governing aspect of Scouting."
  • Servant leadership: achieved through an emphasis on service to others (an expectation shared by all), as well as stressing that the highest leadership roles are expected to assist those in the troop to train themselves as opposed to telling them what to do. "A Scoutmaster's job is to help boys grow - by encouraging them to learn for themselves."
  • Hands-on over theory: "No meeting should be inside - all activities should be outdoors".
  • Respond to change: "If the planned program doesn't work, be resourceful. Throw some out, if necessary, to suit conditions."
  • Transparency: "Encourage members of the troop committee to attend regularly."
  • Stakeholder management: "When they come, have something definite for them to do."
  • Chickens and pigs: "Keep visitors on the side lines. Most of the time visitors come to see what is happening. Don't let them interrupt the meeting."
  • Generalize skills by rotating pairs and responsibilities through the duty roster instead of allowing people to specialize in tasks.
  • Tool construction: pioneering techniques forge useful tools from available resources that make you more productive and comfortable.
  • Each team owns the plan: troop goals and patrol objectives are set by members of the patrols themselves, not dictated the adult leadership.
  • Adaptability in technique: "Fortunately, there is no standard way of planning the program of a troop. A group of robots using a standard pattern in exactly the same fashion would pretty soon kill Scouting. Each troop works out its own way..."
  • A code with positive goals: the Scout Oath and Laws provide a value system for conduct, in much the same way that the Agile Manifesto is a value system for software delivery.

There are many more similarities I could draw out between the Patrol method and Agile teams. The point isn't to suggest that the concept of self-directed teams are influenced by the Boy Scouts - it doesn't matter whether they are or not. Or that there are no new leadership philosophies under the sun - servant leadership concepts are at least 2,500 years old at this point.

The point is to learn from what the people championing that method experienced when they applied it: intransigent doubt that the method can work because it turns leadership responsibility over to the team, or that learning-by-doing is inferior (e.g., doesn't provide value for money, or isn't more effective) to training by mass instruction.

If the concepts aren't new, the objections to them aren't, either. The strengths of a self-directed team might be self-evident to the initiated, but they're not an easy sell to those who are not - for reasons that have been with us for time immemorial. We can learn from their setbacks.

In the next post, we'll look at how champions internalized objections to the method, and what they observed happened when the method was compromised for sake of implementation.

Friday, September 30, 2016

Ecosystems and the Energy Source of Last Resort

It's fashionable for a company to proclaim itself an ecosystem. A mobile phone company makes handsets for users and curates an app market place for developers. The virtuous cycle of an ever increasing collection of apps motivating an ever increasing population of consumers. They have the benefit of steady cash flows from existing customers and constant growth from new ones attracted by an increasingly complex array of products. There are a number of self-proclaimed commercial ecosystems, ranging from online lending to conglomerates of retail, credit and loyalty.

Markets are kind of like ecosystems in the way participants reinforce one another. Buyers and sellers come together in sufficient numbers to perpetuate a market. As more buyers emerge, more sellers offer their wares in the market, which attracts still more buyers. An increase in the number of buyers triggers more complex and diverse products, making the ecosystem more interesting, if not more robust. To wit: demand for tomatoes triggers cultivation of different varieties, some of which are resistant to disease or insects that others are not, increasing the resiliency of the lycopene trade.

Ecosystems aren't inherently complex: a simple terrarium consisting of a lamp, dirt, water and seeds will yield plants. Commercial ecosystems aren't complex, either. We can stand up marketplaces for mobile phone software or money lending or property investing. In doing so, we hope to encourage people to risk their labor by writing software they hope people will buy, or their capital they hope will find a worthy investment. With the right marketing and promotion (i.e., fertilizer) we might attract ever more buyers and ever more sellers, creating a growing and ever-increasing community.

One thing an ecosystem needs to survive is a constant supply of energy. The sun provides an uninterrupted supply of energy to the Earth. It can be consumed immediately (e.g., through photosynthesis). It can also be stored: liquefied dinosaurs are effectively stored energy that originated with the sun. Energy from the sun can be concentrated in many other forms, and accessible to parts of the planet when they're directly exposed to it. This allows formation of more complex life and lifestyles. Some spot on the Earth may suffer drought or fire or some other disaster that wipes out the basic plant life that supports more complex life forms, but the constant energy from el sol means that a devastated area has a source of energy it can draw on to re-develop.

In commercial ecosystems, capital is energy. Markets are highly vulnerable to periodic contractions of liquidity. Both asset classes and tech products fall out of favor, destroying the fortunes of sellers quickly (bank equity values in 2008) or slowly (Blackberry software developers from 2008 onward). Turn off the lamp and the terrarium becomes barren.

Markets require a constant supply of capital in the same way that ecosystems needs a constant supply of energy to survive volatility and seizures. In financial markets, there are market makers who guarantee a counterparty to every trade and buyers of last resort who provide liquidity in the event of a sudden seizure of market activity. It's the latter - the Federal Reserve and the European Central Bank buying sovereign and commercial paper as well as lending to banks with the expectation that they will do the same - who act as the constant supply of energy that keeps commercial ecosystems functioning. Markets will surge and markets will plunge, but it is the "energy source of last resort" that sees markets through the peaks and troughs.

Economic cycles - credit or tech - aren't new phenomenon. When they turn, they expose the fragility of the businesses at their mercy. Late last year, lending marketplaces found themselves with plenty of loans they could write but fewer willing to buy them. The solution they turned to was to introduce a buyer of last resort, initially in the form of banks and eventually in the form of investment vehicles they created themselves.

Any self-proclaimed ecosystem without a backstop buyer - that is, without a constant and reliable source of energy - will be at the mercy of commercial cycles. Mr. Market will not hesitate to turn off the terrarium lamp when the cycle tells him to do so. Once off, he is not so willing to turn it on again. But he might not reach for the switch for the first place - and might very well be first to harvest green shoots after a devastation - as long as there is an energy source of last resort.

Wednesday, August 31, 2016

Method, Part II

Last month, we looked at method as a codification of experience borne of values and expressed through rules, guidelines, practices, policies, and so forth. This month, we'll take a look at the relationship between method and the things that influence it, and that it influences.

The principal framework is an article by Cliff Jacobson describing the change in method that impacted camping and outdoor activity starting in the 1950s, drawing comparisons to changes in method in software development. When we think of method in software, we generally think big: "Agile versus Waterfall". But there are more subtle changes that happen in method, specifically through the codification of skill into tools.

Plus ça change...

* * *

Method...

... and Values

"Environmental concerns? In those days, there were none. Not that we didn’t care, you understand. We just didn’t see anything wrong with cutting trees and restructuring the soil to suit our needs. Given the primitive equipment of the day, reshaping the land was the most logical way to make outdoor life bearable.
"In 1958 Calvin Rutstrum brought out his first book, The Way of the Wilderness. Suddenly, there was new philosophy afield. Calvin knew the days of trenched tents and bough beds were numbered. His writings challenged readers to think before they cut, to use an air mattress instead of a spruce bed. Wilderness camping and canoeing were in transition."
-- Cliff Jacobson

As values regarding nature changed from "tame the land" to "conservation", the method of camping had to change. Of course, it took a long time for the new values to settle in. And even once it did, it took a long time for practitioners to change what they did. Resistance to change is a powerful thing, and both practitioner and gear lag would have kept practitioners executing to an old value set in the field for a long, long time.

Values changed in software, too. When users of software were largely internal to a company, before software became such a high cost item for non-tech companies, and before software was weaponized, development moved at a much slower and more deliberate pace. Once the values changed, the method of software delivery was also pressured to change. The change in relationship between humanity and the outdoors is similar to the change in relationship between companies and their software.

... and Skills

But this narrative applies to both a wholesale change in method as well as to the transition from skills-centric to tool-centric method.

"I discovered the joys of camping at the age of 12 in a rustic Scout camp set deep in the Michigan woods. It was 1952, just before the dawn of nylon tents and synthetic clothes. Aluminum canoes were hot off the Grumman forms, though I’d never seen one. Deep down, I believed they’d never replace the glorious wood-ribbed Old Towns and Thompsons."

Early backpackers had to make do with bulky tarps, fashioning poles and tent pegs from branches - sometimes, even sapling trees - in their campsites. The emphasis among the early outdoorsmen was on the skill of adapting the environment to human survival, and achieving Spartan levels of comfort was a symbol of mastery. Being able to fashion poles and pegs from tree limbs was important in the 1940s as tents didn't necessarily come with them. This was not only destructive, it became unnecessary with the evolution of lightweight and portable aluminum poles and stakes. In a relatively short period of time, being good at pioneering became, at best, only useful in an emergency (you need to fashion a tent peg because you discover you've lost some aluminum ones).

Building Agile trackers in spreadsheets and crafting them anew with each project was somewhat akin to fashioning new tent pegs every time you go camping. Creating a new tracker with each project was a waste of money, and having 5 different teams with 5 different trackers was confusing. The advent of cheap commercial trackers made this unnecessary. Still a good skill to have in an emergency - a project tracker so badly polluted with low priority Stories and tasks is an impediment when you want to make a clean and quick start - but fashioning a tracker is no longer itself a core skill.

... and Tools

"The emphasis had shifted from skills to things."

Early tools supporting a method are crude, often hand made and narrow in their usefulness, and several tend to spring up at the same time. The emphasis is on skills.

But with ever increasing popularity of the activity (trips to the Boundary Waters, or Agile software development) comes the tools. Skills take time to learn and master. Tools make the activity at hand easier to perform, and subsequently more accessible and more enjoyable to more people because they're more successful at it. Canoes are made of strong yet lightweight materials so they're more tolerant to misuse while simultaneously easier to portage. Sleeping bags are made of synthetic materials that are water resistant (unlike down) so that somebody who does a sloppy job at packing a canoe pack won't suffer if the bilge water soaks the contents of the bag.

Of course, tools can be a source of efficiency or a source of trouble. A hatchet makes it easy to build safe, small fires out of short cut sticks. But a hatchet can cause grave injury to somebody if they don't know the proper method for chopping wood with it. Nobody is likely to suffer bodily harm from an over-engineered build script (no matter how many felonious thoughts cross the mind of other people in the team) but an overloaded, single-stage build that reduces build frequency and that fails frequently will cause more harm than good.

"Today, high-tech gear and high-powered salesmanship have become a substitute for rock-solid outdoor skills."

As the complexities of a method get codified into the gear, it becomes difficult to separate one from the other. The tools become a proxy for the method because the state-of-practice matures in conjunction with improvements in science (materials or software) and affordability. Today, we create sophisticated, multi-stage pipelines that instantiate their deployment environments and deploy on every commit. 15 years ago, it was amazing to have a build run every few minutes that would both compile source code and run tests. We can't imagine forging our own crude tools to do basic tasks, or even why we'd want to do it.

Newer tools don't lend themselves to older practices. Tightly rolling a modern (down) sleeping bag won't get it into its stuffsack. Managing cloud instances like rack-mounted servers in a physical data center will run up the bills really fast.

This can be a serious point of confusion for middle managers tasked with making their organization "Agile". If we use Jira, Jenkins and jUnit we must be Agile.

... and People

"I felt quite inadequate, like a peasant in Camelot."

Tools can render entire skills sets irrelevant. The right brain creativity to fashion a tracker for some specific project was no longer needed when the commercial tracking tools arrived. It became a left brain activity of making sure all the requirements were entered into the tracker and configuring canned status reports. Suddenly the thing somebody did that was an act of value has been rendered obsolete by the gear.

The information modeler who was capable of telling the right story based on the nature of the task and team is shoved aside by the efficient administrator who's primary job is to maintain team hygiene. It's entirely possible that the hygienist doesn't really understand why they perform the tasks they perform, but they've been told to hustle people to stand-up and make sure people update status on their (virtual) cards. They're also much cheaper than the craftsman they replaced.

This is pretty destabilizing to people. Where Cliff Jacobson felt inadequate by the gear (and the associated cost), the individual can be stripped of their own sense of self-worth by a change in the method. This can happen when the method changes owing to the values (we need to deploy daily and Waterfall won't let us do that). You might have fancied yourself pretty good at software within your organization, but now the boss is telling you that your worldview is out of touch, you're not up to scratch and you're not only told that you're going to do it differently, but how you're going to do it. That's not likely to elicit warm and welcoming feelings. Just the opposite.

But it can also happen when the change in method is a shift from skills to things. Suddenly anybody can appear to be good at project tracking. That can stir resentment that encourages resistance to the tools and pine for the spreadsheets.

The reverse - a sudden shift from tools to skills - has no less an impact. There are development stacks that are entirely tool driven. When the boss comes in and announces that all vendor dependencies in the code and process gotta go, the tool dependency no longer compensates for weak skills. The person accustomed to going glamping may not much care for back country backpacking.

... and Basics

"Chemical fire-starters take the place of correct fire making; indestructible canoes are the solution to hitting rocks; bizzard-proof tents become the answer to ones inability to stormproof conventional designs; GPS positioning has replaced a map and a compass. And the what-if-you-get-your-down-bag-wet attitude attracts new converts every year. In the end, only the manufacturers win."

Cliff Jacobson argues that tools are a poor substitute for skills. Where they support the value system - Leave No Trace camping - they're welcome. But where they are simply gadgets for convenience or separate the individual from the experience, they're not. They're also predatory, exploiting a person's laziness, or fear of being unable to master a skill, or feeling of inadequacy in dealing with challenging situations that might arise.

To same extent, the impulses that spurred the software craftmanship movement are likely similar to those of Messers Rustrum and Jacobson:

"'I’ve canoed and camped for nigh on seventy years and have never got my down bag wet,' he bellered. 'People who get things wet on trips don’t need new gear. They need to learn how to camp and canoe!'"

Pack correctly and paddle competently and you'll never sleep in a soggy bag. We don't need armies of people mindlessly executing test scripts if we build in quality in the first place.

... and the future of method.

Method is a mirror, not a driver. It reflects values and experience, it doesn't create them. Values shift as our priorities change. Experience changes as we learn what new technologies allow, and sometimes re-learn discipline long lost. Method reflects this; it doesn't inform or define this. The values and experience are there, or they are not.

Method is never a destination. It's an echo.

Sunday, July 31, 2016

Method, Part I

Last month, I went on a 6 day, 55 mile canoe trip with several friends. I last went canoe camping in the 1980s with the Boy Scouts. Being out of practice, I bought books from as far back as the 1950s to as recent as 2010 on canoe camping, studying everything from gear and technique to meal planning and water purification.

Some things haven't changed much over the years: Duluth packs are still in fashion because of their low profile when placed inside a canoe and on your back while portaging one. Some things have changed a lot: plastic barrels with harnesses have replaced the old wooden Wanigan boxes. Some things seem to be over-engineered replacements: you could use a GPS and a map, but a compass works really well, doesn't require batteries, and costs a lot less.

No surprise that the method I learned for canoe camping (and backpacking in general) in the 1980s is different from the method practiced today. The method has changed for a lot of reasons. One is technology: materials science has changed what we pack and how we pack as gear is lighter and easier to compact. Another is economic affluence: we no longer make things once in camp, you buy them in advance and pack them in. Yet another are environmental standards: leave-no-trace has us carry food in the thinnest of packaging since we pack it all out.

The method I learned wasn't exactly state-of-the-art, even for the 1980s. For one thing, the leaders had learned method a decade (or more) before I joined. For another, the Boy Scout troop I was with had acquired most of it's gear in the 70s, and some of it dated to the 60s. Gear was expensive, so upgrading it wasn't an option. While our method was effective, it was far from cutting edge.

Clearly, to make a canoe camping trip in 2016, my method needed to change.

* * *

Method is the codification of experience into rules, guidelines, policy, principles, behaviors, norms and so forth. Method is intangible, but it has tangible manifestations: gear and tools are derived from method so that it can be followed, and performed with efficiency.

One reason we develop methods is so that people new to a craft can learn it in a safe and responsible manner: if newbies can build a cooking fire without burning down the forest, they don't go hungry and future campers will have a chance to enjoy the same forest. Sound method spares disaster and frustration. Another reason for developing method is that is allows us to codify knowledge and build on collective craft, pushing the boundaries of technique and gear: the risk of forest fire means we're better off cooking over stoves rather than campfires, which encourages research into energy sources and stoves, which creates safer, lighter and higher density energy sources for cooking, which allows more people to backpack safely for longer periods in remote areas.

There are methods for all kinds of things. NASA's Manned Spacecraft Center defines a method for putting human beings into space and bringing them back alive. FASB defines methods for accounting practices.

Methods are developed by people who have first-hand experience of what works fantastically well, sorta OK, and not at all. This is why people who define methods have to be hands on: method defined by people without experience is just pontification. But those same people need to be abstract thinkers, aware of the un-changeable forces that need to be dealt with. We have to perform a barbecue roll of a spacecraft throughout a journey to the moon, otherwise it'll freeze on one side and burn up on the other.

Methods are a means of practicing a value system. We perform fire safety following this protocol because we're more concerned with the damage we would cause if we lose control of the fire than we are for having a heat source to cook dinner. Values are powerful, because they supersede any rule or practice that is part of the method. If the RCS ring is activated on a Gemini spacecraft, the astronauts have to come home: the procedure is irreversible and we're more concerned with bringing them back alive than pursuing mission objectives that could put their lives in danger. Values are non-negotiable - in theory, anyway. Compromise your values and you cast doubt over the integrity of the method you insist that everybody follow. If you can take fire out of the ring to perform some stunt, it clearly doesn't matter whether fire stays in the ring or not, so "fire safety" must not be that important to you.

Methods define responsibilities and authority. I'm the voyageur, so I'm the navigator of this canoe trip. I'm Capcom, so only I talk to the astronauts. Changes in method are threatening to people where they upset how they understand power dynamics in a group.

The world around changes, so method needs to evolve. Knowing how to make a backpack (Boy Scout Fieldbook, 1967) isn't particularly useful with increased affluence and advances in materials science. People also need to evolve with method to stay on top of practices, as does their gear: practitioner lag sows seeds of confusion, while gear lag can make some activities impractical, if not impossible.

Except when stipulated by law (e.g., accounting standards), there are no shortage of methods for performing the same activity. National Outdoor Leadership teaches a method for canoe camping and backpacking that is different from the Boy Scouts, which is different from Outward Bound, which is different from the method taught by countless other organizations.

When everybody in a group is trained in the same method, we have uniformity of conduct. The group can be expected to socially reinforce the method should one member slack off in a key area (sanitizing cookware) for reasons of convenience (hates doing the washing up).

When people in a group are trained in different methods, or some are trained in no method at all, the differences can range from trivial (you packed in magnesium? That's great mountain man, we have plenty of matches in multiple waterproof containers) to severe (I'm sure you're not planning to burn down the forest, greenhorn, but fire stays in the ring just the same). The differences can be learning experiences, or competitions, or sources of conflict and dysfunction within the party. People can get pretty worked up when every ounce of their being tells them that something is important (secure the canoes overnight because a strong gust of wind can flip - and wreck - a canoe) but other members don't feel as strongly (securing is overkill as gusts that high are highly unusual).

Differences in a small group are mirrored by the differences among method "experts". In the same article in the Summer 2016 edition of Boundary Waters Journal, one expert writes "[getting wet on portages] is completely preventable", while another, 6 pages later in the same article, writes: "Forget about keeping your feet dry. They will be wet." Not only will the disagreements be mirrored, so will the intensity of the arguments about them. This comes as no surprise as method is a proxy for a value system, but the arguments are rarely over the values, and usually over the practices themselves.

Sometimes, we set out to learn a new method by choice. Other times, we're forced to. The means of teaching method range from immersive to suggestive. At one extreme are those who teach method by breaking us down and rebuilding us, to strip away our preconceived notions and ideas, developing new muscle memories for a way of doing things. At the other are those who observe what we do today, suggesting and coaching how we could do something differently, allowing us to decide and trusting that the obviously superior method will prevail.

The student of a new method may embrace it enthusiastically. Or may never be convinced it is better than the one long practiced. And if not a willing adherent to a new method, being told that you have to change your method is threating. It says that the expertise you developed is irrelevant, unnecessary, inferior, or just plain wrong. It erodes your seniority within a system that is based on that method. It's tough being told "the world has moved on from your understanding of it."

We have many methods for developing software, too. We'll look at how we apply those in Part II.

Thursday, June 30, 2016

In Tech, Portfolio Management is a Metaphor, Not a Way of Life

A few years ago, I wrote about the chronic misuse of the word "governance" in technology. The word "portfolio" is suffering the same fate.

The reason for introducing the word "portfolio" into tech management is that some portion of tech spend is really an investment in the business, something that differentiates it and gives it a competitive edge - for example, mobile or web client-facing apps, or internal software that codifies workflows to capture efficiencies. This is fundamentally different from utility tech, the basic technology a company needs to function, such as ERP or email. To get a better return from our strategic opportunities, we should think in terms of investing-like behaviors and outcomes, as opposed to traditional project-oriented ones (such as time and budget). The word "portfolio" enters the lexicon because when we have multiple technology investments, we logically have a collection of them. Since most firms have more opportunities for investment than capital to invest, there is a degree of selectivity of what gets included in the investment portfolio.

While I was writing chapter 8 of Activist Investing in Strategic Software, it occurred to me that the use of word "portfolio" in technology has increased in recent years. Unfortunately, the activity described by aspirant "technology investment portfolio managers" are a very small fraction of those characteristic of financial portfolio management. The principal problem is that the word "portfolio" suggests a degree of decision-making flexibility that the captive technology investor doesn't really enjoy. Consider:

  1. Although we can create diversity of our investment outcomes, the strategic software investor is limited to a single investment type - e.g., an operationally-based delivery. A financial investor has many more choices of vehicles and outcomes than a captive investor does. There are no fixed income products available to the captive technology investor, it's all equity. Plus, although we can run multiple experiments to qualify an investment and accelerate our delivery frequency to get things in production faster, all of our investment positions are inherently long. The only short position we can take in something is not to invest.
  2. While there are countless investment opportunities, it's rare that a company can pick and choose every investment it makes. Some investments are forced on it by regulation; others by competition; still others by reliability of dilapidated legacy assets; and sometimes because the boss says this or that is what we're going to do and that's all there is to it. A captive technology investment portfolio isn't as discretionary as a financial one.
  3. Our investment goals are denominated in different and sometimes only quasi-economic measures of value. All financial investments denominate their performance in the same measure, even those that have explicit non-financial goals. As I've written before, it's futile to concoct synthetic measures like "business value".
  4. Most often, we can't measure the impact of any given investment in isolation of all other changes a business makes to itself and those that happen in its commercial ecosystem: we change the economics, processes, technology, and policy of our business all the time, while our commercial partners are also making the same changes to theirs. Isolating an outcome to a single decision or action (like a specific tech investment) is very difficult. In addition, because counterfactuals are unprovable, we can't measure whether an alternative investment would have yielded a better or worse outcome. In contrast, we can measure our results of financial portfolios against how well "Hindsight Capital Partners" performed over the same time period.
  5. Venture capital is a high risk business that has more misses than hits. The success rate doesn't improve when the VC is corporate rather than financial. A company doesn't have unlimited capital to experiment at scale and can't afford to have a low success rate on investment decisions.

As much as we may want to be investors, the portfolio metaphor is very limited in captive investing situations. It lacks diversity, is inherently imprecise, and both performance and competency as investors are as much a matter of opinion as fact. It's potentially dangerous, too, because managed poorly it can damage its operational solvency (that is, the capability to get things done through technology) by making a mess of its capital management.

The premise of the book I'm writing - that activist investing behaviors yields better performing tech investments - is also substantially a metaphor. There are real aspects to it, specifically that shareholder activist behaviors - being investigative, inquisitive, interrogative and invasive - are highly suitable to captive technology investing. But "shareholder activism" only goes only so far: we can't buy out other "investors" and we don't gain control of the board through proxy fights. We have fewer levers to throw to change outcomes, and virtually all of them are operational (process, scope, personnel) in nature: few are the captive investments that can show better performance on the basis of financial engineering alone.

While metaphors are limited, they do help us to interpret our world differently. When we think in investment terms, we see financial expectations and possibilities much more clearly than when we think strictly in operating terms. When we think in shareholder activism terms, we understand the importance of good governance structures and mechanics, and the need for diligence by those investing in the business through technology. Interpreting delivery of strategic software through these lenses adds dimensions that make the operations that create them more value-generative to their host businesses.

But we do ourselves no favors by getting carried away with it. As helpful as the portfolio metaphor is, it's just a metaphor - not a way of life.

Tuesday, May 31, 2016

The Times, We've No Idea How Much They're Changin', Part III

In the last post, we looked at the changing relationship between people and possessions, particularly how the perception of land has changed. But there's more to this than just changes in dwelling and mortgage finance. Land was part of a land-labor-laborer troika, a relationship that has existed since the dawn of humanity. Land could feed and protect the laborer and be a means to a better life, but not without expending labor to till the soil and construct buildings. Just as the perception of land is changing, the perception of labor is changing as well, which erodes the relevance of this troika on an individual level.

Over the course of centuries, labor evolved from a means of subsistence, to a means of income, to a means of achieving economic prosperity (through entrepreneurism and meritocracies). It also was a source of individual identity and self-worth: to be employed communicated one's ability to contribute to society. Yet labor has taken a beating in recent years. A two decade period of nearly uninterrupted growth that began in the early 1980s made labor for the real economy king by the late 1990s (peak employment and household income levels), but a couple of recessions soon thereafter meant it went from being scarce to abundant before the first decade of the new century was out. Meanwhile, real economy jobs lost favor to jobs created by ever-cheaper capital, through things like venture-capital subsidized start-ups, property flipping, and day trading. These latter jobs produced far more lottery winners (remember equity options in the 1990s?) than their real economy counterparts. Other factors, like the erosion of job stability, the replacement of pensions with 401(k)s, and more recent worry over the gig economy and robots and automation displacing real economy labor, have cast labor as transient and eroded it's perception as a cornerstone of societal durability.

Labor has gone from champ to chump in less than 20 years. It's no wonder every spring commencement speech contains the obligatory "pursue your passion" line: the not-so-subtle message is that labor on its own is no longer a thing of fulfillment.

This belies the idea that labor is a vehicle to achieving quality of life. Because real wages have been stagnant for nearly a generation, there's less perceived value to labor. If you don't feel that your lot will improve through your labor, you're less inclined to labor for anything better.

Not to mention that for many people there's less urgent need to labor. Previous generations accumulated a lot of wealth, creating familial support nets. Empty-nester parents welcome their children back to live with them in cavernous houses (compared to the small houses of their grandparents and great-grandparents where they couldn't return even if they had wanted to). The cash they accumulated through investing has become inter-generational transfer payments. Why work to meet your own subsistence needs if you don't have to?

Labor, which was once at best a key to individual freedom and at worst a denial of recreation time, is being cast as indentured servitude (we're forced to work by the system) or an emotional prison (the job isn't someone's passion, it's just the job they could find).

This points to change in the definition of what freedom is.

In the 1960s, we had a generation push back on going to war because it didn't understand why they should have to (e.g., go fight a proxy war in a far away land). Today, we have a generation pushing back on going to work because it doesn't understand why they should have to (e.g., if we have so much food and so much stuff why do we have to solve the same problems over and over again that just lead us to more food and more stuff?) Freedom is becoming independence from the need to solve basic problems long ago solved like transportation and food to pursue Maslow-like self-actualization. Freedom is no longer something each person achieves individually by "working the land", it's provided by a sophisticated, intricate system that charges rents in exchange for alleviating burdens - both cost and time - of ownership.

We started this blog series with the observation that change can be a long time coming, particularly because economic habits die hard. We'll still have household formation, individual property ownership, and most of us will have to work for a living for a long time to come. Tax receipts are substantially derived from income. And the land-labor-laborer troika has been displaced before (productivity increases during the industrial revolution de-valued labor for two generations) only to find its way back (the rise of America as an economic power in the 19th century). Yet these ideas around labor are in the ether, to a point that Switzerland will hold a referendum next month whether or not to provide a universal basic income for every citizen.

Which brings us back to where we started. The winds of change blowing today would alter fundamental economic relationships that have been around for centuries, and the technology is here now to make them practical realities. The technology exists today to change shelter, transportation and investment activity of an individual to allow them more free time to pursue personal fulfillment, but that's no guarantee that a critical mass of people will in sufficient numbers, or will do so before societal winds change direction. The times, they have a-changed, and they'll continue to do so. Change happens at a societal pace, not a technological one.

Saturday, April 30, 2016

The Times, They're Still A-Changin', Part II

I ended the previous post by stating that the stage is set for more radical change. Why?

Consider the changing attitude toward land, property and shelter.

For the pre-1965 generation, land meant a lot of things. It was wilderness to be fought with, to be made into a suitable place to live. It was where you built your shelter. It was how you earned or supplemented your living, by farming, mining, logging, or guiding. It was sweat: always one more addition, improvement, or repair to be made. It was the story of America as taught to schoolchildren: land was the reason why the Pilgrims came, and "taming the land" was said to bring out the best in the early settlers who were themselves held out as American heroes and role models. It was very personal history, too: settlers intertwined family stories - and legends - with the land itself.

Land was freedom, and property ownership was independence.

Property was extraordinarily important to the American psyche. A great deal of desirable land remained remote and undeveloped until well after the Second World War (the Interstate network wasn't begun until the Eisenhower administration). Familial bonds with land were strong, particularly where property passed from generation to generation. Schools taught the history of European settlement, including battles with indigenous peoples and noteworthy settlers. A lot of the materials, tools and trade were similar to those employed by generations past, so people could relate to how their grandparents had lived. Activities like camping and hunting gave young people first hand experience of how the European settlers and indigenous people lived, reinforcing the perception of land as well as the myths about it. Land was a great investment ("they're not making more of it" as the saying went), appreciating in value virtually everywhere and almost without interruption. In part, this built up a perception of value in land. And in part, it was a reminder that you weren't too far removed from the rough-and-tumble of the wilderness.

Land no longer captures the American imagination quite so much. More people live in dense urban areas, a large number of them rent, and those who do own expect to move - either upsizing or downsizing - long before their mortgage matures. People who bought property from 2002 through 2008 suffered financially and emotionally with the housing collapse. High crop yields create less demand for farm acreage and farming families. Land has been repurposed: farms near urban areas were more valuable as residential subdivisions, and previously remote rustic areas have been developed into recreation communities or suburbs. Building and zoning regulations restrict what property owners can and cannot do with their land. Early to mid 20th century industrial manufacturers favored rural or suburban owners with the space for industrial products; 21st century providers of services and digital products - that is, where the American economy has been shifting toward for the last 40 years - favor densely populated urban centers. History classes emphasize the high cost - war, disease, resettlement - borne by indigenous peoples at the hands of European settlers. Activities like camping are now either cheap vacation choices (ever notice how many private campgrounds there are near floating casinos?), or some combination of tests of strengths balanced with stewardship.

Land is no longer freedom. Renting is freedom: renting allows you to have many different living experiences and gives you the freedom to change your living accommodation based on your lifestyle, rather than having your lifestyle dictated by the land. Land is red tape and well rehearsed ceremonies to purchase plots or pull permits; it ties you down to a mortgage and a location.

The changing perception of land also reflects the fact that shelter and sustenance - things directly related to land ownership and management - are problems long ago solved on a mass scale. There is little value in rehashing them again and again on an individual basis, when we could use our life's energy to solve the next wave of challenges, from sustainability to space exploration.

There's been a similar change in attitude toward another symbol of independence, the automobile. Because it was a way to get away ("freedom of the open road") or quite literally a means of getaway (think John Dillinger, or Bonnie and Clyde), the automobile captured the American imagination. But the automobile has changed from a symbol of freedom and possibility to one of captivity (monthly payments) and inescapable utility (suburban communities aren't designed with walking in mind, and suburban mass transit is inefficient). The car that sits idle in the garage nearly 99% of the time isn't untapped potential, it's a tax of modern living.

The things which were the physical incarnation of freedom for prior generations have become symbols of economic entrapment to newer ones. Per the previous post, technology enabling things like the "sharing economy" aren't leading change as much as they're riding the changing wave of sentiment.

This wave has a long way to go before it crests. The shift in attitudes toward land and transportation portends a change in asset ownership and associated activities like lending and insurance that we've long taken for granted. That doesn't mean a concentration of assets in the hands of a few owners: technologies like blockchain make it easier to fractionalize ownership. This will allow people to invest in small fractions of many residential properties bundled into a single instrument, and do so incrementally over a long period of time. In essence, they would live in a rented house, but own small fractions of many others. Just as people have shown a preference for passive over active investing, future generations may find it appealing to be able to invest in residential real estate without the need to mortgage future cash flows for a specific spot of dirt in an asset appreciation lottery.

Of course, that's all "some day". But change is afoot, and the stage is set for more still more that goes beyond assets, to the nature of labor itself. We'll look at that in the next post.

Thursday, March 31, 2016

The Times, They Have A-Changed, Part I

"This technology revolution was not invented by robo-advisers. They have simply noticed, and taken advantage of, a broader and deeper shift towards passive investment through ETFs and index funds."

-- John Gapper, Robots are Better Investors than People

We like to think of "technology revolutions", but as Mr. Gapper points out, revolutions aren't led by technology. The landscape is littered with shuttered technology companies that showed that a thing was possible, but failed to foment a "revolution".

Revolutions happen once we have critical mass of people with different attitudes and behaviors. Consider the changes in investing behaviors referred to above. Once investors realized that actively managed funds charged higher fees but performed no better (and frequently worse) than passively managed funds, they switched. Today, as investors come to realize that financial advisors are charging fees for putting their money in passive funds, they're amenable to robo-advisors that charge lower fees for doing exactly the same thing.

A change from human advisor to robo-investing won't happen at the pace set by technology, however. It took a long time for investors to change their preference from active to passive funds. Index funds first appeared in the 1970s, as did research showing that active funds didn't consistently outperform the broader market. Yet it took decades for investors to invest differently.

Why did it take so long? Attitudes, prejudices and predispositions take a long time to change. Sometimes they don't: those who hold a belief may never let it go, even in the face of overwhelming evidence to the contrary. And, people financially, politically or emotionally invested in something will fight to preserve it. Active fund managers initially derided passive funds, while today, facing massive capital outflows, they're fighting for survival. Those who stand to lose from change will also fight back with marketing designed to co-opt the next generation, such as the way manufacturers of sweet fizzy drinks simultaneously play to an older generation's nostalgia while encouraging them to create a sentimental moment - vis-a-vis their product - with a grandchild.

No matter how much technology we throw at something, entrenched behaviors don't start to change until a generation comes along that isn't as emotionally committed to them. And that still only describes how change starts, not how it finishes - and it can take a very long time to run its course. To understand the dynamics of change, we need to look at both ends of the generational spectrum: as people enter, people also leave. This is most obvious in the workforce where, in general, people join around age 18 and leave around age 65.

The United States has just completed a major generational shift, not so much for the one arriving as the one that has recently left.

The Great Depression and the Second World War shaped American domestic and foreign policy well into the 1990s. Yet as long ago as 1965, America had reared a generation without any direct experience of either, making them less constrained by the values held by the people who had come before them. And, starting 1965, a generation began to arrive born to parents who were themselves bred following the Depression-and-WW II-era. Prior to 1945, most everybody had direct experience to the privations of one or both. After 1965, we had generations grow up in households where those two seminal events were things their grandparents told them about from time to time, and which they only briefly studied in high school history classes.

Despite the social upheaval that coincided with the maturation of this post-depression-and-war generation (the late 1960s), the value system of the pre-1965 generation dominated American society and the American workplace, first through sheer numbers (those who held it made up the bulk of the working population) and later through positions of seniority (older people were more likely to hold executive positions).

The numbers are now vastly different. People born in 1945 reached age 65 in 2010. There are very few in the American workforce with a direct experience of life during WWII, let alone the Great Depression. Nor are there too many who are just one generation removed from those events (that is, grew up in households directly influenced by them); those who are one-generation-removed will largely exit the American workforce by 2030.

It's no coincidence that we've seen more change in the last decade than we arguably did in the three previous decades combined. But not so much because of a new generation arriving, bringing with it new expectations and demands, as much as the old generations leaving and relinquishing the top rung of authority and social influence. Out of numbers and out of power, those value systems no longer hold sway. Although we live and work in a "post-1965" world today, it took over 40 years - two additional generations - for that to happen.

Because change is a function of society more than technology, it's slow in coming but swift once it arrives. And, while a lot of change happened with the completion of the pre- to post-1965 shift (at least, in the workforce), the stage is set for still more revolutionary change. We'll look at specific examples in the next post.

Monday, February 29, 2016

How an Operational Gap Becomes a Generation Gap Becomes a Valuation Gap

A decade or so ago, when an IT organization (captive or company) hit rock bottom - bloated payroll, lost confidence and ruptured trust resulting from rampant defects, rocky deployments, functional mis-fits, and long delivery cycles - it would give anything a try, even that crazy Agile stuff. It didn't matter if it was incumbent management with their backs against the wall, or new management brought in to clean house, desperate times called for desperate measures. To people looking to shake things up, Agile made intuitive sense and a lot of it was appealing, even if its proponents had a bit too much evangelical zeal and some of it sounded a little strange. So they'd bite the bullet and do it, selectively anyway: a build server was easy to provision; releasing software for testing every couple of weeks was done easily enough, and getting everybody in the same physical workspace (or at least in close proximity to one another) could be done with a little arm-twisting; of course, developers were instructed to only pair program "opportunistically", and automating QA was a fight nobody wanted to fight, so they'd sit with the team and test incrementally but go on testing the way they always had. Still, even with compromises, there was a good story to tell, usually to do with much higher quality and eliminating rework, and that made Agile A Good Thing for all parties concerned.

Fast forward to today, and we see that Agile is much more ambitious. A few short years ago we were content to have an automated build execute every few minutes; today we want every check-in to trigger a build that is promoted through progressively more complex tests in virtual environments managed, instantiated and torn down by scripts. We used to be content releasing for user testing every other week, and to production every couple of months; we now aspire to release to production many times a day. We used to want Master Story Lists to guide incremental development; today we want to iteratively experiment through code and have the feedback inform requirements definition and prioritization. We used to be satisfied with better delivery of internally-facing software; today we want to evolve software products that are used by people across our ecosystem, from interested parties to customers to investors to employees. Today, Agile wants to push against everything that creates an artificial or temporal constraint, be it organization, management, accounting policy, or even capital structure.

Although Agile has evolved, the entire tech world hasn't moved with it. In fact, some of it hasn't moved at all: it's still common to see non-Agile organizations that do big up-front design; work in functional and skill silos; have manual builds, deployments and testing; and make "big bang" releases. And, it's still common for them to face a "rock bottom" moment where they conclude maybe it's finally time to look into this Agile stuff.

As hard as it was a decade ago to inject Agile into a non-Agile organization, it's much harder today for a non-Agile organization to complete a transformation. This seems counterintuitive: since the non-Agile to Agile path is so well trod, it should be much easier than it was in those pioneering days of yore. Although there's never been more tools, frameworks, languages, books, blogs, and countless other resources available to the individual practitioner aspiring to work differently, organizational change tends not to be self-directed. The challenge isn't taking an organization through the same well-established game plan, it's finding the people - the transformational leaders - who are willing to shepherd it through its journey.

Unfortunately, re-living the same internal negotiations to reach the same compromises, solving technical and organizational problems long ago solved, only to end up with an operating model that is considerably far short of the state-of-practice today is not a destination opportunity for an experienced change leader. Even assuming, as my colleague Alan Fiddes pointed out, that the change agents brought in still have the vocabulary to carry on arguments last fought so long ago, any change agent worth their salt isn't going to reset their career clock back a decade, no matter the financial inducement.

This might simply mean that the approach to change itself is what has to change: require far less shepherding from without by expecting more self-directed change from within, brought about by setting the right goals, creating the right incentives (motivate people) and measuring the right things (what gets measured is what gets managed). Why shouldn't it be self-directed? It isn't unreasonable to expect people in a line of work as dynamic as software development to keep their skills sharp and practices current. For people leading an organization that's a little dated in how it develops software, then, the question to hold people to isn't "why aren't you doing Agile" but "we're going to deploy any time and all the time effective fiscal Q3, so how are you going to operate to be able to support that?" It's amazing what people will do when sufficiently motivated, change agents be damned.

Whether there's a more effective means of change or not, being far adrift of the state of practice points to a more severe threat to the business as a whole: a generation gap.

* * *

Three decades ago, the state of practice didn't vary that much across companies. Yes, there were people coding C over Rdb deployed in VMS on minicomputers and people coding COBOL over IMS deployed in OS/390 on mainframes, but the practice landscape wasn't especially diverse: waterfall prevailed and a lot of code was still data-crunching logic run in batch. At the time, captive IT, consulting firms, governments, new tech firms (think Microsoft in the mid-80s), and established tech stalwarts (H-P, IBM) could reasonably expect to compete for the same labor. College grads in Computer Science or Management Information Systems learned practices that reinforced the modus operandi common to all consumers of business computing.

The landscape has changed. Practices are far less homogeneous, as they've had to evolve to accommodate a diverse community of interactive users pushing for features and functionality with little tolerance for failure. The familiar combatants still slug it out for labor, but must now also compete against tech product firms untethered to any legacy practices, norms, policies or technologies. Today's grads are trained in new practices and expect their employer to practice them, too.

Companies lagging behind in their state of practice will struggle to compete for newly minted labor: why would somebody with highly marketable tech skills go to work at a place stuck in the past, when they can work in a more current - even cutting edge - environment?

This isn't just a hiring problem. A practice gap is fuel for a generation gap if it deflects young, skilled people from becoming employees. By failing to hire the next generation employee, a company develops an intrinsic inability to understand its next generation customer.

A company isn't going to reach a new generation of buyer - consumer or business - if it is tone deaf to them. A company ignorant of the next generation's motivations, desires, values and expectations has little chance of recognizing what it isn't doing to win their attention, let alone their business. Since social systems are self-reinforcing, a company is unlikely to break the deadlock of ignorance and inaction.

Failing to bridge a generation gap not only cuts a business off from growth opportunities, it sets the stage for long-term irrelevance. Investors recognize this, even when management does not. Growth changes from being a "risk" to being an "uncertainty", and when that happens a company's future1 is no longer priced at a premium, but a discount. In this way, an operational gap becomes a generation gap becomes a valuation gap.

Outdated practices are an indicator that management has it's head buried in the sand: it has a problem it can't see, that it doesn't know how to solve, that is starved for information it can't get because it has elected to disassociate itself from its source. The motivation to change how you practice shouldn't be to become more competitive today, but to still be relevant tomorrow.

 

 

1 By way of example, Yahoo net of its Alibaba holding and cash has frequently been valued at or near $0 by investors in recent years.

Sunday, January 31, 2016

Are Microservices to Ecosystems as Core Competencies were to Conglomerates?

As far back as the 19th century, industrial firms pursued vertical integration strategies. The thinking was that by owning the supply chain from raw materials to retail outlets, a firm had direct control over its entire cost structure, making it better able to squeeze efficiencies out of it and being less susceptible to supply shocks. This was important because, for large industrial firms, competing on price was the primary strategy for winning market share.

During the 1950's and 60's, companies also pursued conglomerate strategies: bringing seemingly unrelated businesses under one roof, sometimes seeking synergies (as Sears did owning a retail merchandiser and retail brokerage - "buy your stocks where you buy your socks"), and sometimes not (as LTV did owning a steel company and an airline). The rationale for the conglomerate was entirely financial: cheap (mostly debt) capital allowed large companies to grow through acquisition, and regulators were less likely to block acquisitions of unrelated firms on monopolistic grounds.

By the 1980s, both strategies had begun to lose favor. The financial benefit had evaporated: high interest rates clobbered the profits of debt-fueled acquisitions and forced divestiture. But the operating benefits weren't there, either. Different types of businesses (manufacturing, distribution, retail) require different types of leadership and have very different cultures. And, within each of those businesses, some functions are differentiating (such as fleet optimization for a logistics company) while some functions are not (nobody beats their competitors by having a superior accounting back office). Management thinking embraced "core competencies": own and hone the things that differentiate, rent and commoditize the things that do not. This also allowed for better matching of capital with company: the risks and returns from a company that owns a fleet of railcars is easier to assess than the risks and returns from a company that owns ore mines, railcars, and foundries. By breaking them up, the individual investor can choose what types of businesses to expose their capital to (a raw materials company, or an asset company, or a refining company), and the pricing of that capital more accurately reflects the risks.

Tech firms today are embracing the "vertical integration" and "conglomerate" strategies of yore by positioning themselves as "platform" and "ecosystem" companies. The thinking is that by combining multiple and diverse capabilities into a single offering, a company creates both efficiencies and synergistic value for counterparties in some activity, such as crowdsource funding or payments. The ecosystem strategy takes this even further, combining unrelated capabilities under one roof (eBay buying Skype in 2005, SAP developing HANA in 2010, Facebook buying Oculus in 2014), often justifiable if only because digital commerce is still in its infancy and nobody is really sure what's going to work and what's not.

But what if you could extract utility-like functionality from within an ecosystem into an independent company? Take payroll as an example: rather than have every Human Resources platform company need its own team of people to write and maintain state and federal witholding rules, hive those off into an independent business that makes them available as a metered service offering, charging a tiny usage tax (say, $0.001) each time it's invoked. The technology to do this is certainly available: code the algorithms as a microservice or a blockchain smart contract, and deploy them in an elastic cloud environment (as usage will tend to spike with pay cycles).

To the HR platform company, there are a lot of good reasons to do this. It monetizes a non-value generative activity (nobody subscribes to a payroll system because their implementation of the witholding rules are better than everybody else's). It throws off utility-like revenue streams that move in lock step with the broader job market. It disaggregates a staid HR utility function that needs to be deployed infrequently (potentially only as often as once per year, when new tax laws come into effect) from more innovative ones that are more exploratory in nature and demand frequent deployments (such as time entry performed through emerging wearable tech). It separates a legacy and risk-averse tech culture from a cutting-edge risk-prone one. It takes a high fixed cost for maintaining a non-differentiating characteristic off the P&L (teams maintaining rule-heavy legacy code are rarely inexpensive). It's stable cash flows would be attractive to debt finance, better aligning capital investment in HR technology. It removes asymmetric risk that can be more accurately insured (smothered in a platform, correct calculations offer no financial upside, while a faulty calculation exposes it to costly litigation and reputational damage).

True, it eliminates a barrier to entry of future competitors. And, while the market would support a handful of utilities to prevent monopoly, thin competition would give those utilities oligopic pricing power. But it creates a one-time financial windfall for the first movers, laggards would be pressured to subscribe by shareholders demanding the same structural benefits to the income statement, and low switching costs would keep utility pricing power in check.

Given that tech is in a period of both cheap capital (interest rates remain low, VC levels remain high, and companies such as Alibaba and Facebook can make acquisitions inexpensively with their own high-priced shares) and rapid growth (growth in consumption of tech products such as social media today mirrors growth in consumption of manufactured goods in the 50s and 60s), it's little surprise that we're seeing a return to industrial strategies past. But technologies like microservices and blockchain could be the modern equivalent of "core competencies" to sweep through businesses. Blockchain proponents already champion the potential of decentralized autonomous organizations (DAOs). With MBAs now eschewing investment banking in favor of tech companies, financial engineering of this nature isn't too far away.