I consult, write, and speak on running better technology businesses (tech firms and IT captives) and the things that make it possible: good governance behaviors (activist investing in IT), what matters most (results, not effort), how we organize (restructure from the technologically abstract to the business concrete), how we execute and manage (replacing industrial with professional), how we plan (debunking the myth of control), and how we pay the bills (capital-intensive financing and budgeting in an agile world). I am increasingly interested in robustness over optimization.

I work for ThoughtWorks, the global leader in software delivery and consulting.

Monday, October 31, 2022


A few months ago I was asked to review a product strategy a team had put together. I had to give them the unfortunate feedback that what they had created was a document with a lot of words, but those words did not articulate a strategy.

There is a formula for articulating strategy. In his book Good Strategy, Bad Strategy, Richard Rumelt puts forward the three essential elements of a strategy. It must:

  1. Identify a need or opportunity (the why)
  2. Offer a guiding policy for responding to the need or opportunity (the what)
  3. Define concrete actions for executing the policy (the how)

There’s more to it, of course. The need or opportunity has to be well structured and specific. The guiding policy must be focused on the leverage that a company can uniquely bring to bear (this is effectively the who that a company is) as well as anticipate the reaction of other market participants. The actions must be, well, actionable.

What we see too often passed off as strategy are goals (“grow the business by xx% in the next y years” is a goal, not a strategy); vision statements (“we want to be the premier provider of aquatic taxidermy products” is a lofty if vain ambition); or statements that are effectively guiding policies (“to be the one stop shop for all of our customer’s aquatic taxidermy needs”) without the need (why) articulated or actions (how) defined.

I’ve seen the aftermath of a number of failed strategic planning initiatives. Each time, the initiatives failed to articulate at least two, and sometimes all three, of the aforementioned elements that compose a strategy. The postmortems to understand why these initiatives failed exposed a few consistent patterns.

One pattern is that the people involved in the strategic thinking did not truly come to grips with what is actually going on in a company’s environment. To understand “what’s going on” requires collating the relevant facts (internal and external) into a cohesive analysis. That, in turn, requires a great deal of situational awareness: an honest assessment of a company’s capabilities, a high degree of customer empathy, and a fair bit of macroeconomic understanding. It also requires a sense of timeliness: not too immediate so as to be just a tactical assessment (your competitors are easier to do business with through digital channels than you are), not too far in the future to be purely speculative (ambient computing). All too often, the definition of the opportunity is derived - in many cases, copied verbatim - from some other source, such as an analyst report, somebody else’s PowerPoint making the rounds inside the company, the company’s most recent annual report. Or it is a truism (the world of aquatic taxidermy is going digital). Or worse still, it is a tautology (customers will buy aquatic taxidermy products through digital channels and from physical store locations from specialist retailers and general merchandisers).

Defining the opportunity through a thorough understanding what’s going on is hard. It’s also awkward, an exercise of the blindfolded people describing the pachyderm. And that’s ok. It takes several iterations, it requires diversity of participants, and while there will be many moments when the activity feels like churn (and not the kind of churn that yields butter or ice cream), it is worth the investment of time. The “what’s going on” is, arguably, the most important thing in formulating a strategy. If the “what’s going on” is wrong, the opportunity isn’t clear, and as a result the most eloquent guiding policy and the most definitive of actions will not solve the right problem. By way of analogy, directional North stars are great, but in the field we still largely navigate by compass. A compass is low tech. It works throgh attraction to a magnetic field that serves as a close enough proxy to true north, which we correct with declination. As Dr. John Kay showed, the most successful companies navigate by muddling through.

Another pattern: whereas the would-be strategic thinkers spend comparatively little time defining the opportunity, they are obsessed with formulating the equivalent of the guiding policies. Some of this is likely a function of professional programming: if, for the totality of your career, the boss has supplied you with the reason why you do the things that you do, it isn’t natural to start a new initiative by asking “why”. Just the opposite. But the biggest reason for focusing on the guiding policies is that the strategic thinkers believe they are being paid to come up with clever statements of what a company should do. No surprise that strategic planning exercises tend to produce a lot of “what to do” options, which they present as a portfolio of strategic opportunities. Yes, the portfolio passes the volume test applied to any strategic planning initiative: too few slides suggests the team just faffed about for several weeks. So what we get is a shotgun blast of strategy: dozens of “what to do” options, only some (not many, let alone all) of which are complimentary to one another. Plenty of things to try, but they’re just that: things to try. They don’t converge at cohesive interim states where the company is poised to engage in a next level of exploitation of an opportunity or need, exploitation that is amplified through development of the unique capabilities the company brought to the table in the first place. This is not a strategy as much as it is a task list of very coarsely grained things to maybe do, at some point, and see what happens.

The fear of not having a sufficient quantity of clever “whats” is understandable, but misplaced. ‘Tis better to have a few very powerful statements of “why” that tell the executive leadership team and the board very concrete things they do not know about their company or market, with very strong statements of “what” to do about them.

The third pattern contributing to strategic planning failure is the aversion to defining the concrete actions necessary to operationalize a strategy. As damaging as getting the why wrong is to the validity of the what, glossing over or ignoring the how renders a guiding policy into a fairy tale. Figuring out the how is, for a lot of people, the least attractive part of strategy formulation: it requires coming face to face with the organizational headwinds such as the learned helplessness, the dearth of domain knowledge, the resistance to change that characterize legacy organizations. Operationalization - especially in an environment with decades old legacy systems compounded on top of one another - is where great ideas go to die: we could never do that here, you don’t know the history, it doesn’t work like that, and so forth. Yet a strategy without a clear path of execution is just a theory. No company has the luxury of not starting from where it is today. Strategy has to meet a company where it is at. This isn't big up-front design; it's just the first iteration of the end-to-end to establish that execution is in fact plausible, supplemented with a now / next / later to define a plausible path of evolution.

The aversion to defining execution of a proposed strategy stems from at least two sources. One is the tedium of deep diving into operational systems to figure out what is possible and what is not, and to then delve into the details to turn tables to interrogate in detail the things we can do, changing the question from "why we can't" into “why we can”. But the more compelling reason that strategic thinkers avoid detailing execution that I’ve observed is the fear that a single ground truth could undo the brilliance of a strategy. Strategy is immutably perfect in the vacuum of a windowless conference room. It doesn’t do so well once it makes first contact with reality. And that is the real world problem to the person academically defining strategy in the absence of execution: when given a choice, a company will always choose as Ernst Stavro Blofeld did in the movie version of From Russia With Love: although Kronsteen’s plan may very well have been perfect, it was Klebb who, despite execution failure (engineered through improvisation by James Bond), was the only person who could achieve the intended objective. Strategy doesn't achieve outcomes. Delivery does.

I’ve worked with a number of people who insist they no longer wish to work in execution or delivery roles, only strategy. Living in an abstract world detached from operational constraints is great, but abstract worlds don’t generate cash flow for operating companies. The division of strategy and delivery is a professional paradox: if you do not wish to work in delivery, by definition you cannot work in strategy.

Strategy is genuinely hard. It isn’t hard because it bridges the gap between what a company is today and what it hopes to be in the future (the what). It’s hard because good strategy clearly defines what a company is and is not today (the who), what the opportunities are and are not for it in the future (the why), and the actionable steps it can take to making that future a reality (the how), orchestrated via compelling guiding policies (the what).

Successful business execution is difficult. Successful business strategy is even more difficult. If you want to work in strategy, you better know what it is you're signing up for.

Friday, September 30, 2022

Management is Getting Things Done Through People

Last year, I wrote that one of the core capabilities of an Agile manager is to "create and adapt the mechanical processes and social systems through which the team gets things done." I went on to describe a little bit of what allows the Agile manager to succeed at this; it merits a bit more commentary.

The mechanical processes a team performs matter because orchestrating the right activities in the right sequence at the right times are essential to exploring a problem space and evolving solutions. The Agile toolkit is well established and doesn't merit detailing here, but it is worth pointing out there are a lot of different techniques, activities, and ceremonies (and a multitude of variations therein) that the Agile manager can reach for; the Agile manager must be sufficiently fluent in the mechanics to know which are appropriate, and which are not, for the circumstances. Yes, process matters.

But the mechanical processes themselves aren't enough: the manager has to create the right social system in which people can participate effectively. That's much harder than executing to a script prescribed by a mechanical process because the manager has to understand the skills and aptitudes, strengths and weaknesses, competencies and limitations of the people within the team.

Sometimes people are exactly as advertised: a subject matter expert, a department director, a knowledge worker. And sometimes they're not. This person isn't an expert in supply chain, familiar with the abstract patterns and business processes of supply chain management, with first hand experience in a variety of implementations; they actually only have experience in how this one company manages its supply chain, and only in how it operates, not how it was designed in the first place. That department director is director in title only, because they've delegated all responsibility to subordinates and require decisions they are required to make to be framed as single-alternative choices. And that knowledge worker is really only fluent in which buttons to press at which part of a process and how to fix common exceptions, but has no idea why they do what they do.

To create functional social systems, the Agile manager must be able to meet the people in their team where those people are at. That means some degree of fluency in the subject matter for which an expert has been staffed, some degree of familiarity with the types of decisions the department director makes, some recognition of the wisdom a seasoned knowledge worker possesses. In last year's post, I suggested this is a function of both EQ and professional experience. EQ is key to awareness, but not technical understanding. Professional experience can be the source of techncial understanding, but is limited because no manager has experience with every domain and every context they're asked to manage.

What the manager does not know through experience the manager must try to learn through theory by conducting independent research and investigation into the respective domains to understand the context of the various participants. The important thing for the manager isn't to become a domain expert, but to be sufficiently fluent in the terminology to use and the questions to ask to assess, engage and manage people of various backgrounds and various capability levels.

Finding the right level of fidelity with which to engage members of the team is a critical component of social system formation. To wit: asking people without first hand knowledge of “what good looks like” in supply chain management to design a next generation process will design a "faster horses" solution that is, at best, less bad than what the company has today. Suppose the manager is able to recognize this deficiency; that recognition is fantastic, but it doesn't give the manager license to tell the team to down tools until the SME who isn't quite a SME is replaced. The fact is, no team is perfect, and the manager has to work with the people they’re staffed with. If a genuine SME can’t be sourced full time, it is incumbent on the manager to (a) create a social system within which the not-quite-a-SME is able to contribute to the best that their knowledge allows them to so that the team can make meaningful progress; and (b) in recognition of the not-quite-a-SME's limitiations, to see that the team validates proposed solutions and models against established industry models (that the manager may very well have to self-source), potentially supplemented with slivers of time from experts from analyst organizations or specialists (that may have to be sourced from the manager's network).

How the manager deals with circumstances like this is the difference between a person mindlessly executing a mechanical process and a person steering a team toward an outcome. The prior is an executor; the latter is a manager.

Sometimes, of course, it just isn't there to be done. About a decade ago, I was part of a technology team partnering with a financial services data business to replatform its operations and core systems. Before the first day was out, it was patently obvious the SMEs and knowledge workers could regurgitate the keys they pressed in the monthly process they followed, but had no understanding of why they did the things they did, nor could they articulate the value of the data they provided to the services their customers consumed that data for. Unsurprisingly, the management - none of whom had first hand knowledge of the business itself, having been installed following the acquisition of the company by private equity - refused to accept our day one conclusion. So on day two we performed workshops that laid bare the knowledge deficit. We abandoned the inception because the people they had brought simply lacked the wisdom that comes with knowledge, and no amount of sourced content and slivers of expert time could compensate: we concluded we weren’t even going to get a faster horses solution out of that group. Can't get blood from a stone, so there's no point continuing to squeeze.

Trust the process? Sure. But any process is only as effective as the people participating in it, and participation is a function of the underlying social system of the team. Creating an effective social system is at the heart of the definition of what management is: getting things done through people.

Taking people at face value based on their title is an abrogation of management's responsibility. There are no free rides in management, be it project, program, product, department, division, or executive. You have to know your people; to know your people you have to meet them where they’re at; to meet them where they’re at you have to understand their context; to understand their context you have to have some familiarity with their domain. The manager who fails to do these things is not a manager, they’re an individual contributor with a highfalutin' title.

Wednesday, August 31, 2022

What Does God Need With a Starship?

Andy Kessler wrote in the WSJ this month about the value of being a contrarian. Contrarians have a reputation for being cynics or curmudgeons because they’re out of step with mainstream thinking. And it’s true that being contrarian solely for the purpose of resisting or denying change is generally not helpful. But contrarian thinking can bring a lot of constructive insight.

For quite a few years now, I’ve written about the value of activist investing, which at its best challenges institutional thinking - and, when necessary, institutional reporting - for the benefit of those invested in the business outcome. Activist investors are contrarian thinkers. An effective activist investor sources their own ground truths, creates their own hypotheses from the data, and advocates for those alternative hypotheses. This is true for public company investors and captive IT investors alike. The activist investor in a public company visits company installations, talks to customers, analyzes the footnotes of SEC filings, and develops hypotheses that management may not see or may be choosing not to report. The activist investor in a captive IT investment does the same things: interviews members of the team, reviews code, and analyzes status reports to develop alternative interpretations about the actual progress of and threats to a program or product. The formula is the same: scrutinize the data you’re provided, get some of your own, recontextualize it, and draw your own conclusions. This is critical thinking technique as we were all taught in high school.

But every silver lining has a cloud. The activist isn’t always right. David Einhorn raised questions derived from ground truths and got it right about Allied Capital, St. Joe Company, and Lehman Brothers, but he got it wrong about Keurig. The value of Mr. Einhorn’s contrarian thinking wasn’t in its accuracy as much as it was in offering a fact-based challenge to management’s narrative. The activist articulates a narrative that reframes a situation and draws attention to a risk or deficiency that others don’t see or that management may be obfuscating. The thought exercise is helpful for all investors to re-evaluate what they believe their risk exposure in that specific position to really be.

And it’s important to note that, like anything else, investor activism can be a charade. The public company investor may simply be generating doubt about a company to bolster a short position that can be quickly liquidated. Similarly, the captive IT steering committee member who is also a vendor rep may simply be fostering fear, uncertainty and doubt to drive more services revenue from an existing customer.

Perhaps most important of all, the activist investor isn’t very popular. Contrarian thinking takes us out of our comfort zone, makes us consider difficult possibilities, forces us to have data to support the thing that we desperately want to be true, and reminds us that we’re not as smart as we want to believe that we are. But more banally, challenging the board and management meeting-in and meeting-out wears on people. Contrarian thinkers are irritating. 'tis best that you enjoy dining alone.

Being a contrarian is not the easiest path to take. John Kay once wrote that regulators, if they are not to be co-opted by the regulated, require "...both an abrasive personality and considerable intellectual curiosity to do the job." Contrarian thinking at its best.

Because sometimes, when everybody is too mesmerized or beat down or overwhelmed, or simply can't be bothered, for the sake of everybody concerned, somebody has to ask: “what does God need with a starship?

Sunday, July 31, 2022


One of the benefits of being an agile organization is the elimination of IT shadows: the functions and activities that crop up in response to the inadequacy of the plans, competency and capacity of captive IT.

IT shadows appear in a lot of different forms. There are shadow IT teams of developers or data engineers that spring up in areas like operations or marketing because the captive IT function is slow, if not outright incapable, of responding to internal customer demand. There are also shadow activities of large software delivery programs. The phases that get added long after delivery starts and well before code enters production because integrating the code produced by dependent teams working independently is far more problematic than anticipated. The extended testing phases - or more accurately, testing phases that extend far longer than anticipated - because of poor functional and technical quality that goes undiscovered during development. The scope taken out of the 1.0 release resulting in additional (and originally unplanned) releases to deliver the initially promised scope - releases that only offer the promise to deliver in the future what was promised in the past, at the cost of innovation in the present.

None of these functions and activities are planned and accounted for before the fact; they manifest themselves as expense bloat on the income statement as no-alternative, business-as-usual management decisions.

The historical response of captive IT to these problems was to pursue greater control: double down on big up-front design to better anticipate what might go wrong so as to prevent problems from emerging in the first place, supplemented with oppressive QA regimes to contain the problems if they did. Unfortunately, all the planning in the world can’t compensate for poor inter-team collaboration, just as all the testing in the world can’t inspect quality into the product.

Agile practices addressed these failures through teams able to solve for end-to-end user needs. The results, as measured and reported by Standish, Forrester, and others, were as consistent as they were compelling: Agile development resulted in far fewer delays, cost overruns, quality problems and investment cancellations than their waterfall counterparts. With enough success stories and experienced practitioners to go round, it’s no surprise that so many captive IT functions embraced Agile.

But scale posed a challenge. The Agile practices that worked so well in small to midsize programs needed to support very large programs and large enterprise functions. How scale is addressed makes a critical distinction between the truly agile and those that are just trying to be Agile.

Many in the agile community solved for scale by applying the implicit agile value system, incorporating things like autonomous organizations (devolved authority), platforms (extending the product concept into internally-facing product capabilities) and weak ownership of code (removing barriers of code ownership). Unfortunately, all too many went down the path of fusing Agile with Waterfall, assimilating constructive Agile practices like unit testing and continuous build while simultaneously corrupting other practices like Stories (which become technical tasks under another name) and Iterations (which become increments of delivery, not iterations of evolving capability), ultimately subordinating everything under an oppressive regime pursuing adherence to a plan. Yes, oppressive: there are all too many self-proclaimed "Agile product organizations" where the communication flows point in one direction - left to right. These structures don’t just ignore feedback loops, they are designed to repress feedback.

If you’ve ever worked in or even just advocated for the agile organization, this compromise is unconscionable, as agile is fundamentally the pursuit of excellence - in engineering, analysis, quality, and management. Once Agile is hybridized into waterfall, the expectation for Agile isn’t excellence in engineering and management and the like; it is instead a means of increasing the allegenice of both manager and engineer to the plan. Iteration plans are commitments; unit tests are guarantees of quality.

Thus compromised, the outcomes are largely the same as they ever were: shadow activities and functions sprout up to compensate for IT’s shortcomings. The captive IT might be Agile, but it isn’t agile, as evidenced by the length of the shadows they cast throughout the organization.

Thursday, June 30, 2022

The New New New Normal

My blogs in recent months have focused on macroeconomic factors affecting tech, primarily inflation and interest rates and the things driving them: increased labor power, supply shortages, expansion of M2, and unabated demand. The gist of my arguments has been that although the long-term trend still favors tech (tech can reduce energy intensity as a hedge against energy inflation, and reduce labor intensity as a hedge against labor inflation, and so forth), there is no compelling investment thesis at this time, because we’re in a state of both global and local socio-economic transition and there is simply too much uncertainty. Five year return horizons are academic exercises in wishful thinking. Do you know any business leader who, five years ago, predicted with any degree of accuracy the economic conditions we face today and the conditions we experienced on the way to where we are today?

It is interesting how the nature of expected lasting economic change has itself changed in the last 2+ years.

A little over two years ago, there was the initial COVID-induced shock: what does a global pandemic mean to market economies? That was answered quickly, as the wild frenzy of adaptation made clear that supply in most parts of the economy would find a way to adapt, and demand wasn’t abating. Tech especially benefited as it was the enabler of this adaptation. Valuations ran wild as demand and supply quickly recovered from their initial seizures. Tech investments quickly became clear-cut winners.

As events of the pandemic unfolded, the question then became, "how will economies be permanently changed as a result of changes in business, consumer, labor, capital and government behavior?" The longer COVID policies remained in place, the more permanent the adaptations in response to them would become. For example, why live in geographic proximity to a career when one can pursue a career while living in geographic proximity to higher quality of life? Many asked this and similar questions, but not all did; among those that did, not all answered in the same way. This created an inevitable friction in the workforce. Not a year into the pandemic and the battle lines over labor policies were already being drawn between those with an economic interest in the status quo ante calling for a return to office (e.g., large banks) and those looking to benefit from improved access to labor and lower cost base embracing a permanent state of location independence (e.g., AirBNB). Similar fault lines appeared in all sorts of economic activity: how people shop (brick-and-mortar versus online), how people consume first-run entertainment (theaters versus streaming), how people vacation, and on and on. Tech stood to benefit from both lasting pandemic-initiated change (as the enabler of the new) and the friction between the new and reversion to pre-pandemic norms (as the enabler of compromise - that is, hybrid - solutions). Tech investments again were winners, even if the landscape was a bit more polarized and muddled.

Just as the battles to define the soon-to-be-post-COVID normal were gearing up for consumers and businesses and investors, they were eclipsed by more significant changes that make economic calculus impossible.

First, inflation is running amok in the US for the first time in decades. While tame by historic US and global standards, voters in the US have become accustomed to low inflation. High inflation creates political impetus to respond. Policy responses to inflation have not historically been benign: by way of example, the US only brought runaway 1970s inflation (in fact, it was stagflation - high unemployment and high inflation) under control with a hard economic landing in the form of a series of recessions in the late 1970s and early 1980s. With the most recent interest rate hike, recession expectations have increased among economists and business leaders. Mild or severe is beside the point: twelve months ago, while much of the economy recovered and some sectors even prospered, recession was not seen as a near-term threat. It is now. Go-go tech companies have particularly felt the brunt of this, as their investor’s mantra has done an abrupt volte face from "grow" to "conserve cash". Tech went from unquestioned winner to loser just on the merits of policy responses to inflation alone.

Second, war is raging in Europe, and that war has global economic consequences. Both Ukraine and Russia are mass exporters of raw materials such as agricultural products and energy. A number of nations across the globe have prospered in no small part because of their ability to import cheap energy and cheap food, allowing them to concentrate on development of exporting industries of expensive engineering services and expensive manufactured products. Those nations have also had the luxury of time to chart a public policy course for evolving their economies toward things like renewable energy sources without disrupting major sectors of the population with things like unemployment, while domestic social policy has benefited from a "peace dividend" of needing to spend only minimally on defense. The prosperity of many of those countries is now under threat as war forces a re-sourcing of food and energy suppliers and threatens deprioritization of social policies. Worse still, input cost changes threaten the competitiveness of their industrial champions, particularly vis-a-vis companies in nations that can continue to do business with an aggressor state in Europe. The bottom line is, the economic parameters that we’ve taken for granted for decades can no longer factor into return-on-investment models. Tech as an optimizer and enabler of a better future is of secondary importance when countries are scrambling to figure out how to make sure there are abundant, cheap resources for people and production.

Tech went from darling to dreadful rather quickly.

It’s worth bearing in mind that these recent macro pressures could abate, quite suddenly. Recovery from a real economy recession tends to be far faster than recovery from a recession in the financial economy. Such a recovery - notwithstanding the possibility of secular stagnation - would bring the economic conversation back to growth in short order. Additionally, regardless the outcome, should the war in Europe end abruptly, realpolitik dictates a return to business-as-usual, which would mean a quick rehabilitation of Russia from pariah state to global citizen among Western nations. However, the longer these macro conditions last, the more they fog the investment horizon for any business.

Which brings us back to the investing challenge that we have today. In the current environment, an investment in tech is not a bet on how well it will perform under a relatively stable set of parameters such as pursuing stable growth or reducing costs relative to stable demand. A tech investment today is a bet on how well an investment’s means (the mechanisms of delivering that investment) and ends (the outcomes it will achieve) accurately anticipate the state of the world during its delivery and its operation. That’s not simple when so many things are in flux. We’re on our third “new normal” in two years. There is no reason to think a stable new normal is in the offing any time soon.

Tuesday, May 31, 2022

The Credit Cycle Strikes Back

A few months ago, I wrote that the capital cycle has become less important than the tech cycle. I’d first come across this argument in a WSJ article in 2014, and, having lived through too many credit cycles, it took me some time to warm up to it. The COVID-19 pandemic laid this out pretty bare: all the cheap capital in the world provided by the Fed would have done nothing if there wasn’t a means of conducting trade. Long before the pandemic, tech had already made it possible to conduct trade.

Capital has flexed its muscles in recent months, and the results aren’t pretty. The Fed has raised interest rates and made clear its intention to continue to increase them to rein in inflation. The results are what you’d expect: risk capital has retreated and asset values have fallen. Tech, in particular, has taken a beating. Rising inflation was limiting household spending on things like streaming services, abruptly ending their growth stories. Tech-fueled assets like crypto have cratered. Many tech firms are being advised to do an immediate volte-face from “spend in pursuit of growth” to “conserve cash.”

But this doesn’t necessarily mean the credit cycle has re-established superiority over the tech cycle.

Capital is still cheap by historical standards. In real terms, interest rates are still negative for 5 and 10 year horizons. Rates are less negative than they were a year ago, but they’re still negative. Compare that to the relatively robust period of 2005, when real interest rate curves were positive. Less cheap isn’t the same as expensive. Plus, it’s worth pointing out that corporate balance sheets remain flush with cash.

Any credit contraction puts the most fringe (== high risk) of investments at greater risk, e.g., a business that subsidizes every consumption of its product or service is by definition operationally cash flow negative. Cheap capital made it economically viable for a company to try to create or buy a market until such time as they could find new sources of subsidy (i.e., advertisers) or exercise pricing power (start charging for use). If that moment didn’t arrive before credit tightening began, well, time’s up. Same thing applies to asset classes like crypto: when credit tightens, it’s risk off as investors seek safer havens.

The risk to the tech cycle is, how far will the Fed push up interest rates to combat inflation?

Supply chains are still constrained and labor markets are still tight. Demand is outstripping supply, and that’s driving up the prices of what is available. Raising rates is a tool for reducing demand, specifically reducing credit-based purchases. Higher interest rates won’t put more products on the shelves or more candidates in the labor pool. If demand doesn’t abate - mind you, this is still an economy coming out of its pandemic-level limitations - inflationary pressures will continue, and the Fed has made clear they’ll keep increasing rates until inflation cools off. With other shocks lurking - a war in Europe, the threat of food shortages, the threat of rolling electricity blackouts - inflation could remain at elevated levels while capital becomes increasingly expensive. Of course, sustained elevated interest rates would have negative consequences for bond markets, real estate, durable goods, and so on. The higher the rates and the longer they last, the harder the economic landing.

That said, tech is the driver of labor productivity, product reach and distribution, and a key source of corporate innovation. The credit cycle would have to reach Greenspan-era interest rates before there would be a material impact on the tech cycle. And even then, it’s worth remembering that the personal computer revolution took root during a period of high interest rates. Labor productivity improvement was so great compared to the hardware and software costs, interest rates had no discernible effect.

The credit cycle is certainly making itself felt in a big way. But it’s more accurate to say for now that capital sneezed and tech caught a cold.

Saturday, April 30, 2022

Has Labor Peaked?

I wrote some time ago that labor is enjoying a moment. New working habits developed out of need during the pandemic that in many ways increased quality of life for knowledge workers. Meanwhile, an expansion of job openings and a contraction in the labor participation rate created a supply-demand imbalance that favored labor.

There appears to be confusion of late as to how to read labor market dynamics. With fresh unionization wins and increased corporate commitment to location independent working, is labor power increasing? Or with a declining economy and more people returning to the workplace (as evidenced by increases in the labor participation rate) is labor power near its peak?

The question, has labor peaked?, intimates a return to the mean, specifically that labor power will revert to where it was pre-pandemic (i.e., “workers won’t continue to enjoy so much bargaining power.”) The argument goes that fewer people have left the workforce than have quit jobs for better ones; that hiring rates have increased along with exits; that the labor participation rate has ticked up slightly; that labor productivity has increased (thus lessening the need for labor); and that demand is cooling (per Q1 GDP numbers). Toss in 1970s sized inflation compelling retirees to return to the workforce and there’s an argument to be made that labor’s advantages will be short lived.

But this argument is purely economic, focusing on scarcity in the labor market that has created wage pressure. For one thing, it ignores potential structural economic changes yet to play out, such as the decoupling of supply chains in the wake of new geopolitical realities. For another, it ignores real structural changes in the labor market itself, things like labor demographics (migrations from high-tax to low-tax states), increased workplace control by the individual laborer (less direct supervision when working from home), and improvements in work/life balance.

The question, has labor peaked?, becomes relevant only when there is an outright contraction in the job market. For now, the better question to ask is how durable are the changes in the relationship between employers and employees? It isn’t so much whether labor has the upper hand as much as labor has more negotiating levers than it did just a few years ago. The fact that there hasn’t been a mad rush to return to pre-pandemic labor patterns suggests employers are responding to structural changes in labor market dynamics.

Trying to call a peak in labor power is a task wide of the mark. And for now, the more important question still seems some way off from being settled.