I consult, write, and speak on running better technology businesses (tech firms and IT captives) and the things that make it possible: good governance behaviors (activist investing in IT), what matters most (results, not effort), how we organize (restructure from the technologically abstract to the business concrete), how we execute and manage (replacing industrial with professional), how we plan (debunking the myth of control), and how we pay the bills (capital-intensive financing and budgeting in an agile world). I am increasingly interested in robustness over optimization.

I work for ThoughtWorks, the global leader in software delivery and consulting.

Friday, January 22, 2010

Is Google to IBM as Apple is to Apple?

In the late 1970s, the microcomputer industry was still in its emergent stages. Microcomputers weren’t nearly as powerful as mainframe and minicomputers. There also wasn’t clearly a “killer app” for them. But at the time, it was obvious that microcomputers were going to have a significant impact on our lives. People bought computers for home and used them for work. They even brought them from home and used them at work. While the software was primitive, you could solve many different kinds of problems and perform sophisticated analyses more efficiently than ever (e.g., the simple "what if" forecasting that we can perform in an open-source spreadsheet today was a major breakthrough in productivity 30 years ago). Having a microcomputer in the office was something of a status symbol, if a geeky one. And they made work fun.

The microcomputer industry had some other interesting characteristics.

Most corporate technology people (working in a department known as "Information Systems" as opposed to "Information Technology") didn’t take microcomputers all that seriously. They were seen as primitive devices with little computing power. Toys, really. From the perspective of the technology establishment, “micros” were only really useful if they had terminal emulation software (such as VT100, 3270, 5250) so they could connect to a more “serious” computer.

It was a highly fragmented market. There were lots of different combinations of architectures, operating systems and CPUs. There were also lots of different manufacturers, each offering their own standard and going in pursuit of business users, including firms such as Osborne, Sinclair, Commodore, Tandy and a rather unique firm called Apple Computer.

No one microcomputer platform was dominant. Each sought to develop and sponsor a library of applications and add-ons so they could sell hardware. For the most part, each relied on value-added resellers as their primary channel.

IBM took a different tact when they entered the microcomputer market,. IBM didn’t compete against the rest of the microcomputer market. They created a new market for something they called a Personal Computer. Using off-the-shelf components they built an open platform that anybody could replicate. Through the combination of brand, applications and reach, IBM was the standard in the personal computer space. The prevailing wisdom at the time was that “nobody got fired for buying IBM.” This made personal computers a safe corporate investment, and made IBM the standard.

For a few years, Apple and IBM waged a pitched battle. IBM, or perhaps more accurately, the "personal computer" standard as defined by IBM, was victorious and for all intents and purposes remains the dominant platform today. And while they lost control of that which they had created, IBM had strong hardware sales, while Apple was for many years relegated to being a provider in niche markets such as education and desktop publishing.

Fast forward 30 years.

A handheld computer / smartphone industry has emerged in recent years, and it shares many of the same characteristics of the early stages of the microcomputer business.

Smartphones have been underpowered for much of the past decade, but it’s pretty obvious that they'll soon become very powerful and will have significant impact on our lives. The current "killer app" - e-mail - is really a utility function. The equivalent to the microcomputer's “’what if’ scenario” capability hasn't yet been identified for the smartphone. But it will, and these devices will change how we live and work. As with the early microcomputers, a lot of people have bought a personal smartphones, and it’s not uncommon for people to use their personal handheld for work (e.g., using an iPhone for maps/navigation). The smartphone a person carries is something of a status symbol, if a bit of a geeky one. And they’re fun.

Until recently, we’ve been force-feeding big-screen (1440 x 900 pixel) form factors into small handheld devices. That is, until the current generation of smartphone arrived, mobile devices were primarily made useful as “internet terminals” more than application processors, no different from the terminal emulation of a previous generation.

It is a highly fragmented market, with competing CPUs and operating systems. There are also lots of different vendors with proprietary products, such as Nokia, Blackberry, Palm and another called Apple Computer.

No one platform is dominant. Each is seeking to create and sponsor a library of applications as a means by which to gain market share. Most sell through value added resellers.

Google recently entered this market. In many ways they’re taking the same approach as IBM. By offering an open platform, anybody can build a Droid-compatible phone. They’ve built out a sizable applications catalogue in a short amount of time. They also have brand and reach, although it can't be confirmed whether somebody has been fired for buying Google.

It's interesting to see not only the same market phenomenon happening on a different technology, but that Apple Computer (and specifically Steve Jobs) is at the epicenter of it.

Perhaps it will turn out differently this time. Apple has been through this same dynamic once before. They can also learn from Microsoft’s unsuccessful attempts to make Windows Mobile an ubiquitous platform . And Google has entered the hardware business on the Droid platform, but they're not a hardware company. However, none of this may matter. In the 1980s, the value was in the hardware, but the lion's share of revenue in the Android market won't be in hardware sales. This means Google is following a similar pattern, but changing the attributes. They're not pursuing a 30 year old strategy as much as they're updating a strategy to be the dominant provider in a current market.

No matter how this plays out, it's shaping up to be an epic battle for platform supremacy, just as we experienced 30 years ago. The microcomputer industry was highly innovative in the 1980s. It was an exciting business to be in. No doubt the same will be true of the smartphone / handheld computer business in the 2010s.

It was Mark Twain who wrote, “History doesn’t repeat itself, but it does rhyme.” We’re witnessing this now. Best of all, we have a front row seat.