If you haven’t yet read the Innovator’s Dilemma, and you are involved in startups or high tech in general, I urge you to find a copy and get familiar with the concepts (here’s a summary at Wikipedia). In a nutshell, the theory states that there are moments in business when following all the “correct” or apparent data will lead a healthy company or industry down the wrong path (check the above link for some good examples). I believe the disruptive moment has come for personal computing.
The era began in the 80s, sparked with the combination of hobbyist (TI/94, Apple ][, etc) and work PCs (IBM, Compaq, etc). Throughout the 80s using computers was really an adventure, as there was limited software, clunky hardware, practically no connectivity (thats right, not even Twitter!), and lots and lots of proprietary technology. Apple’s Macintosh was marketed as the anti-Big Brother IBM PCs, despite the fact that Mac’s were closed systems and it was DOS that enabled openness, customization and flexibility. At the time, the Mac seemed poised to grab a big chunk of the personal computing market share while PCs seemed destined for the office. Consumers didn’t want the hassles of configuring home PCs, especially since the majority of the time they were just playing Oregon Trail and Carmen Sandiego anyway.
That is until Windows 3.1, then 95 showed up, both radically transforming the usability of PCs, while at the same time Apple “lost its way” with a series of ill-timed flubbed products. PC manufacturers standardized around Windows, and consumers had a fairly reliable and easy-to-use computer in the PC, with the ability to pick and choose the hardware configurations they wanted. The 90s also birthed the laptop, and as they reached mass consumption they too had massive flexibility, with some models offering literally tens of thousands of combinations. Computers were fun, productive, (mostly) reliable, and enabled access to dialup, then high-speed, Internet services.
Fast forward to 2006. PC makers began prepping their Vista launches, and marketed new models as Vista-ready. The Microsoft marketing machine was at the early stage of a rumored Billion dollar push. Consumers, for the most part, did not really know or care much about Vista, as XP (with Service Pack 2) was quite reliable and, most importantly, provided good Internet access. Vista launched, and Microsoft made a critical error by not mandating higher quality standards within the Vista-Capable program. Basically, a massive launch of new PCs with terrible flaws hit the market, and this was noticed both by personal and professional computer users. Virtually overnight the ability to easily purchase a reliable computer disappeared.
Meanwhile in Cupertino, Steve Jobs moved the Mac to the Intel chipset, and shipped OS X 10.4 (“Tiger”), but more importantly ushered in Boot Camp and third-party Parallels Windows virtualization technologies. In a fairly rapid manner, not only did the Mac become the most reliable computer around, it also ran Windows (and ran it quite well). This opened up the ability for even the most Anti-Mac people (such as myself) to gain the confidence that a MacBook was no longer a risky, fringe purchase, but had all the old software PC people were used to using, just in case.
To add a third twist, ASUS introduced the eeePC last year, a “mini-notebook” that has now spawned a fairly large group of competitors. I question the real size of this market, but it’s a key factor in our discussion of the disruption curve. These computers run a variety of operating systems, and have virtually no customizable components. Whether I’m right or not about the true market opportunity, PC manufacturers are jumping on the bandwagon of ultra-small and/or non-configurable computers.
Now onto the disruption curve: I believe the era of customizing computers has reached an end. I posted a few weeks back on why I think the MacBook will take on a huge market share, and I believe it’s not really because of Apple doing anything so “special” per se. I surmize that a transformative shift has already occurred, and whether by luck or by skill, they are poised properly for the next era of computer buying.
Consumers are speaking with their checkbooks. They are buying computers that are (1) easy to use, (2) reliable, (3) recommended by friends, (4) let them do the one thing they really want to do the most often: get online. The near-irony of the shift we are experiencing is we are closer than ever to the computing era of the 1970s, with terminal-server interaction as key. It’s no longer about massive differentials in software availability, since the browser is the most important desktop application we use today.
I believe the PC manufacturers have ample opportunity to take advantage of this shift as well, but I also believe that most will not do so until it is far too late. As with any moment of disruption, it must be incredibly hard for a PC OEM to stop making laptop ordering systems with 4 pages of configurable options and reduce it to a few controllable selections. The current system works, even if growth has evaporated, and there is not a groundswell of consumers or focus groups pointing the way to introducing less choice (one of the biggest problems of focus grouping anything, btw – less is never more).
It’ll be interesting to watch how PC manufacturers react to their changing world. I’ve been quite bullish on the Mac platform recently, but I don’t think it’s a done deal by a long shot. I truly don’t think it has anything to do with Vista or OS X specifically, and has everything to do with understanding customers’ needs. The irony here in my eyes is the fact that the “speeds and feeds” of PCs are more hidden than ever (as I learned from Ed Bott), as they’ve tried to downplay things like gigahertz and CPU caching. In other words, they’re already marketing their wares based on the new reality, they just aren’t shipping the right wares.