Success is a lousy teacher. It seduces smart people into thinking they can’t lose. And it’s an unreliable guide to the future. What seems the perfect business plan or latest technology today may soon be as out-of-date as the eight-track tape player, the vacuum-tube television, or the mainframe computer. I’ve watched it happen. Careful observation of many companies over a long period of time can teach you principles that will help with strategies for the years ahead.
Companies investing in the highway will try to avoid repeating the mistakes made in the computer industry over the past twenty years. I think most of these mistakes can be understood by looking at a few critical factors. Among them are negative and positive spirals, the necessity of initiating rather than following trends, the importance of software as opposed to hardware, and the role of compatibility and the positive feedback it can generate.
You can’t count on conventional wisdom. That only makes sense in conventional markets. For the last three decades the market for computer hardware and software has definitely been unconventional. Large established companies that one day had hundreds of millions of dollars in sales and lots of satisfied customers had disappeared in a short time. New companies, such as Apple, Compaq, Lotus, Oracle, Sun, and Microsoft, appeared to go from nothing to a billion dollars of revenue in a flash. These successes were driven, in part, by what I call the “positive spiral”
When you have a hot product, investors pay attention to you and are willing to put their money into your company. Smart kids think, Hey, everybody’s talking about this company. I’d like to work there. When one smart person comes to a company, soon another does, because talented people like to work with each other. This creates a sense of excitement. Potential partners and customers pay more attention, and the spiral continues, making the next success easier.
Conversely, there is a negative spiral companies can get caught in. A company in a positive spiral has an air of destiny, while one in a negative spiral feels doomed. If a company starts to lose market share or delivers one bad product, the talk becomes “Why do you work there?” “Why would you invest in that company?” “I don’t think you should buy from them.” The press and analysts smell blood and begin telling inside stories about who’s quarreling and who’s responsible for mismanagement. Customers begin to question whether, in the future, they should continue to buy the company’s products. Within a sick company everything is questioned, including things that are being done well. Even a fine strategy can get dismissed with the argument “You are just defending the old way” and that can cause more mistakes. Then down the company spirals. Leaders such as Lee Iacocca who have been able to reverse a negative spiral deserve a lot of credit.
Throughout my youth the hot computer firm was Digital Equipment Corporation, known as DEC. For twenty years its positive spiral seemed unstoppable. Ken Olsen, the company’s founder, was a legendary hardware designer and a hero of mine, a distant god. In 1960 he had created the minicomputer industry by offering the first “small” computers. The earliest was the PDP-1, the ancestor of my high school’s PDP-8. A buyer, instead of paying the millions asked by IBM for its “Big Iron,” could get one of Olsen’s PDP-1s for $120,000. It wasn’t nearly as powerful as the big machines, but it could be used for a wide variety of applications. DEC grew to a $6.7 billion company in eight years by offering a wide range of computers in different sizes.
Two decades later, Olsen’s vision faltered. He couldn’t see the future of small desktop computers. Eventually he was forced out of DEC, and part of his legend now is that he is the man famous for repeatedly, and publicly, dismissing the personal computer as a passing fad. I am sobered by stories like Olsen’s. He was brilliant at seeing new ways of doing things, and then—after years of being an innovator—he missed a big bend in the road.
Another visionary who faltered was An Wang, the Chinese immigrant who built Wang Laboratories into the dominant supplier of electronic calculators in the 1960s. In the 1970s he ignored the advice of everyone around him and left the calculator market just before the arrival of low-cost competition that would have ruined him. It was a brilliant move. Wang reinvented his company to be the leading supplier of word-processing machines. During the 1970s, in offices around the world, Wang word-processing terminals began to replace typewriters. The machines contained a microprocessor but weren’t true personal computers, because they were designed to do only one thing—handle text.
Wang was a visionary engineer. The kind of insight that had led him to abandon calculators could have led to success in personal-computer software in the 1980s, but he failed to spot the next industry turn. Even though he developed great software, it was tied proprietarily to his word processors. His software was doomed once general-purpose personal computers appeared that could run a variety of word-processing software applications such as WordStar, WordPerfect, and MultiMate (which imitated Wang software). If Wang had recognized the importance of compatible software applications, there might not be a Microsoft today. I might be a mathematician or an attorney somewhere, and my adolescent foray into personal computing might be little more than a distant personal memory.
IBM was another major company that missed technological changes at the start of the PC revolution. The company’s leader had been a hard-driving former cash-register salesman, Thomas J. Watson. Technically, Watson wasn’t the founder of IBM, but it was thanks to his aggressive management style that by the early 1930s IBM dominated the market for accounting machines.
IBM began working on computers in the middle 1950s. It was one of many companies in the business vying for leadership in the field. Until 1964 each computer model, even from the same manufacturer, had had a unique design and required its own operating system and application software. An operating system (sometimes called a disk-operating system, or just DOS) is the fundamental software that coordinates a computer system’s components, tells them how to work together, and performs other functions. Without an operating system, a computer is useless. The operating system is a platform on which all the software programs for applications—such as accounting or payroll or word-processing or electronic-mail programs—are built.
Computers at different price levels had different designs. Some models were dedicated to scientific study, others to commerce. As I discovered when I wrote the BASIC for various personal computers, significant work was required to move software from one computer model to another. This was true even if the software was written in a standard language such as COBOL or FORTRAN. Under the direction of young Tom, as Watson’s son and successor was known, the company gambled $5 billion on the novel notion of scalable architecture—all the computers in the System/360 family, no matter what size, would respond to the same set of instructions. Models built with different technology, from the slowest to the fastest, from small machines that could fit into a normal office to water-cooled giants that sat in climate-controlled glass rooms, could run the same operating system. Customers could move their applications and peripherals, accessories such as disks, tape drives, and printers, freely from one model to the next. Scalable architecture completely reshaped the industry.
System/360 was a runaway success and made IBM the powerhouse in mainframe computers for the next thirty years. Customers made large investments in the 360, confident that their commitment to software and training would not be wasted. If they needed to move to a larger computer, they could get an IBM that ran the same system, and shared the same architecture. In 1977, DEC introduced its own scalable-architecture platform, the VAX. The VAX family of computers ultimately ranged from desktop systems to mainframe-size machine clusters and did for DEC what System/360 did for IBM. DEC became overwhelmingly the leader in the minicomputer market.
The scalable architecture of the IBM System/360 and its successor, the System/370, drove many of IBM’s competitors out of business and scared away potential newcomers. In 1970, a new competing company was founded by Eugene Amdahl, who had been a senior engineer at IBM. Amdahl had a novel business plan. His company, also called Amdahl, would build computers fully compatible with the IBM 360 software. Amdahl delivered hardware that not only ran the same operating systems and applications as IBM, but, because it took advantage of new technology, also outperformed IBM’s comparably priced systems. Soon Control Data, Hitachi, and Itel all also offered mainframes that were “plug-compatible” to IBM. By the mid-1970s, the importance of 360 compatibility was becoming obvious. The only mainframe companies doing well were those whose hardware could run IBM’s operating systems.
Before the 360, computer designs were intentionally incompatible with those from other companies because the manufacturer’s goal was to make it discouragingly difficult and expensive for customers heavily invested in one company’s computer to switch to a different brand. Once a customer committed to a machine, he or she was stuck with offerings from the computer’s manufacturer because changing the software could be done but was too difficult. Amdahl and the others ended that. Market-driven compatibility is an important lesson for the future personal-computer industry. It should also be remembered by those creating the highway. Customers choose systems that give them a choice of hardware suppliers and the widest variety of software applications.
While this was going on, I was busy enjoying school and experimenting with computers. I arrived at Harvard in the fall of 1973. In college there is a lot of posturing, and appearing to slack off was considered a great way to establish your coolness. Therefore, during my freshman year I instituted a deliberate policy of skipping most classes and then studying feverishly at the end of the term. It became a game—a not uncommon one—to see how high a grade I could pull while investing the least time possible. I filled in my leisure hours with a good deal of poker, which had its own attraction for me. In poker, a player collects different shards of information—who’s betting boldly, what cards are showing, what’s this guy’s pattern of betting and bluffing—and then crunches all that information together to devise a plan for his own hand. I got pretty good at this kind of information processing.
The experience of poker strategizing—and the money—were helpful when I got into business, but the other game I was playing, the postponing one, didn’t serve me well at all. But I didn’t know that then. In fact, I was encouraged that my dilatory practices were shared by a new friend, Steve Ballmer, a math major whom I met freshman year, when we lived in the same student dorm, Currier House. Steve and I led very different lives, but we were both trying to pare down to the minimum the course time needed to get top grades. Steve is a man of endless energy, effortlessly social. His activities took a lot of his time. By his sophomore year he was a manager of the football team, the advertising manager for the Harvard Crimson, the school newspaper, and president of a literary magazine. He also belonged to a social club, the Harvard equivalent of a fraternity.
He and I would pay very little attention to our classes and then furiously inhale the key books just before an exam. Once we took a tough graduate-level economics course together—Economics 2010. The professor allowed you to bet your whole grade on the final if you chose. So Steve and I focused on other areas all semester, and did absolutely nothing for the course until the week before the last exam. Then we studied like mad and ended up getting A’s.
After Paul Allen and I started Microsoft, however, I found out that that sort of procrastination hadn’t been the best preparation for running a company. Among Microsoft’s first customers were companies in Japan so methodical that the minute we got behind schedule they would fly someone over to baby-sit us. They knew their man couldn’t really help, but he stayed in our office eighteen hours a day just to show us how much they cared. These guys were serious! They would ask, “Why did the schedule change? We need a reason. And we’re going to change the thing that caused it to happen.” I can still feel how painful being late on some of those projects got to be. We improved and mended our ways. We’re still late with projects sometimes but a lot less often than we would have been if we hadn’t had those scary baby-sitters.
Microsoft started out in Albuquerque, New Mexico, in 1975 because that’s where MITS was located. MITS was the tiny company whose Altair 8800 personal-computer kit had been on the cover of Popular Electronics. We worked with it because it had been the first company to sell an inexpensive personal computer to the general public. By 1977, Apple, Commodore, and Radio Shack had also entered the business. We provided BASIC for most of the early personal computers. This was the crucial software ingredient at that time, because users wrote their own applications in BASIC rather than buying packaged applications.
In the early days, selling BASIC was one of my many jobs. For the first three years, most of the other professionals at Microsoft focused solely on technical work, and I did most of the sales, finance, and marketing, as well as writing code. I was barely out of my teens, and selling intimidated me. Microsoft’s strategy was to get computer companies such as Radio Shack to buy licenses to include our software with the personal computers they sold (the Radio Shack TRS-80, for example) and pay us a royalty. One reason we took that approach was software piracy.
In the early years of selling Altair BASIC, our sales had been far lower than the widespread usage of our software suggested they should be. I wrote a widely disseminated “Open Letter to Hobbyists” asking the early users of personal computers to stop stealing our software so that we could make money that would let us build more software. “Nothing would please me more than being able to hire ten programmers and deluge the hobby market with good software,” I wrote. But my argument didn’t convince many hobbyists to pay for our work; they seemed to like it and used it, but preferred to “borrow” it from each other.
Fortunately, today most users understand that software is protected by copyright. Software piracy is still a major issue in trade relations because some countries still don’t have—or don’t enforce—copyright laws. The United States insists that other governments do more to enforce copyright laws for books, movies, CDs, and software. We will have to be extremely careful to make sure the upcoming highway doesn’t become a pirate’s paradise.
Although we were very successful selling to U.S. hardware companies, by 1979 almost half of our business was coming from Japan, thanks to an amazing guy named Kazuhiko (Kay) Nishi. Kay telephoned me in 1978 and introduced himself in English. He had read about Microsoft and thought he should be doing business with us. As it happened, Kay and I had a lot in common. We were the same age, and he too was a college student on leave because of his passion for personal computers.
We met some months later at a convention in Anaheim, California, and he flew back with me to Albuquerque, where we signed a page-and-a-half contract that gave him exclusive distribution rights for Microsoft BASIC in East Asia. There were no attorneys involved, just Kay and me, kindred spirits. We did more than $150 million of business under that contract—more than ten times greater than we had expected.
Kay moved fluidly between the business cultures of Japan and those of the United States. He was flamboyant, which worked in our favor in Japan, because it bolstered the impression among Japanese businessmen that we were whiz kids. When I was in Japan we’d stay in the same hotel room and he’d be getting phone calls all night long booking millions of dollars of business. It was amazing. One time there were no calls between three and five in the morning, and so when a call came in at five o’clock, Kay reached for the phone and said, “Business is a little slow tonight.” It was quite a ride.
For the next eight years, Kay seized every opportunity. Once, in 1981, on a flight from Seattle to Tokyo, Kay found himself sitting next to Kazuo Inamori, the president of the giant $650 million Kyocera Corporation. Kay, who ran ASCII, his Japanese company, confident of Microsoft’s cooperation, successfully pitched Inamori on a new idea—a small laptop computer with simple software built in. Kay and I designed the machine. Microsoft was still small enough that I could play a personal role in the software development. In the United States, it was marketed by Radio Shack in 1983 as the Model 100 for as little as $799. It was also sold in Japan as the NEC PC-8200 and in Europe as the Olivetti M-10. Thanks to Kay’s enthusiasm, it was the first popular laptop, a favorite of journalists for years.
Years later, in 1986, Kay decided he wanted to take ASCII in a direction different from the one I wanted for Microsoft, so Microsoft set up its own subsidiary in Japan. Kay’s company has continued to be a very important distributor of software in the Japanese market. Kay, who is a close friend, is still as flamboyant and committed to making personal computers universal tools.
The global nature of the PC market will also be a vital element in the development of the information highway. Collaborations between American and European and Asian companies will be even more important for the personal computer than they have been in the past. Countries or companies that fail to make their work global will not be able to lead.
In January 1979, Microsoft moved from Albuquerque to a suburb of Seattle, Washington. Paul and I came home, bringing almost all of our dozen employees with us. We concentrated on writing programming languages for the profusion of new machines that appeared as the personal-computer industry took off. People were coming to us with all kinds of interesting projects that had the potential to turn into something big. Demand for Microsoft’s services exceeded what we could supply.
I needed help running the business and turned to my old Economics 2010 pal from Harvard, Steve Ballmer. After graduating, Steve worked as an associate product manager for Procter & Gamble in Cincinnati, where his work included a stint paying calls on small grocery stores in New Jersey. After a few years he decided to go to the Stanford Business School. When he got my call he had finished only one year and wanted to complete his degree, but when I offered him part ownership of Microsoft, he became another student on indefinite leave. Shared ownership through the stock options Microsoft offered most of its employees has been more significant and successful than anyone would have predicted. Literally billions of dollars of value have accrued to them. The practice of granting employee stock options, which has been widely and enthusiastically accepted, is one advantage the United States has that will allow it to support a disproportionate number of start-up successes, building on opportunities the forthcoming era will bring.
Within three weeks of Steve’s arrival at Microsoft, we had the first of our very few arguments. Microsoft employed about thirty people by this time, and Steve had concluded we needed to add fifty more immediately.
“No way,” I said. Many of our early customers had gone bankrupt, and my natural fear of going bust in a boom time had made me extremely conservative financially. I wanted Microsoft to be lean and hungry. But Steve wouldn’t relent, so I did. “Just keep hiring smart people as fast as you can,” I said, “and I will tell you when you get ahead of what we can afford.” I never had to because our income grew as fast as Steve could find great people.
My chief fear in the early years was that some other company would swoop in and win the market from us. There were several small companies making either microprocessor chips or software that had me particularly worried, but luckily for me none of them saw the software market quite the way we did.
There was also always the threat that one of the major computer manufacturers would take the software for their larger machines and scale it down to run on small microprocessor-based computers. IBM and DEC had libraries of powerful software. Again, fortunately for Microsoft the major players never focused on bringing their computer architecture and software to the personal-computer industry. The only close call came in 1979, when DEC offered PDP-11 mini-computer architecture in a personal-computer kit marketed by HeathKit. DEC didn’t completely believe in personal computers, though, and wasn’t really pushing the product.
Microsoft’s goal was to write and supply software for most personal computers without getting directly involved in making or selling computer hardware. Microsoft licensed the software at extremely low prices. It was our belief that money could be made betting on volume. We adapted our programming languages, such as our version of BASIC, to each machine. We were very responsive to all the hardware manufacturers’ requests. We didn’t want to give anyone a reason to look elsewhere. We wanted choosing Microsoft software to be a no-brainer.
Our strategy worked. Virtually every personal-computer manufacturer licensed a programming language from us. Even though the hardware of two companies’ computers was different, the fact that both ran Microsoft BASIC meant they were somewhat compatible. That compatibility became an important part of what people purchased with their computers. Manufacturers frequently advertised that Microsoft programming languages, including BASIC, were available for their computers.
Along the way, Microsoft BASIC became an industry software standard.
Some technologies do not depend upon widespread acceptance for their value. A wonderful nonstick frying pan is useful even if you’re the only person who ever buys one. But for communications and other products that involve collaboration, much of the product’s value comes from its widespread deployment. Given a choice between a beautiful, handcrafted mailbox with an opening that would accommodate only one size envelope, and an old carton that everyone routinely dropped all mail and messages for you into, you’d choose the one with broader access. You would choose compatibility.
Sometimes governments or committees set standards intended to promote compatibility. These are called “de jure” standards and have the force of law. Many of the most successful standards, however, are “de facto": ones the market discovers. Most analog timepieces operate clockwise. English-language typewriter and computer keyboards use a layout in which the keys across the top letter row, left to right, spell QWERTY. No law says they must. They work, and most customers will stick with those standards unless something dramatically better comes along.
But because de facto standards are supported by the marketplace rather than by law, they are chosen for the right reasons and replaced when something truly better shows up—the way the compact disc has almost replaced the vinyl record.
De facto standards often evolve in the marketplace through an economic mechanism very similar to the concept of the positive spiral that drives successful businesses, in which success reinforces success. This concept, called positive feedback, explains why de facto standards often emerge as people search for compatibility.
A positive-feedback cycle begins when, in a growing market, one way of doing something gets a slight advantage over its competitors. It is most likely to happen with high-technology products that can be made in great volume for very little increase in cost and derive some of their value from compatibility. A home video-game system is one example. It is a special-purpose computer, equipped with a special-purpose operating system that forms a platform for the game’s software. Compatibility is important because the more applications—in this case, games—that are available, the more valuable the machine becomes to a consumer. At the same time, the more machines consumers buy, the more applications software developers create for it. A positive-feedback cycle sets in once a machine reaches a high level of popularity and sales grow further.
Perhaps the most famous industry demonstration of the power of positive feedback was the videocassette-recorder format battle of the late 1970s and early 1980s. The persistent myth has been that positive feedback alone caused the VHS format to win out over Beta, even though Beta was technically better. Actually, early Beta tapes only recorded for an hour—compared to three hours for VHS—not enough for a whole movie or football game. Customers care more about a tape’s capacity than some engineer’s specs. The VHS format got off to a small lead over the Beta format used by Sony in its Betamax player. JVC, which developed the VHS standard, allowed other VCR manufacturers to use the VHS standard for a very low royalty. As VHS-compatible players proliferated, video-rental stores tended to stock more VHS than Beta tapes. This made the owner of a VHS player more likely than a Beta owner to find the movie she wanted at the video store, which made VHS fundamentally more useful to its owners and caused even more people to buy VHS players. This, in turn, further motivated video stores to stock VHS. Beta lost out as people chose VHS in the belief that it represented a durable standard. VHS was the beneficiary of a positive-feedback cycle. Success bred success. But not at the expense of quality.
While the duel between the Betamax and VHS formats was going on, sales of prerecorded videocassettes to U.S. tape-rental dealers were almost flat, just a few million copies a year. Once VHS emerged as the apparent standard, in about 1983, an acceptance threshold was crossed and the use of the machines, as measured by tape sales, turned abruptly upward. That year, over 9.5 million tapes were sold, a more than 50 percent increase over the year before. In 1984, tapes sales reached 22 million. Then, in successive years: 52 million, 84 million, and 110 million units in 1987, by which time renting movies had became one of the most popular forms of home entertainment, and the VHS machine had become ubiquitous.
This is an example of how a quantitative change in the acceptance level of a new technology can lead to a qualitative change in the role the technology plays. Television is another. In 1946, 10,000 television sets were sold in the United States and only 16,000 in the next year. But then a threshold was crossed, and in 1948 the number was 190,000. In successive years it was 1 million units, followed by 4 million, 10 million, and steadily up to 32 million sold in 1955. As more television sets were sold, more was invested in creating programming, which in turn further enticed people to buy television sets.
For the first few years after they were introduced, audio compact disc (CD) players and discs didn’t sell well, in part because it was difficult to find music stores that carried many titles. Then, seemingly overnight, enough players were sold and titles were available, and an acceptance threshold was crossed. More people bought players because more titles were available, and record companies made more titles available on CDs. Music lovers preferred the new, high-quality sound and convenience of compact discs, and they became the de facto standard and drove LPs out of the record stores.
One of the most important lessons the computer industry learned is that a great deal of a computer’s value to its user depends on the quality and variety of the application software available for it. All of us in the industry learned that lesson—some happily, some unhappily.
In the summer of 1980, two IBM emissaries came to Microsoft to discuss a personal computer they might or might not build.
At the time, IBM’s position was unchallenged in the realm of hardware, with a more than 80 percent market share of large computers. It had had only modest success with small computers. IBM was used to selling big, expensive machines to big customers. IBM’s management suspected that IBM, which had 340,000 employees, would require the assistance of outsiders if it was going to sell little, inexpensive machines to individuals as well as companies anytime soon.
IBM wanted to bring its personal computer to market in less than a year. In order to meet this schedule it had to abandon its traditional course of doing all the hardware and software itself. So IBM had elected to build its PC mainly from off-the-shelf components available to anyone. This made a platform that was fundamentally open, which made it easy to copy.
Although it generally built the microprocessors used in its products, IBM decided to buy microprocessors for its PCs from Intel. Most important for Microsoft, IBM decided to license the operating system from us, rather than creating software itself.
Working together with the IBM design team, we promoted a plan for IBM to build one of the first personal computers to use a 16-bit microprocessor chip, the 8088. The move from 8 to 16 bits would take personal computers from hobbyist toys to high-volume business tools. The 16-bit generation of computers could support up to one full megabyte of memory—256 times as much as an 8-bit computer. At first this would be just a theoretical advantage because IBM initially intended to offer only 16K of memory, 1/64 of the total memory possible. The benefit of going 16-bit was further lessened by IBM’s decision to save money by using a chip that employed only 8-bit connections to the rest of the computer. Consequently, the chip could think much faster than it could communicate. However, the decision to use a 16-bit processor was very smart because it allowed the IBM PC to evolve and remain the standard for PCs to this day.
IBM, with its reputation and its decision to employ an open design that other companies could copy, had a real chance to create a new, broad standard in personal computing. We wanted to be a part of it. So we took on the operating-system challenge. We bought some early work from another Seattle company and hired its top engineer, Tim Paterson. With lots of modifications the system became the Microsoft Disk Operating System, or MS-DOS. Tim became, in effect, the father of MS-DOS.
IBM, our first licensee, called the system PC-DOS; the PC was for personal computer. The IBM Personal Computer hit the market in August 1981 and was a triumph. The company marketed it well and popularized the term “PC.” The project had been conceived by Bill Lowe and shepherded to completion by Don Estridge. It is a tribute to the quality of the IBM people involved that they were able to take their personal computer from idea to market in less than a year.
Few remember this now, but the original IBM PC actually shipped with a choice of three operating systems—our PC-DOS, CP/M-86, and the UCSD Pascal P-system. We knew that only one of the three could succeed and become the standard. We wanted the same kinds of forces that were putting VHS cassettes into every video store to push MS-DOS to become the standard. We saw three ways to get MS-DOS out in front. First was to make MS-DOS the best product. Second was to help other software companies write MS-DOS-based software. Third was to ensure MS-DOS was inexpensive.
We gave IBM a fabulous deal—a low, one-time fee that granted the company the right to use Microsoft’s operating system on as many computers as it could sell. This offered IBM an incentive to push MS-DOS, and to sell it inexpensively. Our strategy worked. IBM sold the UCSD Pascal P-System for about $450, CP/M-86 for about $175, and MS-DOS for about $60.
Our goal was not to make money directly from IBM, but to profit from licensing MS-DOS to computer companies that wanted to offer machines more or less compatible with the IBM PC. IBM could use our software for free, but it did not have an exclusive license or control of future enhancements. This put Microsoft in the business of licensing a software platform to the personal-computer industry. Eventually IBM abandoned the UCSD Pascal P-system and CP/M-86 enhancements.
Consumers bought the IBM PC with confidence, and in 1982, software developers began turning out applications to run on it. Each new customer, and each new application, added to the IBM PC’s strength as a potential de facto standard for the industry. Soon most of the new and best software, such as Lotus 1-2-3, was being written for it. Mitch Kapor, with Jonathan Sachs, created 1-2-3 and revolutionized spreadsheets. The original inventors of the electronic spreadsheet, Dan Bricklin and Bob Frankston, deserve immense credit for their product, VisiCalc, but 1-2-3 made it obsolete. Mitch is a fascinating person whose eclectic background—in his case as a disc jockey and transcendental meditation instructor—is typical of that of the best software designers.
A positive-feedback cycle began driving the PC market. Once it got going, thousands of software applications appeared, and untold numbers of companies began making add-in or “accessory” cards, which extended the hardware capabilities of the PC. The availability of software and hardware add-ons sold PCs at a far greater rate than IBM had anticipated—by a factor of millions. The positive-feedback cycle spun out billions of dollars for IBM. For a few years, more than half of all personal computers used in business were IBMs and most of the rest were compatible with its machines.
The IBM standard became the platform everybody imitated. A lot of the reason was timing and its use of a 16-bit processor. Both timing and marketing are key to acceptance with technology products. The PC happened to be a good machine, but another company could have set the standard by getting enough desirable applications and selling enough machines.
IBM’s early business decisions, caused by its rush to get the PCs out, made it very easy for other companies to build compatible machines. The architecture was for sale. The microprocessor chips from Intel and Microsoft’s operating system were available. This openness was a powerful incentive for component builders, software developers, and everyone else in the business to try to copy.
Within three years almost all the competing standards for personal computers disappeared. The only exceptions were Apple’s Apple II and Macintosh. Hewlett Packard, DEC, Texas Instruments, and Xerox, despite their technologies, reputations, and customer bases, failed in the personal-computer market in the early 1980s because their machines weren’t compatible and didn’t offer significant enough improvements over the IBM architecture. A host of start-ups, such as Eagle and Northstar, thought people would buy their hardware because it offered something different and slightly better than the IBM PC. All of the start-ups either changed to building compatible hardware or failed. The IBM PC became the hardware standard. By the mid-1980s, there were dozens of IBM-compatible PCs. Although buyers of a PC might not have articulated it this way, what they were looking for was the hardware that ran the most software, and they wanted the same system the people they knew and worked with had.
It has become popular for certain revisionist historians to conclude that IBM made a mistake working with Intel and Microsoft to create its PC. They argue that IBM should have kept the PC architecture proprietary, and that Intel and Microsoft somehow got the better of IBM. But the revisionists are missing the point. IBM became the central force in the PC industry precisely because it was able to harness an incredible amount of innovative talent and entrepreneurial energy and use it to promote its open architecture. IBM set the standards.
In the mainframe business IBM was king of the hill, and competitors found it hard to match the IBM sales force and high R&D. If a competitor tried climbing the hill, IBM could focus its assets to make the ascent nearly impossible. But in the volatile world of the personal computer, IBM’s position was more like that of the leading runner in a marathon. As long as the leader keeps running as fast or faster than the others, he stays in the lead and competitors will have to keep trying to catch up. If, however, he slacks off or stops pushing himself, the rest will pass him by. There weren’t many deterrents to the other racers, as would soon become clear.
By 1983, I thought our next step should be to develop a graphical operating system. I didn’t believe we would be able to retain our position at the forefront of the software industry if we stuck with MS-DOS, because MS-DOS was character-based. A user had to type in often-obscure commands, which then appeared on the screen. MS-DOS didn’t provide pictures and other graphics to help users with applications. The interface is the way the computer and the user communicate. I believed that in the future interfaces would be graphical and that it was essential for Microsoft to move beyond MS-DOS and set a new standard in which pictures and fonts (typefaces) would be part of an easier-to-use interface. In order to realize our vision, PCs had to be made easier to use not only to help existing customers, but also to attract new ones who wouldn’t take the time to learn to work with a complicated interface.
To illustrate the huge difference between a character-based computer program and a graphical one, imagine playing a board game such as chess, checkers, Go, or Monopoly on a computer screen. With a character-based system, you type in your moves using characters. You write “Move the piece on square 11 to square 19” or something slightly more cryptic like “Pawn to QB3.” But in a graphical computer system, you see the board game on your screen. You move pieces by pointing at them and actually dragging them to their new locations.
Researchers at Xerox’s now-famous Palo Alto Research Center in California explored new paradigms for human-computer interaction. They showed that it was easier to instruct a computer if you could point at things on the screen and see pictures. They used a device called a “mouse,” which could be rolled on a tabletop to move a pointer around on the screen. Xerox did a poor job of taking commercial advantage of this groundbreaking idea, because its machines were expensive and didn’t use standard microprocessors. Getting great research to translate into products that sell is still a big problem for many companies.
In 1983, Microsoft announced that we planned to bring graphical computing to the IBM PC, with a product called Windows. Our goal was to create software that would extend MS-DOS and let people use a mouse, employ graphical images on the computer screen, and make available on the screen a number of “windows,” each running a different computer program. At that time two of the personal computers on the market had graphical capabilities: the Xerox Star and the Apple Lisa. Both were expensive, limited in capability, and built on proprietary hardware architectures. Other hardware companies couldn’t license the operating systems to build compatible systems, and neither computer attracted many software companies to develop applications. Microsoft wanted to create an open standard and bring graphical capabilities to any computer that was running MS-DOS.
The first popular graphical platform came to market in 1984, when Apple released its Macintosh. Everything about the Macintosh’s proprietary operating system was graphical, and it was an enormous success. The initial hardware and operating-system software Apple released were quite limited but vividly demonstrated the potential of the graphical interface. As the hardware and software improved, the potential was realized.
We worked closely with Apple throughout the development of the Macintosh. Steve Jobs led the Macintosh team. Working with him was really fun. Steve has an amazing intuition for engineering and design as well as an ability to motivate people that is world class.
It took a lot of imagination to develop graphical computer programs. What should one look like? How should it behave? Some ideas were inherited from the work done at Xerox and some were original. At first we went to excess with the possibilities. We used nearly every one of the fonts and icons we could. Then we figured out all that made it hard to look at and changed to more sober menus. We created a word processor, Microsoft Word, and a spreadsheet, Microsoft Excel, for the Macintosh. These were Microsoft’s first graphical products.
The Macintosh had great system software, but Apple refused (until 1995) to let anyone else make computer hardware that would run it. This was traditional hardware-company thinking: If you wanted the software, you had to buy Apple computers. Microsoft wanted the Macintosh to sell well and be widely accepted, not only because we had invested a lot in creating applications for it, but also because we wanted the public to accept graphical computing.
Mistakes such as Apple’s decision to limit the sale of its operating-system software for its own hardware will be repeated often in the years ahead. Some telephone and cable companies are already talking about communicating only with the software they control.
It’s increasingly important to be able to compete and cooperate at the same time, but that calls for a lot of maturity.
The separation of hardware and software was a major issue in the collaboration between IBM and Microsoft to create OS/2. The separation of hardware and software standards is still an issue today. Software standards create a level playing field for the hardware companies, but many manufacturers use the tie between their hardware and their software to distinguish their systems. Some companies treat hardware and software as separate businesses and some don’t. These different approaches will be played out again on the highway.
Throughout the 1980s, IBM was awesome by every measure capitalism knows. In 1984, it set the record for the most money ever made by any firm in a single year—$6.6 billion of profit. In that banner year IBM introduced its second-generation personal computer, a high-performance machine called the PC AT, which incorporated Intel’s 80286 microprocessor (colloquially known as the “286"). It was three times faster than the original IBM PC. The AT was a great success, and within a year had more than 70 percent of all business PC sales.
When IBM launched the original PC, it never expected the machine to challenge sales of the company’s business systems, although a significant percentage of the PCs were bought by IBM’s traditional customers. Company executives thought the smaller machines would find their place only at the low end of the market. As PCs became more powerful, to avoid having them cannibalize its higher-end products, IBM held back on PC development.
In its mainframe business, IBM had always been able to control the adoption of new standards. For example, the company would limit the price/performance of a new line of hardware so it wouldn’t steal business from existing, more expensive products. It would encourage the adoption of new versions of its operating systems by releasing hardware that required the new software or vice versa. That kind of strategy might have worked well for mainframes, but it was a disaster in the fast-moving personal-computer market. IBM could still command somewhat higher prices for equivalent performance, but the world had discovered that lots of companies made compatible hardware, and that if IBM couldn’t deliver the right value, someone else would.
Three engineers who appreciated the potential offered by IBM’s entry into the personal-computer business left their jobs at Texas Instruments and formed a new company—Compaq Computer. They built hardware that would accept the same accessory cards as the IBM PC and licensed MS-DOS so their computers were able to run the same applications as the IBM PC. The company produced machines that did everything the IBM PCs did and were more portable. Compaq quickly became one of the all-time success stories in American business, selling more than $100 million worth of computers its first year in business. IBM was able to collect royalties by licensing its patent portfolio, but its share of market declined as compatible systems came to market and IBM’s hardware was not competitive.
The company delayed the release of its PCs with the powerful Intel 386 chip, Intel’s successor to the 286. This was done to protect the sales of its low-end minicomputers, which weren’t much more powerful than a 386-based PC. IBM’s delay allowed Compaq to become the first company to introduce a 386-based computer in 1986. This gave Compaq an aura of prestige and leadership that previously had been IBM’s alone.
IBM planned to recover with a one-two punch, the first in hardware and the second in software. It wanted to build computers and write operating systems, each of which would depend exclusively on the other for its new features so competitors would either be frozen out or forced to pay hefty licensing fees. The strategy was to make everyone else’s “IBM-compatible” personal computer obsolete.
The IBM strategy included some good ideas. One was to simplify the design of the PC by taking many applications that had formerly been selectable options and building them into the machine. This would both reduce costs and increase the percentage of IBM components in the ultimate sale. The plan also called for substantial changes in the hardware architecture: new connectors and standards for accessory cards, keyboards, mice, and even displays. To give itself a further advantage IBM didn’t release specifications on any of these connectors until it had shipped the first systems. This was supposed to redefine compatibility standards. Other PC manufacturers and the makers of peripherals would have to start over—IBM would have the lead again.
By 1984, a significant part of Microsoft’s business was providing MS-DOS to manufacturers that built PCs compatible with IBM’s systems. We began working with IBM on a replacement for MS-DOS, eventually named OS/2. Our agreement allowed Microsoft to sell other manufacturers the same operating system that IBM was shipping with its machines. We each were allowed to extend the operating system beyond what we developed together. This time it wasn’t like when we did MS-DOS. IBM wanted to control the standard to help its PC hardware and mainframe businesses. IBM became directly involved in the design and implementation of OS/2.
OS/2 was central to IBM’s corporate software plans. It was to be the first implementation of IBM’s Systems Application Architecture, which the company ultimately intended to have as a common development environment across its full line of computers from mainframe to midrange to PC. IBM executives believed that using the company’s mainframe technology on the PC would prove irresistible to corporate customers who were moving more and more capabilities from mainframe and mini-computers to PCs. They also thought that it would give IBM a huge advantage over PC competitors who would not have access to mainframe technology. IBM’s proprietary extensions of OS/2—called Extended Edition—included communications and database services. And it planned to build a full set of office applications—to be called OfficeVision—to work on top of Extended Edition. The plan predicted these applications, including word processing, would allow IBM to become a major player in PC-application software, and compete with Lotus and WordPerfect. The development of OfficeVision required another team of thousands. OS/2 was not just an operating system, it was part of a corporate crusade.
The development work was burdened by demands that the project meet a variety of conflicting feature requirements as well as by IBM’s schedule commitments for Extended Edition and OfficeVision. Microsoft went ahead and developed OS/2 applications to help get the market going, but as time went on, our confidence in OS/2 eroded. We had entered into the project with the belief that IBM would allow OS/2 to be enough like Windows that a software developer would have to make at most only minor modifications to get an application running on both platforms. But after IBM’s insistence that the applications be compatible with its mainframe and midrange systems, what we were left with was more like an unwieldy mainframe operating system than a PC one.
Our business relationship with IBM was vital to us. That year, 1986, we had taken Microsoft public to provide liquidity for the employees who had been given stock options. It was about that time Steve Ballmer and I proposed to IBM that they buy up to 30 percent of Microsoft—at a bargain price—so it would share in our fortune, good or bad. We thought this might help the companies work together more amicably and productively. IBM was not interested.
We worked extremely hard to make sure our operating-system work with IBM succeeded. I felt the project would be a ticket to the future for both companies. Instead, it eventually created an enormous rift between us. A new operating system is a big project. We had our team working outside Seattle. IBM had teams in Boca Raton, Florida; Hursley Park, England; and later Austin, Texas.
But the geographical problems were not as bad as those that came from IBM’s mainframe legacy. IBM’s previous software projects almost never caught on with PC customers precisely because they were designed with a mainframe customer in mind. For instance, it took three minutes for one version of OS/2 to “boot” (to make itself ready for use after it was turned on). That didn’t seem bad to them, because in the mainframe world, booting could take fifteen minutes.
IBM, with more than 300,000 employees, was also stymied by its commitment to company-wide consensus. Every part of IBM was invited to submit Design Change Requests, which usually turned out to be demands that the personal-computer-system software be changed to fit the needs of mainframe products better. We got more than 10,000 such requests, and talented people from IBM and Microsoft would sit and discuss them for days.
I remember change request #221: “Remove fonts from product. Reason: Enhancement to product’s substance.” Someone at IBM didn’t want the PC operating system to offer multiple typefaces because a particular IBM mainframe printer couldn’t handle them.
Finally it became clear that joint development wasn’t going to work. We asked IBM to let us develop the new operating system on our own and license it to them cheaply. We’d make our profit by selling the same thing to other computer companies. But IBM had declared that its own programmers had to be involved in the creation of any software it considered strategic. And operating-system software clearly was that.
IBM was such a great company. Why should it have so much trouble with PC software development? One answer was that IBM tended to promote all their good programmers into management and leave the less talented behind. Even more significant, IBM was haunted by its successful past. Its traditional engineering process was unsuitable for the rapid pace and market requirements of PC software.
In April 1987, IBM unveiled its integrated hardware/software, which was supposed to beat back imitators. The “clone-killer” hardware was called PS/2 and it ran the new operating system, OS/2.
The PS/2 included a number of innovations. The most celebrated was the new “microchannel bus” circuitry, which allowed accessory cards to connect to the system and permitted the PC hardware to be extended to meet such particular customer requirements as sound or mainframe communications capabilities. Every compatible computer included a hardware-connection “bus” to allow these cards to work with the PC. The PS/2’s Microchannel was an elegant replacement for the connection bus in the PC AT. But it solved problems that most customers didn’t have. It was potentially much faster than the PC AT’s bus. But in actual practice the speed of the bus hadn’t been holding anyone up, and therefore customers couldn’t get much benefit from this newly available speed. More important, the Microchannel didn’t work with any of the thousands of add-in cards that worked with the PC AT and compatible PCs.
Ultimately, IBM agreed to license the Microchannel, for a royalty, to manufacturers of add-in cards and PCs. But by then a coalition of manufacturers had already announced a new bus with many of the capabilities of the Microchannel but compatible with the PC AT bus. Customers rejected Microchannel in favor of machines with the old PC AT bus. The complement of accessory cards for the PS/2 never came close to the number available for PC AT-compatible systems. This forced IBM to continue to release machines that supported the old bus. The real casualty was that IBM lost control of personal-computer architecture. Never again would they be able to move the industry singlehanded to a new design.
Despite a great deal of promotion from both IBM and Microsoft, customers thought OS/2 was too unwieldy and complicated. The worse OS/2 looked, the better Windows seemed. Because we’d lost the chances both for compatibility between Windows and OS/2, and for OS/2 to run on modest machines, it still made sense to us to continue to develop Windows. Windows was far “smaller"—meaning it used less hard-disk space and could work in a machine with less memory, so there would be a place for it on machines that could never run OS/2. We called this the “family” strategy. In other words, OS/2 would be the high-end system and Windows would be the junior member of the family, for smaller machines.
IBM was never happy about our family strategy, but it had its own plans. In the spring of 1988, it joined other computer makers in establishing the Open Software Foundation to promote UNIX, an operating system that had originally been developed at AT&T’s Bell Labs in 1969 but over the years had splintered into a number of versions. Some of the versions were developed at universities, which used UNIX as a working laboratory for operating-systems theory. Other versions were developed by computer companies. Each company enhanced UNIX for its own computers, which made their operating system incompatible with everyone else’s. This meant that UNIX had become not a single open system, but a collection of operating systems competing with one another. All the differences made software compatibility harder and held back the rise of a strong third-party software market for UNIX. Only a few software companies could afford to develop and test applications for a dozen different versions of UNIX. Also, computer-software stores couldn’t afford to stock all the different versions.
The Open Software Foundation was the most promising of several attempts to “unify” UNIX and create a common software architecture that would work on various different manufacturers’ hardware. In theory, a unified UNIX could get a positive-feedback cycle going. But despite significant funding, it turned out to be impossible for the Open Software Foundation to mandate cooperation from a committee of vendors who were competing for each sale. Its members, including IBM, DEC, and others, continued to promote the benefits of their particular versions of UNIX. The UNIX companies suggested their systems would benefit customers by offering them more choices. But if you bought a UNIX system from one vendor, your software couldn’t automatically run on any other system. This meant you were tied to that vendor, whereas in the PC world you have a choice of where to buy your hardware.
The problems of the Open Software Foundation and similar initiatives point up the difficulty of trying to impose a standard in a field in which innovation is moving rapidly and all the companies that make up the standards committee are competitors. The marketplace (in computers or consumer electronics) adopts standards because customers insist on standards. Standards are to ensure interoperability, minimize user training, and of course foster the largest possible software industry. Any company that wants to create a standard has to price it very reasonably or it won’t be adopted. The market effectively chooses a reasonably priced standard and replaces it when it is obsolete or too expensive.
Microsoft operating systems are offered today by more than nine hundred different manufacturers, which gives customers choices and options. Microsoft has been able to provide compatibility because hardware manufacturers have agreed not to allow modifications to our software that introduce incompatibility. This means that hundreds of thousands of software developers don’t need to worry about what PCs their software will run on. Although the term “open” is used in many different ways, to me it means offering choice in hardware and software applications to the customer.
Consumer electronics has also benefited from standards managed by private companies. Years ago consumer electronics companies often tried to restrict competitors from using their technology, but now all of the major consumer electronics makers are quite open to licensing their patents and trade secrets. The royalties for their products are typically under 5 percent of the cost of the device. Audiocassettes, VHS tapes, compact discs, televisions, and cellular telephones are all examples of technologies that were created by private companies that receive royalties from everyone who makes the equipment. Dolby Laboratories’ algorithms, for example, are the de facto standard for noise reduction.
In May 1990, the last weeks before the release of Windows 3.0, we tried to reach an agreement with IBM for it to license Windows to use on its personal computers. We told IBM we thought that although OS/2 would work out over time, for the moment Windows was going to be a success and OS/2 would find its niche slowly.
In 1992, IBM and Microsoft stopped their joint development of OS/2. IBM continued to develop the operating system alone. The ambitious plan for OfficeVision was eventually canceled.
Analysts estimate that IBM poured more than $2 billion into OS/2, OfficeVision, and related projects. If IBM and Microsoft had found a way to work together, thousands of people-years—the best years of some of the best employees at both companies—would not have been wasted. If OS/2 and Windows had been compatible, graphical computing would have become mainstream years sooner.
The acceptance of graphical interfaces was also held back because most major software-applications companies did not invest in them. They largely ignored the Macintosh and ignored or ridiculed Windows. Lotus and WordPerfect, the market leaders for spreadsheet and word-processing applications, made only modest efforts on OS/2. In retrospect, this was a mistake, and, in the end, a costly one. When Windows finally benefited from a positive-feedback cycle, generated by applications from many of the small software companies, the big companies fell behind because they didn’t move to Windows fast enough.
Windows, like the PC, continues to evolve. Microsoft has continued
to add new capabilities to various versions. Anyone can develop application software that runs on the Windows platform, without having to notify or get permission from Microsoft. In fact, today there are tens of thousands of commercially available software packages for the platform, including offerings that compete with most Microsoft applications.
Customers express to me their worry that Microsoft, because it is, by definition, the only source for Microsoft operating-system software, could raise prices and slow down or even stop its innovation. Even if we did we wouldn’t be able to sell our new versions. Existing users would not upgrade and we wouldn’t get any new users. Our revenue would fall and many more companies would compete to take our place. The positive-feedback mechanism helps challengers as well as the incumbent. You can’t rest on your laurels, because there is always a competitor coming up behind you.
No product stays on top unless it is improved. Even the VHS standard will be replaced when better formats appear at reasonable prices. In fact, the era of VHS is almost over. Within the next several years we will see new digital tape formats, digital movie discs that put feature films on discs like a music CD, and eventually the information highway will enable new services such as video-on-demand, and VHS will be unnecessary.
MS-DOS is being replaced now. Despite its incredible strength as the leading operating system for personal computers, it is being replaced by a system with a graphical user interface. The Macintosh software might have become the successor to MS-DOS. So might OS/2 or UNIX. It appears that Windows has the lead for the moment. However, in high tech this is no guarantee we’ll have it even in the near future.
We have had to improve our software to keep up with hardware advances. Each subsequent version will only be successful with new users if current users adopt it. Microsoft has to do its best to make new versions so attractive in terms of price and features that people will want to change. This is hard because a change involves a big overhead for both developers and customers. Only a major advance is able to convince enough users it is worth their while to change. With enough innovation it can be done. I expect major new generations of Windows to come along every two to three years.
The seeds of new competition are being sown constantly in research
environments and garages around the world. For instance, the Internet is becoming so important that Windows will only thrive if it is clearly the best way to gain access to the Internet. All operating-system companies are rushing to find ways to have a competitive edge in providing Internet support. When speech recognition becomes genuinely reliable, this will cause another big change in operating systems.
In our business things move too fast to spend much time looking back. I pay close attention to our mistakes, however, and try to focus on future opportunity. It’s important to acknowledge mistakes and make sure you draw some lesson from them. It’s also important to make sure no one avoids trying something because he thinks he’ll be penalized for what happened or that management is not working to fix the problems. Almost no single mistake is fatal.
Lately, under Lou Gerstner’s leadership, IBM has become far more efficient and regained both its profitability and its positive focus on the future. Although the continuing decline in mainframe revenues remains a problem for IBM, it will clearly be one of the major companies providing products for businesses and the information highway.
In recent years, Microsoft has deliberately hired a few managers with experience in failing companies. When you’re failing you’re forced to be creative, to dig deep and think, night and day. I want some people around who have been through that. Microsoft is bound to have failures in the future, and I want people here who have proved they can do well in tough situations.
Death can come swiftly to a market leader. By the time you have lost the positive-feedback cycle it’s often too late to change what you’ve been doing, and all the elements of a negative spiral come into play. It is difficult to recognize that you’re in a crisis and react to it when your business appears extremely healthy. That is going to be one of the paradoxes for companies building the information highway. It keeps me alert. I never anticipated Microsoft’s growing so large, and now, at the beginning of this new era, I unexpectedly find myself a part of the establishment. My goal is to prove that a successful corporation can renew itself and stay in the forefront.