When I was growing up, the world was in the grip of what was then called the Computer Age. Computers, everyone knew, were where things were going. And so we were all given training in computer science to prepare us for the Jobs of Tomorrow, which as everyone knew were in computer programming.
To program a computer meant poking little holes in punch cards, stacks and stacks and stacks of them, which you then handed in to the computer lab. When your turn came in the queue, they would feed your stack of cards into the computer; you would get your homework back the next day in the form of a printout. If, as often happened, you had made some small mistake—somewhere—and the computer, baffled, had responded with a string of hysterical gibberish, you simply repeated the whole fiddling, nitpicking exercise.
And for most of us, that was that. The premise, that we were all going to be computer programmers, was false, and we knew it. Computers were for geeks, science fiction enthusiasts and others even further beyond the pale. Though in some ways my own mildly obsessive-compulsive nature made me a natural for it, my teenage identity was even then coalescing around the idea that I was actually some kind of artsie, or at least destined for the humanities.
And besides, programming a computer meant working for the people who had computers—business, mostly, unless you were desperate enough to work for the government. Business? There were two kinds of people who went into business: those who had a lot of money, or those who wanted a lot. All right for some, but it wasn’t something you’d do, if you, like, wanted some meaning in your life.
If you want to know the impact Steve Jobs and the company he created, Apple, have had on society, consider that not a line of the foregoing still applies. We did not all have to learn arcane computer languages, as we thought: instead, the computers became so simple to use that, for some purposes, a swipe of the fingers was all that was required. Computers themselves are no longer great printout-belching beasts, remote and fearsome, guarded by a kind of priesthood with occasional visiting rights for the rest of us, but are in every home, indeed in every pocket or purse.
Perhaps most important, the lines that once separated the arts and sciences, or bohemia and business, have been blurred, if not erased. Computers today are not used merely for crunching numbers, maintaining inventories and other tedium: they have become the tools of artists, musicians, film-makers, even writers. Whereas, before, technology was seen mostly as something used to make things, now it has been harnessed to create works of the imagination. Jobs took the computer away from the engineers, and gave it to the artists. He not only made geeks hip, but made everyone into a geek, at least a bit—including women, who, in the computer age’s clunky, beige early years, would not have been caught dead using a computer, should anyone have thought to ask them.
And the business that did so much to make that possible, Apple, became the template for a kind of business that would not have occurred to my teenage self: a business that not only made great products, but made the world a better place. Jobs not only made computers cool, he made business cool: how many thousands of young people, inspired by Apple’s example, went into business for themselves, as partners in small high-tech start-ups? At a time when the Big Three and other long-time industrial titans were being eclipsed by foreign competition, often from low-wage economies, Jobs showed how advanced economies could still compete: by innovation, design, quality.
If Jobs helped to bridge these different worlds, it was in part because in many ways he combined them in himself. An early technology enthusiast and budding entrepreneur, he was also a college dropout who studied Buddhism and, by his own account, did a lot of acid. As a child of the sixties in southern California, he was a product of the counterculture, but of a vintage that, rather than wanting to smash the system, sought to remould it to its own designs.
Yet he was every inch the businessman; in some ways, the very stereotype of a bullying, bragging, hypertensive control freak. Again, what made him unique was the combination. He had not only an entrepreneur’s sense of the possible, but a manager’s attention to process; not only an engineer’s eye for technical detail, but a designer’s eye for beauty; and he combined all these with an acute sensitivity to cultural trends and an unmatched flair for salesmanship. Some have compared him to Edison, some to Ford, some to Disney. In truth he was all three, with a side order of P.T. Barnum.
So much of what the computer became was made possible or driven by Apple that it’s difficult to separate the two, just as it’s difficult to separate Apple’s story from Jobs’s. Certainly we do not have to speculate on his importance to Apple. We know how Apple did with Jobs, both in its spectacular initial phase and in the remarkable string of successes it has enjoyed in the last 15 years, and we know how it did in the decade after it fired him: it was a mess. If Jobs had no other achievement than co-founding Apple, he would be a business legend; but to have taken a firm that was near bankruptcy and transformed it into, as of this August, the world’s most valuable company (it has since slipped to second) is without parallel.
It’s true that his reputation as an innovator can be overstated. There were flops, from Lisa to Ping, to go with the hits. To be sure, Apple was the first to see a market for personal computers, and the first to produce a commercially successful one: both remarkable achievements in themselves. On the other hand, the company’s other great early innovation, the point-and-click graphical user interface, which transformed personal computers from a plaything for hobbyists to universal necessity, was borrowed from Xerox.
That was the template more often followed in later years: though there were genuine innovations, such as the radical design of the iMac, the more usual formula was to take what others had done first, and do them better. Apple was late to market with its first laptop computer, as it followed others in developing a smartphone, and let Microsoft do the running on the tablet; famously, Jobs dismissed the prospects of a portable music player shortly before bringing the iPod to market.
But that should not obscure Jobs’s particular contribution. It was not just that he saw the commercial possibilities in technologies that others had developed, or his famous emphasis on design and fanatical devotion to quality. Again, it’s the combination. What Jobs understood better than anyone was the connection between the little things and the grand strategic vision—that the difference between consumer acceptance and rejection of a technology often lay in the smallest details of the user experience.
That insight was at the core of his most striking gut call: staying with the closed, proprietary model, software and hardware combined, when the rest of the computer world was going open-source and non-proprietary—indeed, even after it was widely agreed that the proprietary model had been proved wrong. Granted, Microsoft’s willingness to license its software to all comers was the key to its own rise to dominance, consigning Apple at one point to just three per cent of the market. But over time, the tables began to turn. The viruses, incompatibilities and quality issues that went with the Microsoft model began to discredit it, while the virtues of a seamless integration between hardware and software became more apparent.
The turning point, I think, was the iPod: as a music player, not only was it demonstrably better designed than its many competitors, but there simply was no rival to iTunes (another Apple borrowing, or rather purchase: the software began as a third-party app called SoundJam). By the time the iPhone arrived, consumers were more than willing to enter Apple’s “walled garden,” having access only to such software as Apple thought it fit they should have.
This is key. It isn’t just the hardware—it’s the software. As much as Apple’s design and quality may have caused consumers to fetishize their machines, what made them truly transformative was the software. The scale of the transformation is, again, utterly without parallel. If in its first incarnation Apple turned the computer industry upside down, after Jobs’s return it did no less to the music, film and publishing industries. It not only recast the relationship between consumers and creators of content, but between creators and distributors. And it made the creation of content easier and less costly than ever before. How many aspiring filmmakers, for example, are editing on Final Cut Pro—technology that would have been prohibitively expensive a generation ago?
For all these reasons, ordinary people had a relationship with Apple unlike any other company. In the same way, Jobs commanded a personal respect, even veneration, that extended far beyond the business world. I can’t think of any other business figure whose death would have prompted such widespread mourning, especially among people who would not otherwise have any interest in business. But I can’t think of another figure, in business or out, who had such a profound and widespread impact on the society he lived in. Edison may have given us sight (the electric light bulb) and sound (the phonograph), Ford may have given us a new means of locomotion, but Jobs expanded our brains. Quite literally, we now have access, via our iPhones, to all of the world’s knowledge, at all times. It is almost too incredible to comprehend.
We do not have to overlook his faults or exaggerate his virtues to say that for sheer consequence he had few equals: maybe one of the four or five most important people to walk the Earth in his lifetime. Maybe the most.