Tuesday, June 17, 2014

The Impossible Dream Of Perpetual Innovation

The Jillster confronts innovation in today's post. When this bloggger labored in the groves at the Collegium Excellens, the holy Grail was innovation: a new way of teaching, a new way of grading, a new... new. It was an endless canard. Since this blogger has entered the ranks of the honorably(?) retired, the BIG thing in innovation in the higher ed biz is the MOOC (Massive Open Online Course) in which thousands of students world-wide enroll in an online course with a rockstar professor and study whatever at the rockstar's virtual knee (or other low joints). Thousands enroll, but a relative handful complete the work. Innovate or die, you fools. If this is (fair & balanced) Oz-like wizardry, so be it.

[x New Yorker]
The Disruption Machine
By Jill Lepore

Tag Cloud of the following piece of writing

created at TagCrowd.com



In the last years of the nineteen-eighties, I worked not at startups but at what might be called finish-downs. Tech companies that were dying would hire temps—college students and new graduates—to do what little was left of the work of the employees they’d laid off. This was in Cambridge, near M.I.T. I’d type users’ manuals, save them onto 5.25-inch floppy disks, and send them to a line printer that yammered like a set of prank-shop chatter teeth, but, by the time the last perforated page coiled out of it, the equipment whose functions those manuals explained had been discontinued. We’d work a month here, a week there. There wasn’t much to do. Mainly, we sat at our desks and wrote wishy-washy poems on keyboards manufactured by Digital Equipment Corporation, left one another sly messages on pink While You Were Out sticky notes, swapped paperback novels—Kurt Vonnegut, Margaret Atwood, Gabriel García Márquez, that kind of thing—and, during lunch hour, had assignations in empty, unlocked offices. At Polaroid, I once found a Bantam Books edition of Steppenwolf (1978) in a clogged sink in an employees’ bathroom, floating like a raft. “In his heart he was not a man, but a wolf of the steppes,” it said on the bloated cover. The rest was unreadable.

Not long after that, I got a better assignment: answering the phone for Michael Porter, a professor at the Harvard Business School. I was an assistant to his assistant. In 1985, Porter had published a book called Competitive Advantage, in which he elaborated on the three strategies—cost leadership, differentiation, and focus—that he’d described in his 1980 book, http://www.amazon.com/Competitive-Strategy-Techniques-Industries-Competitors/dp/0684841487. I almost never saw Porter, and, when I did, he was dashing, affably, out the door, suitcase in hand. My job was to field inquiries from companies that wanted to book him for speaking engagements. The Competitive Advantage of Nations appeared in 1990. Porter’s ideas about business strategy reached executives all over the world.

Porter was interested in how companies succeed. The scholar who in some respects became his successor, Clayton M. Christensen, entered a doctoral program at the Harvard Business School in 1989 and joined the faculty in 1992. Christensen was interested in why companies fail. In his 1997 book, The Innovator’s Dilemma, he argued that, very often, it isn’t because their executives made bad decisions but because they made good decisions, the same kind of good decisions that had made those companies successful for decades. (The “innovator’s dilemma” is that “doing the right thing is the wrong thing.”) As Christensen saw it, the problem was the velocity of history, and it wasn’t so much a problem as a missed opportunity, like a plane that takes off without you, except that you didn’t even know there was a plane, and had wandered onto the airfield, which you thought was a meadow, and the plane ran you over during takeoff. Manufacturers of mainframe computers made good decisions about making and selling mainframe computers and devising important refinements to them in their R. & D. departments—“sustaining innovations,” Christensen called them—but, busy pleasing their mainframe customers, one tinker at a time, they missed what an entirely untapped customer wanted, personal computers, the market for which was created by what Christensen called “disruptive innovation”: the selling of a cheaper, poorer-quality product that initially reaches less profitable customers but eventually takes over and devours an entire industry.

Ever since The Innovator’s Dilemma, everyone is either disrupting or being disrupted. There are disruption consultants, disruption conferences, and disruption seminars. This fall, the University of Southern California is opening a new program: “The degree is in disruption,” the university announced. “Disrupt or be disrupted,” the venture capitalist Josh Linkner warns in a new book, The Road to Reinvention (2014), in which he argues that “fickle consumer trends, friction-free markets, and political unrest,” along with “dizzying speed, exponential complexity, and mind-numbing technology advances,” mean that the time has come to panic as you’ve never panicked before. Larry Downes and Paul Nunes, who blog for Forbes, insist that we have entered a new and even scarier stage: “big bang disruption.” “This isn’t disruptive innovation,” they warn. “It’s devastating innovation.”

Things you own or use that are now considered to be the product of disruptive innovation include your smartphone and many of its apps, which have disrupted businesses from travel agencies and record stores to mapmaking and taxi dispatch. Much more disruption, we are told, lies ahead. Christensen has co-written books urging disruptive innovation in higher education (The Innovative University [2011]), public schools (Disrupting Class [2008]), and health care (The Innovator’s Prescription [2008]). His acolytes and imitators, including no small number of hucksters, have called for the disruption of more or less everything else. If the company you work for has a chief innovation officer, it’s because of the long arm of The Innovator’s Dilemma (2011). If your city’s public-school district has adopted an Innovation Agenda, which has disrupted the education of every kid in the city, you live in the shadow of The Innovator’s Dilemma. If you saw the episode of the HBO sitcom “Silicon Valley” in which the characters attend a conference called TechCrunch Disrupt 2014 (which is a real thing), and a guy from the stage, a Paul Rudd look-alike, shouts, “Let me hear it, disss-ruppttt!,” you have heard the voice of Clay Christensen, echoing across the valley.

Last month, days after the Times’ publisher, Arthur Sulzberger, Jr., fired Jill Abramson, the paper’s executive editor, the Times’ "2014 Innovation Report" was leaked. It includes graphs inspired by Christensen’s “Innovator’s Dilemma,” along with a lengthy, glowing summary of the book’s key arguments. The report explains, “Disruption is a predictable pattern across many industries in which fledgling companies use new technology to offer cheaper and inferior alternatives to products sold by established players (think Toyota taking on Detroit decades ago). Today, a pack of news startups are hoping to ‘disrupt’ our industry by attacking the strongest incumbent—The New York Times.”

A pack of attacking startups sounds something like a pack of ravenous hyenas, but, generally, the rhetoric of disruption—a language of panic, fear, asymmetry, and disorder—calls on the rhetoric of another kind of conflict, in which an upstart refuses to play by the established rules of engagement, and blows things up. Don’t think of Toyota taking on Detroit. Startups are ruthless and leaderless and unrestrained, and they seem so tiny and powerless, until you realize, but only after it’s too late, that they’re devastatingly dangerous: Bang! Ka-boom! Think of it this way: the Times is a nation-state; BuzzFeed is stateless. Disruptive innovation is competitive strategy for an age seized by terror.

Every age has a theory of rising and falling, of growth and decay, of bloom and wilt: a theory of nature. Every age also has a theory about the past and the present, of what was and what is, a notion of time: a theory of history. Theories of history used to be supernatural: the divine ruled time; the hand of God, a special providence, lay behind the fall of each sparrow. If the present differed from the past, it was usually worse: supernatural theories of history tend to involve decline, a fall from grace, the loss of God’s favor, corruption. Beginning in the eighteenth century, as the intellectual historian Dorothy Ross once pointed out, theories of history became secular; then they started something new—historicism, the idea “that all events in historical time can be explained by prior events in historical time.” Things began looking up. First, there was that, then there was this, and this is better than that. The eighteenth century embraced the idea of progress; the nineteenth century had evolution; the twentieth century had growth and then innovation. Our era has disruption, which, despite its futurism, is atavistic. It’s a theory of history founded on a profound anxiety about financial collapse, an apocalyptic fear of global devastation, and shaky evidence.

Most big ideas have loud critics. Not disruption. Disruptive innovation as the explanation for how change happens has been subject to little serious criticism, partly because it’s headlong, while critical inquiry is unhurried; partly because disrupters ridicule doubters by charging them with fogyism, as if to criticize a theory of change were identical to decrying change; and partly because, in its modern usage, innovation is the idea of progress jammed into a criticism-proof jack-in-the-box.

The idea of progress—the notion that human history is the history of human betterment—dominated the world view of the West between the Enlightenment and the First World War. It had critics from the start, and, in the last century, even people who cherish the idea of progress, and point to improvements like the eradication of contagious diseases and the education of girls, have been hard-pressed to hold on to it while reckoning with two World Wars, the Holocaust and Hiroshima, genocide and global warming. Replacing “progress” with “innovation” skirts the question of whether a novelty is an improvement: the world may not be getting better and better but our devices are getting newer and newer.

The word “innovate”—to make new—used to have chiefly negative connotations: it signified excessive novelty, without purpose or end. Edmund Burke called the French Revolution a “revolt of innovation”; Federalists declared themselves to be “enemies to innovation.” George Washington, on his deathbed, was said to have uttered these words: “Beware of innovation in politics.” Noah Webster warned in his dictionary, in 1828, “It is often dangerous to innovate on the customs of a nation.”

The redemption of innovation began in 1939, when the economist Joseph Schumpeter, in his landmark study of business cycles, used the word to mean bringing new products to market, a usage that spread slowly, and only in the specialized literatures of economics and business. (In 1942, Schumpeter theorized about “creative destruction”; Christensen, retrofitting, believes that Schumpeter was really describing disruptive innovation.) “Innovation” began to seep beyond specialized literatures in the nineteen-nineties, and gained ubiquity only after 9/11. One measure: between 2011 and 2014, Time, the Times Magazine, The New Yorker, Forbes, and even Better Homes and Gardens published special “innovation” issues—the modern equivalents of what, a century ago, were known as “sketches of men of progress.”

The idea of innovation is the idea of progress stripped of the aspirations of the Enlightenment, scrubbed clean of the horrors of the twentieth century, and relieved of its critics. Disruptive innovation goes further, holding out the hope of salvation against the very damnation it describes: disrupt, and you will be saved.

Disruptive innovation as a theory of change is meant to serve both as a chronicle of the past (this has happened) and as a model for the future (it will keep happening). The strength of a prediction made from a model depends on the quality of the historical evidence and on the reliability of the methods used to gather and interpret it. Historical analysis proceeds from certain conditions regarding proof. None of these conditions have been met.

The Innovator’s Dilemma consists of a set of handpicked case studies, beginning with the disk-drive industry, which was the subject of Christensen’s doctoral thesis, in 1992. “Nowhere in the history of business has there been an industry like disk drives,” Christensen writes, which makes it a very odd choice for an investigation designed to create a model for understanding other industries. The first hard-disk drive, which weighed more than a ton, was invented at I.B.M., in 1955, by a team that included Alan Shugart. Christensen is chiefly concerned with an era, beginning in the late nineteen-seventies, when disk drives decreased in size from fourteen inches to eight, then from eight to 5.25, from 5.25 to 3.5, and from 3.5 to 2.5 and 1.8. He counts a hundred and sixteen new technologies, and classes a hundred and eleven as sustaining innovations and five as disruptive innovations. Each of these five, he says, introduced “smaller disk drives that were slower and had lower capacity than those used in the mainstream market,” and each company that adopted them was an entrant firm that toppled an established firm. In 1973, Alan Shugart founded Shugart Associates, which introduced a 5.25-inch floppy-disk drive in 1976; the company was bought by Xerox the next year. In 1978, Shugart Associates developed an eight-inch hard-disk drive; Christensen, who is uninterested in the floppy-disk-drive industry, classes the company as an entrant firm and credits it with disrupting established firms that manufactured fourteen-inch hard drives. In 1979, Alan Shugart founded Shugart Technology, which changed its name to Seagate Technology after Xerox threatened to sue. In 1980, Seagate Technology introduced the first 5.25-inch hard-disk drive; Christensen, at this point, classes Seagate as an entrant firm, and Shugart Associates as a failed incumbent, even though Shugart Associates was shifting its focus to what was then its very profitable floppy-disk-drive business. In the mid-eighties, Seagate—here considered by Christensen to be an established firm—delayed manufacturing 3.5-inch drives, which were valued by producers of portable computers and laptops, because its biggest customer, I.B.M., didn’t want them; I.B.M. wanted a better and faster version of the 5.25-inch drive for its full-sized desktop computers. Seagate didn’t start shipping 3.5-inch drives until 1988, and by then, Christensen argues, it was too late.

In his original research, Christensen established the cutoff for measuring a company’s success or failure as 1989 and explained that “ ‘successful firms’ were arbitrarily defined as those which achieved more than fifty million dollars in revenues in constant 1987 dollars in any single year between 1977 and 1989—even if they subsequently withdrew from the market.” Much of the theory of disruptive innovation rests on this arbitrary definition of success.

In fact, Seagate Technology was not felled by disruption. Between 1989 and 1990, its sales doubled, reaching $2.4 billion, “more than all of its U.S. competitors combined,” according to an industry report. In 1997, the year Christensen published The Innovator’s Dilemma, Seagate was the largest company in the disk-drive industry, reporting revenues of nine billion dollars. Last year, Seagate shipped its two-billionth disk drive. Most of the entrant firms celebrated by Christensen as triumphant disrupters, on the other hand, no longer exist, their success having been in some cases brief and in others illusory. (The fleeting nature of their success is, of course, perfectly consistent with his model.) Between 1982 and 1984, Micropolis made the disruptive leap from eight-inch to 5.25-inch drives through what Christensen credits as the “Herculean managerial effort” of its C.E.O., Stuart Mahon. (“Mahon remembers the experience as the most exhausting of his life,” Christensen writes.) But, shortly thereafter, Micropolis, unable to compete with companies like Seagate, failed. MiniScribe, founded in 1980, started out selling 5.25-inch drives and saw quick success. “That was MiniScribe’s hour of glory,” the company’s founder later said. “We had our hour of infamy shortly after that.” In 1989, MiniScribe was investigated for fraud and soon collapsed; a report charged that the company’s practices included fabricated financial reports and “shipping bricks and scrap parts disguised as disk drives.”

As striking as the disruption in the disk-drive industry seemed in the nineteen-eighties, more striking, from the vantage of history, are the continuities. Christensen argues that incumbents in the disk-drive industry were regularly destroyed by newcomers. But today, after much consolidation, the divisions that dominate the industry are divisions that led the market in the nineteen-eighties. (In some instances, what shifted was their ownership: I.B.M. sold its hard-disk division to Hitachi, which later sold its division to Western Digital.) In the longer term, victory in the disk-drive industry appears to have gone to the manufacturers that were good at incremental improvements, whether or not they were the first to market the disruptive new format. Companies that were quick to release a new product but not skilled at tinkering have tended to flame out.

Other cases in “The Innovator’s Dilemma” are equally murky. In his account of the mechanical-excavator industry, Christensen argues that established companies that built cable-operated excavators were slow to recognize the importance of the hydraulic excavator, which was developed in the late nineteen-forties. “Almost the entire population of mechanical shovel manufacturers was wiped out by a disruptive technology—hydraulics—that the leaders’ customers and their economic structure had caused them initially to ignore,” he argues. Christensen counts thirty established companies in the nineteen-fifties and says that, by the nineteen-seventies, only four had survived the entrance into the industry of thirteen disruptive newcomers, including Caterpillar, O. & K., Demag, and Hitachi. But, in fact, many of Christensen’s “new entrants” had been making cable-operated shovels for years. O. & K., founded in 1876, had been making them since 1908; Demag had been building excavators since 1925, when it bought a company that built steam shovels; Hitachi, founded in 1910, sold cable-operated shovels before the Second World War. Manufacturers that were genuinely new to excavation equipment tended to sell a lot of hydraulic excavators, if they had a strong distribution network, and then not do so well. And some established companies disrupted by hydraulics didn’t do half as badly as Christensen suggests. Bucyrus is the old-line shovel-maker he writes about most. It got its start in Ohio, in 1880, built most of the excavators that dug the Panama Canal, and became Bucyrus-Erie in 1927, when it bought the Erie Steam Shovel Company. It acquired a hydraulics-equipment firm in 1948, but, Christensen writes, “faced precisely the same problem in marketing its hydraulic backhoe as Seagate had faced with its 3.5-inch drives.”

Unable to persuade its established consumers to buy a hydraulic excavator, Bucyrus introduced a hybrid product, called the Hydrohoe, in 1951—a merely sustaining innovation. Christensen says that Bucyrus “logged record profits until 1966—the point at which the disruptive hydraulics technology had squarely intersected with customers’ needs,” and then began to decline. “This is typical of industries facing a disruptive technology,” he explains. “The leading firms in the established technology remain financially strong until the disruptive technology is, in fact, in the midst of their mainstream market.”

But, actually, between 1962 and 1979 Bucyrus’s sales grew sevenfold and its profits grew twenty-five-fold. Was that so bad? In the nineteen-eighties, Bucyrus suffered. The whole construction-equipment industry did: it was devastated by recession, inflation, the oil crisis, a drop in home building, and the slowing of highway construction. (Caterpillar sustained heavy losses, too.) In the early nineteen-nineties, after a disastrous leveraged buyout handled by Goldman Sachs, Bucyrus entered Chapter 11 protection, but it made some sizable acquisitions when it emerged, as Bucyrus International, and was a leading maker of mining equipment, just as it had been a century earlier. Was it a failure? Caterpillar didn’t think so when, in 2011, it bought the firm for nearly nine billion dollars.

Christensen’s sources are often dubious and his logic questionable. His single citation for his investigation of the “disruptive transition from mechanical to electronic motor controls,” in which he identifies the Allen-Bradley Company as triumphing over four rivals, is a book called The Bradley Legacy (1992), an account published by a foundation established by the company’s founders. This is akin to calling an actor the greatest talent in a generation after interviewing his publicist. “Use theory to help guide data collection,” Christensen advises.

He finds further evidence of his theory in the disruption of the department store by the discount store. “Just as in disk drives and excavators,” he writes, “a few of the leading traditional retailers—notably S. S. Kresge, F. W. Woolworth, and Dayton Hudson—saw the disruptive approach coming and invested early.” In 1962, Kresge (which traces its origins to 1897) opened Kmart; Dayton-Hudson (1902) opened Target; and Woolworth (1879) opened Woolco. Kresge and Dayton-Hudson ran their discount stores as independent organizations; Woolworth ran its discount store in-house. Kmart and Target succeeded; Woolco failed. Christensen presents this story as yet more evidence of an axiom derived from the disk-drive industry: “two models for how to make money cannot peacefully coexist within a single organization.” In the mid-nineteen-nineties, Kmart closed more than two hundred stores, a fact that Christensen does not include in his account of the industry’s history. (Kmart filed for bankruptcy in 2002.) Only in a footnote does he make a vague allusion to Kmart’s troubles—“when this book was being written, Kmart was a crippled company”—and then he dismisses this piece of counter-evidence by fiat: “Kmart’s present competitive struggles are unrelated to Kresge’s strategy in meeting the original disruptive threat of discounting.”

In his discussion of the steel industry, in which he argues that established companies were disrupted by the technology of minimilling (melting down scrap metal to make cheaper, lower-quality sheet metal), Christensen writes that U.S. Steel, founded in 1901, lowered the cost of steel production from “nine labor-hours per ton of steel produced in 1980 to just under three hours per ton in 1991,” which he attributes to the company’s “ferociously attacking the size of its workforce, paring it from more than 93,000 in 1980 to fewer than 23,000 in 1991,” in order to point out that even this accomplishment could not stop the coming disruption. Christensen tends to ignore factors that don’t support his theory. Factors having effects on both production and profitability that Christensen does not mention are that, between 1986 and 1987, twenty-two thousand workers at U.S. Steel did not go to work, as part of a labor action, and that U.S. Steel’s workers are unionized and have been for generations, while minimill manufacturers, with their newer workforces, are generally non-union. Christensen’s logic here seems to be that the industry’s labor arrangements can have played no role in U.S. Steel’s struggles—and are not even worth mentioning—because U.S. Steel’s struggles must be a function of its having failed to build minimills. U.S. Steel’s struggles have been and remain grave, but its failure is by no means a matter of historical record. Today, the largest U.S. producer of steel is—U.S. Steel.

The theory of disruption is meant to be predictive. On March 10, 2000, Christensen launched a $3.8-million Disruptive Growth Fund, which he managed with Neil Eisner, a broker in St. Louis. Christensen drew on his theory to select stocks. Less than a year later, the fund was quietly liquidated: during a stretch of time when the Nasdaq lost fifty per cent of its value, the Disruptive Growth Fund lost sixty-four per cent. In 2007, Christensen told Business Week that “the prediction of the theory would be that Apple won’t succeed with the iPhone,” adding, “History speaks pretty loudly on that.” In its first five years, the iPhone generated a hundred and fifty billion dollars of revenue. In the preface to the 2011 edition of The Innovator’s Dilemma, Christensen reports that, since the book’s publication, in 1997, “the theory of disruption continues to yield predictions that are quite accurate.” This is less because people have used his model to make accurate predictions about things that haven’t happened yet than because disruption has been sold as advice, and because much that happened between 1997 and 2011 looks, in retrospect, disruptive. Disruptive innovation can reliably be seen only after the fact. History speaks loudly, apparently, only when you can make it say what you want it to say. The popular incarnation of the theory tends to disavow history altogether. “Predicting the future based on the past is like betting on a football team simply because it won the Super Bowl a decade ago,” Josh Linkner writes in The Road to Reinvention. His first principle: “Let go of the past.” It has nothing to tell you. But, unless you already believe in disruption, many of the successes that have been labelled disruptive innovation look like something else, and many of the failures that are often seen to have resulted from failing to embrace disruptive innovation look like bad management.

Christensen has compared the theory of disruptive innovation to a theory of nature: the theory of evolution. But among the many differences between disruption and evolution is that the advocates of disruption have an affinity for circular arguments. If an established company doesn’t disrupt, it will fail, and if it fails it must be because it didn’t disrupt. When a startup fails, that’s a success, since epidemic failure is a hallmark of disruptive innovation. (“Stop being afraid of failure and start embracing it,” the organizers of FailCon, an annual conference, implore, suggesting that, in the era of disruption, innovators face unprecedented challenges. For instance: maybe you made the wrong hires?) When an established company succeeds, that’s only because it hasn’t yet failed. And, when any of these things happen, all of them are only further evidence of disruption.

The handpicked case study, which is Christensen’s method, is a notoriously weak foundation on which to build a theory. But, if the handpicked case study is the approved approach, it would seem that efforts at embracing disruptive innovation are often fatal. Morrison-Knudsen, an engineering and construction firm, got its start in 1905 and helped build more than a hundred and fifty dams all over the world, including the Hoover. Beginning in 1988, a new C.E.O., William Agee, looked to new products and new markets, and, after Bill Clinton’s election, in 1992, bet on mass transit, turning to the construction of both commuter and long-distance train cars through two subsidiaries, MK Transit and MK Rail. These disruptive businesses proved to be a disaster. Morrison-Knudsen announced in 1995 that it had lost three hundred and fifty million dollars, by which point the company had essentially collapsed—not because it didn’t disruptively innovate but because it did. Time, Inc., founded in 1922, auto-disrupted, too. In 1994, the company launched Pathfinder, an early new-media venture, an umbrella Web site for its magazines, at a cost estimated to have exceeded a hundred million dollars; the site was abandoned in 1999. Had Pathfinder been successful, it would have been greeted, retrospectively, as evidence of disruptive innovation. Instead, as one of its producers put it, “it’s like it never existed.”

In the late nineteen-nineties and early two-thousands, the financial-services industry innovated by selling products like subprime mortgages, collateralized debt obligations, and mortgage-backed securities, some to a previously untapped customer base. At the time, Ed Clark was the C.E.O. of Canada’s TD Bank, which traces its roots to 1855. Clark, who earned a Ph.D. in economics at Harvard with a dissertation on public investment in Tanzania, forswore Canada’s version of this disruptive innovation, asset-backed commercial paper. The decision made TD Bank one of the strongest banks in the world. Between 2002 and 2012, TD Bank’s assets increased from $278 billion to $806 billion. Since 2005, TD Bank has opened thirteen hundred branches in the United States, bought Commerce Bank for $8.5 billion, in 2008, and adopted the motto “America’s Most Convenient Bank.” With the money it earned by expanding its traditional banking services—almost four billion dollars a year during the height of the financial crisis, according to the Canadian business reporter Howard Green—it set about marketing itself as the bank with the longest hours, the best teller services, and free dog biscuits.

When the financial-services industry disruptively innovated, it led to a global financial crisis. Like the bursting of the dot-com bubble, the meltdown didn’t dim the fervor for disruption; instead, it fuelled it, because these products of disruption contributed to the panic on which the theory of disruption thrives.

Disruptive innovation as an explanation for how change happens is everywhere. Ideas that come from business schools are exceptionally well marketed. Faith in disruption is the best illustration, and the worst case, of a larger historical transformation having to do with secularization, and what happens when the invisible hand replaces the hand of God as explanation and justification. Innovation and disruption are ideas that originated in the arena of business but which have since been applied to arenas whose values and goals are remote from the values and goals of business. People aren’t disk drives. Public schools, colleges and universities, churches, museums, and many hospitals, all of which have been subjected to disruptive innovation, have revenues and expenses and infrastructures, but they aren’t industries in the same way that manufacturers of hard-disk drives or truck engines or drygoods are industries. Journalism isn’t an industry in that sense, either.

Doctors have obligations to their patients, teachers to their students, pastors to their congregations, curators to the public, and journalists to their readers—obligations that lie outside the realm of earnings, and are fundamentally different from the obligations that a business executive has to employees, partners, and investors. Historically, institutions like museums, hospitals, schools, and universities have been supported by patronage, donations made by individuals or funding from church or state. The press has generally supported itself by charging subscribers and selling advertising. (Underwriting by corporations and foundations is a funding source of more recent vintage.) Charging for admission, membership, subscriptions and, for some, earning profits are similarities these institutions have with businesses. Still, that doesn’t make them industries, which turn things into commodities and sell them for gain.

In The Innovative University (2011), written with Henry J. Eyring, who used to work at the Monitor Group, a consulting firm co-founded by Michael Porter, Christensen subjected Harvard, a college founded by seventeenth-century theocrats, to his case-study analysis. “Studying the university’s history,” Christensen and Eyring wrote, “will allow us to move beyond the forlorn language of crisis to hopeful and practical strategies for success.” On the basis of this research, Christensen and Eyring’s recommendations for the disruption of the modern university include a “mix of face-to-face and online learning.” The publication of The Innovative University, in 2011, contributed to a frenzy for Massive Open Online Courses, or moocs, at colleges and universities across the country, including a collaboration between Harvard and M.I.T., which was announced in May of 2012. Shortly afterward, the University of Virginia’s panicked board of trustees attempted to fire the president, charging her with jeopardizing the institution’s future by failing to disruptively innovate with sufficient speed; the vice-chair of the board forwarded to the chair a Times column written by David Brooks, “The Campus Tsunami,” in which he cited Christensen.

Christensen and Eyring’s recommendation of a “mix of face-to-face and online learning” was drawn from an investigation that involves a wildly misguided attempt to apply standards of instruction in the twenty-first century to standards of instruction in the seventeenth. One table in the book, titled “Harvard’s Initial DNA, 1636-1707,” looks like this:









In 2014, there were twenty-one thousand students at Harvard. In 1640, there were thirteen. The first year classes were held, Harvard students and their “nonspecialized faculty” (one young schoolmaster, Nathaniel Eaton), enjoying “small, face-to-face classes” (Eaton’s wife, who fed the students, was accused of putting “goat’s dung in their hasty pudding”) with “high faculty empathy for learners” (Eaton conducted thrashings with a stick of walnut said to have been “big enough to have killed a horse”), could have paddled together in a single canoe. That doesn’t mean good arguments can’t be made for online education. But there’s nothing factually persuasive in this account of its historical urgency and even inevitability, which relies on a method well outside anything resembling plausible historical analysis.

Christensen and Eyring also urge universities to establish “heavyweight innovation teams”: Christensen thinks that R. & D. departments housed within a business and accountable to its executives are structurally unable to innovate disruptively—they are preoccupied with pleasing existing customers through incremental improvement. Christensen argues, for instance, that if Digital Equipment Corporation, which was doing very well making minicomputers in the nineteen-sixties and seventies, had founded, in the eighties, a separate company at another location to develop the personal computer, it might have triumphed. The logic of disruptive innovation is the logic of the startup: establish a team of innovators, set a whiteboard under a blue sky, and never ask them to make a profit, because there needs to be a wall of separation between the people whose job is to come up with the best, smartest, and most creative and important ideas and the people whose job is to make money by selling stuff. Interestingly, a similar principle has existed, for more than a century, in the press. The “heavyweight innovation team”? That’s what journalists used to call the “newsroom.”

It’s readily apparent that, in a democracy, the important business interests of institutions like the press might at times conflict with what became known as the “public interest.” That’s why, a very long time ago, newspapers like the Times and magazines like this one established a wall of separation between the editorial side of affairs and the business side. (The metaphor is to the Jeffersonian wall between church and state.) “The wall dividing the newsroom and business side has served The Times well for decades,” according to the Times’ Innovation Report, “allowing one side to focus on readers and the other to focus on advertisers,” as if this had been, all along, simply a matter of office efficiency. But the notion of a wall should be abandoned, according to the report, because it has “hidden costs” that thwart innovation. Earlier this year, the Times tried to recruit, as its new head of audience development, Michael Wertheim, the former head of promotion at the disruptive media outfit Upworthy. Wertheim turned the Times job down, citing its wall as too big an obstacle to disruptive innovation. The recommendation of the Innovation Report is to understand that both sides, editorial and business, share, as their top priority, “Reader Experience,” which can be measured, following Upworthy, in “Attention Minutes.” Vox Media, a digital-media disrupter that is mentioned ten times in the Times report and is included, along with BuzzFeed, in a list of the Times’ strongest competitors (few of which are profitable), called the report “brilliant,” “shockingly good,” and an “insanely clear” explanation of disruption, but expressed the view that there’s no way the Times will implement its recommendations, because “what the report doesn’t mention is the sobering conclusion of Christensen’s research: companies faced with disruptive threats almost never manage to handle them gracefully.”

Disruptive innovation is a theory about why businesses fail. It’s not more than that. It doesn’t explain change. It’s not a law of nature. It’s an artifact of history, an idea, forged in time; it’s the manufacture of a moment of upsetting and edgy uncertainty. Transfixed by change, it’s blind to continuity. It makes a very poor prophet.

The upstarts who work at startups don’t often stay at any one place for very long. (Three out of four startups fail. More than nine out of ten never earn a return.) They work a year here, a few months there—zany hours everywhere. They wear jeans and sneakers and ride scooters and share offices and sprawl on couches like Great Danes. Their coffee machines look like dollhouse-size factories.

They are told that they should be reckless and ruthless. Their investors, if they’re like Josh Linkner, tell them that the world is a terrifying place, moving at a devastating pace. “Today I run a venture capital firm and back the next generation of innovators who are, as I was throughout my earlier career, dead-focused on eating your lunch,” Linkner writes. His job appears to be to convince a generation of people who want to do good and do well to learn, instead, remorselessness. Forget rules, obligations, your conscience, loyalty, a sense of the commonweal. If you start a business and it succeeds, Linkner advises, sell it and take the cash. Don’t look back. Never pause. Disrupt or be disrupted.

But they do pause and they do look back, and they wonder. Meanwhile, they tweet, they post, they tumble in and out of love, they ponder. They send one another sly messages, touching the screens of sleek, soundless machines with a worshipful tenderness. They swap novels: David Foster Wallace, Chimamanda Ngozi Adichie, Zadie Smith. Steppenwolf is still available in print, five dollars cheaper as an e-book. He’s a wolf, he’s a man. The rest is unreadable. So, as ever, is the future. Ω

[Jill Lepore is the David Woods Kemper '41 Professor of American History at Harvard University as well as the chair of the History and Literature Program. She also is a staff writer at The New Yorker. Her latest books are The Story of America: Essays on Origins (2012) and Book of Ages: The Life and Opinions of Jane Franklin (2013). Lepore earned her B.A. in English from Tufts University, an M.A. in American Culture from the University of Michigan, and a Ph.D. in American Studies from Yale University.]

Copyright © 2014 Condé Nast Digital



Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2014 Sapper's (Fair & Balanced) Rants & Raves

No comments:

Post a Comment

☛ STOP!!! Read the following BEFORE posting a Comment!

Include your e-mail address with your comment or your comment will be deleted by default. Your e-mail address will be DELETED before the comment is posted to this blog. Comments to entries in this blog are moderated by the blogger. Violators of this rule can KMA (Kiss My A-Double-Crooked-Letter) as this blogger's late maternal grandmother would say. No e-mail address (to be verified AND then deleted by the blogger) within the comment, no posting. That is the (fair & balanced) rule for comments to this blog.