Free Novel Read

Thank You for Being Late Page 3

In 2007, Intel introduced non-silicon materials—known as high-k/metal gates (the term refers to the transistor gate electrode and transistor gate dielectric)—into microchips for the first time. This very technical fix was hugely important. Although non-silicon materials were already used in other parts of the microprocessor, their introduction into the transistor helped Moore’s law—the expectation that the power of microchips would double roughly every two years—continue on its path of delivering exponential growth in computing power. At that time there was real concern that Moore’s law was hitting a wall with traditional silicon transistors.

  “By opening the way to non-silicon materials it gave Moore’s law another shot in the arm at a time when many people were thinking it was coming to an end,” said Sadasivan Shankar, who worked on Intel’s material design team at the time and now teaches materials and computational sciences at the Harvard School of Engineering and Applied Sciences. Commenting on the breakthrough, the New York Times Silicon Valley reporter John Markoff wrote on January 27, 2007: “Intel, the world’s largest chip maker, has overhauled the basic building block of the information age, paving the way for a new generation of faster and more energy-efficient processors. Company researchers said the advance represented the most significant change in the materials used to manufacture silicon chips since Intel pioneered the modern integrated-circuit transistor more than four decades ago.”

  For all of the above reasons, 2007 was also “the beginning of the clean power revolution,” said Andy Karsner, the U.S. assistant secretary of energy for efficiency and renewable energy from 2006 to 2008. “If anyone in 2005 or 2006 told you their predictive models captured where clean tech and renewable energy went in 2007 they are lying. Because what happened in 2007 was the beginning of an exponential rise in solar energy, wind, biofuels, LED lighting, energy efficient buildings, and the electrification of vehicles. It was a hockey stick moment.”

  Last but certainly not least, in 2007 the cost of DNA sequencing began to fall dramatically as the biotech industry shifted to new sequencing techniques and platforms, leveraging all the computing and storage power that was just exploding. This change in instruments was a turning point for genetic engineering and led to the “rapid evolution of DNA sequencing technologies that has occurred in recent years,” according to Genome.gov. In 2001, it cost $100 million to sequence just one person’s genome. On September 30, 2015, Popular Science reported: “Yesterday, personal genetics company Veritas Genetics announced that it had reached a milestone: participants in its limited, but steadily expanding Personal Genetics Program can get their entire genome sequenced for just $1,000.”

  As the graphs on the next two pages display, 2007 was clearly a turning point for many technologies.

  Technology has always moved up in step changes. All the elements of computing power—processing chips, software, storage chips, networking, and sensors—tend to move forward roughly as a group. As their improving capacities reach a certain point, they tend to meld together into a platform, and that platform scales a new set of capabilities, which becomes the new normal. As we went from mainframes to desktops to laptops to smartphones with mobile applications, each generation of technology got easier and more natural for people to use than the one before. When the first mainframe computers came out, you needed to have a computer science degree to use them. Today’s smartphone can be accessed by young children and the illiterate.

  As step changes in technology go, though, the platform birthed around the year 2007 surely constituted one of the greatest leaps forward in history. It suffused a new set of capabilities to connect, collaborate, and create throughout every aspect of life, commerce, and government. Suddenly there were so many more things that could be digitized, so much more storage to hold all that digital data, so many faster computers and so much more innovative software that could process that data for insights, and so many more organizations and people (from the biggest multinationals to the smallest Indian farmers) who could access those insights, or contribute to them, anywhere in the world through their handheld computers—their smartphones.

  This is the central technology engine driving the Machine today. It snuck up on us very fast. In 2004, I started writing a book about what I thought then was the biggest force driving the Machine—namely, how the world was getting wired to such a degree that more people in more places had an equal opportunity to compete, connect, and collaborate with more other people for less money with greater ease than ever before. I called that book The World Is Flat: A Brief History of the Twenty-First Century. The first edition came out in 2005. I wrote an updated 2.0 edition in 2006 and a 3.0 edition in 2007. And then I stopped, thinking I had built a pretty solid framework that could last me as a columnist for a while.

  I was very wrong! Indeed, 2007 was a really bad year to stop thinking.

  I first realized just how bad the minute I sat down in 2010 to write my most recent book, That Used to Be Us: How America Fell Behind in the World It Invented and How We Can Come Back, which I coauthored with Michael Mandelbaum. As I recalled in that book, the first thing I did when I started working on it was to get the first edition of The World Is Flat off my bookshelf—just to remind myself what I was thinking when I started back in 2004. I cracked it open to the index, ran my finger down the page, and immediately discovered that Facebook wasn’t in it! That’s right—when I was running around in 2004 declaring that the world was flat, Facebook didn’t even exist yet, Twitter was still a sound, the cloud was still in the sky, 4G was a parking space, “applications” were what you sent to college, LinkedIn was barely known and most people thought it was a prison, Big Data was a good name for a rap star, and Skype, for most people, was a typographical error. All of those technologies blossomed after I wrote The World Is Flat—most of them around 2007.

  So a few years later, I began updating in earnest my view of how the Machine worked. A crucial impetus was a book I read in 2014 by two MIT business school professors—Erik Brynjolfsson and Andrew McAfee—entitled The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies. The first machine age, they argued, was the Industrial Revolution, which accompanied the invention of the steam engine in the 1700s. This period was “all about power systems to augment human muscle,” explained McAfee in an interview, “and each successive invention in that age delivered more and more power. But they all required humans to make decisions about them.” Therefore, the inventions of that era actually made human control and labor “more valuable and important.”

  Labor and machines were, broadly speaking, complementary, he added. In the second machine age, though, noted Brynjolfsson, “we are beginning to automate a lot more cognitive tasks, a lot more of the control systems that determine what to use that power for. In many cases today artificially intelligent machines can make better decisions than humans.” So humans and software-driven machines may increasingly be substitutes, not complements.

  The key, but not the only, driving force making this possible, they argued, was the exponential growth in computing power as represented by Moore’s law: the theory first postulated by Intel cofounder Gordon Moore in 1965 that the speed and power of microchips—that is, computational processing power—would double roughly every year, which he later updated to every two years, for only slightly more money with each new generation. Moore’s law has held up close to that pattern for fifty years.

  To illustrate this kind of exponential growth, Brynjolfsson and McAfee recalled the famous legend of the king who was so impressed with the man who invented the game of chess that he offered him any reward. The inventor of chess said that all he wanted was enough rice to feed his family. The king said, “Of course, it shall be done. How much would you like?” The man asked the king to simply place a single grain of rice on the first square of a chessboard, then two on the next, then four on the next, with each subsequent square receiving twice as many grains as the previous one. The king agreed, noted Brynjolfsson and McAfee—without realizin
g that sixty-three instances of doubling yields a fantastically big number: something like eighteen quintillion grains of rice. That is the power of exponential change. When you keep doubling something for fifty years you start to get to some very big numbers, and eventually you start to see some very funky things that you have never seen before.

  The authors argued that Moore’s law just entered the “second half of the chessboard,” where the doubling has gotten so big and fast we’re starting to see stuff that is fundamentally different in power and capability from anything we have seen before—self-driving cars, computers that can think on their own and beat any human in chess or Jeopardy! or even Go, a 2,500-year-old board game considered vastly more complicated than chess. That is what happens “when the rate of change and the acceleration of the rate of change both increase at the same time,” said McAfee, and “we haven’t seen anything yet!”

  So, at one level, my view of the Machine today is built on the shoulders of Brynjolfsson and McAfee’s fundamental insight into how the steady acceleration in Moore’s law has affected technology—but I think the Machine today is even more complicated. That’s because it’s not just pure technological change that has hit the second half of the chessboard. It is also two other giant forces: accelerations in the Market and in Mother Nature.

  “The Market” is my shorthand for the acceleration of globalization. That is, global flows of commerce, finance, credit, social networks, and connectivity generally are weaving markets, media, central banks, companies, schools, communities, and individuals more tightly together than ever. The resulting flows of information and knowledge are making the world not only interconnected and hyperconnected but interdependent—everyone everywhere is now more vulnerable to the actions of anyone anywhere.

  And “Mother Nature” is my shorthand for climate change, population growth, and biodiversity loss—all of which have also been accelerating, as they, too, enter the second halves of their chessboards.

  Here again, I am standing on the shoulders of others. I derive the term “the age of accelerations” from a series of graphs first assembled by a team of scientists led by Will Steffen, a climate change expert and researcher at the Australian National University, Canberra. The graphs, which originally appeared in a 2004 book entitled Global Change and the Earth System: A Planet Under Pressure, looked at how technological, social, and environmental impacts were accelerating and feeding off one another from 1750 to 2000, and particularly since 1950. The term “Great Acceleration” was coined in 2005 by these same scientists to capture the holistic, comprehensive, and interlinked nature of all these changes simultaneously sweeping across the globe and reshaping the human and biophysical landscapes of the Earth system. An updated version of those graphs was published in the Anthropocene Review on March 2, 2015; they appear here of this book.

  “When we started the project it was ten years since the first accelerations had been published, which ran from 1750 to 2000,” explained Owen Gaffney, director of strategy for the Stockholm Resilience Centre, and part of the Great Acceleration team. “We wanted to update the graphs to 2010 to see if the trajectory had altered any”—and indeed it had, he said: it had accelerated.

  It is the core argument of this book that these simultaneous accelerations in the Market, Mother Nature, and Moore’s law together constitute the “age of accelerations,” in which we now find ourselves. These are the central gears driving the Machine today. These three accelerations are impacting one another—more Moore’s law is driving more globalization and more globalization is driving more climate change, and more Moore’s law is also driving more potential solutions to climate change and a host of other challenges—and at the same time transforming almost every aspect of modern life.

  Craig Mundie, a supercomputer designer and former chief of strategy and research at Microsoft, defines this moment in simple physics terms: “The mathematical definition of velocity is the first derivative, and acceleration is the second derivative. So velocity grows or shrinks as a function of acceleration. In the world we are in now, acceleration seems to be increasing. [That means] you don’t just move to a higher speed of change. The rate of change also gets faster … And when the rate of change eventually exceeds the ability to adapt you get ‘dislocation.’ ‘Disruption’ is what happens when someone does something clever that makes you or your company look obsolete. ‘Dislocation’ is when the whole environment is being altered so quickly that everyone starts to feel they can’t keep up.”

  That is what is happening now. “The world is not just rapidly changing,” adds Dov Seidman, “it is being dramatically reshaped—it is starting to operate differently” in many realms all at once. “And this reshaping is happening faster than we have yet been able to reshape ourselves, our leadership, our institutions, our societies, and our ethical choices.”

  Indeed, there is a mismatch between the change in the pace of change and our ability to develop the learning systems, training systems, management systems, social safety nets, and government regulations that would enable citizens to get the most out of these accelerations and cushion their worst impacts. This mismatch, as we will see, is at the center of much of the turmoil roiling politics and society in both developed and developing countries today. It now constitutes probably the most important governance challenge across the globe.

  Astro Teller’s Graph

  The most illuminating illustration of this phenomenon was sketched out for me by Eric “Astro” Teller, the CEO of Google’s X research and development lab, which produced Google’s self-driving car, among other innovations. Appropriately enough, Teller’s formal title at X is “Captain of Moonshots.” Imagine someone whose whole mandate is to come to the office every day and, with his colleagues, produce moonshots—turning what others would consider science fiction into products and services that could transform how we live and work. His paternal grandfather was the physicist Edward Teller, designer of the hydrogen bomb, and his maternal grandfather was Gérard Debreu, a Nobel Prize–winning economist. Good genes, as they say. We were in a conference room at X headquarters, which is a converted shopping mall. Teller arrived at our interview on Rollerblades, which is how he keeps up with his daily crush of meetings.

  He wasted no time before launching into an explanation of how the accelerations in Moore’s law and in the flow of ideas are together causing an increase in the pace of change that is challenging the ability of human beings to adapt.

  Teller began by taking out a small yellow 3M notepad and saying: “Imagine two curves on a graph.” He then drew a graph with the Y axis labeled “rate of change” and the X axis labeled “time.” Then he drew the first curve—a swooping exponential line that started very flat and escalated slowly before soaring to the upper outer corner of the graph, like a hockey stick: “This line represents scientific progress,” he said. At first it moves up very gradually, then it starts to slope higher as innovations build on innovations that have come before, and then it starts to soar straight to the sky.

  What would be on that line? Think of the introduction of the printing press, the telegraph, the manual typewriter, the Telex, the mainframe computer, the first word processors, the PC, the Internet, the laptop, the mobile phone, search, mobile apps, big data, virtual reality, human-genome sequencing, artificial intelligence, and the self-driving car.

  A thousand years ago, Teller explained, that curve representing scientific and technological progress rose so gradually that it could take one hundred years for the world to look and feel dramatically different. For instance, it took centuries for the longbow to go from development into military use in Europe in the late thirteenth century. If you lived in the twelfth century, your basic life was not all that different than if you lived in the eleventh century. And whatever changes were being introduced in major towns in Europe or Asia took forever to reach the countryside, let alone the far reaches of Africa or South America. Nothing scaled globally all at once.

  But by 1900, Teller noted,
this process of technological and scientific change “started to speed up” and the curve started to accelerate upward. “That’s because technology stands on its own shoulders—each generation of invention stands on the inventions that have come before,” said Teller. “So by 1900, it was taking twenty to thirty years for technology to take one step big enough that the world became uncomfortably different. Think of the introduction of the car and the airplane.”

  Then the slope of the curve started to go almost straight up and off the graph with the convergence of mobile devices, broadband connectivity, and cloud computing (which we will discuss shortly). These developments diffused the tools of innovation to many more people on the planet, enabling them to drive change farther, faster, and more cheaply.

  “Now, in 2016,” he added, “that time window—having continued to shrink as each technology stood on the shoulders of past technologies—has become so short that it’s on the order of five to seven years from the time something is introduced to being ubiquitous and the world being uncomfortably changed.”

  What does this process feel like? In my first book about globalization, The Lexus and the Olive Tree, I included a story Lawrence Summers told me that captured the essence of where we’d come from and where we were heading. It was 1988, Summers recalled, and he was working on the Michael Dukakis presidential campaign, which sent him to Chicago to give a speech. A car picked him up at the airport to take him to the event, and when he slipped into the car he discovered a telephone fixed into the backseat. “I thought it was sufficiently neat to have a cell phone in my car in 1988 that I used it to call my wife to tell her that I was in a car with a phone,” Summers told me. He also used it to call everyone else he could think of, and they were just as excited.

  Just nine years later Summers was deputy treasury secretary. On a trip to the Ivory Coast in West Africa, he had to inaugurate an American-funded health care project in a village upriver from the main city, Abidjan, that was opening its first potable water well. What he remembered most, though, he told me, was that on his way back from the village, as he stepped into a dugout canoe to return downriver, an Ivory Coast official handed him a cell phone and said: “Washington has a question for you.” In nine years Summers went from bragging that he was in a car with a mobile phone in Chicago to nonchalantly using one in the backseat of his dugout canoe in Abidjan. The pace of change had not only quickened but was now happening at a global scale.