Rise of the Robots: Technology and the Threat of a Jobless Future (17 page)

BOOK: Rise of the Robots: Technology and the Threat of a Jobless Future
8.43Mb size Format: txt, pdf, ePub

Other obvious applications for the Watson system are in areas like customer service and technical support. In 2013, IBM announced that it would work with Fluid, Inc., a major provider of online shopping services and consulting. The project aims to let online shopping sites replicate the kind of personalized, natural language assistance you would get from a knowledgeable sales clerk in a retail store. If you’re going camping and need a tent, you’d be able to say something like “I am taking my family camping in upstate NY in October and I need a tent. What should I consider?” You’d then get specific tent recommendations, as well as pointers to other items that you might not have considered.
23
As I suggested in
Chapter 1
, it is only a matter of time before capability of that type becomes available via smart phones and shoppers are able to access conversational, natural language assistance while in brick and mortar stores.

MD Buyline, Inc., a company that specializes in providing information and research about the latest health care technology to hospitals, likewise plans to use Watson to answer the far more technical questions that come up when hospitals need to purchase new equipment. The system would draw on product specifications, prices, and clinical studies and research to make specific and instant recommendations to doctors and procurement managers.
24
Watson is also looking for a role in the financial industry, where the system may be poised to provide personalized financial advice by delving into a wealth of information about specific customers as well as general market and economic conditions. The deployment of Watson in customer service call centers is perhaps the area with the most disruptive near-term potential, and it is likely no coincidence that within a year of Watson’s triumph on
Jeopardy!,
IBM was already working with Citigroup to explore applications for the system in the company’s massive retail banking operation.
25

IBM’s new technology is still in its infancy. Watson—as well as the competing systems that are certain to eventually appear—have the potential to revolutionize the way questions are asked and answered, as well as the way information analysis is approached, both internal to organizations and in engagements with customers. There is no escaping the reality, however, that a great deal of the analysis performed by systems of this type would otherwise have been done by human knowledge workers.

Building Blocks in the Cloud

In November 2013, IBM announced that its Watson system would move from the specialized computers that hosted the system for the
Jeopardy!
matches to the cloud. In other words, Watson would now reside in massive collections of servers connected to the Internet. Developers would be able to link directly to the system and incorporate IBM’s revolutionary cognitive computing technology into custom software applications and mobile apps. This latest version of Watson was also more than twice as fast as its
Jeopardy!
-playing predecessor. IBM envisions the rapid emergence of an entire ecosystem of smart, natural language applications—all carrying the “Powered by Watson” label.
26

The migration of leading-edge artificial intelligence capability into the cloud is almost certain to be a powerful driver of white-collar automation. Cloud computing has become the focus of intense competition among major information technology companies, including Amazon, Google, and Microsoft. Google, for example, offers developers a cloud-based machine learning application as well as a large-scale compute engine that lets developers solve huge, computationally intensive problems by running programs on massive supercomputer-like networks of servers. Amazon is the industry leader in providing cloud computing services. Cycle Computing, a small company that specializes in large-scale computing, was able to solve a complex
problem that would have taken over 260 years on a single computer in just 18 hours by utilizing tens of thousands of the computers that power Amazon’s cloud service. The company estimates that prior to the advent of cloud computing, it would have cost as much as $68 million to build a supercomputer capable of taking on the problem. In contrast, it’s possible to rent 10,000 servers in the Amazon cloud for about $90 per hour.
27

Just as the field of robotics is poised for explosive growth as the hardware and software components used in designing the machines become cheaper and more capable, a similar phenomenon is unfolding for the technology that powers the automation of knowledge work. When technologies like Watson, deep learning neural networks, or narrative-writing engines are hosted in the cloud, they effectively become building blocks that can be leveraged in countless new ways. Just as hackers quickly figured out that Microsoft’s Kinect could be used as an inexpensive way to give robots three-dimensional machine vision, developers will likewise find unforeseen—and perhaps revolutionary—applications for cloud-based software building blocks. Each of these building blocks is in effect a “black box”—meaning that the component can be used by programmers who have no detailed understanding of how it works. The ultimate result is sure to be that groundbreaking AI technologies created by teams of specialists will rapidly become ubiquitous and accessible even to amateur coders.

While innovations in robotics produce tangible machines that are often easily associated with particular jobs (a hamburger-making robot or a precision assembly robot, for example), progress in software automation will likely be far less visible to the public; it will often take place deep within corporate walls, and it will have more holistic impacts on organizations and the people they employ. White-collar automation will very often be the story of information technology consultants descending on large organizations and building completely custom systems that have the potential to revolutionize the
way the business operates, while at the same time eliminating the need for potentially hundreds or even thousands of skilled workers. Indeed, one of IBM’s stated motivations for creating the Watson technology was to offer its consulting division—which, together with software sales, now accounts for the vast majority of the company’s revenues—a competitive advantage. At the same time, entrepreneurs are already finding ways to use the same cloud-based building blocks to create affordable automation products geared toward small or medium-sized businesses.

Cloud computing has already had a significant impact on information technology jobs. During the 1990’s tech boom, huge numbers of well-paying jobs were created as businesses and organizations of all sizes needed IT professionals to administer and install personal computers, networks, and software. By the first decade of the twenty-first century, however, the trend began to shift as companies were increasingly outsourcing many of their information technology functions to huge, centralized computing hubs.

The massive facilities that host cloud computing services benefit from enormous economies of scale, and the administrative functions that once kept armies of skilled IT workers busy are now highly automated. Facebook, for example, employs a smart software application called “Cyborg” that continuously monitors tens of thousands of servers, detects problems, and in many cases can perform repairs completely autonomously. A Facebook executive noted in November 2013 that the Cyborg system routinely solves thousands of problems that would otherwise have to be addressed manually, and that the technology allows a single technician to manage as many as 20,000 computers.
28

Cloud computing data centers are often built in relatively rural areas where land and, especially, electric power are plentiful and cheap. States and local governments compete intensively for the facilities, offering companies like Google, Facebook, and Apple generous tax breaks and other financial incentives. Their primary objective, of
course, is to create lots of jobs for local residents—but such hopes are rarely realized. In 2011, the
Washington Post
’s Michael Rosenwald reported that a colossal, billion-dollar data center built by Apple, Inc., in the town of Maiden, North Carolina, had created only fifty full-time positions. Disappointed residents couldn’t “comprehend how expensive facilities stretching across hundreds of acres can create so few jobs.”
29
The explanation, of course, is that algorithms like Cyborg are doing the heavy lifting.

The impact on employment extends beyond the data centers themselves to the companies that leverage cloud computing services. In 2012, Roman Stanek, the CEO of Good Data, a San Francisco company that uses Amazon’s cloud services to perform data analysis for about 6,000 clients, noted that “[b]efore, each [client] company needed at least five people to do this work. That is 30,000 people. I do it with 180. I don’t know what all those other people will do now, but this isn’t work they can do anymore. It’s a winner-takes-all consolidation.”
30

The evaporation of thousands of skilled information technology jobs is likely a precursor for a much more wide-ranging impact on knowledge-based employment. As Netscape co-founder and venture capitalist Marc Andreessen famously said, “Software is eating the world.” More often than not, that software will be hosted in the cloud. From that vantage point it will eventually be poised to invade virtually every workplace and swallow up nearly any white-collar job that involves sitting in front of a computer manipulating information.

Algorithms on the Frontier

If there is one myth regarding computer technology that ought to be swept into the dustbin it is the pervasive believe that computers can do only what they are specifically programmed to do. As we’ve seen, machine learning algorithms routinely churn through data, revealing statistical relationships and, in essence, writing their own programs
on the basis of what they discover. In some cases, however, computers are pushing even further and beginning to encroach into areas that nearly everyone assumes are the exclusive province of the human mind: machines are starting to demonstrate curiosity and creativity.

In 2009, Hod Lipson, the director of the Creative Machines Lab at Cornell University, and PhD student Michael Schmidt built a system that has proved capable of independently discovering fundamental natural laws. Lipson and Schmidt started by setting up a double pendulum—a contraption that consists of one pendulum attached to, and dangling below, another. When both pendulums are swinging, the motion is extremely complex and seemingly chaotic. Next they used sensors and cameras to capture the pendulum’s motion and produce a stream of data. Finally, they gave their software the ability to control the starting position of the pendulum; in other words, they created an artificial scientist with the ability to conduct its own experiments.

They turned their software loose to repeatedly release the pendulum and then sift through the resulting motion data and try to figure out the mathematical equations that describe the pendulum’s behavior. The algorithm had complete control over the experiment; for each repetition, it decided how to position the pendulum for release, and it did not do this randomly—it performed an analysis and then chose the specific starting point that would likely provide the most insight into the laws underlying the pendulum’s motion. Lipson notes that the system “is not a passive algorithm that sits back, watching. It
asks questions.
That’s
curiosity.

31
The program, which they later named “Eureqa,” took only a few hours to come up with a number of physical laws describing the movement of the pendulum—including Newton’s Second Law—and it was able to do this without being given any prior information or programming about physics or the laws of motion.

Eureqa uses genetic programming, a technique inspired by biological evolution. The algorithm begins by randomly combining
various mathematical building blocks into equations and then testing to see how well the equations fit the data.
*
Equations that fail the test are discarded, while those that show promise are retained and recombined in new ways so that the system ultimately converges on an accurate mathematical model.
32
The process of finding an equation that describes the behavior of a natural system is by no means a trivial exercise. As Lipson says, “[P]reviously, coming up with a predictive model could take a [scientist’s] whole career.”
33
Schmidt adds that “[p]hysicists like Newton and Kepler could have used a computer running this algorithm to figure out the laws that explain a falling apple or the motion of the planets with just a few hours of computation.”
34

When Schmidt and Lipson published a paper describing their algorithm, they were deluged with requests for access to the software from other scientists, and they decided to make Eureqa available over the Internet in late 2009. The program has since produced a number of useful results in a range of scientific fields, including a simplified equation describing the biochemistry of bacteria that scientists are still struggling to understand.
35
In 2011, Schmidt founded Nutonian, Inc., a Boston-area start-up company focused on commercializing Eureqa as a big data analysis tool for both business and academic applications. One result is that Eureqa—like IBM’s Watson—is now hosted in the cloud and is available as an application building block to other software developers.

Most of us quite naturally tend to associate the concept of creativity exclusively with the human brain, but it’s worth remembering that the
brain itself—by far the most sophisticated invention in existence—is the product of evolution. Given this, perhaps it should come as no surprise that attempts to build creative machines very often incorporate genetic programming techniques. Genetic programming essentially allows computer algorithms to design themselves through a process of Darwinian natural selection. Computer code is initially generated randomly and then repeatedly shuffled using techniques that emulate sexual reproduction. Every so often, a random mutation is thrown in to help drive the process in entirely new directions. As new algorithms evolve, they are subjected to a fitness test that leads to either their survival, or—far more often—their demise. Computer scientist and consulting Stanford professor John Koza is one of the leading researchers in the field and has done extensive work using genetic algorithms as “automated invention machines.”
*
Koza has isolated at least seventy-six cases where genetic algorithms have produced designs that are competitive with the work of human engineers and scientists in a variety of fields, including electric circuit design, mechanical systems, optics, software repair, and civil engineering. In most of these cases, the algorithms have replicated existing designs, but there are at least two instances where genetic programs have created new, patentable inventions.
36
Koza argues that genetic algorithms may have an important advantage over human designers because they are not constrained by preconceptions; in other words, they may be more likely to result in an “outside-the-box” approach to the problem.
37

Other books

Amsterdam by Ian McEwan
An Indecent Proposition by WILDES, EMMA
The School for Brides by Cheryl Ann Smith
Lady Jane's Ribbons by Sandra Wilson
Legado by Greg Bear
Tortilla Sun by Jennifer Cervantes