In 50 years, humans will have long-term space colonies, according to the predictions of 1 in 3 Americans. A greater fraction of people think teleportation and lab-grown organs will become reality by 2064. The responses, based on a survey of 1,001 U.S. adults by the Pew Research Center, suggest that Americans feel both optimistic and apprehensive about future technology.
More than 65 percent of men thought that technological changes would lead to a future in which people’s lives are better, a sense of positivity that was lower for women (51 percent). Older Americans, wealthier Americans and those who had completed a higher level of education were more likely to feel positive about future technology.
When given a choice between futuristic inventions they could own, 9 percent of respondents chose a time machine, 4 percent wanted a personal robot servant and 2 percent wanted world peace.
But the 50-year predictions were what jumped out at us: 19 percent of respondents thought that humans would control the weather, and 51 percent said computers would be able to create art as well as humans do.
We could wait until 2064 and check whether these predictions come true. But we will probably be too busy teleporting our colleagues to the FiveThirtyEight Mars bureau to worry about the Pew study. So instead, we thought it would be worth looking at the pace of technological change — do the past 50 years of change suggest that the next 50 will be as dramatic as Americans believe?
Of course, how to measure the pace of technological change isn’t so obvious. But Ben Miller at the Information Technology and Innovation Foundation suggested four proxies that caught our eye. Let’s see how the past five decades look for each of them.
Economists love this one. The basic assumption is that better technology means lower costs and more output per hour of work. But when we looked at data on U.S. labor productivity since 1964, there is no evidence of a consistent rate of increase. There are plenty of reasons that output can change, including wages and demand, and not all of them are related to technology.
As Miller explains, the basic theory here is that technological progress eats up energy. Except that ignores efficiency. Miller explains, “If I drive and you take the high-speed train, am I more technologically advanced than you because I used more energy — even though you got there first?”
This theory is based on something called Moore’s Law, which claims that about every two years, the number of transistors on a computer chip doubles. The history of computer performance suggests that speed has increased. And if that speed is an indication of technological change, then the pace of change has increased. (Note: The y-axis below increases exponentially. You can click the image to enlarge it.)
Because patents are supposed to provide monetary incentives to build inventions, theory suggests they might be a good indication of technological change. Here’s the data on the number of patents applied for and granted since 1964.
It looks like a pretty impressive and consistent increase. Except not all patents involve the use of new technologies.
Technological change is happening, but whether it’s fast enough to meet the expectations of those survey respondents is another question. Looking at the 1964 New York World’s Fair, it seems people dream big. The “picturephone” and “personal use of the computer” sound familiar these days, but the 1964 fair also foresaw “moon colonies,” which have yet to see the light of day, and “jetpacks,” which are still far from widespread use. That’s too bad for the 1 percent of Pew survey respondents who said they looked forward to owning a jetpack.