Innovation frequently appears to be a series of divergent choices regarding when, how, and what to invent. In our case, we must choose whether to develop AI tools that merely assist humans in their work or agents that completely automate entire tasks. Upon closer examination, however, it becomes clear that this is a false choice. Due to their immense utility, autonomous agents that completely replace human labor will undoubtedly be developed because they cannot be replaced by AI tools. The only real option is whether to initiate this technological revolution in our absence or to accelerate it ourselves. This is not an original lesson. The ability to plot our course, avoid storms, and select our destination is frequently compared to that of a ship captain. Yet this view is wrong.
To a first approximation, the future course of civilization has already been fixed, predetermined by hard physical constraints combined with unavoidable economic incentives. Regardless of the choices we make now, humanity will develop roughly the same technologies, in roughly the same order, and in roughly the same manner, despite our best efforts. Humanity is more like a raging stream that flows into a valley, taking the shortest route, rather than a ship captain. Putting obstacles in the way, prohibiting certain technologies, or aggressively pursuing others may be attempts to control the tide, but these actions will only delay the inevitable and not prevent us from reaching the valley floor. It might come off as surprising. After all, we stopped the cloning of human beings all over the world and nearly stopped nuclear power in the United States. Doesn’t that demonstrate our ability to select the technologies we develop? Contrary to popular belief, we actually have much less control over our technological future. The technology tree is found, not created. Technological progress occurs in a logical sequence. Each innovation rests on a foundation of prior discoveries, forming a dependency tree that constrains what we can develop, and when. You can’t invent the telescope before discovering how to grind optical lenses, or develop electric lighting before learning how to generate electricity.
This technology tree did not originate with us; rather, it was the result of events beyond our control. The evidence for this lies in two observations: first, technologies routinely emerge soon after they become possible, often discovered simultaneously by independent researchers who never heard of each other. Second, when isolated societies are confronted with similar issues and limited resources, they converge on the same fundamental technologies. Simultaneous discovery is common. In the same year, 1886, Charles Martin Hall in the United States and Paul Héroult in France discovered the Hall–Héroult method, which is now used to smelt the world’s supply of aluminum. They were separated by an ocean, but they made their discoveries simultaneously. Using the same electrolysis and aluminum oxide dissolution in molten cryolite, both worked independently. Both were aware of the other’s work. In the latter part of the 1930s, British engineer Frank Whittle and German engineer Hans von Ohain demonstrated the jet engine for the first time independently. Von Ohain’s design was the first to take flight (1939), and he was also reportedly first to run his engine (March 1937), followed closely by Whittle (April 1937).
Perhaps most strikingly, on February 14, 1876, Alexander Graham Bell’s lawyer filed a patent application for the telephone on the same day Elisha Gray’s lawyer filed a caveat for an almost identical design. Gray’s application was recorded as the thirty-ninth and Bell’s as the fifth on that day. Bell received the patent, and the inventor of the telephone would be remembered depending on when these filings were made. These are not unheard-of events. Like Robert K. Merton put it this way: “In principle, the pattern of independent multiple discoveries in science is the dominant pattern, not a subsidiary one.” The residual cases that require special explanation are the singletons, which are discoveries that have only been made once in the history of science. According to this pattern, technologies almost spontaneously emerge when the necessary conditions are met. When the prerequisites fall into place, invention follows quickly. Take LLMs as an example. We can infer from NVIDIA’s revenue data that the compute required to train a GPT-4 level model did not become available until around 2020. GPT-4 itself was trained just two years later in 2022.
The same fundamental technologies are shared by isolated civilizations. When Hernán Cortés arrived in the New World in 1519, he encountered a civilization that had been evolving independently from his own for over 10,000 years. The Aztec were unlike the Spanish in many ways. They didn’t speak the same language, practice the same cultural traditions, or worship the same gods.
Yet for all their differences, there were also many striking similarities. Both had independently developed intensive agriculture with irrigation and terracing. Both designed their cities using rectangular street grids centered on public plazas. Both made use of the concept of zero, weaved cotton into clothing that was dyed, traded in currency, and constructed massive stone structures. Both were hierarchical societies that featured a monarch at the top, a hereditary nobility, bureaucracies to administer taxation, and standing professional armies.
This pattern is not unusual but rather common. Metalworking, the wheel, writing, and bureaucratic states were independently developed by civilizations separated by vast distances and time. Specifics varied depending on the circumstances of the location: in Mesopotamia, writing appeared on clay tablets, in Egypt, on papyrus, in China, on bamboo strips, and in Mesoamerica on bark paper. Each civilization worked within their local constraints, utilizing the materials that were available to them. However, when confronted with similar issues, each developed similar technologies. These observations suggest that societies frequently have only one effective approach to resolving issues. Rather than having a free hand in how to develop, societies are fundamentally constrained by what works and what doesn’t. Certain technological and social structures must emerge at given developmental stages, regardless of specific cultural choices.
This principle parallels evolutionary biology, where different lineages frequently converge on the same methods to solve similar problems. The cephalopod eye and the vertebrate eye developed completely independently, but they eventually developed remarkably similar camera-type structures. Both have a cornea, a spherical lens for focusing light, an iris for controlling the amount of light that enters, a retina for resolving sharp details, eye muscles for tracking movement, and an optic nerve for transmitting visual information to their brains.
We have no control over where technology goes. Nuclear energy is highly regulated, but this does not imply humanity has much control over technology in general. It is easy to constrain a technology when there are readily available substitutes that work about as well for nearly as cheap. Without nuclear energy, humans can still use coal, oil, natural gas or hydroelectric energy to heat their homes and power their devices.
Experience with technologies that provide one-of-a-kind, incomparable capabilities will be the true test of humanity’s ability to control technology. Rather than looking at nuclear energy, we should look at nuclear weapons. Nuclear weapons are orders of magnitude more powerful than conventional alternatives, which helps explain why many countries developed and continued to stockpile them despite international efforts to limit nuclear proliferation.
History is replete with similar examples. In the 15th and 16th centuries, Catholic monarchies attempted to limit the printing press through licensing and censorship, but ultimately failed to curtail the growth of Protestantism. Even though Britain made it illegal to export industrial machinery at the beginning of the 19th century, the designs were still smuggled abroad, allowing new industries to emerge in Europe and the United States. The United States attempted to restrict strong encryption technology in the 1990s by classifying it as a munition, prosecuting developers, and promoting mandatory government backdoors through the Clipper Chip, but these efforts collapsed within five years as encryption software spread globally through the Internet.
Human cloning appears to be a genuine counterexample, but it is important to consider the timescales involved. Even if fully developed, cloning would take many decades, if not centuries, to deliver meaningful competitive advantages. It has only been about one human generation since human cloning became technologically feasible. It tells us little about humanity’s ability to resist technologies that provide immediate and significant competitive advantages that we have not developed after only one generation. The broader picture suggests that when a technology offers quick, overwhelming economic or military advantages to those who adopt it, efforts to prevent its development will fail. It may be possible to delay or regulate its use, but it seems impossible to abandon the technology completely. Transformative technologies will be developed anyway.
Absolute automation is inevitable. AI makes a convincing case for a technology that is hard to limit. Artificial intelligence (AI) promises to boost productivity in virtually every conceivable sector of the economy due to the fact that machines can theoretically perform any task. Any nation that chooses not to adopt AI will quickly fall far behind the rest of the world due to the rapid economic growth that will likely result from its deployment.