Blade Runner, released on June 25, 1982, was a film with a budget of $30 million that earned just $41 million at the box office. While it wasn’t a complete failure, it was certainly a financial disappointment. Director Ridley Scott had previously achieved success in the science fiction genre with Alien in 1979, and star Harrison Ford was already a well-established name in Hollywood.
Instead of delivering a thrilling and action-packed blockbuster to rival the popular sci-fi film E.T. the Extra-Terrestrial, which also premiered that month, Scott presented audiences with a dystopian art-house film. Blade Runner transported viewers to a polluted and socially unequal version of Los Angeles in the year 2019, where “blade runners” hunted down bioengineered androids known as “replicants.” The film was characterized by its dark and rainy noir atmosphere.
Today, Blade Runner is considered a classic in the sci-fi genre, admired for its breathtaking visuals and profound exploration of a future that includes sentient artificial life. Many fans and film critics also hold its sequel, Blade Runner 2049, directed by Denis Villeneuve and released in 2017 (which also underperformed at the box office), in equally high regard, both visually and intellectually. Personally, I count myself among those fans who have watched and appreciated both films multiple times.
However, whenever I watch these films, I never think to myself, “This is what the future might actually be like.” The world-building, while visually and dramatically compelling, doesn’t add up, even if we assume that all the futuristic technologies depicted are possible. This assumption undermines the believability of the world created in the films.
The Blade Runner universe asks us to believe that the stagnation in technological progress that followed the 1960s doesn’t occur or ends quickly. It envisions a future where humanity achieves the technological advancements predicted by futurists and technologists of the 1960s: artificial general intelligence, sentient AI, bioengineered androids with superior capabilities to humans, colonies established across the Solar System, and flying cars possibly using anti-gravity technology (which would also aid in space colonization).
Taken together, these remarkable advancements would seemingly address the concerns of those who are pessimistic about the future of technology and its impact on the economy. Firstly, it would counter the notion that the inventions of the past half-century, primarily in information technology, are less consequential than those of the previous century. Secondly, it would challenge the thought that groundbreaking ideas have become harder to find, requiring greater intellectual capacity and research budgets.
Let’s focus on artificial intelligence (AI) or human-level AI. Its achievement would arguably be the most significant scientific breakthrough in human history. Economist Anton Korinek and researcher Megan Juelfs of the University of Virginia discussed the potential impact of advanced AI and robotics fully substituting human labor. They suggested that such a development could significantly increase economic growth, potentially by orders of magnitude.
AGI wouldn’t just result in machines working longer and more efficiently than humans; it would also enable an almost infinite minds to tackle complex problems as super-research assistants. It would revolutionize the very process of invention. In their paper “Artificial Intelligence and Economic Growth,” economists Philippe Aghion, Benjamin F. Jones, and Charles I. Jones explain that AI has the potential to transform how we create new ideas and technologies, facilitating the resolution of intricate problems and scaling up creative efforts. In extreme scenarios, some believe AI could rapidly self-improve, leading to unbounded machine intelligence and/or economic growth.
So, where does the world of Blade Runner fall short regarding its techno-solutionism? Instead of presenting one of the most famous depictions of a future dystopia, it should portray a world of much greater wealth, improved health (considering the historical correlation between technological progress and increased life expectancy), abundant resources, and enhanced problem-solving capabilities. If replicants take over many jobs, a generous basic income could be provided for displaced human workers. Climate change could be addressed through mega-machines capable of extracting carbon from the atmosphere. Resource constraints could be overcome through asteroid mining. And for those facing overpopulation, opportunities await in off-world colonies, as advertised in the first film. Life on Earth should also be prosperous for those who remain, embracing sustainable practices.
Even the two malevolent corporations in the films, the Tyrell Corporation and the Wallace Corporation, suggest the continued existence of entrepreneurial opportunities in a world where companies like Atari, Pan Am, and RCA thrive. This implies that the economically volatile 1970s in our reality were likely far more prosperous in the Blade Runner universe.
One might argue that all the benefits of technological progress and faster economic growth would be concentrated in the hands of a wealthy elite. The films portray such an outcome, with replicants essentially enslaved despite their sentience and most non-emigrant humans living in a dreary and environmentally devastated world beneath towering corporate skyscrapers. However, economic history suggests a different future, one that is far less dystopian. Over time, the consumption habits of the wealthy tend to become accessible to the broader population. Elon Musk doesn’t have exclusive access to a superior COVID-19 vaccine. SpaceX’s innovations in reducing the cost of space launches open up opportunities for regular people to travel to and live in space, not just the super-rich.
Furthermore, economic history shows that even during increased inequality, faster productivity growth improves living standards. The chart Robert Lawrence of the Peterson Institute created demonstrates a close relationship between productivity and compensation. It’s worth noting that despite a roughly 25% increase in income inequality between 1979 and 2016 according to the Congressional Budget Office, measures of after-tax income indicate that middle-class living standards have risen by 42%, and for the bottom fifth, they have increased by 70%. There is no compelling reason to believe that future economic relationships would be fundamentally different unless it serves the plot’s necessity.
While the economic aspects depicted in Blade Runner may make some sense when considering the concept of Engel’s Pause—an era during the early Industrial Revolution when output expanded while living standards remained stagnant—it is unlikely that the filmmakers intentionally referenced this. During Engel’s Pause, real wages for certain occupations drastically fell, and many workers couldn’t be retrained to compete in the changing economy. Families suffered for generations, lives were disrupted, and social unrest was widespread. However, this historical period is not in line with the economic reality likely resulting from the technological progress envisioned in Blade Runner.
I believe that our appreciation for Blade Runner and Blade Runner 2049 will only grow in the years to come, but it will be driven more by what the films say about human interaction with AI rather than their portrayal of the flaws of capitalism. But Gene Roddenbery’s future may be more appealing.