Eli Dourado

Notes on technology in the 2020s

As we start a new decade, it’s a good time to reflect on expectations for the next 10 years. Tyler thinks the Great Stagnation could be ending. Caleb sees cracks. Noah expresses techno-optimism. In this post, my aim is not to predict an end or non-end to stagnation. Rather, it is to think through the particulars of how technology could evolve over the next decade. Then we can assess separately whether we should consider it the Roaring 20s or the Boring 20s.

What would constitute an end to the Great Stagnation? Any precise cutoff will be arbitrary, but for the sake of discussion, let’s say sustained growth in utilization-adjusted total factor productivity of 2 percent per year. By comparison, mean utilization-adjusted TFP growth from 1947 through 1972 was 2.1 percent. Since 2005, it has been 0.17 percent. (Note: it is important to use the utilization-adjusted series, as this corrects for the business cycle.)

Total factor productivity in the U.S. since 1947

Whatever your cutoff for TFP growth, one of my convictions is that scientific breakthroughs alone are not enough to drive an end to the Great Stagnation. TFP only budges when new technologies are adopted at scale, and generally this means products, not just science. Science lays critical groundwork for new technology, but after all the science is done, much work remains. Someone must shepherd the breakthrough to the product stage, where it can actually affect TFP. This means building businesses, surmounting regulatory obstacles, and scaling production.

With that caveat firmly in mind, what will the next decade bring in terms of meaningful technological change? Here’s what I’m watching.

Biotech and health

We are coming off a huge win: two new mRNA COVID vaccines, conceived and brought to market in less than a year. The ability to encode and deploy arbitrary mRNA in our bodies sure seems like a game changer—it allows us to essentially program our cells to make whatever proteins we want. In the case of the COVID vaccines, the vaccine payload instructs our cells to make the coronavirus spike protein, which our immune system then learns to attack. Bert Hubert has a fascinating write-up of the “code” in the vaccine.

Bringing a brand new vaccine to market in less than a year—using a never-before-applied-in-humans-at-scale technology no less—is a world record, but it could have been even faster. As David Wallace-Wells emphasizes, Moderna’s vaccine was designed by January 13. We had it the whole time. Some delay was necessary to determine effective dosing. Some further regulatory delay may have been warranted to ensure the vaccine was safe and to ascertain its efficacy. But as Wallace-Wells indicates, the regulatory outcome was never really in doubt. “None of the scientists I spoke to for this story were at all surprised by either outcome,” he writes. “All said they expected the vaccines were safe and effective all along.”

What should we make of the fact that all of the scientists knew all along that Moderna’s vaccine would work? The question in my mind is: what other mRNA treatments do we have the whole time? What if I told you Moderna has an HIV vaccine candidate? HIV lacks SARS-CoV-2’s telltale spike protein and thus may prove a more challenging foe—but don’t you wonder, if we treated the problem with real urgency, whether new mRNA technology could wipe out the AIDS epidemic this decade? I do.

And mRNA technology can be deployed against more than just viruses. Both Moderna and BioNTech have personalized vaccine candidates targeting cancer. Although called a “cancer vaccine,” the treatment is only administered once the subject has cancer—it isn’t preventative. The companies use an algorithm to analyze the genetic sequences of the tumor and the patient’s healthy cells and predict which molecules could be used to generate a strong immune response against the cancer. “I was actually witnessing the cancer cells shrinking before my eyes,” said Brad Kremer, a melanoma patient who received the BioNTech treatment. So let’s milk mRNA technology for all it’s worth this decade. It can save us from more than just a pandemic.

What about CRISPR? It is a great example of a technology that has not yet made a meaningful economic contribution. Although the technique for editing DNA was discovered in 2012—and a Nobel Prize was awarded to its two discoverers this year—no treatment using CRISPR has been approved outside of clinical trials. So far, its impact has been limited to making researchers more productive—not a bad thing, to be sure, but not close to CRISPR’s full potential. As trials progress, however, I do think some CRISPR treatments will come online in the next few years, especially those targeting genetic disorders that we have very limited means of otherwise treating.

DeepMind’s protein-folding breakthrough signals a promising decade for the science of proteomics. Most directly, being able to predict protein shapes will enable us to discover drugs more rapidly. Buuuut, because drug trials take many years, we might expect this technology not to really be felt by the general public until the 2030s.

What DeepMind’s achievement indicates to me the most is that machine learning is actually useful. This might seem obvious, but consider: most applications of machine learning so far—excluding autonomous vehicles, which have themselves not really arrived yet—are toys. I love watching AlphaZero crush Stockfish on YouTube, but chess is literally a game. GPT-3 produced some fun demos. AlphaFold heralds something different—non-toy superhuman performance is now here, and I am interested to see what else it can do. Aside from the aforementioned AVs, I expect it to be applied widely in other areas of biology. Again, it will take a long time for the breakthroughs to trickle down into products, but at least the 2030s should be sick. I mean, not sick. Healthy.

Let’s talk about life extension, one of my favorite biotech topics. 2020 was a big year for the Conboy Lab at Berkeley, which proved that all the weird past findings about “young blood” extending life were not actually due to any elixir in the blood of children (thank goodness). Rather, the rejuvenating aspects of young blood experiments were due to the dilution of harmful factors in old blood. By mechanically removing plasma and replacing it with saline and enough albumin to replace what was taken out, they diluted aged blood factors in both mice and humans and were able to rejuvenate germ layer tissues and improve cognition by reducing neuroinflammation.

These findings are exciting not only because they represent a scientific advance in understanding aging, but also because they herald the first real anti-aging product that could come to market. Therapeutic plasma exchange is FDA-approved (not for aging, but for a bunch of other conditions). I imagine there remain prohibitions on advertising that it can add years to your life, but it is safe, and a doctor can prescribe it off label. It’s also cheap. An automated plasmapheresis machine—which lets you do treatment after treatment—can be bought online for under $3,000. That is less than the cost of a single transfusion of young blood sold by the startup Ambrosia. How long until someone opens a clinic offering plasma dilution? I bet someone tries it in 2021. If it works, people will get over the weirdness, and it could be commonplace by 2030.

Another longevity product that is about to get hot: aging clocks based on DNA methylation or proteomics. Do you want to know how biologically old you are? Today, for a few hundred dollars, you can get a test that will tell you. As these tests become better and cheaper, self-experimenters are going to have a field day. Doing before-and-after aging tests, anyone who can get their hands on human growth hormone could replicate the protocol used by Fahy et al. to rejuvenate the thymus. As the thymus is a critical element of the immune system, decline of which is a critical factor in aging, this is non-trivial rejuvenation. The Fahy study found that 12 months of treatment created about 2.5 years of epigenetic rejuvenation, with results accelerating in the last quarter of the trial.

There is a lot more in the Rejuvenation Roadmap—dozens of possible life-extending treatments are at various stages of development. There’s a good chance a few senolytic drugs will be approved by the end of the decade. As I noted yesterday at Fortune, we spend less than 1% of the NIH budget on aging biology—we should raise that by a lot.

Unlike others, I am not-so-bullish on metformin. It does seem to reduce all-cause mortality in Americans, but it may do so because 88% of Americans are metabolically unhealthy. If you are one of the 12%, and you should strive to be, I don’t think metformin will do much for you.

One final biotech observation: every year, the Apple Watch gets a new health-related sensor. This year it was blood oxygen, pretty good for detecting if you might have COVID! Fast forward to 2030 and wearables will have at least 10 more health-related sensors than they do today. Some no-brainers are body temperature, blood pressure, and blood glucose sensors. What will the other 7 be? At some point, it becomes possible to replace a lot of primary care with continuous monitoring. A few smart algorithms to provide simple medical advice could improve population-level health without much cost. More data could also yield faster, more accurate, and of course more remote diagnoses when you do have to see a doctor.

There is a lot in biotech that is promising right now, but in more than any other field, it is important not to be seduced by the sexy headlines showing rapid scientific progress. Don’t get complacent. Biology is proceeding faster than medical productivity because a lot of the wonderful discoveries are not being translated into approved treatments and products at a decent rate. Let’s salute and cheer for the discoveries, but spare many thoughts for the entrepreneurs trying to bring treatments to market.

Energy

The 2010s were the wind and solar decade. We observed stunning declines in the cost of both, although total deployment of wind and solar remains small—in 2019, wind and solar represented less than 9 percent of utility-scale electricity generation in the US. In the 2020s, cost declines will likely stall—wind and solar are already pretty cheap, so the declines of the past decade are not reproducible. Deployment, on the other hand, will accelerate.

Mass deployment of wind and solar will bring challenges. These sources are highly intermittent. When the wind suddenly stops blowing—which happens—we need a way to quickly make up the deficit. Each of the three electricity grids in the continental US—east, west, and Texas—has to remain in supply-demand balance every second of every day. We can use grid storage to smooth out some of the bumps, but storage remains expensive. To reach a grid powered entirely by today’s renewables, we would need storage at a price of $20 per kWh (with caveats).

That storage doesn’t all have to come from batteries, but let’s talk about batteries for a bit. Using Tesla’s grid-scale Powerpack as data, a 232 kWh battery today costs $125,793. That is a price of over $542/kWh. Through innovation, that pricetag will come down over the course of this decade, but improvements on the supply side could easily get swamped by increases in demand. After all, this decade will also include a huge shift toward electric vehicles, which I will discuss below. When demand outpaces supply, prices tend to stay high, even when there is impressive innovation.

With increased deployment of intermittent power generation, increased total demand for electricity due to electric vehicles, a high cost of grid storage, inadequate electricity transmission (have I mentioned that we often neglect to build in this country?), and strong political support for decommissioning fossil fuel plants, the 2020s may be a time of electric grid instability. This could be tempered to some extent by using car batteries as grid resources and through (politically unpopular) variable electricity prices.

Ultimately, we need scalable zero-carbon baseload energy, which means nuclear or geothermal. The problem with nuclear is the high cost. If you look at NuScale’s small modular reactor technology, they are targeting 6.5¢/kWh. That is baseload power, so not directly comparable to wind and solar’s intermittent generation costs, but even so, it isn’t the most competitive in today’s market. Furthermore, NuScale’s flagship project was just delayed three years and is now not scheduled to come online until 2030.

What is more plausible this decade is enhanced and advanced geothermal systems. The legacy geothermal industry is sleepy, tapping energy at traditional volcanic hydrothermal hotspots—forget about it. The next generation of the industry, however, is a bunch of scrappy startups manned by folks leaving the oil and gas industry. The startups I have spoken to think with today’s technology they can crack 3.5¢/kWh without being confined to volcanic regions. With relatively minor advancements in drilling technology compared to what we’ve seen over the last decade, advanced geothermal could reach 2¢/kWh and scale to become viable just about anywhere on the planet. Collectively, the startups are talking about figures like hundreds of gigawatts of generation by 2030. I’m watching this space closely; the Heat Beat blog is a great way to stay in the loop. As I wrote last month, permitting reform will be important.

Fusion continues to make technical progress. I expect we will get a demonstration of energy-positive fusion in this decade from one of several fusion startups or perhaps Lockheed Martin’s compact fusion reactor. But again: a demonstration is far from a change that transforms society. It will take further decades to deploy reactors onto the grid. By the time fusion gets there, the energy market will be quite different from when we started working on fusion reactors in the 1940s. Wind, solar, and hopefully geothermal will make electricity pretty cheap, and fusion will struggle to compete.

Consider: around half the cost of an advanced geothermal plant is drilling, and half is conversion equipment. Suppose the plant is amortized over 30 years (although many geothermal plants last longer), and after that period the conversion equipment needs to be replaced. But the hole in the ground does not need to be replaced! That means for the next 30 years, electricity can be generated at half the initial cost. Geothermal wells we dig this decade could be producing at less than 1¢/kWh by the 2050s. That is a tough market for fusion to break into. But fusion will still be a great source of power in applications where other sources aren’t available, such as in space.

The 2020s will be a big decade for sustainable alternative fuels (SAF). Commercial aviation can’t electrify—batteries will never match fossil fuels’ energy density. Given political realities, aviation has no choice to decarbonize, which means either hydrogen fuel or SAF. Hydrogen fuel is much better than batteries, but still not as energy dense as fossil fuels or SAF, and so my money is on SAF, and particularly on fuel made from CO₂ pulled from the atmosphere. It is easy to convert atmospheric CO₂ to ethanol in solution; and it is easy to upgrade ethanol into other fuels. But it is hard to separate ethanol from water without using a lot of energy—unless you have an advanced membrane as Prometheus Fuels does. I have written about Prometheus before and continue to follow them closely. Their technology could decarbonize aviation very suddenly.

One final note on energy: there may be very interesting geopolitical consequences in the decade ahead to America’s newfound energy independence. I could easily see, for example, the US deciding we actually don’t need an alliance with the Saudis after all, considering they are journalist-dismembering savages. If the US pulls out of Saudi Arabia, war between the Saudis and the Iranians becomes likely. Which means oil shipments to Asia get disrupted. Which means global chaos. This Zeihanesque scenario is only a scenario, but I’m watching for it.

Transportation

Here’s the thing about electric cars: they are better than regular cars. They have lower fuel costs. They have fewer moving parts and thus lower maintenance costs. They have higher low-end torque and faster acceleration. If you mainly drive to and from work and have a charger at home, you never have to stop for gas. Electric cars will win because they are better, and the shift will happen suddenly.

California will require that new cars purchased after 2035 have zero emissions. For most people, this will be a non-issue, as by 2035 most Californians would not dream of getting an internal combustion engine vehicle. I say this even as a relative battery price pessimist—or more accurately, I am a relative battery price pessimist because I think demand for batteries will be off the charts.

One area where batteries may not work (aside from aviation, already discussed) is trucking. Towing really heavy loads requires a lot of energy—hydrogen fuel cells will be more suited to interstate trucking. The transition from diesel to hydrogen in trucking will likely not be as automatic as the transition from gas to batteries in cars. It’s possible that truckers will need a bit of a push.

As cars shift to electric and trucks shift to hydrogen, air pollution will plummet, especially the currently unregulated ultrafine particles less than 0.1 μm in diameter which cause the worst health harms. There may still be larger particles from tires degrading and so on, but my view is that these do not cause serious health problems. Getting rid of the smallest particles, particularly from diesel fumes, will create health gains which may seem to appear out of nowhere. Fewer premature births, fewer cases of asthma, fewer cancers, fewer mystery illnesses.

Autonomous vehicles could finally happen at scale. Waymo is already in production with a driverless fleet in Phoenix. Tesla has a “full self-driving” computer which might not yet live up to the label, but is nonetheless very cool. Although we have all been continually disappointed by the promise of autonomy right around the corner, it does seem like it has to happen at some point. As sensors and computing power get cheaper, and machine learning algorithms get better, autonomy is inevitable. A decade is a long time, so I am reasonably confident it will happen in the 2020s. It could save a lot of lives. Autonomy, too, will accelerate the adoption of electric vehicles, as fleet companies will prefer the low maintenance costs of battery or fuel cell vehicles.

Let’s do aviation. As everyone knows, I am a huge fan of supersonics. I continue to cheer for my former colleagues at Boom, who will legitimately fly a supersonic aircraft in 2021. Supersonics will have an enormous impact on global business when it arrives at scale, but it’s looking like that won’t be in the 2020s—Boom’s most optimistic timeline per a recent article is a first full-scale airliner by 2026, then several years of certification tests, then a ramp in production to make it matter.

Other contenders: Aerion has a Mach-1.4 business jet design ready to enter production. The key question is whether they can raise the money to build the factory. The business jet market is small, so as an investor you really have to believe they will be able to parlay success on the AS2 into a future airliner program. Hermeus is working on a Mach-5 design point with 20 passengers. Exosonic is targeting Mach 1.8 with low boom technology that could allow it to operate over land. Gulfstream seems to have shuttered their supersonic business jet program, which was never announced in any case.

In addition to Boom’s XB-1 flight test program commencing in 2021, I’ll also be watching two other supersonic flight programs in the early 2020s. NASA’s X-59 will start flying over select cities in 2023 to collect data on acceptable levels of sonic boom. This will pave the way for new standards that unlock overland supersonic flight. NASA is literally making America boom again! In the next couple years, I also expect an unmanned demonstrator flight from Hermeus reaching as high as Mach 5. Awesome, right?

Aside from supersonics, the other exciting development in aviation is the proliferation of urban air mobility companies. Check out Joby, which recently acquired Uber’s Elevate division. Or Wisk, a joint venture between Boeing and Kitty Hawk. Both of these projects could enter service in the first half of the decade. Hyundai also has a new UAM division—look for their product to enter service near the end of the decade.

A key question in my mind regarding urban air mobility is whether regulations will allow autonomy. The business model doesn’t seem like it will work if you have to pay a pilot and lose the space associated with the pilot’s seat, which could otherwise serve an additional passenger. The FAA has been very incremental about allowing even small drones to fly beyond line-of-sight of the operator. In order for urban air mobility to compete with, say, Uber Black, FAA needs to adopt rules for low-altitude air traffic control (called UTM) and figure out a way to certify autonomous operations. Wisk seems pessimistic—they are targeting New Zealand as a first market.

Drone delivery is likely in the 2020s. FAA is about to issue a rule incrementally expanding drone operations, this time allowing operations at night and flights over crowds of people. This is how the FAA operates: use waiver authority to expand the scope of drone operations, and then once they are comfortable with that, make it into a generally applicable rule. I expect the process will incrementally allow bigger and bigger drone delivery programs until they become normal. Those of us who live within five miles of a Class B airport, however, may be out of luck the entire decade.

Let’s talk about tunnels. An efficiently governed country would need some tunnels, but perhaps not very many. China has added 25,000 km of (mostly non-tunnel) high-speed rail since 2008, and there is no technological reason why we couldn’t have done the same. But with a promiscuous distribution of the veto power, building long rail lines above ground becomes challenging. It may therefore be worth the high cost of tunneling to build new high-speed transport options.

The Boring Company has a small, near-operational “loop” under construction in Las Vegas. The project will whiz people around the convention center at up to 155 mph. Expansion plans include the Las Vegas Strip, the airport, and eventually connecting to Los Angeles. Another Boring project, currently mired in environmental review, is the DC-Baltimore Loop, which would connect the two cities’ downtowns in 15 minutes. All of Boring’s loops are designed to be compatible with hyperloop requirements, which would eventually enable 600-mph travel between major cities.

Although the full realization of this technology—a nationwide hyperloop network—is unlikely by 2030, even the 150-mph version is worth following. The time and hassle cost of travel is an important input into the gravity model of trade. I expect the DC-Baltimore Loop to significantly increase economic activity between the two cities—especially helping to revitalize Baltimore, as it would become easier to live there and work in DC.

Space

The big story in space technology for the next 10 years is Starship, as it will enable just about everything else. Let’s compare some launch costs. The Space Shuttle entered service in 1981 and launched successfully 134 times. Each launch cost an inflation-adjusted $1.8 billion. The payload cost to low-Earth orbit (LEO) was $65,400/kg. Today’s workhorse launch vehicle, the Falcon 9, can send cargo to LEO for $2,600/kg. That is a staggering decrease in launch costs.

Starship promises to take this trend much further. On Falcon 9, only the first stage is reusable, whereas on Starship, the entire system—both the booster and the space vehicle—is reusable. Starship runs on dirt cheap liquid methane instead of expensive rocket fuel. It is made out of stainless steel instead of more expensive traditional aerospace materials. SpaceX is talking about churning out Starships at a rate of one every 72 hours for a cost of $5 million each. Operating costs come down with a high flight rate, so Elon is figuring a $1.5-million fully burdened launch cost for 150 tons to LEO. That is $10/kg, more than 100 times cheaper than a Falcon 9 launch today.

It gets even more insane. Because Starship is designed to be refuelable on orbit, its 150-ton payload capacity to LEO equals its payload capacity to anywhere in the solar system. You will be able to launch 150 tons to LEO, load up on fuel while orbiting Earth, and then fly the same payload the rest of the way to the moons of Jupiter. The whole thing could cost less than one Falcon 9 launch—which is limited to 15 tons to LEO in a reusable configuration or 4 tons to Mars in an expendable configuration.

Let’s apply the gravity model of trade once more, this time to commerce between Earth and LEO. Meta-analyses have found that trade (on Earth) is roughly inverse-linear in transport costs. If that holds for space, a 200x cost reduction in travel between Earth and LEO should increase “trade” between Earth and LEO by 200x. Commerce between the Earth and the moon, or between the Earth and Mars, starting from a base close to zero, would be stimulated even more.

It’s worth noting a second-order effect of cheap launch costs. When launch is expensive, more engineering has to go into the payload to ensure reliability. You don’t want to spend $1.8 billion on launch, and then find out, as NASA did with the Hubble Space Telescope, that your new satellite needs repairs. This dynamic has caused over-engineering of space payloads. With launch for a new low price of $10-20/kg, companies and research agencies will be able to reduce engineering expenses by simply taking on the risk of paying for another (cheap) launch.

Since my guiding star is economically noticeable technological progress, let’s talk about that. SpaceX first landed a rocket booster five years ago. They have been undercutting all other players in the medium-lift launch market ever since. But in the grand scheme of things, launch is still a small market. Aside from getting to watch cool livestreams of boosters landing, Falcon 9 has probably not made a noticeable impact on your life (unless you work in the space industry).

That is finally beginning to change with Starlink. As of this month, there are 955 Starlink satellites providing Internet access to thousands of users in a “better-than-nothing” beta test. The constellation size could go as high as 42,000 satellites. Internet speeds are already over 100 mbps down—they seem to be only somewhat attenuated by bad weather. For many rural customers, the service is indeed much better than nothing—better than any other available alternative. With more (and more advanced) satellites in operation, speeds could reach a gigabit. With Starship, the cost of launching these thousands of satellites, and the speed at which the company could do so, will improve. Plan on a full buildout of the network this decade.

Starlink could be a cash cow. The service is not a good fit for most customers—urban populations are too dense and have too many alternative service providers for Starlink to be viable. Elon has said Starlink will serve the 3–4 percent hardest-to-reach customers. In addition to rural customers, it will presumably serve other niches like in-flight wifi on airplanes and Internet access for the crew on container ships.

Let’s call global telecommunications revenue $2.4 trillion. Assume Starlink can capture 3 percent of that. That is $72 billion per year in revenue, faaaaar more than SpaceX makes in launch. In 2019, the company had only $2 billion in revenue. Starlink is a money printer. And it makes you wonder, if SpaceX’s success so far has come on a budget of $2B in annual revenue, what would a $72B-per-year SpaceX do?

How about colonize Mars? I have a bet with Robin Hanson that a human will set foot on Mars by end of Q1 2030. I am not totally confident that this will happen (Robin gave me odds), but the scenario I think is most likely is the following: Starlink prints a lot of money, and SpaceX uses the money to pay for Mars colonization on Starship at a breakneck pace. That results in a human launch to Mars no later than January 2029, landing in September 2029. SpaceX President Gwynne Shotwell has said it will be a “major company fail” if humans are not flying on Starship (presumably just to LEO) by 2023. With Starlink revenue, SpaceX will be able to do the work on life support systems and mission planning to enable a human mission by 2029. NASA could be involved as a partner, but SpaceX would tolerate zero government obstacles.

Starship is also still in the running to be the landing vehicle for NASA’s Artemis missions. If it is not selected, that seems like a colossal error. To be sure, choosing Starship would represent a huge change of plans for NASA, which had been assuming a congressionally supported boondoggle relying on the Space Launch System, the Orion capsule, and a moon-orbiting Gateway. While Artemis’s goal is a human mission to the moon’s south pole by 2024, the schedule could easily slip. But by leveraging the new opportunities afforded by Starship, a permanent moon base by the end of the decade seems highly plausible.

With lower launch costs, what else is possible? Varda is a new company working on in-space manufacturing. Microgravity means that structures can be used that would collapse under their own weight on Earth. As a result, certain pharmaceuticals, fiber optics, semiconductor wafers, and nanotube materials can be manufactured in space that can’t be made on our planet. Lots of people want to bring manufacturing back to America, but putting manufacturing in orbit is much more exciting.

How about asteroid mining? I think this is still a ways off. There’s no question that it could be profitable someday. The street value of the materials on 16 Psyche back on Earth is $10 quintillion—even allowing for the inevitable hefty price slippage, space resource extraction could make a few trillionaires. I would love to be proven wrong, but I don’t think serious space mining will happen until the 2030s at the earliest. Again, however, cheap launch costs could be a game changer.

Information technology

Custom silicon is going to be huge. The rave reviews for Apple’s new custom system-on-a-chip platform demonstrate its inevitability. For machine learning, Cerebras has a full wafer-sized SoC. Tesla’s “full self-driving” computer likewise uses custom silicon. Almost all computer hardware—anything that has any scale to it—will move in this direction, because the performance benefits are so large. In a way, it’s a repetition of what happened before in semiconductors: individual transistors gave way to the integrated circuit. This change is simply taking integration another level further. Note: we lack the capability to manufacture these SoCs (at least good ones) in North America. Given their strategic importance, it may be worth remedying that.

The 2020s will be the decade that makes or breaks cryptocurrency. Well, nothing will ever break cryptocurrency—true believers will run the networks forever no matter what. But for cryptocurrency to have long-run value, I still hold that it needs to have mainstream uses. This means it needs to scale, it needs a good user experience, and normal people need to actually use it to transact. If it can’t reach that point by the end of the decade, I think it will have failed to have lived up to its promise. I am still cautiously optimistic. I think migration to proof-of-stake, lower transaction costs, more refined tools, and mature standards could lead to mainstreaming.

By the middle of the decade, augmented reality will be widely deployed, in the same way that smart watches are today. Glasses will be computing devices. Every big tech company has a glasses project at a relatively mature stage in the lab today. The need for the glasses to understand context could result in much smarter digital assistants than today’s Siri, Alexa, and so on.

Miscellaneous

I have an irrational love of vertical farming. The combination of LED lights, cheap electricity (for water pumps), direct-use geothermal heating, and smart machine learning algorithms that determine optimal nutrient distribution could yield better produce than conventionally farmed vegetables at competitive prices. By removing pesticides, optimizing varieties for nutrition and flavor instead of hardiness on the supply chain, and ensuring quick delivery to market, vertical farms could supply a healthier and more delicious future of food. Speaking of food, I predict plant-based “meat” will flop, but lab-grown real meat is worth keeping an eye on. Until then, eat humanely-raised grass-finished cows.

Construction tech is another area to watch. Whether it’s 3d-printed homes as imagined by Icon, or advanced manufactured housing as designed by Cover or Modal, there has to be a better way to build than our current stick-built paradigm. Housing costs have skyrocketed largely due to zoning rules, but construction technology is another lever by which we can increase housing productivity. This is another area where the barriers don’t seem to be primarily technological.


Collectively, these technologies add up to a lot of possibility. If we cure a bunch of diseases, slow down aspects of aging, realize cheap and emissions-free baseload energy, and deploy new modes of transportation and better construction technologies, we will almost certainly exceed 2 percent TFP growth. But we might not do these things.

It all depends on execution. The underlying science is there. The engineers are willing. Even the funding is available in most cases. But, as a society, how much urgency do we feel? Our culture does not prioritize progress—it fights, destructively, for status. And our politics reflects our culture.

I want to go faster.