Last month, Chinese startup AI company Deep Seek shocked the financial markets when it revealed that – for the relatively measly sum of $5 million – its AI capabilities equaled or surpassed those of giant U.S. companies that had plowed $1 trillion into developing their AI models. Shares of Nvidia dropped 17% in a single day of trading.
Since then, investors seem to have largely shrugged off any concerns. But that’s a big mistake – especially given that China followed the Deep Seek release with other even better AI models. These include Alibaba’s Wanda and, even more recently, KIMI 1.5, which easily tops both Deep Seek and Wanda. Both Deep Seek and Kimi 1.5 can be easily downloaded for free.
China’s superior and more sustainable AI capabilities have blown a hole in any notion of present-day American exceptionalism. And it signals sharply rising risks to both the economy and the financial markets – making a prolonged market bloodbath look ever more probable.
The U.S. had been presumed to have a big lead in AI. It was widely viewed as the apotheosis of our technologies, a triumphant demonstration of our ability to continually create world-beating tech products. Investors, buying into that narrative, pushed AI-related stocks to wild heights. Between early February 2023 and 2025, of the $17 trillion the S&P 500 tacked on in gains – the greatest two-year gain ever – more than 75%, or $13 trillion, came in the tech giants viewed as AI leaders. They were propelled by investors’ expectation that continuing advances in U.S. AI would turbocharge their growth indefinitely. Now the hollowness of that assumption is being exposed, as Deep Seek and the even better products that followed have put to shame the entire American model.
And given the outsized market weighting of these tech giants, if they falter, they will drag the market down with them in what could be an unprecedented decline.
It’s not just that China’s AI models are superior while being created for a fraction of what U.S. companies spend on their AI. It’s also that they are based on an entirely different ethos. The U.S. model is based on the quest by giant U.S. tech companies to keep profits growing by selling ever more AI services to ever more customers. To do this, and operating under the belief that money can get you anything, they are spending massive sums to create huge data centers that provide ever more cloud space. They also are pushing the idea that their AI is a pathway to artificial general intelligence (AGI) that will result in massive rewards.
Both these pursuits will hit roadblocks. Insufficient resources, as we discuss below, will ultimately short-circuit expansion of data centers. And as I’ve discussed in prior blogs and in my previous investment letter The Complete Investor, AGI is a nonstarter, because it will never substitute for human creativity. The large language models that the companies are pouring resources into developing are a joke when it comes to matching human intuitive and creative leaps.
But AI still can be transformative when used for what it’s good at – as an ultra-high-powered assistant to humans, performing tasks that don’t require creativity, thereby freeing up humans to concentrate on creative solutions to real-world problems.
Resource scarcity, for example, is one of the biggest problems facing the world today. AI can help in many different ways, for example, by acquiring and indexing information and data about the earth far faster and more comprehensively than humans can. But making sense of this information to find solutions to resource scarcity will require creative leaps. Success would benefit the world enormously, resulting in far greater material gains than anything obtained from simply marketing AI for profit. The U.S. model not only doesn’t present a way forward to tackling resource scarcity, it makes the problem worse by consuming more resources.
By contrast, China’s AI models, by being shared with anyone who wants to download them, encourages using AI in ways that provide real-world benefits, rather than enrich a few billionaire oligarchs (who, ironically, will end up major losers as the U.S. model they are promoting collapses). China’s ability to create superior AI for so much less money and the willingness to share its models for free should scare investors because it shows that China, unlike the U.S., has a sustainable, scalable pathway for developing AI.
Our Inadequate Electric Grid
As we said above, if the U.S. model isn’t sustainable and U.S. giants can’t keep profits flowing, the risks to the overall market are incalculable. One reason to think they’ll run into problems is our inadequate electric grid. It’s worth understanding just how big an obstacle this is.
The expanding AI data centers require immense and ever-rising amounts of reliable electricity. In the U.S., the electric grid – parts of which were built a century ago – is nowhere near being up to the task. Keep in mind that producing electricity requires two things. One is to generate energy, from whatever source. The second is to transform that energy into electricity, generally by connecting it to the electric grid, which then transmits electricity to the end users.
That means having an electric grid big enough and reliable enough and interconnected enough to handle all the energy needed to meet the demand for electricity. Our grid doesn’t cut it.
In May 2022, Reuters reported: “The decrepit power infrastructure of the world’s largest economy is among the biggest obstacles to expanding clean energy and combating climate change on the ambitious schedule laid out by U.S. President Joe Biden. His administration promises to eliminate or offset carbon emissions from the power sector by 2035 and from the entire U.S. economy by 2050. Such rapid clean-energy growth would pressure the nation’s grid in two ways: Widespread EV adoption will spark a huge surge in power demand; and increasing dependence on renewable power creates reliability problems on days with less sun or wind.”
And that was six months before AI startup company OpenAI introduced the world’s first working large language model, Chat GPT, which was the basis of a prediction by McKinsey, two years later, that between 2024 and 2030, demand for U.S. electricity would soar by nearly 25%, driven by an explosion in data center demand. [
To meet this demand, our “decrepit grid” would have to expand capacity accordingly. Can it do so? That’s virtually impossible, given that there has been just de minimis expansion of the grid in more than a generation,
Recognition of the growing demand for electricity has set off a mad scramble to build additional sources of electricity generation and ensure it can be linked to the grid. But there seems no chance that major utilities will be able to keep up with demand. Even before AI entered the picture, McKinsey reported that utilities had waiting times of up to two years before being able to handle additional electricity.
Advocates of renewable energies like solar and wind no doubt would be thrilled to learn that some 2,600 gigs of solar and wind energy – twice the capacity of today’s grid – could be available within the next few years. But then they’d learn that most of it will go unused because of problems in connecting it to the grid. Wind and solar are intermittent energy sources that are unequally distributed. To use them requires some combination of long-distance transmission and storage batteries.
Long-distance transmission requires ultra-high voltage (UHV) cabling capable of carrying at least 800,000 volts. How many miles of UHV do we have? None. By contrast China has about 50,000 miles of UHV cabling and adds thousands of miles every year – another reason their AI model is scalable. Consider it a disheartening example of our short-sighted, short-term approach vs. China’s long-term focus.
As for energy storage, lithium ion batteries are generally considered to be the lowest-cost option. But recent estimates from various sources including the Department of Energy suggest storage still won’t be a solution. If we compare current estimates of how much it would cost to upgrade the grid vs. how much it would cost to add comparable capacity via more storage, it turns out that – in terms of units of electricity produced – upgrading is more than 90% less expensive. Adding storage capacity will be necessary but is hardly a stand-alone answer.
How much will it cost to upgrade the grid to accommodate the burgeoning growth in data centers and other growing users of electricity? There are no estimates recent enough to take into account the surge in AI-driven growth in data centers. But it’s safe to say it will be multiples of pre-2023 estimates, which were around $2 trillion.
Longer term, a major stumbling block could be copper. The following quote is from the chief operating officer at Rio Tinto, the world’s third-largest copper producer. “Think of increasing demand for things like electric vehicles, the copper plumbing in our houses … transmission lines, smartphones, electronic devices – the things you can’t live without all include copper…The U.S. is estimated to consume nearly 2 million metric tons of copper per year but only produces slightly over 1 million metric tons, by 2035, that demand is estimated to be around 4 to 5 million tons of copper a year. The question is, where are we going to get all that additional copper?”
It’s worth mentioning that some of the giant tech companies, including Microsoft, Amazon, Meta, and Alphabet, are seeking to bypass the grid by developing their own dedicated sources of electricity. Will this work? Since a data center requires electricity 24/7, it rules out wind and solar as dedicated sources. Natural gas is limited because of volatility in supplies and prices, and there are indications that natural gas production is leveling off.
But one source that has created something of mini mania is nuclear energy in the form of small modular reactors (SMRs). These are small enough to be used by a single data center, and they’re scalable in that more modules can be added if a company wanted to cluster data centers. Moreover, they avoid many of the dangers of large nuclear reactors. But they won’t be an important source of electricity for probably at least a decade. As of this writing there are no SMRs in operation. The first SMR, which would provide only nominal electricity-generating capacity, is scheduled to come online in about two years, but even that time frame is not guaranteed.