New Capital Management

View Original

DeepSeek or Swim: Has the AI Thesis Fundamentally Changed?

Given AI’s rapid evolution, DeepSeek’s emergence has disrupted markets by developing a high-performing model at a fraction of the cost of major tech firms. While this challenges assumptions about AI development, it doesn’t fundamentally alter the investment case for U.S. leadership. Investors should stay diversified and consider broader AI beneficiaries.


A Chinese startup has disrupted the AI landscape and sent shockwaves through markets. DeepSeek, a newly released large-language model (LLM), challenges the dominance of tech giants by boasting similar performance despite using less sophisticated chips and functioning at a fraction of the cost. This triggered a significant selloff in U.S. tech on January 27th, with the NASDAQ falling 3% and the S&P 500 falling 1.5%.

This breakthrough has implications for the AI revolution, mega cap tech companies and portfolio construction:

  • Reevaluating AI competitive moats and capex: DeepSeek’s model calls into question competitive moats of U.S. tech firms and the multi-billion dollar capex investments. DeepSeek was trained for just $6 million using older-generation Nvidia chips, compared to $60 million to train ChatGPT 4.0 and the projected $1bn for the next generation LLM. The model's efficiency is perhaps even more disruptive. Early benchmarks suggest that DeepSeek operates at just 3-5% of the computing cost of its competitors, making it over 20 times more efficient. This could challenge Big Tech’s pricing power and revenue models as this technology becomes commoditized. Still, multi-year AI buildout plans are unlikely to turn on a dime, and we expect these intentions to be reiterated in earnings calls. 

  • Balancing quality fundamentals with valuations and concentration: The Mag 7's valuations, at roughly 30x forward earnings, face increased scrutiny and potential compression. In addition, multi-decade high index concentration has left many investors heavily overweight U.S. tech, exposing them to repricing expectations. Still, leading tech firms boast a robust earnings growth outlook. Analysts are expecting Mag 7 profits to grow 23% y/y this quarter, compared to 10% for the rest of the index. Combined with robust infrastructure footholds and advanced R&D, these companies remain at the forefront of AI development and deployment. 

  • Capitalizing on broader AI-beneficiaries: We expect continued strong demand for computing infrastructure (data centers, power) and the providers of AI applications (software companies, small-caps, startups). Those leveraging off-the-shelf LLMs could benefit from wider corporate margins and accelerated productivity gains, providing a tailwind for the sleeve of “AI adopters” that have been largely missed in the Mag 7-led rally.

  • Diversifying portfolios: Notably, Monday’s sell-off was highly concentrated in AI-linked names, with Nvidia losing a spectacular $589 billion in market cap and other AI-related companies like semiconductor firms, data centers and power-linked companies coming under pressure. However, 70% of S&P 500 stocks still finished in the green and the S&P 500 equal weight outpaced the S&P 500 by the most since July 2024. The unwinding of imbalances could have further room to run, supporting market breadth this year and emphasizing the need for diversification. 

While we don’t think this news fundamentally changes the case for U.S. exceptionalism or the sustainability of AI capex underway, it does underscore the uncertainty around the competitive moats of U.S. tech firms, the potential for disruption and the vulnerability of portfolios with elevated concentration. The landscape will continue to shift, but investors can still look forward to the AI-driven boosts to growth, productivity and earnings while actively diversifying their exposures and leaning into managers that can capitalize on market leadership shake ups. 


End-user Pricing of Leading LLMs

Source: J.P. Morgan Asset Management, Google, OpenAI, Anthropic, DeepSeek. LLMs typically price their API access based on the number of "tokens" processed. A token roughly represents a word or part of a word.
Data are as of January 27, 2025


See this form in the original post

Disclaimer


We’d Love To Hear From You

See this form in the original post