Close Navigation
Third Quarter Equity Outlook: AI Continues to Boom

Third Quarter Equity Outlook: AI Continues to Boom

Posted October 21, 2025 at 11:15 am

John Osterweis , Greg Hermanski
Osterweis Capital Management

What Market Selloff? Thank you, AI (and the Tariff Retreat)

As the market marches to new highs, just two months after a violent 20% selloff in equities, we ask ourselves how the narrative has shifted so quickly. Our simple answer is that economic fundamentals remain strong, with corporate profits rising, unemployment low, and inflation still under control for the moment. While the Trump administration threatened punishing tariffs that we believe risked an almost certain recession driven in part by the ensuing rampant inflation, the White House quickly retreated — leading to a snapback in equity prices. However, this still begs the question as to why the economy continues to grow and hence the stock market continues to move higher.

We believe the ever-expanding technology sector driven by the AI boom is playing a critical role in driving continued economic growth. We examine the role of AI in this outlook and conclude that direct participants and the broader economy are beginning to see a payoff, justifying the significant AI investments observed across sectors.

The AI Rollercoaster: Increasing Payoff Despite the Volatility

There have been several key milestones in the AI revolution over the past few years, most notably the public launch of ChatGPT in late 2022. But arguably the most critical recent milestone was the January 2025 release of DeepSeek’s Reasoning Large Language Model (LLM) R1. At the time, markets were roiled by the revolutionary Chinese LLM, which promised nearly the same performance of incumbent models at a small fraction of the cost. The question on everyone’s mind was whether DeepSeek would advance or impair the ability of technology companies to monetize AI and whether their enormous investments would ever see a payoff. Our analysis leads us to conclude that the pivotal DeepSeek moment helped unleash and accelerate the next wave of AI innovation while actually improving return on investment (ROI) on the enormous budgets being spent on LLMs.

Evolution of AI Scaling Laws

DeepSeek’s release demonstrated that highly capable AI models can be developed at a fraction of the previously estimated cost. With a reported spend of only $5.6 million for its final training run (roughly 95% cheaper than OpenAI’s o1 model), DeepSeek challenged the long-held belief that only tech giants with immense capital budgets could compete at the forefront of AI innovation. This revelation initially sparked considerable market volatility, particularly affecting semiconductor vendors and hyperscalers racing to build the next state-of-the-art model, as investors questioned the ROI and long-term sustainability of the massive capital expenditures directed towards building AI data centers in the U.S.

However, the industry swiftly reacted and adapted to the DeepSeek release, with the focus shifting towards a more sophisticated understanding of AI development. While pre-training of LLMs remains crucial, DeepSeek’s success highlighted the increasing importance of post-training and test-time scaling (see graphic below), which involves a series of steps deployed after feeding models with raw data to improve relevancy and specificity. Models are now able to break down complex tasks into smaller steps, explain each step, check for possible answers, and review their own rationales before coming up with a response. This has led to significant performance improvements and a focus on optimizing models for efficient and powerful reasoning capabilities that resemble how humans “think.”

Phases of AI Revolution and Scaling Laws
Source: Nvidia, Barclays Research.

In fact, the vast majority of state-of-the-art LLMs are now reasoning models that can think and reason like humans when tackling complex multi-step problems and tasks. Western AI labs have quickly caught up and even surpassed some of DeepSeek’s initial benchmarks. Alphabet, for instance, has released two state-of-the-art reasoning models this year. OpenAI followed suit, releasing its newest mini models in mid-April 2025, simultaneously slashing costs by over 80% and further democratizing access to advanced AI capabilities.

Exponential Cost Reductions in Inferencing and Jevons Paradox

Jevons Paradox is the idea that when something becomes more efficient and cheaper to use, people end up using it even more. This principle has historically underpinned innovations across the tech industry — from semiconductor chips to the internet to ecommerce to cloud migration. So far, the current AI revolution is proving to be no exception, and DeepSeek has played a critical role. In fact, since the introduction of GPT-4 in March 2023, the costs of running AI queries have fallen by over 99%! It is now much cheaper and easier for consumers and businesses to use AI tools to analyze information, make predictions, and generate new content.

As a result of lower costs, there has been an explosion in how much information these LLMs are processing and how frequently they are being deployed. Nvidia’s CEO Jensen Huang recently emphasized this message. Models now routinely process over 100 trillion tokens (units of data processed by AI models), and a reasoning model can consume 20x more tokens and 150x more compute cycles compared to a non-reasoning model. This is evident in the dramatic increase in tokens processed by major tech companies like Alphabet, which recently reported processing a 50x year-over-year increase in tokens.

At the same time, capital expenditures for hyperscalers showed no signs of plateauing. Since the “DeepSeek moment,” major players have announced further increases in CapEx, most notably with the proposed $500 billion Stargate project by OpenAI and Softbank, alongside increased CapEx commitments from Alphabet, Meta, Oracle, and others. In fact, the Big Five Hyperscalers (Microsoft, Alphabet, Meta, Amazon, and Oracle) are expected to increase CapEx by 42% to over $350 billion in 2025, which will in turn translate to likely over 50% revenue growth for Nvidia in 2025, according to consensus estimates. Not only are hyperscalers and frontier AI labs still engaged in an arms race to achieve the ultimate goal of AGI (Artificial General Intelligence), they are also increasingly focused on serving the next wave of AI demand from the explosion of inferencing workloads.

Critically, we are now less concerned about the sheer magnitude of these CapEx investments, as we have increasing evidence of ROIs that are translating to accelerated revenue growth and improving profitability for hyperscalers:

  • Microsoft reported a 33% year-over-year growth rate in Azure revenue in the most recent quarter, mostly driven by AI deployments.
  • Alphabet’s Google Cloud segment has also been growing at approximately 30% year-over-year, and operating margins have dramatically improved from 9% to 18% in the past year, demonstrating increasing economies of scale.
  • Amazon, the largest and most established cloud provider, reported revenue growth of 17% year-over-year in its most recent quarter, while generating record operating margins of 40% in the quarter.
  • Oracle recently guided to cloud revenue growth of over 40% year-over-year in fiscal year 2026 and over 100% growth in their Remaining Performance Obligations (an indication of future revenue), which now stand at $270 billion.

Meanwhile, hyperscalers have continued to signal capacity constraints and demand outstripping supply, giving us confidence in the sustainability of growth and profitability well into the future.

Rapid Adoption of Agents and Increasing Productivity

With the exponential decline in inferencing costs, we have seen the rapid launch of agentic products and fast adoption of agents in the software space. Some prominent figures in the tech industry have declared 2025 as “the year of Agents,” whereby a wide array of AI agents can be seamlessly integrated into our daily personal and professional lives to improve our productivity and relieve us of tedious tasks.

Companies like Salesforce have demonstrated rapid iteration in product development. Agentforce 1.0 was released in September 2024, and the newest Agentforce 3.0 was just released in June 2025. Agentforce has won 8,000 deals to date, the fastest product ramp in Salesforce’s history. In May of this year, Alphabet rolled out its plan for a wide array of AI agents that can autonomously visit web pages, summarize content, or complete tasks like shopping, booking tickets, and making reservations.

As agents become an increasingly significant part of the workflow, higher productivity is becoming evident across industries. For example, Amazon has leveraged Amazon Q’s code transformation capabilities to dramatically improve the efficiency of Java upgrades, resulting in significant savings of $260 million and a staggering 4,500 developer years. Amazon CEO Andy Jassy has even suggested that AI could lead to a decline in the number of Amazon employees over time, underscoring the transformative potential of AI on workforce dynamics and operational efficiency.

Not All Sunshine and Roses – AI’s Disruptive Power

We are well aware of the disruption that AI could potentially bring to existing business models. For example, software’s traditional model of per-seat, per-month subscription may take a hit with fewer people in the workforce. Hence, software companies are quickly adopting new revenue models such as charging on a consumption basis or offering AI credit packs to offset potential headwinds from seat compression. Concerns on Google’s search dominance were recently highlighted in Apple executive Eddy Cue’s testimony during Google’s antitrust trial, when he stated that Google searches on Apple’s Safari browser declined for the first time in 20 years. These are risks we monitor carefully.

Portfolio Positioning

We are diligently monitoring the rapidly evolving AI landscape to identify and invest in potential winners. We believe our portfolio is well positioned for secular growth in AI, with positions in what we consider to be the most valuable and defensible parts of the AI stack. This includes:

  • Electronic Design Automation Companies: Synopsys (SNPS), a critical player for chip design and development.
  • Semiconductor Vendors: Broadcom (AVGO) and Nvidia (NVDA), leaders in custom AI chips and GPUs.
  • Big Tech and Hyperscalers: Microsoft (MSFT), Alphabet (GOOG) and Amazon (AMZN), companies that lead in AI innovation and provide critical cloud infrastructure for customers.
  • Software Companies: Intuit (INTU) and Salesforce (CRM), companies that we think are going to emerge as AI winners, due to incumbency advantages and robust product roadmaps.

Despite rising geopolitical tensions and tariff-related macro concerns, we believe these companies are relatively insulated from these headwinds. Hence, we have treated bouts of market weakness as opportunities to tactically adjust and add to our positions, which reflect our long-term conviction in the transformative power of AI and our commitment to investing in quality growth companies.

Our quality growth framework remains consistent, regardless of the environment and shifts in the technology landscape. We continue to focus on businesses with the following three attributes: 1) durable competitive advantages that insulate against competition, 2) significant runway for growth via reinvestment, and 3) excellent governance. We seek to own a concentrated portfolio of these types of businesses, investing when valuations become attractive due to temporary headwinds. Ideally, we own these businesses for the long term, enjoying the twin benefits of significant annual profit growth and multiple expansion to drive attractive returns.

We would like to acknowledge our team’s Senior Analyst Jasmine Shen for her significant contribution to the discussion on AI in this outlook. And, as always, we thank you for your continued confidence in our management.

Join The Conversation

For specific platform feedback and suggestions, please submit it directly to our team using these instructions.

If you have an account-specific question or concern, please reach out to Client Services.

We encourage you to look through our FAQs before posting. Your question may already be covered!

Leave a Reply

Disclosure: Interactive Brokers Third Party

Information posted on IBKR Campus that is provided by third-parties does NOT constitute a recommendation that you should contract for the services of that third party. Third-party participants who contribute to IBKR Campus are independent of Interactive Brokers and Interactive Brokers does not make any representations or warranties concerning the services offered, their past or future performance, or the accuracy of the information provided by the third party. Past performance is no guarantee of future results.

This material is from Osterweis Capital Management and is being posted with its permission. The views expressed in this material are solely those of the author and/or Osterweis Capital Management and Interactive Brokers is not endorsing or recommending any investment or trading discussed in the material. This material is not and should not be construed as an offer to buy or sell any security. It should not be construed as research or investment advice or a recommendation to buy, sell or hold any security or commodity. This material does not and is not intended to take into account the particular financial conditions, investment objectives or requirements of individual customers. Before acting on this material, you should consider whether it is suitable for your particular circumstances and, as necessary, seek professional advice.

IBKR Campus Newsletters

This website uses cookies to collect usage information in order to offer a better browsing experience. By browsing this site or by clicking on the "ACCEPT COOKIES" button you accept our Cookie Policy.