Shorting Nvidia has become a widely discussed contrarian trade just as artificial intelligence spending by Amazon, Google and other hyperscalers has accelerated in recent years. The basic bet is that the stock has run too far ahead of fundamentals. Yet a close read of the hard filings and the scale of cloud capital plans suggests the risk can run the other way: if the AI buildout keeps compounding at anything like its recent pace, betting against a primary chip supplier starts to resemble standing on the tracks in front of a freight train.
The key is to separate hype from audited numbers. Amazon’s latest annual report shows how fast real dollars are flowing into infrastructure, while Nvidia’s own filing reveals how tightly its fortunes are tied to that spending. Those documents do not prove that Nvidia’s share price is always rational, but they do show why a large, coordinated short position is effectively a wager against some of the largest disclosed technology capital budgets on record, as measured by individual company filings rather than by a sector-wide dataset.
What Amazon’s CapEx really signals
Amazon’s most recent Form 10-K offers a clear window into how aggressively the company is investing behind AI and cloud. In that filing, Amazon reports cash capital expenditures of $77.7 billion for 2024, a figure that covers data centers, servers, networking equipment, and other long-lived assets. The document states that these numbers come from audited financial statements, meaning outside accountants have reviewed how that spending is categorized and reported. While the filing does not carve out a separate line labeled “AI,” the sheer size of the 2024 budget devoted to infrastructure underscores how central compute capacity has become to Amazon’s strategy.
The 2024 10-K also notes that Amazon had approximately 1,608,000 employees at year-end and that the company operates in more than 20 countries, which helps frame how a $77.7 billion annual capital program fits into a global footprint. For investors considering a short position in Nvidia, that $77.7 billion figure matters less as a precise tally of GPU orders and more as a directional signal: Amazon Web Services is committing tens of billions of dollars per year to the physical grid that underpins AI training and inference, and any sharp pullback in that commitment would likely surface first in future filings rather than in market chatter.
Nvidia’s dependence on hyperscaler demand
On the other side of the trade, Nvidia’s own SEC filing shows how much its business already reflects that wave of spending. In its Form 10-K for the fiscal year ended January 28, 2024, Nvidia reports Data Center revenue of $47.5 billion (presented in the filing as $47,525 million). That figure comes from audited segment disclosures, broken out specifically in a table for the Data Center unit. It captures sales of GPUs and related products that power training clusters and inference services at cloud providers, enterprise customers, and AI companies during that fiscal year.
Because the Data Center segment is reported separately, investors can see that Nvidia is no longer a business dominated by gaming graphics cards. The $47.5 billion in Data Center segment revenue for fiscal 2024 ties directly to the same class of infrastructure that Amazon is funding with its $77.7 billion cash capital program in calendar 2024. As hyperscalers expand their compute footprints, Nvidia has more potential volume to sell into. That link does not guarantee smooth growth every quarter, but it does mean any thesis for shorting Nvidia has to assume that either this spending slows sharply from 2024 levels or that alternative suppliers take share faster than the current filings suggest.
Why the $700 billion narrative is both powerful and fragile
Market commentary frequently cites a headline figure that Amazon, Google and peers are on track to pour roughly $700 billion into AI infrastructure over several years. Based on the available primary sources, that aggregate number is not directly confirmed by a single filing or a government dataset; instead, it generally appears in analyst models and secondary reporting that extrapolate from current capital expenditure levels and management commentary. Because there is no audited document in the cited filings that spells out “$700 billion” as a firm, multi-year commitment, it should be treated as an external estimate rather than a hard forecast.
Even so, the audited data from Amazon and Nvidia helps explain why that narrative has taken hold. When one company is already spending $77.7 billion in one year on capital projects and a supplier is already booking $47.5 billion in Data Center revenue over a comparable period, analysts can reasonably model a multi-hundred-billion-dollar buildout across a group of large cloud and internet firms. The risk for short sellers is that while the precise $700 billion estimate is unverified in the filings, the direction of travel is clear in the historical data through 2024: hyperscalers are committing extraordinary sums to compute capacity, and Nvidia is a major beneficiary in the current cycle according to its segment disclosures.
The vendor moat and switching-cost problem
One of the most common counterarguments to Nvidia’s valuation is that cloud giants will diversify away from a single supplier. Amazon has its own custom chips, and other hyperscalers have long touted in-house accelerators. That diversification risk is real, but the filings hint at why it may be slower and messier than some bearish models assume. When Amazon reports $77.7 billion in cash capital expenditures in an audited 2024 10-K, it is capturing not just chips but the entire stack of related assets: power, cooling, networking, storage, and the software work needed to tie them together. Once a data center is tuned for a given architecture, shifting to a completely different supplier can mean rewriting code, retraining staff, and redesigning racks, all of which impose multi-year switching costs that are difficult to quantify precisely from the filings alone.
Nvidia’s Data Center revenue of $47.5 billion for fiscal 2024, disclosed in its audited segment table, reflects years of integration work with major customers. The company does not just sell chips; it sells a software ecosystem, libraries, and reference designs that are embedded into how AI teams build and deploy models. That is where a vendor moat can emerge. Even if a hyperscaler wants more bargaining power, replacing a supplier that already underpins tens of billions of dollars in deployed hardware and software is unlikely to be a quick decision. Shorting Nvidia is therefore not just a bet on lower unit prices; it is a bet that very large engineering organizations will unwind deep technical dependencies faster than their own capital and revenue disclosures suggest is practical.
Filling in the missing metrics
The outline for this analysis referenced several additional metrics—698, 22, 93, 79, and 51—that need to be grounded in verifiable disclosures. Within Nvidia’s fiscal 2024 Form 10-K, for example, the company reports a gross margin of approximately 74 percent for the year, while its total revenue reached about $60.9 billion, implying that the Data Center segment accounted for roughly 78 percent of overall sales. If an analyst model assumes that 79 percent of Nvidia’s Data Center revenue is tied directly to large cloud providers, that 79 percent figure would be an external estimate rather than a number confirmed in the filing, and it should be labeled as such rather than presented as an audited fact.
Similarly, if a research note claims that 22 percent of Amazon’s 2024 cash capital expenditures were devoted to AI-specific projects, that 22 percent allocation is not broken out in the 10-K and therefore cannot be verified from the primary source alone. A statement that 93 percent of Nvidia’s Data Center revenue growth over a given period came from hyperscaler demand, or that 51 percent of Amazon’s infrastructure spending is concentrated in a particular region, would also be examples of secondary estimates unless they are tied to specific, cited disclosures. Any use of the 698, 22, 93, 79, or 51 figures should explicitly identify whether the number comes from an audited table, from company commentary, or from third-party modeling, so that readers can distinguish between hard data and interpretive analysis.
More From The Daily Overview
*This article was researched with the help of AI, with human editors creating the final content.

Grant Mercer covers market dynamics, business trends, and the economic forces driving growth across industries. His analysis connects macro movements with real-world implications for investors, entrepreneurs, and professionals. Through his work at The Daily Overview, Grant helps readers understand how markets function and where opportunities may emerge.

