Visualizing Big Tech Company Spending On AI Data Centers
This was originally posted on our Voronoi app. Download the app for free on iOS or Android and discover incredible data-driven charts from a variety of trusted sources.
By Kayla Zh
Big Tech’s AI Data Center Costs
This was originally posted on our Voronoi app. Download the app for free on iOS or Android and discover incredible data-driven charts from a variety of trusted sources.
Big tech companies are aggressively investing billions of dollars in AI data centers to meet the escalating demand for computational power and infrastructure necessary for advanced AI workloads.
This graphic visualizes the total AI capital expenditures and data center operating costs for Microsoft, Google, Meta, and Amazon from January to August 2024.
AI capital expenditures are one-time or infrequent investments in long-term AI assets and infrastructure.
Data center operating costs are ongoing expenses for running and maintaining AI data centers on a day-to-day basis
The data comes from New Street Research via JP Morgan and is updated as of August 2024. Figures are in billions. Operating costs include cash operating expenses, software, depreciation, and electricity.
Training AI Models Is Eating Up Costs
Below, we show the total AI capital expenditures and data center operating costs for Microsoft, Google, Meta, and Amazon.
Microsoft currently leads the pack in total AI data center costs, spending a total of $46 billion on capital expenditures and operating costs as of August 2024.
Microsoft also currently has the highest number of data centers at 300, followed by Amazon at about 215. However, variations in size and capacity mean the number of facilities doesn’t always reflect total computing power.
In September, Microsoft and BlackRock unveiled a $100 billion plan through the Global Artificial Intelligence Infrastructure Investment Partnership (GAIIP) to develop AI-focused data centers and the energy infrastructure to support them.
Notably, both Google and Amazon currently spend more than twice as much training their models as they do running them for their end-use customers (inference).
The training cost for a major AI model is getting increasingly expensive, as it requires large data sets, complex calculations, and substantial computational resources, often involving powerful GPUs and significant energy consumption.
However, as the frequency and scale of AI model deployments continue to grow, the cumulative cost of inference is likely to surpass these initial training costs, which is already the case for OpenAI’s ChatGPT.
Our IndieGoGo for my new book: Voluntaryism - How the Only “ISM” Fair for Everyone Leads to Harmony, Prosperity and Good Karma for All! has raised $3391 from 75 backers or 13% of our $25,000 goal with 15 days left! You can get some great perks for pre-ordering at https://www.indiegogo.com/projects/voluntaryism-why-we-don-t-need-government