Nvidia's H100 Microchips Projected to Consume Over 13,000 GWH in 2024, Surpassing Energy Consumption of Entire Nations Like Georgia and Costa Rica
Updated Feb 21, 2024
Nvidia’s H100 AI GPUs are taking the tech world by storm, but their reign comes at the price of a hefty energy bill. According to Stocklytics.com, these power-hungry processors are projected to consume a staggering 13,797 GWh in 2024, exceeding the annual energy consumption of nations like Georgia and Costa Rica.
These findings bring up concerns about the environmental impact and sustainability of this AI advancement.
Stocklytics Financial analyst Edith Reads commented on the analysis:
AI, which often requires running computations on gigabytes of data, needs enormous computing power compared with ordinary workloads. And, Nvidia’s cutting-edge H100 AI GPUs are leading the way with energy consumption of over 13,000 GWh this year. Each H100 GPU, running at 61% annual utilization, consumes roughly 3,740 kilowatt-hours (kWh) of electricity annually. This is equivalent to the average American household. While this figure might seem alarming, GPU efficiency may improve in the near future, offering a potential path towards more sustainable computing.
Stocklytics Financial analyst Edith Reads
Nvidia Ventures into a $30 Billion Tailored Chip Market
Nvidia, a leading player in AI chip design, is broadening its scope by venturing into custom chip development for cloud computing and AI applications. The firm is now looking to tap into the growing custom chip sector, projected to reach $10 billion this year and double by 2025. The broader custom chip market hit around $30 billion in 2023, accounting for 5% of chip sales annually.
Based in Santa Clara, California, Nvidia targets the changing needs of tech giants such as OpenAI, Microsoft, Alphabet, and Meta Platforms. The company is establishing a division focused on developing custom chips, including powerful artificial intelligence (AI) processors, for cloud computing firms and other enterprises.
Currently holding 80% of the high-end AI chip market share, the position has driven Nvidia’s stock market value up by 40% so far this year to $1.7 trillion after a more than threefold increase in 2023.
Energy Challenge of Powering AI Chips
As Nvidia’s aspirations grow higher, concerns are emerging regarding the impact of the escalating energy requirements linked to its cutting-edge chip technologies.
According to Paul Churnock, Microsoft’s Principal Electrical Engineer of Datacenter Technical Governance and Strategy, the installation of millions of Nvidia H100 GPUs will consume more energy than all households in Phoenix, Arizona by the end of 2024
Thus, successfully navigating these challenges and fostering innovation will shape the future landscape of AI computing and beyond. Amazon’s recent unveiling of the Arm-based Graviton4 and Trainium2 chips holds promise for efficiency gains.
Sign up for our newsletter
Join our exclusive community of over one million investment enthusiasts and receive our free newsletter filled with analysis, news, and updates every weekday.