When the whole world is celebrating AI's computing power, NVIDIA founder Jensen Huang pointed out the next battlefield: electricity. This is not just a choice of energy, but the first pin that bursts the bubble of renewable energy idealism, revealing the ultimate battle behind AI development involving tech giants, national strategies, and energy realities. (Background: The blue-white coalition proposed amending the law to extend the nuclear power deadline by 20 years, fighting to end the non-nuclear homeland.) (Background Supplement: Meta signed a 20-year nuclear power agreement) to back an entire nuclear reactor supporting AI computing power, in collaboration with U.S. Constellation Energy. This article is contributed by a person with a bachelor's degree from the National Tsing Hua University's Institute of Nuclear Science and Engineering, who has nearly ten years of media experience in technology and blockchain. You may not agree with nuclear energy, but you cannot imagine that the AI kingdom and a non-nuclear homeland can have it both ways. When NVIDIA founder Jensen Huang suddenly flashed to Taiwan this morning (22) and spoke about how AI will change the world, he was like a magician pulling increasingly powerful GPUs from his hat. However, recently, his focus has quietly shifted from chips to a more fundamental and controversial topic: electricity. He said the future of AI is closely tied to energy, asking why we should stigmatize energy...? Nuclear energy is an outstanding energy choice... I really hope that the local government can address our needs and solve all the necessary problems, so we can set our Asian headquarters here. This statement sounds a bit abrupt, even politically incorrect. In an era that advocates ESG and embraces renewable energy, why would a global AI leader bet its future on an energy source fraught with historical burdens and public concerns? What Jensen Huang sees is not the current stock price or next quarter's financial report, but the 'ultimate battle' of this AI revolution. The unspoken implication is: the idealistic worship of renewable energy that we've had for the past twenty years is about to be ruthlessly punctured by this 'electric tiger' called AI. This is not just a technical option; it is a paradigm shift that will impact industrial structure, capital flow, and even geopolitical dynamics. AI Energy Hunger: An Endless Feast of Computing Power To understand why nuclear energy has become an inevitable choice for AI, we must first grasp how terrifying AI's 'appetite' is and how picky its 'diet' is. Traditional data centers operate in a way that has peaks and off-peak times, similar to an office building that is bustling during the day and quiet at night. But AI is different, especially the training of large language models, which is a 24/7, never-ending marathon. A single NVIDIA H100 GPU has a maximum power consumption of up to 700 watts, and one server can consist of up to 8 GPUs, while a server rack is composed of 4 such AI servers. Its area and volume can be like a thick bookshelf, but a single unit can consume up to 150 kWh per hour, racking up 3,500 kWh per day, and there may be thousands of such server racks in a data center, consuming 18 million kWh in a single day, and this may just be a single computing cluster of one enterprise. During operation, waste heat is also generated, requiring additional electricity to dissipate the heat and cool down. According to explanations from the Science Channel, the most power-hungry part of AI servers is actually the cooling. It requires more energy than the AI servers (GPU racks) themselves, and if we add the power consumption for cooling, a single day of operation for a 5,000 rack GPU2 AI center would consume 40 million kWh. When thousands of such GPUs form a computing cluster that runs day and night, the power demand is a stable and massive 'flat curve'; it does not differentiate between day and night and does not reduce power during evenings, early mornings, or weekends. It is like a giant beast that, once it starts devouring power, will never stop. According to forecasts, by 2030, the global power demand of data centers will reach nearly 945 TWh annually, equivalent to the entire electricity consumption of Japan, with AI's power consumption quadrupling during the same period. This 'constant, high-power' energy demand directly excludes the currently mainstream renewable energy sources—solar and wind—from being primary options. This is not a denigration of renewable energy but a harsh reality; the sun does not shine at night, and wind power is not always stable. Their essence is 'intermittent.' To enable them to provide stable power 24/7, exorbitantly priced energy storage facilities, such as large-scale battery arrays, must be paired with them. This not only significantly raises costs but also causes enormous waste in energy conversion efficiency; for example, daytime storage can only provide a 30% conversion rate to AI from nighttime to morning. If you are an AI server, you also have to calculate the conversion rate and feasible deployment machines, which amounts to a massive uncertainty and deployment cost. AI needs 'Baseload Power,' a type of energy that can provide stable output all year round. Among all low-carbon or zero-carbon energy options, nuclear energy is the only one that perfectly meets this role. Nuclear power plants have a capacity factor (the ratio of actual output to maximum rated output) of over 92% in the United States, meaning they operate at maximum efficiency almost year-round. This stability is precisely the most demanding requirement of AI, this picky gourmand. Investment in Nuclear Power Plants: More Assets than Expenses Jensen Huang's appeal also foreshadows the next battlefield for tech giants: from competing for 'computing power' to controlling 'electric power' through vertical integration. In the past, the core competitiveness of tech giants was algorithms, chips, and data. In the future, whoever controls stable, cheap, large-scale low-carbon power will control the lifeblood of the AI era. Behind this lies extremely shrewd business calculations. For a company, electricity is an 'expense' that directly erodes profits. But a power plant is an 'asset' that can be included on the balance sheet and even depreciated for tax benefits. When giants like Meta, Amazon, and Microsoft start directly investing in or signing long-term nuclear power purchase agreements (PPAs), they are not just aiming to reduce operating costs but are making a profound strategic layout. The classic case is the collaboration between Amazon AWS and Talen Energy. AWS signed a ten-year contract to purchase 960 MW of carbon-free power from the Susquehanna nuclear power plant in Pennsylvania to supply its data centers. The brilliance of this deal lies in the fact that the data center is built right next to the nuclear power plant, achieving 'direct delivery from the source,' maximizing energy efficiency and minimizing the risks of grid instability. This is not just buying electricity; it's internalizing energy as a core infrastructure, freeing itself from dependence on traditional power companies and the constraints of price volatility. This has given rise to an unprecedented 'Tech-Energy Complex.' In the future, tech giants will no longer just be energy consumers; they will become energy producers and dispatchers. And nuclear energy, especially the developing small modular reactors (SMRs), will become the best puzzle piece for achieving this vertical integration due to its flexible siting, short construction cycle, and higher safety potential. Imagine that in the future, every large AI data center park will be equipped with several SMRs next to it, forming a self-sufficient 'computing power-energy' island. This is the ultimate battle that tech giants are planning.