_7.jpg?itok=xxYvay_O)
Authored by Kay Rubacek via The Epoch Times,
“The United States is in a race to achieve global dominance in artificial intelligence. Whoever has the largest AI ecosystem will set global AI standards and reap broad economic and military benefits.”
That’s the U.S. government’s own language. An arms race.
Artificial intelligence is no longer framed as a research project or an economic opportunity. It is being cast as a struggle for survival and global power, a modern Manhattan Project.
Yet just last week, on Aug. 26, the Congressional Research Service released a Frequently Asked Questions memo designed to help lawmakers get on the same page about the basics: what a data center is, how many exist, and how much electricity data centers consume.
If even government institutions are still in the process of aligning their understanding, it’s clear that citizens will need to move quickly to understand what is happening and to understand what it means for their daily lives.
The memo laid out in plain language what many assumed lawmakers already understood.
A data center is a specialized building that houses thousands of servers. There are about seven thousand worldwide, with the largest concentration in the United States, especially in Northern Virginia and Texas. In 2022, American data centers consumed about 176 terawatt-hours of electricity—roughly 4 percent of all U.S. demand, more than many entire states. Projections suggest an additional 35 to 108 gigawatts of demand by 2030. The midpoint estimate, 50 gigawatts, is enough to power every home in California.
The very fact that such a memo was necessary highlights a structural reality: the pace of technological build out is outstripping the pace of legislative comprehension. If institutions themselves are still catching up, it underscores how important it is for citizens to get informed now, before the costs mount even higher.
While Congress is being briefed on “Data Centers 101,” the executive branch has been preparing all year for the AI race that is already underway:
On January 20, 2025, the White House declared a National Energy Emergency.
On April 8, an order was issued to strengthen grid reliability, with the Department of Energy (DOE) tasked to model how AI demand would reshape the grid.
Four months later, on July 2, DOE’s report warned bluntly: “Retirements plus load growth increase risk of outages by 100x. Status quo is unsustainable.”
Just weeks later, on July 23, a new order accelerated federal permitting of data centers, opening federal lands to construction. And on July 25, the White House released America’s AI Action Plan, framing AI as the next great geopolitical race.
Energy Secretary Chris Wright put it plainly: “We are taking a bold step to accelerate the next Manhattan Project—ensuring U.S. AI and energy leadership.” So on one side of our government, institutions are receiving crash courses on the fundamentals. On the other, the executive branch is already issuing a call to arms.
For many Americans, the gap between government priorities and local realities shows up in one place: the monthly electric bill. Reports are now showing how in Columbus, Ohio, households on standard utility plans saw increases of about 20 dollars a month (roughly $240 a year) linked directly to AI data centers. In New Jersey, Pennsylvania, and Ohio this summer, bills jumped by ten to 27 dollars a month.
In Oregon last year, utilities warned regulators that consumers needed protection from rate hikes caused by data centers. And in the Mid-Atlantic, regulators cited data centers as one of the main reasons for projected 20 percent increases in household electricity costs by 2025.
The complaints about rising bills suggest something deeper. Citizens are starting to connect the dots before Washington fully has. If households can feel the costs already, then citizens cannot wait for official briefings, they must demand clarity and prepare themselves. Part of the confusion comes from the nature of artificial intelligence itself. To most people, AI feels intangible. It lives in the “cloud.” You type a question, get an answer, and never see the machinery behind it. No one sends you a receipt for the power you used to get your answer.
But AI is not weightless. It runs on football-field-sized data centers, packed with servers that must run day and night. These machines use staggering amounts of electricity and water to stay cool. A Google search consumes about 0.3 watt-hours of electricity. An AI chatbot query can use up to ten times more—around three watt-hours.
Training a single large AI model consumes as much electricity as hundreds of U.S. homes use in a year.
Despite the technological advancements, computing power is not getting more efficient in terms of power usage. It is becoming ever more energy-hungry. For example, Nvidia’s latest chips draw more kilowatts per server than previous generations to help data centers keep up with demand.
And tracking AI power usage data has become more difficult. After ChatGPT launched in November 2022, companies became more competitive and stopped publishing detailed energy-use reports, leaving analysts to rely on estimations. In the end, citizens are footing the bill for systems whose true power demands are growing while publicly available data on its power usage is growing ever more opaque.
At the same time, there is no consensus among the experts on what AI means for humanity. Geoffrey Hinton, who is often called the “godfather of AI,” won a Nobel Prize for his contribution to AI development, but left Google in 2023 to warn the public about the existential threat AI poses to humanity.
Others in the field insist the opposite: that AI will cure diseases, solve climate change, and usher in an age of prosperity. The gap between these views is not narrow. It is a chasm.
When leaders call AI the “new Manhattan Project,” they reveal both ambition and blind spots. The original Manhattan Project was vast and secretive, but its end product was narrow: nuclear weapons. Its impact was geopolitical and terrifying, but its scope was limited to defense.
AI is different. It is already reshaping daily life: jobs, media, education, commerce, and even household energy bills. In 2025 alone, Google, Meta, Amazon, and Microsoft will spend over $350 billion on AI data centers. This amount is nearly double the inflation-adjusted cost of the Apollo program. These projects will require as much power as tens of millions of homes. And unlike nuclear weapons, which governments tightly controlled, AI is being rolled out by private corporations with minimal transparency. This is not simply another Manhattan Project. It is bigger, broader, and more disruptive.
The silent stage for this race is the U.S. electric grid, and American citizens are now forced to share their access to it. The White House itself admits that “the U.S. electric grid is one of the largest and most complex machines on Earth.”
It is also a machine under strain. The United States has three major grids: Eastern, Western, and Texas. Within those supergrids, electricity is managed by regional operators like PJM in the Mid-Atlantic, MISO in the Midwest, CAISO in California, and ERCOT in Texas. Today, the average household experiences one or two short outages a year, mostly from storms. But the Department of Energy now warns of a hundred-fold increase in blackout risk by 2030 if data center growth continues and plants keep closing on schedule.
Calling the U.S. grid “one of the largest and most complex machines on Earth” is not an exaggeration. But a machine that complex is also fragile. When you push it beyond its limits, for example, by adding the equivalent of every California household’s electricity demand in just a few years, the consequences will not be abstract. They will be rolling blackouts, instability, and higher costs in the places where citizens live.
The options given by the Department of Energy are stark: We can continue on the current path and expect regular blackouts. We can keep coal and nuclear plants online longer than planned, accepting the political and environmental trade-offs. We can build new supply along with transmission lines at a pace far faster than anything the U.S. has achieved in decades. Or we can slow down data center expansion, requiring companies to build only where abundant power is available. None are easy. But without action, the “status quo” means a future of rolling blackouts.
This brings us to the deeper question: Why has Congress only just now been briefed on the basics?
The answer is less about individual lawmakers and more about how institutions work. Legislative bodies are designed to deliberate slowly, to build consensus, and to check the pace of change. That design clashes directly with the speed of today’s AI build out, which is driven by corporate competition and framed by the executive branch as an arms race.
The lag is systemic, not personal. But while the government catches up, citizens cannot afford to wait. The bills are already arriving, the outages are already looming, and the decisions shaping our energy future are being made now.
What’s certain is that AI is not intangible. It is not free. It is not frictionless. It is built on massive physical infrastructure that consumes enormous amounts of electricity and water today, and experts cannot accurately predict its future costs because the technology is changing too fast. It is reshaping society in real time, as citizens are already paying more each month, facing higher risks of blackouts, and living in a society being reshaped faster than any in living memory.
Just as families once endured rationing and shortages for the Manhattan Project, today’s citizens are being asked to bear the unseen costs of an AI race they never chose. And as with the Manhattan Project, the greatest sacrifices will not be made in Washington boardrooms but in American households.
Views expressed in this article are opinions of the author and do not necessarily reflect the views of The Epoch Times or ZeroHedge.
Tyler Durden
Sat, 09/06/2025 – 22:10