Experts are considering a $500 billion stargate project for artificial intelligence

Experts are considering a $500 billion stargate project for artificial intelligence

News

At a press conference on Tuesday, President Trump announced the Stargate project, which he called “the largest AI infrastructure project in history.”

With the CEOs of OpenAI, Oracle and Softbank at his side, Trump said those companies and other private sector partners would invest up to $500 billion in building data centers across the United States, with the first $100 billion coming this year. The announcement came a day after Trump fired the former President Biden’s Executive Order on AIwhich was the goal increase the security of the technology.

While details about Stargate are scarce, artificial intelligence, energy and data center experts have had a range of reactions to the news.

What AI experts say about Stargate

Shelly Palmer, a technology expert and consultant, has argued that Stargate will give the United States a strategic advantage and that it will bring benefits that we cannot yet imagine. He wrote:

As for the 100,000 jobs the project is expected to create? Some construction jobs will be created while the data centers are being built, but many more (millions more) will be created when the data centers come online. We’ve never had cloud computing like this—there’s literally no way to calculate the economic impact of this much AI computing. It’s going to be massive.

There are many tech skeptics and it has become fashionable to denigrate and vilify big tech. For me, the Stargate project is the first step in securing the future of the American economy as well as our digital and cyber security. Every business will benefit from the power and promise of AI, and – like it or not, believe it or not – warfare will be dominated by AI. Today, the US has a clear lead. Project Stargate will help ensure it stays that way.

But not everyone is so bullish. Tan AI critic noted Gary Marcus replied to a post on X (formerly Twitter) from OpenAI CEO Sam Altman said the project would be “great for our country” (himself in response to Elon Musk question the financing of the project). Marcus took the problem with Altman’s rosy optimism:

Like much of what Sam says, this is based on an assumption, or in this case several assumptions:

1) The purely speculative assumption that LLM or something else OpenAI will figure out how to build will be hugely profitable. So far, infrastructure (costs) across the field (perhaps $250 billion) outweigh total revenue enormously, perhaps 50:1.

2) The entirely speculative assumption that any profits will actually do much to help the American people, as opposed to just enriching those who own the infrastructure. Yes, some people will be employed building data centers; but if data centers work on better AI, many others will lose their jobs. The net effect is entirely unclear.

Meantime,Doug Calidas, senior vice president of government affairs for Americans for Responsible Innovation, said. IEEE spectrum that the core of this initiative may not be new.

My sense is that this is mostly a repackaging of commitments that have already been made (especially by SoftBank) coupled with the aspiration to raise even more money to achieve this higher goal. Given the extreme level of investment interest in the area and the players involved, I think it’s likely but not certain that they’ll pull it off, especially since Trump appears to be allowing them to pull funding from the Middle East. The spectacle surrounding the announcement and President Trump’s public support for the project will likely make it easier for them to hit their target.

What energy people say about the stargate

The biggest ongoing debate in the energy sector over the past year has centered on how to deal with the coming onslaught of electricity demand from AI operations. The researchers predicted that U.S. electricity demand would increase by as much as 15.8 percent over the next four years, due to the energy intensity of AI and data centers, and those numbers predict any additional electricity demand that might come from Stargate projects.

So when Trump announced the initiative, it was his accompanying energy plans that raised eyebrows among the energy crowd. On Monday, his first day in office, Trump declared a national energy emergency, halted offshore wind development and suspended payments from the Infrastructure Investment and Jobs Act (IIJA) and the Inflation Reduction Act (IRA), which largely supported clean energy projects.

Line Roald, an electrical power systems expert at the University of Wisconsin in Madison, calls the moves “a huge contradiction” and says she’s concerned that technology companies will get preferential treatment when connecting their data centers to the grid.

Curiously, at the same time that Trump is voicing his support for AI initiatives, he is also trying to limit the development of new wind generation. Wind is a cheap source of electricity that could help support the needs of new AI infrastructure. To support these electricity needs, we also need new power plants and transmission lines. This costs a lot of money to build and is usually covered by all consumers in the region where the data center is built. As data centers connect to the network, they should pay their fair share for network expansion. Otherwise, electricity prices for everyone could increase.

Costa Samaras, director of the Wilton E. Scott Institute for Energy Innovation at Carnegie Mellon University, says the rapid growth and localized impacts of electricity consumption in data centers are the biggest challenges to large-scale AI adoption, but they can be managed.

The easiest way to get things online quickly is to bring your own power. BYOP. And an even better way is to not just bring your own, but bring enough for the community. AI electrical load will only break the grid if we are not proactive and come together to manage it properly by deploying lots of new clean electricity, maximizing energy efficiency and deploying virtual power plants. If we wantensure our AI competitiveness and our national securitywe don’t have the luxury of taking cheap, clean energy off the table.

Thomas Wilson is chief technical officer at the independent, nonprofit Electric Power Research Institute (EPRI). His organization doesn’t comment on specific policy statements, but offers some general insights on powering data centers:

New generation energy takes time to deploy, but wind, solar, batteries and natural gas currently have the shortest lead times. So if the data center community is interested in speed, it will be looking at it in addition to long-term bets like advanced nuclear technology. A new transmission also takes time. Data centers that operate flexibly, scaling down or self-powering when the network is stressed require less network construction. And if technology companies can spread that computation over multiple interconnected devices separated by tens of kilometers, instead of concentrating them all in one area, it will allow them to access multiple existing transmission lines. Both strategies could help them connect faster.

What Data Center Experts Say About Stargate

Data center providers have reason to be excited here. Kevin Cochrane, CMO of cloud infrastructure company Vultr, hoped it would benefit the industry and also increase much-needed geographic diversity in computing access:

Stargate will act as a catalyst for data center providers of all types in all geographies to realize the importance of building the capacity needed to support the wholesale transformation of cloud storage and enterprises worldwide. Every national government needs to have a strategy for building critical infrastructure to support artificial intelligence.Data center capacity must be more widely distributed across regions; specially capacity-optimized for the deployment of next-generation GPUs with regard to optimal energy efficiency and sustainability. Just as data center capacity was built to support the Internet revolution at the turn of the century, and data center capacity was later built to support new cloud services, similarly we must see further hyper-expansion of data center capacity around the world. .

Josh Mesout, chief innovation officer of another cloud computing provider, Civo, was also hopeful, but expressed concern that highly sought-after GPUs are in short supply and only available to large enterprises, as Civo noted in a recent report. Mesout warns that this huge investment should not only benefit OpenAI (such as Financial Times just announced it).

Any government support for AI initiatives should be met with support. Much of the promised (original) $100 billion will need to move to bridge the GPU gap we saw in our research, along with power infrastructure improvements to keep the data centerrunningMost importantly, the benefits of AI should be for everyone. Public and private organizations across all sectors can gain a huge amount from using AI to improve the lives of their customers and users. We’ve seen a big shift in the industry from training models to more expensive inference, so projects like Stargate should focus on keeping the cost of GPU access as low and flexible as possible for businesses. While big firms like OpenAI and Oracle certainly have a lot to offer, it’s vital that all of this funding isn’t just directed at apps like ChatGPT.

Data center provider IREN is building facilities powered by 100 percent renewable energy. IREN Business Director Kent Draper, while also enthusiastic, was concerned that energy considerations would be a hindrance to the effort. He also warned of the need for public investment in AI.

This announcement underscores the importance of next-generation data center capacity to support the growth of AI. That mrepresents an inflection point where the highest levels of government and the world’s largest companies are working together to address data center supply shortages.

Access to power is a key barrier to new data center development. The timelines for securing new network connections to build data centers are longer than ever before. Challenge will manage the already strained networks and ensure the availability of power for data center development.

It is likely that much of the hyperscaler’s computation will be for proprietary use. Hopefully, increased public investment in AI infrastructure will help mitigate the risk of computing centralization.

From your articles

Related articles on the web

Leave a Reply

Your email address will not be published. Required fields are marked *