A new collaboration is working to develop models for the faster setup of smaller-scale, real-time data processing centers.
EPRI, InfraPartners, NVIDIA and Prologis will assess the ways data centers in the 5- to 20-MW range can be built at or near utility substations that have available capacity.
The effort was announced Feb. 3 at the DTECH transmission and distribution conference in San Diego.
The goal is to speed deployment by making better use of underused infrastructure. The partners hope to have at least five pilot sites in development nationwide by the end of 2026, and develop a replicable, scalable model for wider use.
The focus is on inference data processing, which supports artificial intelligence in nearly every sector of the economy, EPRI said.
Unlike AI model training, which often is carried out in larger facilities over longer time frames, AI inference provides real-time responses and can work from smaller facilities.
When AI inference is distributed, rather than centralized at a single hyperscale facility, it is closer to the end users of data, which can reduce response time.
EPRI said this edge-of-grid distribution also can reduce transmission congestion, improve system flexibility and help integrate renewable energy.
EPRI President Arshad Mansoor said: “This collaboration with Prologis, NVIDIA, InfraPartners and the utility community highlights the type of innovative actions required to meet the moment. Using existing grid capacity to bring inference compute closer to where it’s needed — quickly and reliably — is a win for all.”
Power industry R&D organization EPRI will identify areas with capacity and fiber connections that could host the pilot projects; later, it will collect and analyze the results to inform future best practices.
Industrial real estate investment trust Prologis will identify suitable land and buildings that could be used for rapid deployment and will coordinate development and planning.
Graphic processing unit designer/manufacturer NVIDIA will deliver optimized computing platforms, offer technical guidance and facilitate connections to potential customers.
Data center builder InfraPartners will provide AI data centers manufactured offsite and designed for high-density power and cooling.
Participating utilities will assess distribution capacity, guide siting and interconnection, and ensure operational requirements are met.
Marc Spieler, senior managing director for the global energy industry at NVIDIA, said: “AI is driving a new industrial revolution that demands a fundamental rethinking of data center infrastructure. By deploying accelerated computing resources directly adjacent to available grid capacity, we can unlock stranded power to scale AI inference efficiently.”




