
Johnny Clemmons is global vice president and industries head at software firm SAP. Opinions are the author’s own.
Artificial intelligence is driving a massive and unprecedented infrastructure expansion to the tune of $40 billion in annual spending on data centers in the U.S. alone.
Johnny Clemmons
Permission granted by SAP
The pace of data center construction is accelerating across the world. There’s OpenAI’s $400 billion Stargate initiative, which will add 10 gigawatts of data center capacity with partners Oracle and SoftBank, alongside Amazon’s $100 billion global data center expansion effort. Moody’s projects $3 trillion in global data center spending over the next five years.
As hyperscalers and others invest to meet AI’s growing data capacity appetite, it’s vitally important that they also bring more standardization and modularity to data center design and construction.
Ultimately, the architectural aesthetics with these projects are secondary to efficiency, security, safety and repeatability. This means that unless a particular project has unique environmental, siting or regulatory requirements, heavily customized designs are unnecessary, even frivolous.
For general building design, the occupant experience almost always comes first. But for data centers, while the safety and comfort of those who work onsite should be a high priority, so, too, is creating an optimal environment for server racks and their contents.
As much as stakeholders might feel inclined to treat a data center like most other buildings, with customizations and unique design elements to make the building more appealing and the occupant experience richer, those considerations take a back seat to standardization, predictability, modularity, repeatability and sustainability.
In fact, data centers may be more repeatable than most other types of building projects. Many of their critical components — power density, cooling strategies, security zones, structural loads and commissioning requirements — tend to follow predictable patterns. Once these patterns are identified — with the help of artificial intelligence, of course — they can be used to inform design and construction strategies from the start.
That, in turn, creates an opportunity to develop a repeatable blueprint or formula that data center developers can readily apply to future projects.
Excessive customization leads to longer timelines, higher and less predictable costs, quality inconsistencies and elevated risk. Modularization, on the other hand, allows projects to be designed, approved, built and commissioned faster, at a more predictable and often lower cost, with fewer surprises along the way.
Here are the specific advantages modularization brings:
Speed: Modularization and standardization can significantly speed up projects without compromising quality. The ability to develop one megawatt modules, for example, with standardized HVAC configurations, materials and labor requirements can enable developers to scale different components and then assemble the project as a whole. AI tools can help design and configure the modules themselves.
Off-site fabrication of modules, as well as continuous commissioning within a project, where stakeholders can identify and address issues as they arise, offer additional time savings.
Better cost control, if not cost reduction: Modularization brings greater project cost certainty and likely cost reductions, due to fewer customizations and economies in procurement.
Tighter standardization for interior specifications: Within a data center, every tenth of a degree in temperature and every millimeter of space counts. Miscalculations can lead to expensive inefficiencies and even equipment failures. Repeatable, intelligent modeling and design draws from past performance data and design specs to optimize data center interior configuration.
Along these lines, AI can build and manage a reference library of best construction practices specific to data centers. This simplifies the build, particularly for those with less experience.
Predictability in terms of equipment and labor requirements: Repeatable designs allow stakeholders to develop standard requirements for the parts, tools, equipment and skilled labor needed onsite, as well as for how long. Those factors can then be adjusted according to a specific project’s site conditions.
Project tracking and quality control: The modular approach allows contractors and other project stakeholders to monitor construction processes, progress and quality in nearly real time to ensure everything’s being delivered, built and installed to spec and to code. Since it’s predictable and planned out following a template, AI can help in this regard, too, monitoring a project and alerting the right people when discrepancies emerge.
Sustainability tracking and reporting: Intelligent tools can help project stakeholders measure, track and report on carbon footprint, use of recycled materials, and other important sustainability-related metrics to meet regulatory requirements and internal benchmarks/goals.
Put all these elements together and the result is a repeatable set of specs, processes and practices for bringing data centers online faster, at a more predictable,and hopefully lower cost, without compromising performance. This is how our AI future should be built.






