Insight and analysis on the data center space from industry thought leaders.
Are Quality Concerns Impeding Data Center Construction?
Amid surging data center construction demands, quality concerns are hindering progress and efficiency, writes Matthew Kleiman.
June 26, 2024
For years, it’s been said that “data is the new oil.” While this turn of phrase is technically true, in reality, compute is the real future, and the proof is in the massive surge of data center construction occurring all over the world.
Over the past four years, hyperscale data center capacity has doubled to reach nearly 1,000 facilities at the end of 2023. Further, Synergy Research Group predicts that data center capacity will double again in the next four years. As a result, global data center spending is projected to grow at an annual rate of 18% and reach $200 billion by 2028, according to research from Dell’Oro Group. The list goes on.
It’s clear that data center construction is in the midst of a gold rush, with contractors struggling to keep pace with market hype and demand. This critical infrastructure is foundational to keep pace with rapidly expanding AI workloads, which are projected to represent 25% of annual data center capex by 2028.
Unsurprisingly, the companies with the broadest data center footprint are the leading cloud providers: Amazon, Microsoft, and Google. These behemoths now account for 60% of all hyperscale data center capacity, followed by Meta, Alibaba, Tencent, Apple, and ByteDance.
In a relatively short period, data centers have become an essential piece of infrastructure, and industry experts don’t expect this to change anytime soon. Sam Altman, the CEO of OpenAI, recently stated: “I think compute is going to be the currency of the future. I think it will be maybe the most precious commodity in the world, and I think we should be investing heavily to make a lot more compute.”
The Elephant in the Data Center: Poor Work Quality
The rapid growth in data center construction doesn’t come without its challenges. Learning how to build these complicated projects in a relatively short amount of time, including leveraging a strategy of modular fabrication, has put general contractors and modular fabricators under immense pressure.
Traceability and accountability of work across the supply chain is challenging. Schedule demands require changes to existing procedures and methods, which in the current data center arms race often take a back seat to the pace of construction. As a result, quality issues are cropping up across the value chain much more often than they should.
One critical component of data centers to examine is the air and liquid cooling systems that keep temperatures under control. These systems are a necessity due to the enormous amount of heat generated by data centers. Each cooling system requires complex mechanical and piping systems in order to work, but when piping flange bolt connections or structural connections of a module aren’t properly tightened, it’s easy for a leak to occur. This poses significant safety and financial repercussions.
Another challenge that we must address are arc flashes. An arc flash, or flashover, occurs when an electrical current leaves its intended path and flows through the air, either from one conductor to another conductor or to ground, causing an arc blast pressure wave. These are typically caused when electrical terminations are not properly seated, tightened to the correct specification, or missed during a visual inspection.
Exposure to an arc flash can result in serious injuries or even death. Unfortunately, estimates cite that 5-10 arc flash explosions occur in electric equipment in the US every day. Arc flashes are a significant safety risk and, because they occur late in the process once power is introduced to the facility, are a major cause of rework and commissioning delays.
Noticing a theme here? In this new world of hyper-growth, where providing more compute to the market as fast as possible is key, data center construction quality has not been properly prioritized. While regular inspections and quality control checks are standard practice during data center construction, they have largely become a reactive function. Will this traditional approach to quality control and progress monitoring continue to serve these ambitious goals?
The Root Cause of Poor Data Center Work Quality
With data center construction schedule delays and project cost overruns becoming more common, the current approach to quality must evolve and improve to meet the age of AI. Rework, caused by work done wrong, has become a major contributor to safety risks and unexpected delays on data center construction projects. For many companies, quality control (QC) and quality assurance (QA) have become reactive functions that leverage spot checks after work is completed. This leads to many issues slipping through the cracks, and workers repeating the same mistakes.
One main reason that explains this reactive nature to quality has been the lack of QC/QA resources. As an industry, we are facing a severe worker shortage and that has been especially challenging among quality personnel. Many projects simply don’t have the QC/QA project team members required to watch every step of the construction process. Even if the construction projects could find enough people to fill these roles, the cost of the manpower would simply be cost-prohibitive.
In particular, finding skilled labor with experience in the mechanical, electrical, and liquid cooling fields – often referred to as MEP or Mechanical, Electrical, and Plumbing/Piping – is the most challenging. To make matters worse, many data center owners are beginning to build data centers in rural parts of the world that are nowhere near a major city.
Unfortunately, while these more rural areas may provide greater access to reliable power supply, they provide less access to highly skilled MEP construction workers. As a result, both productivity and quality suffer.
To combat the worker shortage, many projects are implementing “digital checklists.” QC’s use these checklists on mobile devices to answer a series of questions and record certain conditions such as “no visible damage is seen.” While this method is effective at capturing data digitally for easy access in the future, it has many shortcomings.
First, checklists are often just long forms that are completed back in the office, long after the work was actually completed. This is the equivalent of “digitally pencil whipping” to complete a mandatory task, with no real quality control actually taking place.
Further, this process does a poor job of identifying actual quality issues that may cause serious rework in the situations or arc flashes or leaks in piping systems.
An Integrated Approach to Improving Data Center Construction
To meet the rising demands for the future of compute, we must rethink how we approach quality in data center construction. We need a system that can guide the next generation of construction workers through the necessary construction work process so that each step of a work procedure is crystal clear.
Next, accurate work quality data be captured during the installation process – not after. Then, the necessary quality control documentation should be automatically generated in real time as the work is completed.
To take things one step further, many advanced contractors are even using Bluetooth-enabled tools to automate the data collection process and to ensure that proper specifications, such as torque values, are being applied properly. This granular information is one of the most effective methods for creating quality control documentation as the work is being completed. Data collected directly from devices is as accurate as it gets – it cannot be misplaced or mistyped.
Additionally, since this information is often associated with unique MEP items in the work plan and schedule, it can be used to provide real-time transparency in construction quality and progress visibility down to every weld, bolt, and electrical termination for data center construction.
Embracing Technology for Next-Gen Data Center Builds
To ensure that the expected doubling of data center construction is as efficient and safe as possible, paper checklists must become a thing of the past.
Technology-enabled workflows can get every project stakeholder – the owner, general contractor, subcontractors – on the same page with real-time visibility into project progress and quality.
Hyperscalers must prepare for the future and be ready to proactively prioritize quality using proven technologies to protect their $200 billion capex investments. Only then can they consistently execute high work quality while accelerating project schedules.
Matthew Kleiman is co-founder and CEO of Cumulus Digital Systems, a cloud-based platform enabling connected work in construction. and manufacturing and author of Work Done Right: Using Systems Thinking to Guide Your Digital Transformation.
About the Author
You May Also Like