For Siemens, dealing with energy and supply chain for data centers is paramount

For Siemens, the supply chain for data centers is not necessarily about being the fastest but about being the most reliable, says Ciaran Flanagan, VP & Global Head of Data Centre Solutions at Siemens.

Siemens has been in the data center industry for some time, but it wasn’t until about a decade ago that the technology company decided not just be a supplier, but also to be a partner to some of the bigger players in the data center industry.

According to Ciaran Flanagan, VP & Global Head of Data Centre Solutions at Siemens, the tech vendor now has a direct business model with a lot of the big data center operators, which includes the hyperscalers, the co-location providers, and the big enterprises.

Speaking to CRN Asia exclusively during Singapore Tech Week, Fanagan said that this new model has worked out very well for Siemens over the last number of years, especially with customer intimacy helping them refine the portfolio, understand the customer requirements, and be competitive.

Looking at the Asia Pacific region, Flanagan believes there is a lot of growth potential to support the indigenous regional demand for cloud and traditional IT services as digitalization continues.

“I think this region is positioning itself conservatively to deal with what's coming from AI. We're hearing a lot about AI in North America, the Middle East, and perhaps India as well. I think this region has still to come to that table, but it will. Undoubtedly, it will,” said Flanagan.

Siemens in APAC and working with partners

While Siemens has got a variety of business offerings, in the data center industry the technology vendor works with a lot of the co-location providers and a lot of the telcos. Siemens also works with a number of enterprise customers in the region and is supporting the international hyperscaler customers coming to the region.

“Now, this region is growing fast. Over the last two or three years, we've grown very well with it as well. So, the hyperscalers are building so much, so fast, so big. They need a consistent, repeatable, and vanilla approach to how they're designed. Where else an enterprise customer will have maybe a small data center in different cities with different environmental requirements, even different types of workloads. So, they need a little bit more of a bespoke solution. And often with the enterprise customers, we will lean very heavily on our partners. So sometimes we find that our partners have a very good relationship with some of those enterprise customers, and our partners act as their design authority,” he said.

“We support the partners; the partner supports the enterprise. With the hyperscalers and the large colocation data centers, we go direct, and we try to engage them directly in a design that they can use over and over and over again. So, it does have different requirements,” he added.

When working with partners, Flanagan said Siemens supports the partners to design the technology into their solution. In many cases, it may not necessarily be exclusive as well.

“So, we've got to compete. But what we want to make sure is that the partner understands the value proposition that the Siemens technology brings, and they can translate that into a solution they bring to the customer. So that requires us to train our partner and to provide them with technical skills. It requires us to support our partner with go-to-market investment. It requires us, in some cases with some technologies, to certify the partner,” said Flanagan.

For Flanagan, it's not a partnership where Siemens just ships products and hopes it will work. Instead, they would rather stay close with their partners and that will be the model.

“That has been the model and that will continue to be the model,” said Flanagan.

Challenges in AI for data centers

Given the increasing use of AI, data centers are becoming stretched to support the need for more compute. For Flanagan, this could be a really compelling and valuable way for customers to achieve what they want to achieve faster and not waste so much time and money.

Specifically, Flanagan highlighted a Siemens platform called White Space Cooling Optimization for data centers that can help the industry deal with increasing challenges in data centers. White Space Cooling Optimization employs a network of sensors, cooling unit controls, and an AI engine to match facility cooling with real-time IT load. Sensors are deployed in a dense network to measure temperature where it matters most – at the IT equipment air inlets. The temperature data is relayed to an AI engine, which is stored virtually or on dedicated,

“We use AI to help our customers drive efficiently in their cooling load for their data centre. So, the AI algorithm learns how the data centre generates heat and where it needs to reject the heat, and it optimizes the cooling infrastructure to meet the requirement of just that heat and not overcool the data centre. We're also using AI in our digital twin technology, where we're helping our customers iteratively plan scenarios or simulate what might happen in a set of circumstances, whether it's in the construction phase or the operation phase,” explained Flanagan.

The other challenge is the supply chain. While Flanagan acknowledges that its not as bad as the pandemic era, availability infrastructure, of equipment in a timely manner is still a concern, which he also believes will continue for a little while longer.

“For the larger customers, the availability of power is a huge issue. And even for some enterprise customers where they want to build new data centers, they can sometimes find it difficult to find the power. And then the last piece that's concerning customers is the whole challenge around AI workload density, the energy density of AI workload,” he added.

Flanagan explained that if AI workloads bring the data center from an average rack density of 10 kilowatts to an average rack density of 100 kilowatts, it means they've got to bring in liquid cooling.

“They have tighter control loops. They've got to have different power distribution. And I don't think the industry has cracked that. And I think a lot of our customers are looking at us and our competitors, the big OEMs, to give them guidance. And I think we need to do that, and we will. We're working with people like NVIDIA, Open Compute, Intel, AMD, to make sure we understand what the future requirements will be, and we bring that back into our portfolio offer. But it's quite a remarkable time right now in this industry. It's all about having better supply chain management and how they can leverage AI. And then we're looking at energy,” said Flanagan.

In short, Flanagan said the supply chain for Siemens is not necessarily about being the fastest but about being the most reliable.

“That's what we're focused on. So, we're not trying to go to the market and say suddenly we can make it happen twice as fast. We're telling the market what we can do, and we're delivering that commitment. That's how we're dealing with supply chain problems,” he said.

Dealing with energy demand and future of data centers

Looking at the energy density situation at data centers, Flanagan highlighted how Siemens is using liquid cooling to cool and deliver power to data centers. He believes the innovation in this domain in the next few years will solve the problem.

“I'm not convinced that we need to build an industry for every rack to be 500 kilowatts. I don't think that's it. But certainly, we need to build an industry that can respond to increased demand density. And this is only one piece of the puzzle. The second piece of the puzzle is what the semiconductor industry does. They will innovate as well. One of the issues that we're dealing with right now is the idea of AI workload power swings. It's really irritating the utilities. It makes life difficult for utilities. The Nvidia's, the AMD's, the Intel's, they'll figure that out. They will figure that out incrementally,” he said.

“And then the last piece of the puzzle is the utilities themselves, which is the ability of the utilities to bring on more power generation. The ability of the industry to do more generations behind the meter. All of these solutions will slowly come together and will slowly resolve the gap between the demand for energy and the supply. But what will not happen is it will not happen quickly,” he added.

As an industry, Flanagan explained that it's important to recognize these as a multi-year challenge. And over time, the industry will catch up and will meet the demand.

“For Siemens, we're constantly looking at and working on our portfolio to make sure it's relevant from an electrification point of view, from an automation point of view, from a safety point of view. We're constantly updating our portfolio, and we continually make announcements about interventions and acquisitions and changes to our portfolio to meet our customer demand. I think that the real battleground for innovation is now going to become how we run the data center,” said Flanagan,

Flanagan also stated that Siemens will be at the forefront of innovating the software stack that manages the infrastructure.

“This isn't just about being able to respond to events; it's about being predictive. Based on a certain set of characteristics, we can predict what the data center operation requirement might look like. We would do that with a digital twin, or we would do that with an AI algorithm. By giving the customer that information, they can respond better, and they can run the data centers more effectively,” he explained.

For Flanagan, that is the future for Siemens and the data center industry. As he puts it, software-defined operations will eventually dominate how data centers run in the future.