AI is poised to take over the data center

There’s been a lot of hype around artificial intelligence (AI) lately, but a new Gartner report indicated that talk hasn’t turned into IT spending – yet. Once the money does start to flow in AI’s direction, Gartner predicted it will be dedicated to hardware, software and services that companies are already using. Taken alongside a separate Dell’Oro Group forecast, it seems all arrows are indicating that data centers will be at the heart of AI transformation.

Cyxtera SVP and Field CTO Holland Barry told Silverlinings that the current AI bonanza has been a long time in the making. “The companies that are in part powering a lot of these large language model [LLM] solutions, we saw the seeds getting planted a couple years ago,” he said. That was about 24 months ago. Over the last 12 months, customers began “really asking for a lot of capacity.”

Barry said he doesn’t really see that changing anytime soon.

“We’re privy to RFPs, really large, to just deploy these massive farms of GPUs to do LLM activities. I don’t see that slowing down in the next couple of years,” he said.

His comments mesh with Dell’Oro’s report, which predicted 20% of Ethernet data center switch ports will be connected to AI servers by 2027. Further, Dell’Oro forecast AI will account for an even greater percentage of server revenue, approaching 50% in the same timeframe.

“Our current market sizing does not even encompass the full potential growth propelled by the rise of new generative AI applications and Large Language Models (LLMs),” Dell’Oro VP Sameh Boujelbene said in a statement.

“These groundbreaking AI applications necessitate a distinct AI network known as the back-end network, which serves as the infrastructure to connect all accelerated servers,” she continued. “This back-end network differs from the front-end network used to connect general-purpose servers, which is the main scope of this current Ethernet data center switch report.”

Ch-ch-changes

Cyxtera is one of the world’s largest data center providers with over 60 data center facilities in more than 30 markets, and thus has a front row seat to all the AI activity.

Right now, Barry said a lot of the large-scale AI deployments in its data center sit on the colocation side of things — that is, Cyxtera provides space within its data center and the customer builds its own architecture. Thus, for instance, a customer can choose to run its workloads with 400G connections without forcing Cyxtera to upgrade its entire facility.

But while 25G and 100G are sufficient for more enterprise workloads today, Barry said he believes AI will drive conversations around capacity and availability — and is indeed already starting to. It’ll also require thoughtful consideration of how these workloads might impact Cyxtera’s internal SDN fabric and its ability to connect disparate points within the data center and beyond.

“These workloads demand a lot more from a network pipe perspective than other types of workloads. So how are we architecting a larger core, how are we facilitating 25-gig, 100-gig, up to 400-gig connectivity between compute nodes and storage,” he said. “We’re already seeing it.”

Additionally, Barry said these changes are pushing Cyxtera to make architectural changes in terms of how it cools its racks as well, as it sees increased demand for liquid cooling and nascent interest in immersion cooling.

Dell’Oro’s Boujelbene in a blog also tipped the AI trend to drive momentum behind open standards and increased adoption of Ethernet network chips for AI workloads rather than today’s InfiniBand technology.


Want to learn all about cloud data center strategies? Catch Silverlinings’ on-demand Cloud Data Center Strategies virtual event here.