🌅 Quantum Data Centers: The Next Horizon

And Meta’s $8.5bn/month data center habit, the summer of liquid cooling, and more!

Here’s what you should know today:

  • COOLING, TECH, AND POWER: Quantum Data Centers are the next frontier, DCF’s “Summer of Liquid Cooling”, and Nvidia’s robot algorithms

  • 🎧 LISTEN TO THIS: Yotta predicts the future with Digital Realty

  • BIG DEALS: Meta’s $8.5bn/month data center habit, Google builds amidst water concerns, Stonepeak + CoreSite DC

- Cooling, tech, and power -

Why Quantum Data Centers are the Next Frontier

Quantum processors promise to revolutionize how we store and compute data. However, integrating them into traditional data centers will be one of the industry’s next formidable challenges

The challenge lies in the technical and environmental mismatches between classical and quantum computing paradigms. Quantum hardware is susceptible to noise and vibrations, making the typical data center a hostile environment.
Additionally, the software architectures of classical and quantum systems don’t naturally align, creating further complications.

On-Premises Quantum Computing

Yuval Boger, Chief Commercial Officer at QuEra Computing, highlights the emerging demand for quantum computing: “Had you asked me two years ago, I would have said you’d be crazy to have an on-premises quantum computer. But it turns out that’s what everyone wants.”

This shift is partly driven by the need to minimize latency issues and keep sensitive data local. Traditionally, quantum hardware companies have offered cloud-based access to their devices, but the recent trend is toward installing them directly in data centers. 
However, the data center environment is challenging for quantum computers, which rely on stable conditions to maintain their delicate quantum states.

Jonathan Burnett, Technical Director at Oxford Quantum Circuits, underscores this issue: “You’ve actually got, in many ways, a horrible environment for anything quantum.”

Software Hurdles and Adaptations

The software side presents additional hurdles. Quantum computers operate on timescales of seconds, unlike the days-long computations typical of HPC tasks, necessitating software solutions to manage workloads efficiently.

Existing HPC facilities use workload managers that are not well-suited for the quick bursts of processing characteristic of quantum computers. To bridge this gap, companies like Oxford Quantum Circuits have developed workarounds, such as placing a software node in front of the quantum computer to mimic an HPC resource to the workload manager, while a separate job queue handles the quantum computations.

Future Integration and Evolution

David Rivas, CTO at Rigetti Computing, notes the evolving nature of quantum hardware integration: “We have to build quantum computing systems, not just qubits.” 
As quantum computers advance, the control systems that manage them will increasingly resemble HPC systems, facilitating deeper integration. This integration will allow quantum processors to communicate with classical resources over high-speed connections, potentially speeding up hybrid algorithms.

Looking Ahead

Ultimately, as quantum computers become more powerful and their applications more diverse, customers will demand lower-level access to maximize performance. This will necessitate bespoke tooling to support these advanced hybrid computing environments.

As Rivas emphasizes, the future of quantum computing lies in creating comprehensive systems that integrate seamlessly with existing infrastructure, transforming these cutting-edge devices from scientific experiments into practical, robust computing solutions.

_____________________________________

More Cooling, Tech, and Power to Explore

  1. DC Frontier announces LCC and the “Summer of Liquid Cooling”. Data Center Frontier just released a new piece on the launch of the “Liquid Cooling Coalition” along with major developments in the liquid cooling industry in 2024.
    The piece touches on the growth of immersion cooling and recent innovations by Intel, Submer, Shell Lubricants, and others. If you’re interested in this technology, this is a great breakdown of the current market.

  2. Microsoft hopes to transform how data centers use water. In Microsoft’s latest sustainability brief, they describe their goal to become water-positive by 2030.

    From optimizing cooling systems to integrating advanced technologies like direct-to-chip cooling, they've reduced water intensity by over 80% since the early 2000s. Their efforts include using reclaimed water and harvesting rainwater, and they've implemented predictive models to minimize inefficiencies.

  3. Nvidia announces new AI tools to accelerate robot development. 

🎧 Listen to This:

Yotta does a fantastic podcast featuring George Rocket and various industry juggernauts. This is a particularly good one:

- Big Deals -

Meta has an $8.5bn/Month Data Center Habit

Meta is investing heavily in AI infrastructure, with capital expenditure reaching $8.5 billion in July alone. This surge in spending is driven by the need for more servers, data centers, and networking equipment.

Revenue for Meta hit $39.07 billion, marking a 22% increase from last year and surpassing analyst expectations. The company projects next quarter’s revenue to be between $38.5 billion and $41 billion. Net income saw a significant jump to $13.46 billion, up 73%.

Despite capex being below analyst expectations of $9.51 billion, it still grew 32.8% year-on-year.
CFO Susan Li explained the strategy: “We’re employing a strategy of staging our data center sites at various phases of development.” This approach provides flexibility in meeting demand while controlling long-term spending.

Meta is preparing for substantial capex growth in 2025 as well. The company plans to leverage the infrastructure built for generative AI training for other purposes, such as ranking and recommendations. Li emphasized, “There’s a whole host of use cases for the life of any individual data center.”

CEO Mark Zuckerberg highlighted the future needs for AI development: “The amount of compute needed to train Llama 4 will likely be almost 10Ă— more than what we used to train Llama 3.” 

Meta’s rapid expansion of its AI capabilities introduces a certain risk of overbuilding infrastructure in the short term, (it does not expect to profit from AI in 2024) but Zuckerberg has indicated he would rather have too much compute built than not enough.

_____________________________________

More Big Deals:

  • Groundbreaking announced for 2 Google DCs, amid water usage concerns: Google is set to start building two new data centers in Dorchester County, S.C., amid ongoing protestations about the tech giant's water usage. Their existing Berkeley County center consumed 763.4 million gallons of water in 2023, drawing criticism from environmental groups for its significant impact on local water resources.

  • Synergy Research: Enterprise cloud spending topped $79B in Q2: This marks a 22% increase from last year. The Big Three—Amazon, Microsoft, and Google—dominate the market, with AWS leading at 32%. Notably, Oracle is making strides, now tied with Salesforce as the fifth largest cloud provider, while APAC leads regional growth at 25% YoY.

    Synergy Research Group

  • Stonepeak x CoreSite = New Colorado Data Center: Stonepeak, a leading alternative investment firm specializing in infrastructure and real assets, and CoreSite are teaming up to build an 18 MW data center in Denver, Colorado.
    Stonepeak will hold 85% of the joint venture, with CoreSite operating the facility. The $250 million project will expand CoreSite's Denver presence. Stonepeak manages approximately $71.2 billion in assets, focusing on defensive, hard-asset businesses globally.

Thank you for reading. Please let us know how we did by replying to this email.

Also, you can support this newsletter by sharing it with a friend or colleague.

-Taylor