Developing a nuclear strategy to power data centers

Microsoft is seeking a “Principal Program Manager for Nuclear Technology” with the intention that this individual will develop a strategy for how Microsoft’s own data centers hosting cloud and AI can be operated with small nuclear reactors. Other public and private operators of large cloud infrastructure are facing the same questions as hungry AIs add to an already exponentially growing data mountain, and with it the need for digital computing power, storage and even more electricity. What can companies and individuals do to slow or even reduce the hunger for more resources?

Microsoft is the first cloud giant to publicly declare that it will have a nuclear-capable strategy to make itself more independent of fossil fuels, providing the kind of focused energy it says it needs for the future cloud and AI. The vacancy specifies that the manager should focus on so-called Small Modular Reactors (SMRs) and develop an energy strategy for microreactors. These SMRs are cheaper, more mobile and less risky than conventional reactors and do not emit CO2. They can also generate up to 35 MW. Four such reactors would probably be enough to power a data center.

This intermediate step seems necessary because two effects reinforce each other; The global energy crisis caused by the war in Ukraine has made dependencies transparent and energy prices more expensive. At the same time, the rapid success of ChatGPT and other AIs based on large language models has fueled the desire for data and more computing resources. Hardware sales by Nvidia – which specializes in data centers and AI – rose to $10.32 billion, up 171 percent from the previous year.

Now Microsoft is the first major software vendor to look for solutions to power its growing infrastructure without compromising its own carbon targets. Every major vendor, be it Apple, Alibaba, AWS, Google, IBM, etc., will have to ask themselves the same question about how to tackle this challenge. No one will want to miss the AI ​​trend, and they all have publicly stated sustainability goals – often driven by their own governments – to achieve.

But private companies with their own large cloud infrastructure are also in the same dilemma. Their AI is trained with their own data to offer customers intelligent services or deliver software-driven offers. Tesla’s big AI project on autonomous driving is perhaps the most popular of these projects. “It’s like ChatGPT, but for cars,” says Dhaval Shroff, describing the approach. He is a member of the manufacturer’s autopilot team. For many reasons – intellectual property protection being the biggest – these projects use their own internal resources, keeping the learning AI and the essence of the business within their own walls.

The exponentially growing energy needs of this new infrastructure run counter to the political objectives of numerous global and European initiatives such as COP26 and the 2020 ‘European Green Deal’. The aim of the Green Deal is to make Europe climate neutral by 2050. This initiative continues with the “European Digital Strategy”, which aims to ensure that data centers are climate neutral by 2030. European Commission Executive Vice-President Margrethe Vestager said: “We cannot allow our electricity consumption to increase unchecked.” The International Energy Agency says emissions from data centers around the world should be at least halved by 2030. This was before the sudden expansion of AI began to increase the volume of computing and data. Data center owners must take both challenges seriously.

Mark Molyneux

CTO for EMEA at Cohesity.

Address a cause

The growth of data is increasing again, because an AI simply learns faster as more information can be evaluated. And today, the amount of data in more than half of all companies is already growing by an average of 50 percent per year. However, most companies have cluttered their IT infrastructure with data, and on average they don’t even know 70 percent of the content. This unstructured dark data contains cat videos, the menu from the last Christmas party, outdated copies of databases, research results, all mixed in with data that needs to be retained for regulatory and commercial purposes.

This data needs to be cleaned up, and not just to reduce the risk of lawsuits. Anyone who cleans and disposes of data waste can feed their AI with high-quality content and free up space for new data. To do this, the data must be indexed and classified based on its content and value to the business. Here too, AI plays a key role in classifying the content very accurately and quickly.

Companies need to consolidate their data on a common platform instead of continuing to operate dozens or even hundreds of separate silos. There, this data can be further reduced using standard techniques such as deduplication and compression. In everyday life, reduction percentages of 96 percent are possible.

What every individual can do

Each user can help reduce overall energy consumption and slow down data growth, because anyone can search their data in the cloud and delete what is useless. These could be X-folded versions of the same photo with a slightly different perspective, or videos you once thought were funny and haven’t watched since. That cat video maybe. Every bit we can save by reducing our stored data will reduce energy consumption. So let’s start cleaning up.

We recommended the best green web hosting.

Related Post