Sustainability is key for enterprise AI programmes to make a positive difference

Like Comment

AI-based business applications, leveraging algorithms and modelling to turn data into actionable insights, are growing in both maturity and popularity.

Whether these applications are protecting organisations from cyber attack, predicting future consumer behaviour, optimising supply chains, redefining product design or forecasting business risks, enterprise AI is on the brink of becoming mainstream, and will dominate the way organisations make decisions - and revenue - for many years to come. 

Yet while AI offers many benefits to business - and with it, society - it is worth bearing in mind that these models are significant generators of carbon emissions. Indeed, in 2019, University of Massachusetts found that the carbon emissions from training a single large AI model can be up to five times greater than the amount produced during the whole lifetime of a car. Or, to put it another way, about the same as 300 round-trip flights between New York and San Francisco.

 By its very nature, AI has a voracious appetite for power. It takes a huge amount of computational power to train deep learning applications and run analytics on massive datasets. According to OpenAI researchers, the quantity of computing power used for deep learning research doubles every 3.4 months. All of this computational power equates to markedly high energy consumption and while it’s true that some AI applications are playing a significant, positive role in tackling climate change, it is impossible to ignore the negative impact these workloads have on the environment.

With this in mind, a collective effort needs to be made by organisations across all sectors to use AI without sacrificing the environment, with the focus placed on efficiency rather than progress.

Fortunately, there are a host of approaches sustainably-minded organisations can consider to minimise the carbon footprint of their AI initiatives. The first step is to ensure that the infrastructure that powers their innovation is housed in data centres that can efficiently handle the high-density compute involved.

Less than 20 percent of enterprise hardware - and an even smaller proportion of the hardware running AI models - needs to be located close to an organisation’s headquarters or other business operations. This means organisations can strategically locate their high intensity workloads in data centres that are optimised for these workloads and where they can be powered by renewable energy sources. This includes countries such as Iceland, where 100 percent of electricity is generated by hydroelectric and geothermal energy.

There are also environmental advantages that can be gained within the data centre, starting with how the hardware is cooled. On average, cooling IT equipment equates to 40 percent of a data centre’s total energy consumption, yet there are alternative methods of cooling which can be used to reduce the carbon cost of preventing hardware from overheating – such as water cooling. Such methods of cooling are becoming more popular and necessary in data centres as demand grows for ever-increasing compute power. Data centres located in cooler climates, such as the Nordic countries and Iceland also offer free-air cooling, which can drastically reduce the carbon cost of running AI models. 

Balancing sustainability with operational requirements

While CIOs will of course be mindful of their organisations’ ESG goals and commitments to reducing carbon emissions, they have to balance these environmental objectives with the need to ensure their infrastructures are resilient, scalable and fit-for-purpose. Selecting a data centre for its green credentials alone is not enough, it also needs to be able to support the highly specific and growing demands of high density compute. 

A key priority for organisations growing their reliance on AI is that the infrastructure supporting their workloads can scale alongside both their datasets and their overarching business ambitions. These enterprises should house their workloads inside data centres that are optimised to support high density computing, that provide the flexibility to grow, and that are supported by a team of on-the-ground experts who have the skills required to ensure maximum performance at all times. 

To learn about how Verne Global’s sustainable data centre solutions can power your next AI innovations, please visit our dedicated AI webpages. 

You can view more Verne Global content here 

Verne Global

Verne Global provides sustainable data centre solutions that are specifically designed, engineered and optimised to support high intensity compute workloads such as AI, machine learning, high performance computing (HPC) and supercomputing.

Located in Keflavik, Iceland, the company’s 40-acre data centre campus is powered by 100 percent renewable energy sources. By leveraging Iceland’s clean and reliable power grid, as well as a predictable climate that ensures 24/7/365 free cooling, Verne Global enables organisations to cut costs, carbon emissions and energy usage. Furthermore, the company’s specialist team of experts provides on-site, around-the-clock customer support, helping them to continually maximise the performance of their workloads. 

Founded in 2012, Verne Global’s campus today powers key innovations within some of the world’s most demanding industries, including financial services, earth sciences, life sciences, engineering, scientific research and AI. Its customers span industry giants to pioneering young start-ups, and include BMW, Volkswagen, Earlham Institute, DeepL, Threatmetrix and Wirth Research.

4 Contributions
0 Followers
0 Following