MIT Students Are Searching for Ways AI Data Centers Can Be More Eco-Friendly

With the AI boom, the more pressing matters that people talk about are the threat it brings to both privacy and employment, as AI can soon be advanced enough to take over jobs. However, another issue is being overlooked, mainly how its data centers are affecting the environment.

Data Center
Getty Images

MIT Students Are Finding a Solution

With a lot of people using AI models, data centers need much more power to be able to process them all, which means that their carbon footprint is expanding, which comes from electricity consumption for energy, water consumption for cooling, and other factors.

There are no signs that this will slow down anytime soon. If anything, usage will increase as more AI models and tools are developed in the artificial intelligence boom. Reports say that cloud computing is contributing up to 3.7% of global greenhouse emissions.

The Lincoln Laboratory Supercomputing Center (LLSC) at the Massachusetts Institute of Technology (MIT) saw how the data centers are affecting the environment and are set to find a solution that would decrease its impact.

The Senior LLSC Staff leading the research, Vijay Gadepally says that energy-aware computing is not really a research area, because everyone's been holding on to their data, adding that "Somebody has to start, and we're hoping others will follow."

The team is doing this by capping the power intake of GPUs, which is what powers the AI models. Through this method, the researchers managed to reduce the energy consumption of an AI model by 12-15%, as per Interesting Engineering.

However, reducing the capacity of power leads to AI models taking more time to train, In an experiment, the researchers saw that Google's BERT language model saw a two-hour increase in its training time when the GPU power was capped at 150 watts.

For context, Open AI actually uses a lot of processing power for its large language model. The Nvidia GPU the company uses for GPT-3 uses 1,300 megawatt-hours of electricity, and they use approximately 10,000 GPUs to train the AI model.

OpenAI's Water Consumption

A study showed that OpenAI needed a lot of water to keep its data centers cool when training ChatGPT. The AI tool being used by a single person is equivalent to a large bottle of water being thrown out. Overall, GPT-3 used 700,000 liters of water during training.

It goes without saying that the massive water consumption for the data centers can prove to be harmful to the environment, and it could take away resources for those who need water. Researchers are already looking for ways to make the cooling systems more efficient.

Those from the University of California Riverside and the University of Texas Arlington who are working on the paper called "Making AI Less 'Thirsty'" said that the freshwater needed to train GPT-3 is equivalent to the amount needed to fill a nuclear reactor's cooling tower, as per Gizmodo.

With AI models and tools becoming more advanced, along with the growing demand for better AI technology, there's a real possibility that AI operations may no longer be sustainable, and development could be stunted by it.

© 2024 iTech Post All rights reserved. Do not reproduce without permission.

More from iTechPost

Real Time Analytics