Overview of AI’s Environmental Impact
As AI continues to advance and expand, the industry is proactively addressing the growing demand for electricity and water to power and cool the servers that make this technology possible. A standard DGX computer, the gold standard for AI work, consumes over 10KW of power. Big Tech will buy millions of these systems this year, using more power than all of New York City, and with this comes a responsibility to find sustainable ways to manage the energy consumption. To mitigate the environmental impact, researchers and engineers are already developing innovative solutions.
The Growing Environmental Challenges in AI Technology
But it is not just the electricity needed to run these computers. They get hot, really hot, and so they need cooling. You have to get rid of that heat. That typically takes up two times more power than the actual computer. So now that 10KW machine is really using 30KW when running. These new servers will consume three times more than all of the electricity used in California in 2022! To get around this, server farms are exploring alternative cooling methods, such as using water, to reduce electricity usage. While this shifts the resource burden, it also presents an opportunity to develop more efficient and eco-friendly cooling technologies.
Sustainable Solutions for AI Energy and Water Usage
This saves electricity, but is using our precious fresh water to help cut costs.
Case Studies: Effective AI Sustainability Techniques
AI is hungry for power, and things are going to get worse. How can we solve this problem? Fortunately, researchers are already starting to pursue more efficient methods of making and using AI. Four promising techniques are model reuse, ReLora, MoE, and quantization.
Selecting Technologies for Sustainable AI
Model reuse involves retraining an already trained model for a new purpose, saving time and energy compared to training from scratch. This approach not only conserves resources but also often results in better-performing models. Both Meta (Facebooks parent) and Mixtral have been good about releasing models that can be reused.
ReLora and Lora reduce the number of calculations needed when retraining models for new uses, further saving energy and enabling the use of smaller, less power-hungry computers. This means that instead of relying on large, energy-intensive systems like NVidia’s DGX, a modest graphics card can often suffice for retraining.
MoE models, such as those recently released by Mistral, have fewer parameters than conventional models, resulting in fewer calculations and reduced energy consumption.
Moreover, MoE models only activate the necessary blocks when in use, much like turning off lights in unused rooms, leading to a 65% reduction in energy usage.
Advantages of Energy-Efficient AI Models
Quantization is an innovative technique that reduces the size of AI models with minimal impact on performance. By quantizing a model, the number of bits required to represent each parameter is reduced. This shrinks the model size, enabling the use of less powerful and more energy-efficient hardware. For instance, a massive 40 billion parameter model would typically require an energy-hungry GPU system like the DGX to run efficiently. But with quantization, this same model can be optimized to run on a low-power consumer GPU, like those found in most laptops. While quantization can slightly reduce model accuracy in some cases, for many practical applications this tradeoff is negligible or unnoticeable. The performance remains excellent while requiring a fraction of the computing resources.
The Impact of Sustainable Practices in AI on Industry
Overall, quantization provides a way to make AI models much more efficient, compact and eco-friendly, minimizing the hardware requirements and energy consumption. This allows state-of-the-art AI to run on ubiquitous consumer devices while maintaining accuracy where it matters most. Quantization represents an important step towards scalable and sustainable AI.
Current Status of Sustainable AI Developments
By combining these four techniques, we have successfully reused a 47 billion parameter MoE model and retrained it for a client using a server that consumes less than 1KW of power, completing the process in just 10 hours. Furthermore, the client can run the model on standard Apple Mac computers with energy-efficient M2 silicon chips. At smartR AI, when developing and training new models, such as our generative AI loyal companion SCOTi® AI, we have been privileged to be able to utilize the super computer at EPCC, Edinburgh University, reducing the time span required for training of models substantially – we trained a model from scratch in nearly one hour.
Timeline of Advances in AI Sustainability
As AI becomes more prevalent, we all need to start thinking more proactively about the energy and water usage. Research into more efficient training and utilization methods is yielding promising results. But we also need to start using these methods actively; by integrating these new techniques into our tool flows, we not only benefit our clients but also contribute to a more sustainable future for our planet.