Artificial Intelligence and Carbon Footprint

To cut it short.
Be green. Try Storykube š
Global warming, climate change and the impact of AI on them
When we talk about global warming and climate change, the first thing that comes to our mind is transportation: cars,trucks, ships, trains, and planes, they all use carbon fossil fuels. Then thereās electricity production which is a well known consumer of coal and natural gas that produces greenhouse gas emissions. Industrial areas and production plants of all sorts are big carbon emitters as well, being certain chemical reactions necessary to produce goods from raw materials. But there are also other fields that have a big carbon footprint, like the commercial and residential sector, the farming and agriculture sector, the food industry, the land use and forestry.
We know that thereās a common goal: to bring emissions to zero by 2050, and limit the average global warming to 1.5 °C. In this catastrophic scenario, artificial intelligence has its own spot. Artificial intelligence is rapidly taking on an increasingly pivotal role in every aspect of our lives, from research to health, from website chatbots and virtual assistants, to streaming platforms and smart roads and homes. This use of AI to improve our lives has a cost: AI models consume a tremendous amount of energy, therefore their impact on global warming and climate change is huge but also and especially unspoken.
The process of deploying AI models and training them to spot patterns in data leads to the rising carbon footprint of AI and Deep Learning. Thatās because each piece of data learnt during the training corresponds to neural networks performing a lengthy sequence of mathematical operations. But pieces of data form large datasets require increased computation and energy consumption. Artificial intelligence algorithms need massive amounts of computational power to train. This, in turn, necessitates a vast quantity of power, making AI system training environmentally unsustainable.
Recently, University of Massachusetts Amherst researchers conducted a study, analyzing various natural language processing (NLP) training models available online to estimate the energy cost in kilowatts required to train them. Converting this energy consumption into approximate carbon emissions and electricity costs, the researchers estimated that the carbon footprint of training a single big language model is equal to around 300,000 kg of carbon dioxide emissions. In this paper, it is estimated that the carbon emissions associated with training a transformer model, such as BERT or GPT-2/3, is about five times the lifetime emissions of the average car in the US, including its manufacture.
When calculating AIās footprint, these are the factors that should therefore be analyzed: the algorithm, the processor, the datacenter and the energy mix. And these factors are the same that can possibly lead to a solution to the problem, by going directly to the source of the problem itself. Thereās a movement called Green AI, that finds solutions to make machine learning cleaner and greener. Letās see how.
The emissions incurred in the training of a neural network model, as the research mentioned above states, are also related to the location of the training server and the energy grid it uses, the length of the training procedure, and the hardware on which the training takes place. Hardwares are also big producers of heat, rising temperatures in the data centers at extremely high levels. Therefore, these centers require significant energy for cooling, constant adjustments to air temperature, pressure and humidity. Software firm Vigilent developed an AI-powered platform to control and automate the cooling of servers. The platform is linked to wireless sensors that constantly monitor the data center environment in real time. This technology was applied in a 5MW, 100,000-square-foot Evoque (AT&T) data center in the Chicago area. Over three months, the results were clear: the data center reduced its fan energy consumption rate from 430 kW to 300 kW, with a 30% decrease, so the annual energy savings was estimated at nearly $194,000. There was also a 23% cooling carbon decrease and only one out of six chillers needed to be operational.
Regarding algorithms, some of them are less power-hungry than others and also, reusing existing models instead of training them from scratch will result in energy saving.
But thereās another truth in this matter. Storykubeās CEO/CTO states that there is a basic madness out there: the research centers as well as the enterprise realities that develop and publish models, aim at sizes with billions of parameters. This triggers a sort of competition between all these realities. It is the same exact race we experienced in the 90ās, when the early Pentiums were released, MHz on MHz everybody was trying to beat the competitor on speed. But the problem now, that people donāt realize, is that itās actually not important to have billions of parameters: the size of a model doesnāt automatically make it efficient or high-quality. In a few words, bigger doesnāt mean better. Rather you have to optimize the dataset and develop smaller but dedicated models for specific tasks.
GPT-3 is an extremely generic model, with billions of parameters and a dataset that contains whatsoever (CommonCrawl). The number of parameters simply stands for how large the mathematical matrix representing that model is. This thing basically means nothing, because if part of this matrix is not needed for the task we want to accomplish we are unnecessarily using a large model. Thatās a waste. In all the shades of meaning that the word waste can have.
Returning instead to the issue of data centers, moving training sessions to a location supplied by renewable sources could be the first step towards a greener AI. As an example, researchers estimated that a training session in Estonia will produce 30 times the volume of carbon as the same session would in Quebec. Estonia relies on shale oil, while Quebec relies primarily on hydroelectricity.
How AI can help reduce its own carbon footprint
We already talked about AI-powered platforms to manage cooling systems in data centers. Many other companies are also using AI to monitor their emissions, predict their future ones and, armed with that technology, make adjustments to reduce them.
So basically, AI can help humanity towards carbon neutrality, but at the same time, it has a pretty big carbon footprint itself. So where should we place it? Right in the middle.
First things first, acknowledging the existence of AIās carbon footprint is a big step ahead. There are so many tools and frameworks to quantify the carbon cost of machine learning models, researches and studies which are analyzing and finding solutions to help understand AIās impact on global warming and therefore switch to a sustainable artificial intelligence use and application. Next step, of course, is to take action. Thereās still much we can do to save the spark of green left.
This is exactly what we are doing here at Storykube: weāre taking action. We want our proprietary artificial intelligence to have a zero carbon footprint. Weāre already doing so by relying on AWS servers and their Sustainability in the Cloud project. We are selecting AWS data centers that are using renewable energies, like the ones in Dublin (Ireland) or Virginia (USA). AWS projects go from using solar panels for producing energy to recycling non-potable water for data center cooling systems.
Moreover, we at Storykube train our AI using existing small/medium models highly optimized for the tasks we need, instead of starting them from scratch and using huge models with big datasets.
We value AI and what it can do to help people achieve zero carbon goals; but for the AI to help the rest of the world to be green, the AI must be green itself. In this looming devastating scenario, we have found our own little way to help make a green difference, hoping to inspire others as well.
Resources:
Andrews, E. L. (2020) āAIās Carbon Footprint Problemā, Stanford University Human-Centered Artificial Intelligence, 2 July. Available at: https://hai.stanford.edu/news/ais-carbon-footprint-problem (Accessed: 21 July 2022).
Gandharv, K. (2021) āWith A Rush To Create Larger Language Models, Are We Beating Their Purposeā, Analytics India Magazine, 19 August. Available at: https://analyticsindiamag.com/with-a-rush-to-create-larger-language-models-are-we-beating-their-purpose/ (Accessed: 21 July 2022).
Goled, S. (2021) āHow To Build Smaller, Faster, Better Deep Learning Modelsā, Analytics India Magazine, 29 June. Available at: https://analyticsindiamag.com/how-to-build-smaller-faster-better-deep-learning-models/
Morgan, L. (2021) āAI carbon footprint: Helping and hurting the environmentā, TechTarget SearchEnterpriseAI, 21 September. Available at: https://www.techtarget.com/searchenterpriseai/feature/AI-carbon-footprint-Helping-and-hurting-the-environment (Accessed: 21 July 2022).
Yao, d. (2022) āData Center World 2022: Using AI To Cool Data Centers Yields Big Cost Savingsā, DataCenterKnowledge, 30 May. Available at: https://www.datacenterknowledge.com/power-and-cooling/data-center-world-2022-using-ai-cool-data-centers-yields-big-cost-savings (Accessed: 22 July 2022).
Comments ()