We Need To Talk About An Energy Label For AI

Artificial intelligence (AI) can distinguish a dog from a cat, but the billions of calculations needed to do so demand quite a lot of energy. The human brain can do the same thing while using only a small fraction of this energy. Could this phenomenon inspire us to develop more energy-efficient AI systems?

Our computational power has risen exponentially, enabling the widespread use of artificial intelligence, a technology that relies on processing huge amounts of data to recognize patterns. When we use the recommendation algorithm of our favorite streaming service, we usually don’t realize the gigantic energy consumption behind it. The billions of operations needed to process the data are typically carried out in data centers. All these computations consume a tremendous amount of electric power. Although data centers heavily invest in renewable energy, a significant part of the power still relies on fossil fuels. The popularity of AI applications clearly has a downside: the ecological cost.

To get a better understanding of the total footprint, we should take two factors into account: training and inference. First, an AI model needs to be trained by a labeled dataset. The ever-growing trend toward the use of bigger datasets for this training phase causes an explosive growth in energy consumption. Researchers from the University of Massachusetts calculated that during the training of a model for natural language processing, 284 metric tons of carbon dioxide is emitted. This is equivalent to the emission of five cars during their entire life span, including construction. Some AI models developed by tech giants — which are not reported in scientific literature — might emit at a greater magnitude.

The training phase is just the beginning of the AI model’s life cycle. Once the model is trained, it is ready for the real world: finding meaningful patterns in new data. This process, called inference, consumes even more energy. Unlike training, inference is not a one-off. Inference takes place continuously. For example, every time a voice assistant is asked a question and generates an answer, extra carbon dioxide is released. After about a million inference events, the impact will surpass that of the training phase. This process is unsustainable.

Source: Forbes

an energy efficient artificial intelligence systemartificial intelligence newsefficient energy useelectrical energyenergy efficiency sector newsenergy sector market newsenerji maliyetlerini düşürmekenerji sektörü piyasası haberleriEnerji verimli bir yapay zeka sistemiEnerji verimli yapay zeka kullanımıenerji verimliliği sektörü piyasası haberlerireducing energy costssürdürülebilir yaşam haberlerisustainable life newstechnology and informatics newsteknoloji ve bilişim haberlerithe use of energy efficient artificial intelligenceUniversity of Massachusettsuse of artificial intelligenceuse of artificial intelligence in data managementveri yönetiminde yapay zeka kullanımıverimli enerji kullanımıyapay zeka haberleriyapay zeka kullanımı
Yorumlar (0)
Yorum Ekle