Artificial intelligence learns to think green

Artificial intelligence is present in almost all areas of daily life, from the algorithms that predict the content we want to see based on our past choices to those that help detect diseases from medical images.

Artificial intelligence learns to think green

Artificial intelligence is present in almost all areas of daily life, from the algorithms that predict the content we want to see based on our past choices to those that help detect diseases from medical images. This branch of computer science that ensures that machines have the same capabilities as people, such as learning or reasoning, has endless applications that improve decision making, although its benefits go hand in hand with a huge impact on the environment. To get an idea, the use of ICT solutions today represents between 5 and 9% of electricity consumption worldwide, a figure that could reach 20% in 2030, according to a report by the European Parliament published in May of the year past.

In the case of artificial intelligence, both the training of the algorithms and the processing of the data generate an ecological bill that is difficult to digest. A group of researchers from the University of Amherst (Massachusetts, United States), for example, revealed in a 2019 study that feeding information to a computer for human language processing involves the emission of about 284,000 kg of carbon dioxide equivalent, five times more than what a car produces during its useful life, including manufacturing. A problem that, given the accelerated disruption, will worsen in the future unless these mathematical formulas begin to consider guidelines of respect for the planet.

This is what is known as green algorithms, a concept that includes both those that are more efficient and generate less energy consumption to reach the same goal, and those that, through their use, contribute to increasing the sustainability of other activities, such as those that are They are used in agriculture to reduce the use of water and fertilizers. They are the two aspects of the same revolution that is still embryonic but that, according to experts, is already irreversible.

“A lot of work has always been done to make hardware efficient, with components that are smaller and faster and that perform more complicated tasks, but for a few years it has been detected that final consumption also affects how the software is made. », explains Coral Calero, director of the Green Algorithms area of ​​OdiseIA and professor at the University of Castilla-La Mancha. This new paradigm is penetrating little by little within the technology. "Companies that develop software are realizing that green algorithms have to be part of their way of working and are beginning to implement them, which is something incipient but unstoppable," says the expert.

The projections for the next few years are overwhelming. It is expected that in 2025 463 exabytes of data will be created every day and that the 'data centers', which proliferate like mushrooms, will require, in the worst case, almost 8,000 terabytes per hour of data in 2030 and, in the best case, , 3,000 terabytes per hour, which highlights the need to take measures to minimize the energy resources associated with technological deployment.

But how can algorithms be made more environmentally friendly? The DigitalEs employers, in a recent report dedicated to this issue, makes it clear. The key is to strike a balance between the volume of data needed to train the models, the amount of time to train them, and the number of iterations to optimize their parameters. Put like this, everything seems very abstract, but Coral Calero clarifies the issue with a specific example.

In the field of 'machine learning' (automatic learning), the optimizers that consume more energy are the ones that offer the best precision, so they are usually chosen despite their consequences. “Achieving 2% more accuracy in the algorithm's classification involves twice the consumption, while going from 85 to 94% accuracy leads to 30 times more consumption. If we classify objects that are in fashion, it may not matter to us that there are 15 that do not classify well, but if we talk about things that must be taken into account to detect a disease, that 94% is very important. Artificial intelligence is now going to the maximum, but many times such precision is not worth it for the problem we are solving and, however, from the point of view of energy consumption, the jump is very large », he warns.

Tamer Davut, partner of Governance, Risk and Compliance at PwC and Chief Digital Officer of the Assurance division of the consulting firm, also emphasizes this issue. “To achieve smaller and smaller increments in algorithm improvement you have to consume more and more energy, that is, the marginal improvement costs more and more from an energy and consumption point of view,” he details. To deal with this situation, several options are being studied. «It may happen that the balance between what you want to improve and what you are spending to improve, from the point of view of sustainability, does not compensate. In that case, instead of taking a giant sample to achieve the marginal increase, a small sample is selected that allows reaching the same objective by optimizing the algorithm development method”, comments the expert.

Another alternative is to seek efficiency through the use of existing models: «Instead of testing an intelligent algorithm from scratch, which costs a lot from the point of view of the learning curve and energy consumption, existing algorithms can be used» . They are advances in the right direction, yes, still in the process of maturity. "Right now the focus is more on developing algorithms that are more reliable than on reducing the carbon footprint of the algorithm itself, but the developer community is becoming aware and academics are doing more studies," says the expert.

The employers' association DigitalEs also stress the importance of carrying out tests on a small scale so that the algorithms are tested and validated on simple equipment before being deployed in the cloud, where the volume of data and the associated cost-consumption are greater. .

For all these proposals to be adopted in a general way, the sources consulted believe that there are still some obstacles to overcome. «Calculating the carbon footprint of an algorithm is not easy because it depends on which model is applied, the hardware that is being used to achieve it, the energy sources used... The same algorithm in different environments can have a very high energy consumption. different, therefore, there is a complexity in the calculation and in the comparability", emphasizes Davut. This is one of the great challenges to be tackled, as Sara Hernández, DigitalEs' regulation and sustainability consultant, agrees: "It is necessary that there be a great success in research and that companies dedicate resources to measuring exactly how much it consumes, not only the algorithm itself working but its development. It is necessary that this process of creation and use be measured to know the impact on energy consumption.

Along these lines, Coral Calero argues that just as appliances carry consumer labels, algorithms should have a similar certification. «There are people who are already aware of the environment and prefer to spend a little more knowing that the product is efficient. At OdiseIA we want software companies to be able to label their products so that the user, be it an individual or a company, can decide to invest a little more in the software they buy knowing that it will be efficient during its use. That would be the ultimate goal », she deepens. Within the field of artificial intelligence, not all techniques generate the same impact. Some, with deep learning leading the way, need more processing power. «They are algorithms that usually work with images and videos, data that weighs a lot, so that storing them, managing them, etc. entails high consumption. And in general, in AI, as in other software, the more movement the data has, the worse it is for the environment », she recalls Calero.

The growing concern about the ecological footprint of algorithms has led nine European Union countries, including Spain, to promote strategies that include AI applications related to environmental problems in their priority areas or areas of interest. The Ministry of Economic Affairs and Digital Transformation, through the Secretary of State for Digitization and Artificial Intelligence, is working on the National Program for Green Algorithms, which will be presented soon.

From DigitalEs they comment that, for now, little is known about the plan, except that it is planned to give aid for the development of "best practices" in sustainable AI (surely via subsidies), which is one of the suggestions made by the employers in their report . They point out that this aid should be complemented with actions such as supporting the national supercomputing network (to "train" the algorithmic models there), training technical personnel in sustainability skills, or fiscally incentivizing (broadly) innovation projects aimed at sustainability. environmental. "In Spain we are a little ahead, as is Europe, which is better positioned than the United States or Canada," says Coral Calero. There is still a long way to go, but few today doubt that the future of algorithms is written in green.

NEXT NEWS