May 10.2022

AI and Sustainability: Friends or Foes?

AI and Sustainability: Friends or Foes?

Artificial intelligence has recently come under fire for being energy-intensive. Read to uncover how the AI sector can improve its impact on the environment.

As the next frontier of the Internet of Things, AI pushes IoT beyond sensor and actuator functionalities to a realm where automation is driven by large-scale data processing and analytics. However, despite its many positive effects on lives and industries, modern-day AI has a relatively less discussed dark side: its adverse impact on the environment.
 
AI systems can be energy-intensive, and as they become more widespread, their environmental impact is becoming too big to ignore. A study by the University of Massachusetts, Amherst, found that training AI models can emit more than 626,000 pounds of carbon oxides, which is nearly five times the average a car produces in its lifetime.
 
Moreover, many AI systems are built on data centers, whose high energy use takes a significant toll on the environment. For example, Greenpeace East Asia estimates that electricity consumption by Chinese data centers will register a 289% growth rate over the next 15 years, resulting in severe damage from carbon emissions as 61% of the country's electricity comes from coal.
 
Thankfully, solutions to power-hungry AI systems exist, and some involve fixing the problem with AI itself. Today's article discusses AI's growing impact on the planet's well-being and how developers, users, and regulators can make it more sustainable.

Why does AI consume so much energy?

Artificial intelligence has advanced by leaps and bounds in recent years, primarily driven by unprecedented improvements in computing power. Unfortunately, this growth has come at a heavy price. While today's AI models can analyze massive volumes of datasets to reveal insights that dramatically improve decision-making, training them requires substantial amounts of energy. According to OpenAI, the computational resources needed to produce best-in-class AI models, such as those in the GPT-n series, have doubled every 3.4 months since 2012.
 
GPT-3, the latest in OpenAI's transformer series, was trained on a dataset containing 45TB of data, 10 billion parameters, and 175 billion tokens, a monumental leap from its predecessor, which only had 1.5 billion parameters. This spike in dataset sizes has led to skyrocketing energy consumption and an exponential increase in greenhouse gas emissions.
 
Another factor driving AI's high energy use is the extensive experimentation required to develop a model. Today's deep learning approaches rely primarily on trial and error. Practitioners typically build hundreds of versions of a model during training, experimenting with various neural architectures and parameters before identifying the optimal design. These trial runs add up, consuming even more energy and exacerbating AI's carbon footprint.
 
AI's environmental impact does not stop with training. Once deployed in the real world, AI models consume energy as they process data and make predictions - a process known as inference. For example, the AI underlying a self-driving car must perform inference continuously over its entire lifetime to navigate.

Can AI become more sustainable?

Given the impact of AI on the environment, AI players must work towards making the technology more sustainable. Below are five ways developers and users can improve efficiency during training and inference. 

1. Selecting only the most relevant training data

When training AI models, researchers can select only the datasets relevant to the system's core purpose rather than using a massive web of information. This approach helps reduce design time and the number of trials performed, resulting in considerable energy savings. 

2.Implementing edge computing

Deploying AI models on edge devices like laptops and smartphones can help developers reduce the number of inference requests to traditional cloud services. By minimizing the reliance on cloud servers, edge devices deliver better data security, faster computing, and greater efficiency. They also address the energy usage spikes that occur during busy cloud-traffic hours. 

3. Using energy-efficient hardware

Training and deploying AI models on energy-efficient hardware, such as graphics processing units (GPUs), can also help lower emissions. While the cost of these devices may be higher than traditional central processing units (CPUs), they often offer better performance per watt, which means they require less energy to operate.

4. Improving transparency in energy measurements

Data centers that host AI systems need to make the information they collect on their energy usage more readily accessible. Doing so can help researchers better understand the environmental impact of training and deploying AI models to minimize it further. 

5. Assessing energy consumption before training

Before training begins, practitioners can estimate a model's energy requirements and plot expected energy costs against performance gains. Explicitly quantifying this trade-off will help them make more informed decisions about resource allocation accounting for diminishing returns.
 
Researchers from the Mila AI institute in Quebec have developed a Machine Learning Emissions Calculator that developers can use to estimate the carbon footprints of their models based on factors like hardware, cloud provider, and geographical region.

How does AI support climate action?

Besides AI's potential to become more sustainable, it can also play an essential role in sustainability efforts. AI is steadily finding use in IoT projects that support climate action in key focus areas like power grids, agriculture, and transportation.
 
  • In electricity grids, AI-enabled IoT (AIoT) applications are helping balance supply and demand in real-time while optimizing energy consumption and storage to reduce rates. These solutions are particularly critical in decentralized energy networks, which draw from various sources, such as microgrids, solar panels, and wind farms, and are more sensitive to fluctuations.
 
  • In agriculture, AIoT is finding use in collecting and analyzing farm data for yield predictions, irrigation, and crop monitoring. With AI and IoT devices, farmers can grow more food with fewer resources, increasing efficiency and reducing waste.
 
  • In the transport sector, AI is helping reduce traffic congestion and energy consumption by monitoring and analyzing traffic patterns on roads, railways, and in the skies. AI is also aiding the development of more fuel-efficient vehicles and better planning systems for public transport networks.

So, is AI a friend or foe of the environment?

AI has a substantial carbon footprint today, and if the demand for sophisticated intelligence continues to rise, the impact will soon become much worse. Unless stakeholders can quickly reassess and reform AI strategies, the field could morph into a significant antagonist in the fight against climate change. Some scientists are even comparing AI to fossil fuels, which have contributed extensively to industrialization and development, but at the cost of long-term planetary health.
 
On the bright side, the AI community still has some time to take action and reshape its outlook. In addition to reducing AI's environmental impact through efficient training, Companies and research institutions can bolster sustainability by developing solutions that support climate action. With continued development, AI could soon become one of the world's best allies in climate action.
 
If you are working in sustainability, you can boost your efforts tremendously with AI and IoT solutions. Create your free IoT2Market account and start exploring the top AIoT products today.


This website uses cookies to ensure you get the best experience on our website.
Decline
Settings