The artificial-intelligence (AI) industry is often compared to the oil industry: Once mined and refined, data, like oil, can be a highly lucrative commodity. Now it seems the metaphor may extend even further. Like its fossil-fuel counterpart, the process of deep learning has an outsize environmental impact.
In a new paper, researchers at the University of Massachusetts, Amherst, performed a life cycle assessment for training several common large AI models. They found that the process can emit more than 626,000 lbm of carbon dioxide equivalent—nearly five times the lifetime emissions of the average American car (and that includes manufacture of the car itself).
It’s a jarring quantification of something AI researchers have suspected for a long time. “While probably many of us have thought of this in an abstract, vague level, the figures really show the magnitude of the problem,” said Carlos Gómez-Rodríguez, a computer scientist at the University of A Coruña in Spain, who was not involved in the research. “Neither I nor other researchers I’ve discussed them with thought the environmental impact was that substantial.”
The paper specifically examines the model training process for natural-language processing (NLP), the subfield of AI that focuses on teaching machines to handle human language. In the last two years, the NLP community has reached several noteworthy performance milestones in machine translation, sentence completion, and other standard benchmarking tasks. OpenAI’s infamous GPT-2 model, as one example, excelled at writing convincing fake news articles.
But such advances have required training ever-larger models on sprawling data sets of sentences scraped from the Internet. The approach is computationally expensive—and highly energy intensive.
The researchers looked at four models in the field that have been responsible for the biggest leaps in performance: the Transformer, ELMo, BERT, and GPT-2. They trained each on a single GPU for up to a day to measure its power draw. They then used the number of training hours listed in the model’s original papers to calculate the total energy consumed over the complete training process. That number was converted into pounds of carbon dioxide equivalent based on the average energy mix in the US, which closely matches the energy mix used by Amazon’s AWS, the largest cloud services provider.
Read the full story here.
Find the paper here.
Don't miss our latest HSE content, delivered to your inbox twice monthly. Sign up for the HSE Now newsletter. If you are not logged in, you will receive a confirmation email that you will need to click on to confirm you want to receive the newsletter.
3 Oct 2019
- Calgary, Alberta, Canada
Being Human - reserve your place at this one-day course
3 - 4 Oct 2019
- Calgary, Alberta, Canada
For a better understanding of geomechanical factors, attend this course
1 Nov 2019
- Bali, Indonesia
Don't miss out!
9 - 11 Nov 2019
- Abu Dhabi, UAE
The programme combines expert input, case studies, and immersive scenarios from the E&P and other industries to embed your learning and enable you to progress to the next level of your career.
10 Nov 2019
- Abu Dhabi, United Arab Emirates
Safety Leadership focuses on the ‘Human Factors’ (HF) which complement technical training to optimize reliability, safety, compliance, efficiency and risks within a team-based environment.
This course will help you develop a better understanding of factors that could impact your daily economic decisions as well as establish a new set of applicable tools to use in your professional career.
Through this workshop, attendees will go through the different processes involved in strategic planning including the elements of organizational SWOT, business scenario and options development, elaboration of strategic options and communication to stakeholders.
HSE Now is a source for news and technical information affecting the health, safety, security, environment, and social responsibility discipline of the upstream oil and gas industry.
©2003-2019 Society of Petroleum Engineers, All Rights Reserved.