On October 2, OpenAI revealed it had successfully closed its highly anticipated funding round, securing $6.6 billion. It marks the largest venture capital deal to date, and brought the company's valuation to $157 billion. But it was just the latest funding round in the seemingly unstoppable development of generative AI.
The Intersection of Two Events
The funding announcement came the same week that the small town of Spruce Pine, NC, was devastated by Hurricane Helene after the storm dropped over two feet of rain on the area. While Spruce Pine may not be a household name, the location of several pure quartz mines in the town mean this event could significantly disrupt the global supply of microchips and solar panels. Operations have since resumed in one mine, but full operations are still on hold.
While the two events may appear to be unrelated, it is noteworthy that microchips, particularly advanced processors like those used in GPUs and TPUs (Tensor Processing Units), play a critical role in generative AI by providing the necessary computational power to handle the large-scale data processing.
Climate change and its consequences remains a hotly debated issue, although even ChatGPT acknowledges hurricanes like Helene are generally getting worse, and climate change is a significant factor behind this trend. And the uncontrolled use of generative AI appears to be contributing to the problem.
At the recent ALL IN artificial conference in Montreal, Sasha Luccioni, a leading researcher and AI & climate lead at Hugging Face decried the fact that generative AI was being used in internet search.
“I find it particularly disappointing that generative AI is used to search the internet,” she said according to French news agency AFP. She noted the difference between the basic functioning of a search engine to extract information, such as finding the capital of a country, and the information generation capabilities of GenAI programs, which make it "much more energy-intensive.”
Related Article: Generative AI Is Pushing the Limits of the Power Grid
Generative AI's Collateral Environmental Harm
One of the complaints about generative AI has been its collateral environmental harm — even if things are getting better, AI researcher and Sphere IT CEO Michael Collins told Reworked.
Generative AI tools such as ChatGPT and Midjourney are based on massive amounts of computing and require a massive training database of billions of data points, he explained. Any use of them is energy intensive, requiring high-end servers that are hungry for power. However, Collins flagged other problems, including:
1. Water Usage
In addition to electricity, generation of these systems also requires large amounts of clean water to cool down the processors and produce electricity, he explained. This part of AI ecology has mostly remained in the periphery but is central to finding out the scope of AI’s consumption of resources.
2. Comparative Energy Use
Collins also noted that a single query to a Large Language Model can consume as much energy as leaving a low brightness light emitting diode bulb on for an hour.
“Work on solving these issues are ongoing,” Collins said. “Researchers currently are trying to find methods for further reduction of the energy consumption of AI through improving data collection and processing, using better libraries and refining training algorithms.”
The Greater the Complexity, the Greater the Energy Demand
Things are likely to get worse as models become more complex, said Russell Hunter, AI and data science academic lead at Cambridge Advance Online.
The more complex a model, the greater the demand for computational power. And the more we rely on them, the more energy is needed to power them, Hunter told Reworked. Training a single large model can consume as much energy as several hundred households over a year, he continued.
The data centers that support AI operations consume vast amounts of electricity too. While there are efforts to use renewable energy sources, the current reliance on traditional energy sources contributes to a substantial carbon footprint.
Hunter does note the work being done to counter the problem, including ongoing research and development efforts focused on developing algorithms that require less computational power without compromising performance.
Techniques like model pruning, a method for reducing the number of deep neural network parameters; quantization, another method to compress the model size; and even quantum computing could be used to reduce energy consumption. Similarly, Hunter said efficiencies can be introduced on the hardware side through more energy-efficient processors.
Related Article: Ready or Not, ESG Platforms Are Coming for the Digital Workplace
One Solution: Use Traditional Search Engines Instead of GenAI
The most obvious solution now is to limit the use of generative AI search, Digital Meld founder and CEO Brad Groux told Reworked. The obvious, immediate action is to return to traditional search approaches both inside and outside the organization.
"Traditional search engines are more energy efficient. Their algorithms, honed over decades, are optimized to deliver search results using less computational power,” Groux said. "Search engines like Google have also taken measures to lower their carbon footprint through energy-efficient data centers, renewable energy investments and the optimization of search algorithms."
While these engines still consume energy, the footprint is considerably smaller compared to the compute-heavy demands of generative AI systems. As a result, and from an environmental perspective, sticking with traditional web search, especially for simple information retrieval tasks, seems a practical and responsible choice, Groux said. Traditional search engines can effectively handle most common queries without the need for energy-intensive AI responses.
Organizations will of course continue to use generative AI, given the productivity gains it creates in areas like generating content, solving multi-step problems or offering personalized experiences — areas that traditional search couldn't solve, he said. Given that, organizations should identify and implement a more balanced approach, including guiding workers to follow some basic rules of thumb:
- Use traditional search for simple, routine and fact-based queries.
- Use generative AI for complex tasks.
“While the energy consumption of GenAI is a legitimate concern, it’s impractical for organizations to abandon the technology altogether,” Groux said. "Instead, using both traditional web search and GenAI in tandem, depending on the task at hand, can offer a practical and more sustainable path forward."
Blue Goat Cyber's Christian Espinosa disagrees. The founder and CEO said sticking with traditional web search is becoming increasingly impractical for organizations. The sheer volume and variety of data that organizations now deal with, both internally and externally, require more sophisticated tools that can provide deeper insights, better context and more accurate results, he said.
Espinosa recommends a multi-faceted approach to information retrieval and knowledge management that could involve enterprise search solutions, the use of artificial intelligence and machine learning for more intelligent data analysis, and integrating various data sources for a more comprehensive view.
Additionally, he suggests organizations explore natural language processing and semantic search to enhance their ability to find relevant information quickly and efficiently.
“By diversifying their search and information discovery tools, organizations can improve decision-making processes, increase productivity and better capitalize on their data assets,” he said.
The bottom line is that regardless of who is responsible for the environmental impact of generative AI — whether it is the vendors or the users — both sides seem to be more focused on the benefits than the costs. Since we cannot just stop using AI, a more realistic approach is to balance the benefits with positive environmental actions.