top of page
Lanon Wee

Generative AI Expansion Causes Concern for Tech Giants

An international scramble for the subsequent boom in generative artificial intelligence has led to a heightened public examination of Big Tech's widening water consumption. Shaolei Ren, a researcher from the University of California, Riverside, recently unveiled a report examining the resources necessitated to run popular generative AI models such as OpenAI's ChatGPT. On the basis of their study, Ren and his associates discovered that ChatGPT guzzles 500 milliliters of water for each 10 to 50 commands, contingent upon where and when this celebrated chatbot is utilized. A global race for the newest generation of generative artificial intelligence is prompting more attention to a crucial, yet often overlooked, environmental problem: the burgeoning water footprint of Big Tech. Microsoft and Google, among others, have recently reported a striking jump in their water usage and researchers point to the rush for AI as a major cause. In April, Shaolei Ren, a researcher at the University of California, Riverside, explored the resources needed to operate popular generative AI models such as OpenAI's ChatGPT. Ren and his colleagues discovered that ChatGPT consumes 500mL of water (close to the amount in a 16-ounce bottle) between 10 and 50 prompts, contingent upon the AI model's placement and activation. With the tremendous number of monthly users inquiring of the well-recognized chatbot, it's apparent how much moisture AI models could demand. The paper's authors cautioned that if the water use of AI models isn't managed properly, it could become an impediment to the sustainable and socially responsible utilization of AI in the future. OpenAI, the creator of ChatGPT and partially owned by Microsoft, did not respond to a request for remarks on the conclusions of the study. Ren expressed to CNBC by video conference, "The public is becoming more informed and aware of the water dilemma. If they’re aware that major tech corporations are taking away their water access and not providing enough, it won’t be welcomed. I feel that there will be more disputes over water usage in the upcoming years and companies must take measures to address this risk." Big Tech needs data centers to work properly, and cooling many of its power-hungry servers requires high amounts of water. For Meta, the majority of its energy use and greenhouse gas emissions come from its many data centers. In July, Uruguayans demonstrated about Google's plan to build a data center in their county during a particularly dry season. Despite this, Google maintained that sustainability was at the core of its project. Microsoft has revealed in its most up-to-date sustainability report that a surge of over a third of its water usage to 1.7 billion gallons was seen during 2021-2022.If converted, this would be the equivalent of 2,500 Olympic-sized swimming pools.Google's total water consumption from its data centers and offices was 5.6 billion gallons for 2022, having witnessed a 21% boost from the prior year.Both organizations are striving to decrease their use of water and strive to be "water positive" by the close of the decade, meaning replenishing more water than they consume. Despite the release of their latest water consumption figures prior to the launch of their respective ChatGPT competitors, Microsoft's Bing Chat and Google Bard could require more energy and resources to operate, meaning the consumption of greater quantities of water in the near future. "With AI, we're seeing the classic problem with technology in that you have efficiency gains but then you have rebound effects with more energy and more resources being used," said Somya Joshi, head of division: global agendas, climate and systems at the Stockholm Environment Institute. Joshi also mentioned to CNBC that water use for supplying cooling to machines necessary for heavy computation and large-language models is increasing exponentially. "So, on one hand, companies are promising to their customers more efficient models… but this comes with a hidden cost when it comes to energy, carbon and water," she added. Microsoft informed CNBC that they are investing in research to calculate the energy, water usage, and carbon footprint of AI technology. They are also trying to make larger systems more efficient. "AI technology has the potential to serve as a strong instrument for improving sustainability solutions. However, to harness its power, it is essential to have an ample clean energy supply that can meet the heightened energy requirements of this emerging innovation," a Microsoft spokesperson informed CNBC via email. The company also shared that they will keep a close eye on their emissions, accelerate their progress, and amplify their use of clean energy for datacenters. On top of that, they plan on sourcing renewable energy and executing other tactics that move them closer to their sustainability goals, e.g. becoming carbon negative, water positive, and having zero waste by 2030. A Google representative informed CNBC that studies indicate there is an expansion in AI computing requirements, however the energy utilized to power the technology is progressing "at a much slower rate than many forecasts have predicted." They declared, "We are using proven approaches to reduce the carbon footprint of assignments by considerable amounts; together these approaches can decrease the energy of training a model by up to 100x and emissions by up to 1000x." The spokesperson added that Google data centers are designed, built and managed to increase effectiveness - compared to five years ago, Google is currently providing around 5X as much processing power with the same amount of electricity. To support the next level of crucial developments in AI, their most recent TPU v4 [supercomputer] is demonstrated to be one of the most efficient, most effective, and most sustainable ML [machine learning] infrastructure hubs in the world.

Comments


bottom of page