The Unknown Toll Of The AI Takeover

Lois Parshley at The Lever: In early May, Google announced it would be adding artificial intelligence to its search engine. When the new feature rolled out, AI Overviews began offering summaries to the top of queries, whether you wanted them or not — and they came at an invisible cost.

Each time you search for something like “how many rocks should I eat” and Google’s AI “snapshot” tells you “at least one small rock per day,” you’re consuming approximately three watt-hours of electricity, according to Alex de Vries, the founder of Digiconomist, a research company exploring the unintended consequences of digital trends. That’s ten times the power consumption of a traditional Google search, and roughly equivalent to the amount of power used when talking for an hour on a home phone. (Remember those?)

Collectively, de Vries calculates that adding AI-generated answers to all Google searches could easily consume as much electricity as the country of Ireland.

Unlike their chatbots, the companies behind these advances are far less willing to share information. Though researchers like de Vries can make educated estimates, because of a lack of industry transparency it remains surprisingly difficult to put an exact number on just how much power and water AI might use. Yet that demand is soaring as the technology is tacked on to everything from your iPhone’s operating system to how your car insurance company calculates your rates.

While the mass adoption of AI has transformed digital life seemingly overnight, regulation of its very physical impacts has not kept pace. Federal agencies like the U.S. Energy Information Administration, which collects information about industries’ energy use, aren’t tracking the demand of the data centers that enable AI—even as their footprint skyrockets.

“We do not have any mandated disclosures on the amount of energy or resources that general AI systems use,” says Merve Hickok, president and research director of the Center for AI and Digital Policy, a nonprofit research organization. When journalists file record requests to get this information, it’s usually redacted. This secrecy limits the ability of utilities and regulators to know how these needs are changing.

That’s a problem because data centers are rapidly outgrowing the electric grid while keeping dirty sources of power, like coal plants, operating. Tech companies also have a long track record of arranging for special, discounted rates for their massive power consumption—which means in many cases, ratepayers like you are subsidizing data centers’ undisclosed energy use.

In addition to power, these facilities suck up substantial amounts of water to cool their servers, and are often located in places where land is cheap—like deserts. Only a few operators report their water usage, even though a fifth of servers “draw water from moderately to highly stressed watersheds.” One paper estimates that globally, the demand for water for data centers could be half that of the United Kingdom within the next several years.

Yet even as questions about data centers’ impact on the public grow, companies are expanding restrictions on what they share about their operations. In a written response to The Lever, a Google spokesperson says that since introducing generative AI to its search, associated machine costs have decreased by 80 percent. They say that based on internal data, de Vries’ analysis is “an overestimation and our systems are far more efficient.”

But they declined to provide any further specifics about their energy use, other than noting that predicting the future growth of energy consumption and emissions from AI data centers is challenging.

“We’re actually seeing less and less disclosure,” de Vries says, as companies claim information about models harms their competitive advantage. “In terms of transparency, we’re actually going backward.”

More here.

(This story was originally published by The Lever, an investigative newsroom.)