12306307099?profile=RESIZE_710xDr. Geoffrey Hinton helped Google develop it’s Artificial Intelligence (AI) capabilities. He quit Google when he began to worry about the unintended consequences of using the technology. “I’m afraid of self-censoring,” he told Jeff Bennett and the PBS Newshour last May. He praises what AI can do for medicine. “Would you rather meet with a doctor who has seen a thousand patients with your condition, or millions?”  He was clear to point out that AI can help doctors, but not take responsibility for the care of individual patients.

AI has it’s uses and may be an indispensable tool in creating an efficient global energy distribution grid. A truly smart grid continually matches supply and demand and integrates solar, wind, and hydro sources of energy, so that utilities can decrease the need for dirty and inefficient “peaker” plants that need to be fired up in order to meet a peak load for a few hours a day. Add energy storage devices, like batteries or capacitors to the grid and AI will teach us how to use energy as wisely as possible, without interruptions.

But new technology is often a step back from our goal to use less fossil fuel energy. Technology itself is morally neutral. It’s how we use it that benefits or diminishes all of our lives over time. It does both, like a two-edged sword, and regulators are needed with the authority to set realistic goals for emissions reduction.

“Biological intelligence has evolved to use very little power, so we only use 30 watts,” says Hinton. “And we have huge numbers of connections, like 100 trillion connections between neurons. And learning consists of changing the strength of those connections. The digital intelligence we have been creating uses a lot of power, like a megawatt when you're training it.” AI can store a lot more data than a brain, and has it available at its fingertips, so to speak, but it processes that data in a much more energy intensive way compared to the human brain.

According to the International Energy Agency (IEA): “AI also uses more energy than other forms of computing – a crucial consideration as the world seeks to build a more efficient energy system. Training a single model uses more electricity than 100 US homes consume in an entire year.”

As a nation trying to minimize its carbon emissions, we’ve faced this troubling relationship between technology and energy before. For example, Steven Gonzalez Monserrate, who works in MIT’s Program in History, Anthropology, and Science, Technology, and Society, studied the cost of Cloud computing and the data centers, i.e., server farms, that Cloud computing depends on. In his report, “The Cloud Is Material: On the Environmental Impacts of Computation and Data Storage,” from 2022 but updated this November, Monserrate writes, “The Cloud now has a greater carbon footprint than the airline industry. A single data center can consume the equivalent electricity of 50-thousand homes.”  (See also, “Understanding Data Center Energy Consumption.”)

Another energy hog is the Bitcoin economy. The computing power behind bitcoin calculations can use as much energy as a small country (see, “Is Bitcoin An Energy Bomb?”)

The energy to power 100 U.S. homes for a year; the electricity to run 50-thousand homes; the energy of a small country: add this up and you get some serious energy use and emissions from fossil fuels.

But there are ways to lessen our emissions while still benefiting from the latest technology. Cutting back on the energy use of data centers gets at the problem associated with the Cloud, Bitcoin, and AI energy use. Data centers produce a lot of heat energy and require air-conditioning to cool them in order for the centers to operate efficiency. So how about putting data centers in Nordic countries with cold claims much of the year? You need less energy for cooling and you might use the heat energy to keep the neighborhood toasty warm. Also, creating large data centers allows for the benefit of scale and savings on air-conditioning energy use.

The companies who run the largest data centers, such as Google, Meta, and Apple, are committed to mitigating the energy use of their technology, especially data centers, by finding ways to use less energy and by investing more in renewable energy as an offset.

What can you do at home? Here are some options from the SCOCO website. Or simply replace burnt out light bulbs with LEDs, turn off the lights when you are not in the room, replace worn out appliances with Energy Star certified models, and either unplug electronic devices, like computers, when not in use or invest in a “smart power strip” that will automatically cut the cord on standby power.

Coming up: Big tax benefits and rich rebates for heat pumps and heat pump water heaters. The IRS is actually here to help. So is the White House and the Inflation Reduction Act of 2022. California is still figuring out how to fairly distribute all those federal funds.

 

Photo by Christina @ wocintechchat.com on Unsplash.

You need to be a member of SCOCO Network to add comments!

Join SCOCO Network

Replies

  • Many exciting innovations, yet always good to keep an eye on environmental impacts.
  • Very informative and educational article, Jim!
This reply was deleted.