Out Like a Light
Every time I use an AI answer machine like ChatGPT, a lightbulb goes off. I don’t mean that in the sense that I suddenly understand something. I mean like a real lightbulb goes off. AI uses a lot of energy. It’s singlehandedly changed the emissions promises from big tech companies like Google and Microsoft. And it will completely change the game when it comes to energy needs and use in every region where the machines are deployed at scale. According to one researcher, “One query to ChatGPT uses approximately as much electricity as could light one light bulb for about 20 minutes…So, you can imagine with millions of people using something like that every day, that adds up to a really large amount of electricity.” AI brings soaring emissions for Google and Microsoft, a major contributor to climate change. Let’s hope we can use AI to discover strategies for becoming more energy efficient and to accelerate the move to new, cleaner forms of electricity.
+ For an example of just how much energy technology can consume, let’s consider just one (albeit massive) installation. Las Vegas’ dystopia-sphere, powered by 150 Nvidia GPUs and drawing up to 28,000,000 watts, is both a testament to the hubris of humanity and an admittedly impressive technical feat. (That was my original tagline for NextDraft.)
+ The latest AI bot to go live is Rufus. It’s Amazon’s shopping bot and it’s now available to anyone.
+ Meanwhile, the debate over what Chatbots have the right to inhale continues. AP: Two 80-something journalists tried ChatGPT. Then, they sued to protect the ‘written word.’


