Tech companies aren't telling us enough about AI and energy use.
[View this email in your browser]( Should you feel guilty about using AI? Apple just put AI in millions of peopleâs pockets. The company is rolling out what it calls [Apple Intelligence]( this week, bringing some basic text generation and image editing features to iPhone, iPad, and Mac users [who opt in](. Iâve been testing these tools through the developer beta version of the software for a couple months now, and [theyâre pretty mediocre](. But this is only the beginning. Generative AI, once a parlor trick for the tech-obsessed, is fast becoming the main event for major software releases. As Apple pushes its version of the technology, Google [is building AI into its Android operating system]( and [forcing everyone to look at AI Overviews]( at the top of virtually every Google Search. [OpenAI]( and [Meta]( are building their own AI-powered search engines, while the startup [Perplexity]( already has one. [Microsoft]( and [Anthropic]( recently announced new, super-powerful AI agents that can complete complex tasks much like humans would. (Disclosure: Vox Media is one of several publishers that has signed partnership agreements with OpenAI. Our reporting remains editorially independent.) While some companies have had generative AI products out in the wild for over a year, the arrival of Apple Intelligence marks an inflection point for the mainstreaming of the technology. Apple Intelligence is only available on the latest Apple devices, but over half the phones in the United States [are iPhones](. As people upgrade, millions more can tap into the new technology. If youâre not already using AI, you probably will be soon â whether you like it or not. âWe're getting AI, especially generative AI, shoved down our throats with little to no transparency, and honestly, the opt-out mechanisms are either nonexistent or complicated,â said Sasha Luccioni, AI researcher and climate lead at [Hugging Face](, a platform for sharing AI and machine learning tools If that fills you with dread, itâs understandable. Maybe you feel bad participating in the race [to build a superintelligent AI nobody asked for](. You may feel complicit for using AI models [trained on copyrighted material]( without paying the creators. You probably feel just plain bad about the flood of [AI slop thatâs ruining the internet]( even if you did not personally create the slop. Then thereâs the climate consequences of it all. AI, in its many shapes and forms, requires a lot of [energy and water to work](. A lot. That might make you feel downright guilty about using AI. AIâs big energy appetite Thereâs a chance Apple Intelligence is more guilt-free than the other big AI options as far as energy is concerned. Apple says it keeps the processing for certain AI features, like GenMoji and Image Playground, entirely on your device. That means less reliance on energy-intensive data centers. We donât know exactly how much energy AI uses at these data centers. Using data from a recent Microsoft Research study, Shaolei Ren, an engineering professor at the University of California Riverside, came up with this: Asking ChatGPT to write two 200-word emails uses roughly the same amount of energy as a Tesla Model 3 would need to drive one mile. Because they generate so much heat, the processors that generated those emails would also require [about four half-liter bottles of water]( to cool down. The consequences of such energy profligacy become clearer if you scale up. The amount of electricity used by data centers, where AI processing largely takes place, is [predicted to grow by 160 percent]( by the end of the decade, and carbon dioxide emissions could more than double as a result, according to Goldman Sachs. Meanwhile, the amount of water needed will also spike, so much so that by 2027, AIâs thirst [could be equal to half the annual water withdrawal of the United Kingdom](. These are all estimates based on limited data because the tech companies building AI systems, including Apple, Google, Microsoft, and OpenAI, do not share exactly how much energy or water their models use. âWe're just looking at the black box because we have absolutely no idea of the energy consumption for interacting with the large language models,â Ren explained. He compared the situation to searching for flights on Google and being able [to see the carbon emissions for each leg](. âBut when it comes to these large language models, there's absolutely none, zero, no information.â The lack of transparency about AIâs energy demands also runs counter to these tech companiesâ sustainability promises. Thereâs good reason to believe that AI is [leading directly to those promises being broken](. Due to increases in data center energy usage, Google saw its greenhouse gas emissions [increase by 48 percent]( from 2019 to 2023, despite a pledge to cut emissions by 50 percent from its 2019 levels [by 2030](. The company [no longer claims to be carbon neutral](. Microsoft similarly [saw a 29 percent jump in emissions]( from 2020 to 2023. While Microsoft has promised to be carbon negative by 2030, it [is now openly struggling]( with ways to make that happen while keeping pace with AI innovation. What the AI dealers arenât telling us This is what an arms race looks like. Itâs worth pointing out here that all energy usages started to spike around the time that OpenAI [knocked the worldâs socks off]( with its surprise release of ChatGPT in November 2022. The chatbot [became the fastest-growing app ever](, capturing a hundred million users [in two months]( and kick-starting the AI gold rush in Silicon Valley. Now, [40 percent of all venture capital money]( in cloud computing goes to generative AI companies. OpenAI itself announced a $6.6 billion funding round in early October â the [largest venture capital round of all time]( â giving it a $157 billion valuation. With such staggering amounts of money at play, itâs perhaps no surprise that energy efficiency takes a back seat to growth and innovation. Companies like OpenAI want the models that power their AI technology to get bigger so they can get better and outperform competitors. And the bigger the model, the greater the energy demand â at least for now. Over time, itâs likely that performance will get more efficient thanks to advances in chip technology, data center cooling, and engineering. âBecause the innovation happened so quickly around when ChatGPT burst onto the scene, you would expect, initially, for the efficiency to be at its lowest point,â Josh Parker, head of sustainability at chipmaker Nvidia, told me. Still, the most energy-intensive products are now what companies like OpenAI, Google, and Meta are pushing the hardest. Those include real-time chatbots, voice assistants, and search engines. These features [enlist larger models and require more advanced chips]( to work at the same time to reduce latency, or lag. Put simply, they have to do a lot of hard math problems all at once and very quickly. Thatâs why it takes as much electricity as it does to run a Tesla. Apple, however, seems to present itself as an exception. As part of its promise to protect user privacy, the company says it handles as many Apple Intelligence tasks as it can on your device without sending queries to data centers. That means when you opt in to Apple Intelligence, you download a small generative AI model that can handle pretty simple tasks on your phone. Your iPhone battery, unlike a grid-connected cloud data center, has a limited amount of power, which forces Apple Intelligence to handle these tasks with some efficiency. Maybe on-device AI is the guilt-free version of the future after all. The problem, of course, is that we donât know exactly how Apple Intelligence works. We donât know which tasks are handled on the device, which are sent to energy-hungry Apple servers, or how much energy it all requires. I asked Apple about this, but the company did not provide specifics. Then again, not providing specifics is a bit of a theme when it comes to big tech companies explaining their AI offerings. So again, if youâre feeling dread or guilt about AI in your life, thatâs understandable. It is very clear that this technology, in its current state, consumes vast and increasing amounts of energy, contributing to greenhouse gas emissions and worsening human-caused climate change. It is also true that you might not have a choice, as big tech companies make generative AI more foundational to their products. You can opt out of Apple Intelligence or never opt in. But youâll find itâs more difficult, if not impossible, to opt out of AI products from Google, Meta, and Microsoft. (If you want to try, hereâs [a helpful guide](.) âI don't think there's a reason to feel guilty,â said Luccioni. âBut I do think there's a reason â as with climate change in general â to ask for more information, to ask for accountability on behalf of the companies that are selling us this stuff.â If AI is supposed to solve all our problems or destroy us all [or both](, it would be nice to know the details. We could ask ChatGPT, but that might be a huge waste of energy. â[Adam Clark Estes](, senior technology correspondent Getty Images [Elon Musk says heâs giving away $1 million a day to voters. Is that legal?]( Muskâs âlotteryâ is only available in swing states and seems meant to appeal to potential Trump voters. Getty Images [Elon Musk is Trumpâs biggest booster â and patron. Why?]( Muskâs political ambitions, explained. [Passwords are projected onto a womanâs face in an eery illustration.]( Leon Neal/Getty Images [A world without passwords is in sight]( Thanks to passkeys, you may not need to remember a password ever again. [AI-generated image of cafe window]( Canva AI [Thereâs something off about this yearâs âfall vibesâ]( Autumn is being eaten by a deluge of AI slop. [Sam Altman at conference]( Bloomberg via Getty Images [Inside OpenAIâs multibillion-dollar gambit to become a for-profit company]( OpenAIâs transition isnât what you think. The stakes are tens of billions of dollars â and the future of AI. [Become a Vox Member]( [Support our journalism â become a Vox Member and youâll get exclusive access to the newsroom with members-only perks including newsletters, bonus podcasts and videos, and more.]( [Join our community]( [Listen To This] [Listen to This]( [Why is horror so fun?]( It makes sense that we run away from scary things. Thatâs a good way to stay alive. But why do some people also love scary things? [Listen to Apple Podcasts]( [This is cool] [A historic train ride through Taiwan's cypress forest]( [Facebook]( [Twitter]( [YouTube]( This email was sent to {EMAIL}. Manage yourâ¯[email preferences]( orâ¯[unsubscribe](param=tech) â¯to stop receiving emails from Vox Media. View our [Privacy Notice]( and our [Terms of Service](. If you value Voxâs unique explanatory journalism, [become a member](. Vox Media, 1201 Connecticut Ave. NW, Washington, DC 20036. Copyright © 2024. All rights reserved.