Should We Be Polite To ChatGPT?

By Kitty Shepherd-Cross

5 mins ago

Our manners could be secretly impacting the planet


From queuing to curtseying, manners are at the forefront of British society – but our obsession with polite language may be quietly harming our planet. Last week, OpenAI CEO Sam Altman revealed on X that users saying please and thank you to ChatGPT has cost the company ‘tens of millions of dollars’ through increased energy demands. So should we be letting go of etiquette when communicating with AI?

Here’s How British Courtesy Is Impacting The Planet

Brits have always been frightfully polite. After learning our first words, the importance of saying please and thank you is quickly drilled into us – and soon afterwards, we’re taught to apologise for everything. Even if someone spills their Americano down your trousers, you must apologise profusely and find a way to blame it on yourself. Then comes a whole arsenal of doublespeak to maintain this politeness.

So, when ChatGPT came into our lives in 2022, becoming everything from our PA to our therapist, nutritionist, doctor and friend, it was no wonder we wanted to grovel at its feet. But while our saccharine social graces may work wonders for human relationships, it doesn’t translate so well to AI. Human beings thrive on niceties, ChatGPT, on the other hand, runs on tokens, servers, and electricity. This all results in politeness increasing energy consumption and, consequently, environmental damage.

How Much Energy Does The Internet Use?

Long before AI exploded into the mainstream, the tech world was already acutely aware of the internet’s significant energy demands. By 2022, the internet accounted for 2.5 percent of global electricity usage.

Despite this, there’s still a common misconception that the internet is intangible, floating somewhere in the ‘cloud’. This confusion likely stems from the term ‘cloud computing’, popularised after former Google CEO Eric Schmidt used it at a Californian tech conference in 2006. The term was originally meant as a metaphor to explain the complex system of shared online storage, however it’s often misinterpreted as meaning that the internet doesn’t physically exist.

In reality, it’s anything but metaphysical. From Instagram accounts to parliamentary records, every digital snippet is stored in vast physical data centres, like Microsoft’s Washington data centre, which is the size of eight football fields. Running the servers inside these facilities is hugely energy-intensive, with some consuming up to 50 times more power than conventional buildings.

Of course, as the internet has grown, so too have these energy demands: the internet now stores an estimated 181 zettabytes of data. To put that into perspective, streaming just one zettabyte of HD video would take around 36 million years, so if you started during the dinosaur era, you’d barely be a quarter of the way through.

While it’s almost impossible to fully grasp the scale of the internet, one thing is clear: storing and powering this much data comes at a serious environmental cost. As of 2025, data centres are responsible for around 300 million metric tons of CO₂ emissions each year – more than the total emissions of over 90 percent of individual nations.

A phone with Chat GPT on the screen

Pexels

The Addition Of AI

While the internet already has high energy demands, adding AI into the mix is a whole new beast. One ChatGPT query consumes about 2.9 watt hours of energy, 10 times that of a typical Google search. While each search is fairly inconsequential, due to the extent of ChatGPT’s use it creates very consequential increases in energy demands, with over 1 billion queries submitted to ChatGPT per day in 2024, translating to over 1,000 GWh of electricity. This is the equivalent electricity use of 250,000 homes per year.

So while data centers currently use two percent of global electricity, with the continuing expansion of AI it is estimated data centers energy demands will double by 2030.

This phenomenon occurs as ChatGPT runs off ‘tokens’: small units of text made up of a few characters or punctuation marks. Most English words average 1.5 tokens. The longer the conversation exchange you have with Chat GPT the more tokens it uses, for example ‘translate this’ would take up four tokens, whereas: ‘Hey, please could you translate this sentence for me?’ would use 12 tokens. The more tokens used, the increased computational workload and energy consumed. 

The energy cost per token is insignificant (an estimated 0.1 watt hours, the equivalent of keeping an LED light on for 36 seconds), but when there are billions of additional tokens being processed per day, this computational load can lead to measurable increases in global energy demands. 

A recent survey found 71 percent of British people are intentionally polite to AI, compared to 67 percent of Americans. But, with research showing British people say ‘please’ twice as much as our American counterparts, it seems we could be unnecessarily adding to the environmental impact.

Is AI Bad For The Planet?

The birth of AI has undoubtedly been transformational in our society, from disease diagnosis to personalised education and emotional support. However, it is a double edged sword, as AI is also contributing to rising energy demands. Experts warn that without major improvements in its efficiency, AI’s emissions could therefore surpass those of the aviation and shipping industries by 2030, accounting for four percent of global electricity demands.

So, while our polite habits may seem inconsequential on a per person basis, on a global scale they can add up to the total energy demands of small cities. Luckily, unlike many contributions to our worsening climate, this is a very solvable issue. Dropping our manners is a small price to pay for emissions reductions.