Interesting
  • William
  • Blog
  • 6 minutes to read

Saying These Simple Words To ChatGPT Is Costing OpenAI Millions Of Dollars

Growing up, we were all taught to be polite, but when you’re one of the world’s foremost AI companies, pleasantries can be expensive. According to OpenAI CEO Sam Altman, his firm is losing tens of millions of dollars thanks to well-mannered users. It’s no secret that generative AI is an expensive business, but what appears to be less well known is that, at least so far, it’s an unprofitable one. Generative AI models, with the notable exception of China’s DeepSeek, have cost companies like OpenAI and Google billions. OpenAI lost about $5 billion in 2024, despite ending the year with 15.5 million paid subscribers. The company nonetheless closed out a $40 billion funding round at the end of March 2025. It’s more than double the money a company has ever raised before.

AI is an expensive business for several reasons. First, a massive amount of power is required to run data centers where models are trained and run. One Washington Post estimate found that generating 100 words requires enough power to keep 14 LED lights turned on for an hour, and requires over 519 milliliters of water. That’s more than an average water bottle. Factor in the cost of supercomputers from companies like NVIDIA, as well as the cost of top-tier AI talent, and you’ve got a business that churns through cash at an alarming pace. And then it turns out that being polite to an AI tacks on millions more in additional costs. Here’s what you need to know.

Please and thank you are expensive courtesies, Sam Altman claims

In a reply on X (formerly Twitter), to someone who wondered how much money OpenAI has spent on electricity as a result of people saying “please” and “thank you” to ChatGPT, Sam Altman spilled the details “tens of millions of dollars well spent–you never know,” he posted.

To explain why niceties cost OpenAI money, there’s a bit of basic (and heavily watered down) computer science that needs to be unpacked. Words you input into ChatGPT and other AI chatbots are calculated into units called tokens, with each token needing computer resources to process and respond to. As noted above, generating as few as 100 words uses quite a bit of power and water, and therefore money. When you’re polite to an AI bot, you’re likely to get some extra text back acknowledging your courtesy, in addition to the usual output. So, in addition to the cost of processing pleasantries like “please” and “thank you,” there’s the extra cost attached to the chatbot saying, “You’re welcome.”

Being nice is rarely a bad choice, even when talking to a non-sentient AI. However, OpenAI is so expensive to run that Altman has claimed the company loses money on its most premium, $200 a month ChatGPT Pro subscriptions. Although the company seems able to hoover up investment capital like Cookie Monster at a Keebler factory, it’s hard to imagine it has any desire to spend more than necessary. So, why did Altman call the extra resources needed to calculate a kind word money well spent?

Should you be polite to AI?

Altman’s assertion that the money spent processing politeness is “well spent” because “you never know” seems to imply that humanity had better mind our Ps and Qs around AI or else suffer its wrath in the future. Altman has employed similar rhetoric countless times in recent years, which some view as a marketing strategy. What better way to convince investors your product will eventually make them trillions of dollars than to claim it will one day become a supreme AI intelligence? If you believe that’s where AI is headed, you’re likely to conclude that an ownership stake in that intelligence is worth every penny.

But does that mean you shouldn’t be polite to AI? Not necessarily. Research on the influence of prompt politeness on LLM performance from Waseda University, Japan, suggests that talking nicely to a large language model can improve its output, with those who ask politely getting more accurate responses. It’s not as if ChatGPT has feelings, but there are some explanations behind this phenomenon. 

AI is able to sound so humanlike because it is very good at understanding contexts, and since it is trained on large datasets comprised of text scraped from every possible location across the web, it understands the contexts of conversations, both polite and impolite. Think about times you’ve come across an online argument that had progressed into insults and attacks. Chances are there wasn’t a lot of useful information in that exchange. Contrast this with a polite exchange of ideas where both parties shared their perspectives in good faith, likely leading to better outcomes. ChatGPT has innumerable examples of both types of interactions in its training data, and when you use polite language, it may be more likely to draw upon the latter, leading to better answers.


Source: http://www.slashgear.com/1842415/chat-gpt-please-thank-you-cost-open-ai-millions/

Inline Feedbacks
View all comments
guest

Machine Learning Transparency: Making AI Understandable for Business Success

The proliferation of machine learning systems across industries has created an unprecedented challenge for business leaders: how to...

The Philosophy of AI Consciousness: Can Machines Achieve Sentience and What It Means for Humanity

The question of whether machines can achieve genuine consciousness represents perhaps the most profound philosophical challenge of our...

Every Super Bowl 2025 AI Commercial Has The Same Problem

The Super Bowl is known for its over the top commercials. Most of the time, they're packed with...

Is Ford Really Making A 2025 Focus RS?

The Ford Focus RS' blend of functionality and performance made it a hit back when SlashGear reviewed it...

ChatGPT’s Image Generator Is Free – Here’s What You Can (And Can’t) Do With It

On April 1, Sam Altman announced on X that ChatGPT's new image generator will now be accessible to...

What Is GeoSpy AI & How Does It Work?

Anyone with a pulse nowadays is painstakingly aware of Artificial Intelligence's rapid growth as an economic and cultural...

4 Raspberry Pi Projects For Bicycle Riders

Raspberry Pi is a versatile device that could have a home in virtually every industry and hobby, and...

What Is Agentic AI & How Might It Change How The World Works In The Future?

If films and TV have taught us anything, it's that the future ought to be full of autonomous,...

ChatGPT Vs. Google Gemini: Which AI Chatbot Is Smarter?

The AI wars are officially heating up as major tech companies all try to claim a piece of...

The Hollywood Tech That’s Training Tesla’s AI-Powered Robots

Tesla has been hard at work developing its own humanoid robot since 2021, branching out beyond electric vehicles....

5 AI Tools To Step-Up Your Online Content Creation

The internet is easily one of the best inventions of mankind. Not only does it enable the transfer...

AI-Generated Images Are About To Invade Your iPhone, iPad, And Mac

Apple recently announced a lot of new AI-powered software features that will soon be integrated into the iOS18...

The Formerly Futuristic Way Amazon Scans For Product Defects Before Shipping

We may receive a commission on purchases made from links. When you order a product from Amazon, you...

How Do ‘AI Productivity’ Apps Like Beloga Actually Work?

While general purpose chatbots like OpenAI's ChatGPT are the focus of initial AI consumer hype, AI products with...

OpenAI Reveals Multimodal GPT-4o To Take On Google’s Gemini AI

OpenAI has announced a new model called GPT-4o to power ChatGPT. But, unlike the advancements introduced by previous...

Is ChatGPT Safe? What You Should Know Before You Start Using AI Chatbots

In November 2022, the tech world was upended as OpenAI released ChatGPT, an AI chatbot with capabilities that...

5 Ways Microsoft’s CoPilot AI Extension For Chrome Is Actually Useful

Microsoft's Copilot can help boost your productivity in several ways. You can access it through various platforms, including...

4 Ways NASA Is Using AI For Space Exploration

The buzz and controversy surrounding artificial intelligence may make it seem pretty new, but NASA has been using...

Microsoft Copilot Vs. ChatGPT: Which AI Is Smarter And More Useful?

Generative AI has been with us for over two years now, with most major tech companies trying to...

We Tried Apple Intelligence. Here Are The Nine Best Features So Far

Apple's foray into generative AI began with the introduction of iOS 18 at WWDC 2024. In usual fashion,...