Interesting
  • William
  • Blog
  • 6 minutes to read

Saying These Simple Words To ChatGPT Is Costing OpenAI Millions Of Dollars

Growing up, we were all taught to be polite, but when you’re one of the world’s foremost AI companies, pleasantries can be expensive. According to OpenAI CEO Sam Altman, his firm is losing tens of millions of dollars thanks to well-mannered users. It’s no secret that generative AI is an expensive business, but what appears to be less well known is that, at least so far, it’s an unprofitable one. Generative AI models, with the notable exception of China’s DeepSeek, have cost companies like OpenAI and Google billions. OpenAI lost about $5 billion in 2024, despite ending the year with 15.5 million paid subscribers. The company nonetheless closed out a $40 billion funding round at the end of March 2025. It’s more than double the money a company has ever raised before.

AI is an expensive business for several reasons. First, a massive amount of power is required to run data centers where models are trained and run. One Washington Post estimate found that generating 100 words requires enough power to keep 14 LED lights turned on for an hour, and requires over 519 milliliters of water. That’s more than an average water bottle. Factor in the cost of supercomputers from companies like NVIDIA, as well as the cost of top-tier AI talent, and you’ve got a business that churns through cash at an alarming pace. And then it turns out that being polite to an AI tacks on millions more in additional costs. Here’s what you need to know.

Please and thank you are expensive courtesies, Sam Altman claims

In a reply on X (formerly Twitter), to someone who wondered how much money OpenAI has spent on electricity as a result of people saying “please” and “thank you” to ChatGPT, Sam Altman spilled the details “tens of millions of dollars well spent–you never know,” he posted.

To explain why niceties cost OpenAI money, there’s a bit of basic (and heavily watered down) computer science that needs to be unpacked. Words you input into ChatGPT and other AI chatbots are calculated into units called tokens, with each token needing computer resources to process and respond to. As noted above, generating as few as 100 words uses quite a bit of power and water, and therefore money. When you’re polite to an AI bot, you’re likely to get some extra text back acknowledging your courtesy, in addition to the usual output. So, in addition to the cost of processing pleasantries like “please” and “thank you,” there’s the extra cost attached to the chatbot saying, “You’re welcome.”

Being nice is rarely a bad choice, even when talking to a non-sentient AI. However, OpenAI is so expensive to run that Altman has claimed the company loses money on its most premium, $200 a month ChatGPT Pro subscriptions. Although the company seems able to hoover up investment capital like Cookie Monster at a Keebler factory, it’s hard to imagine it has any desire to spend more than necessary. So, why did Altman call the extra resources needed to calculate a kind word money well spent?

Should you be polite to AI?

Altman’s assertion that the money spent processing politeness is “well spent” because “you never know” seems to imply that humanity had better mind our Ps and Qs around AI or else suffer its wrath in the future. Altman has employed similar rhetoric countless times in recent years, which some view as a marketing strategy. What better way to convince investors your product will eventually make them trillions of dollars than to claim it will one day become a supreme AI intelligence? If you believe that’s where AI is headed, you’re likely to conclude that an ownership stake in that intelligence is worth every penny.

But does that mean you shouldn’t be polite to AI? Not necessarily. Research on the influence of prompt politeness on LLM performance from Waseda University, Japan, suggests that talking nicely to a large language model can improve its output, with those who ask politely getting more accurate responses. It’s not as if ChatGPT has feelings, but there are some explanations behind this phenomenon. 

AI is able to sound so humanlike because it is very good at understanding contexts, and since it is trained on large datasets comprised of text scraped from every possible location across the web, it understands the contexts of conversations, both polite and impolite. Think about times you’ve come across an online argument that had progressed into insults and attacks. Chances are there wasn’t a lot of useful information in that exchange. Contrast this with a polite exchange of ideas where both parties shared their perspectives in good faith, likely leading to better outcomes. ChatGPT has innumerable examples of both types of interactions in its training data, and when you use polite language, it may be more likely to draw upon the latter, leading to better answers.


Source: http://www.slashgear.com/1842415/chat-gpt-please-thank-you-cost-open-ai-millions/

Inline Feedbacks
View all comments
guest

Can You Get Banned From Using ChatGPT?

When it hit the market in 2023, ChatGPT shook the world in more than one way, and there...

Why Some May Not Trust Using Gemini In Their Google Workspace Account

As it competes against other companies in the AI race, Google is pushing its Gemini AI into every...

What Is Agentic AI & How Might It Change How The World Works In The Future?

If films and TV have taught us anything, it's that the future ought to be full of autonomous,...

The Philosophy of AI Consciousness: Can Machines Achieve Sentience and What It Means for Humanity

The question of whether machines can achieve genuine consciousness represents perhaps the most profound philosophical challenge of our...

Do AI Humanizers Actually Work? We Tested Them And This Is What We Found

First, we had ChatGPT and other Generative Pre-Trained Transformers, which created AI-generated text. Next, we had AI Detectors,...

Google’s Gemini AI Super Bowl Ad Tries To Humanize AI (And It’s Creeping Us Out)

The Super Bowl 59 ad fest had a clear trend that no sports fan would have predicted. It...

How Do ‘AI Productivity’ Apps Like Beloga Actually Work?

While general purpose chatbots like OpenAI's ChatGPT are the focus of initial AI consumer hype, AI products with...

AI Governance in the Age of Uncertainty: Building Regulatory Frameworks for Unknown Futures

The emergence of artificial intelligence as a transformative force in human society has created an unprecedented regulatory paradox....

Unsettling Reasons To Avoid Sharing Personal Information With ChatGPT

Artificial Intelligence or AI technology has become a big topic these days, particularly among companies like OpenAI (developers...

Celebrity Voices Like John Cena And Awkwafina Headline Meta’s Latest AI Upgrades

As part of the Meta Connect keynote, the technological giant has unveiled a variety of new developments in...

6 Things You Can Do With The New Raspberry Pi AI Kit

Raspberry Pi has just released an AI Kit which is designed to work with the Raspberry Pi 5....

4 Raspberry Pi Projects For Bicycle Riders

Raspberry Pi is a versatile device that could have a home in virtually every industry and hobby, and...

DeepSeek: The Pros And Cons Of China’s Groundbreaking AI Model

AI chatbot DeepSeek R1 might have only been released a few weeks ago, but lawmakers are already discussing...

AI-Generated Images Are About To Invade Your iPhone, iPad, And Mac

Apple recently announced a lot of new AI-powered software features that will soon be integrated into the iOS18...

The History Of AI: How Machine Learning’s Evolution Is Reshaping Everything Around Us

For a long time, artificial intelligence was a futuristic concept. But thankfully, the future is finally here. AI...

6 Of The Worst Mistakes Google’s New AI Overview Has Made So Far

In May 2024, Google held its latest Google I/O conference, which opened with a keynote speech that focused...

Just How Much Energy Does Generating An AI Image Actually Use?

Image generation with the use of artificial intelligence has become commonplace online, with plenty of buzz surrounding the...

Saying These Simple Words To ChatGPT Is Costing OpenAI Millions Of Dollars

Growing up, we were all taught to be polite, but when you're one of the world's foremost AI...

The Easy Way To Run An AI Chatbot Locally On Your Laptop

People are using all kinds of artificial intelligence-powered applications in their daily lives now. There are many benefits...

Microsoft Copilot Vs. ChatGPT: Which AI Is Smarter And More Useful?

Generative AI has been with us for over two years now, with most major tech companies trying to...