development

ChatGPT API Cost

The ChatGPT API cost is the per-use fee OpenAI charges developers and site owners to access GPT models programmatically — billed by token, not by the month.

What Is ChatGPT API Cost?

Who this is for: Small business owners or freelancers who want to add AI-powered features to a WordPress site and need to understand what OpenAI charges before committing.

Affiliate disclosure: WPSchool uses affiliate links to recommended products. This glossary entry is educational and contains no affiliate links.

The ChatGPT API cost is the per-use fee OpenAI charges developers and site owners to access GPT models programmatically — billed by token, not by the month.

Answer capsule: ChatGPT API cost refers to OpenAI’s usage-based pricing for accessing GPT models via API. As of April 2026, GPT-4.1 costs $2.00 per million input tokens and $8.00 per million output tokens. A typical chatbot reply of 500 words uses roughly 650 tokens, costing under $0.01.


What Is a Token?

A token is roughly 0.75 words of English text — “WordPress” counts as one token, “site” as one. Every request you send (input) and every reply you receive (output) is metered in tokens.

A 200-word product description you send plus a 300-word AI rewrite you receive equals roughly 650 tokens total. At GPT-4.1 rates, that exchange costs about $0.004 — less than half a cent.


How Does ChatGPT API Pricing Work?

OpenAI charges separately for input tokens (text you send) and output tokens (text the model returns). Output tokens cost more — typically 3–6× the input rate — because generating text is more compute-intensive than reading it.

Current published rates at OpenAI’s pricing page as of April 2026:

ModelInput (per 1M tokens)Output (per 1M tokens)
GPT-4.1$2.00$8.00
GPT-4.1 mini$0.40$1.60
GPT-4o$2.50$10.00
GPT-3.5 Turbo$0.50$1.50

For most WordPress use cases — chatbots, content helpers, form-response drafters — GPT-4.1 mini delivers near-GPT-4.1 quality at 80% lower cost. We default to it on client sites handling under 10,000 monthly interactions.


What Does This Mean for a WordPress Site?

A modest AI chatbot on a small business site handling 500 conversations per month (avg. 300 words in, 400 words out per conversation) consumes roughly 600,000 tokens. At GPT-4.1 mini rates, that is approximately $0.60/month — a cost most site owners will not notice.

Where costs escalate: high-volume content generation, image-generation endpoints (billed separately), and embedding large documents. In our testing with an ecommerce client running automated product descriptions across 2,000 SKUs, a single generation run cost $4.20 using GPT-4.1 mini.


Do You Need a Paid ChatGPT Subscription to Use the API?

No. The ChatGPT consumer subscription (ChatGPT Plus at $20/month) is a separate product from the API. API access requires an OpenAI platform account with billing enabled — you pay only for what you use, with no monthly minimum.


  • Token — the unit OpenAI uses to measure text length for billing
  • API key — the credential your WordPress plugin uses to authenticate with OpenAI
  • Rate limit — the maximum requests-per-minute OpenAI allows on your account tier
  • Prompt engineering — writing efficient inputs to reduce token consumption
  • OpenAI Playground — a browser tool for testing API calls and estimating token usage before writing code

Additional Reading

Last verified: April 2026