OpenAI wants to keep its number one place in the generative AI space, and to do that, the company has announced new features for ChatGPT along with a price reduction.
The developers of ChatGPT took to the OpenAI blog to announce a new version of GPT-3.5-turbo and GPT-4. GPT-4 and GPT-3.5-turbo, as OpenAI explains in its blog post, will be getting a feature called "Function Calling", which will allow developers to create chatbots that can outsource made requests to external tools such as ChatGPT plugins. Furthermore, OpenAI has added features such as converting natural language into database queries.
An example of that would be, "Email Anya to see if she wants to get coffee next Friday" to "send_email(to: string, body: string)". Another example would be "What's the weather like in Boston?" to "get_current_weather(location: string, unit: 'celsius' | 'fahrenheit')." The feature also works for converting natural language into API calls or database queries. An example would be converting the prompt, "Who are my top ten customers this month?", which would convert to "get_customers_by_revenue(start_date: string, end_date: string, limit: int)".
We continue to make our systems more efficient and are passing those savings on to developers, effective today.
text-embedding-ada-002 is our most popular embeddings model. Today we're reducing the cost by 75% to $0.0001 per 1K tokens.
gpt-3.5-turbo is our most popular chat model and powers ChatGPT for millions of users. Today we're reducing the cost of gpt-3.5-turbo's input tokens by 25%. Developers can now use this model for just $0.0015 per 1K input tokens and $0.002 per 1K output tokens, which equates to roughly 700 pages per dollar.
gpt-3.5-turbo-16K will be priced at $0.003 per 1K input tokens and $0.004 per 1K output tokens.