HUGE ChatGPT Upgrade: 16K Context Length - Lower Prices + Testing
\ud83d\udc4a Become a member:
https://www.youtube.com/c/AllAboutAI/join
Get a FREE 45+ ChatGPT Prompts PDF here:
\ud83d\udce7 Join the newsletter:
https://www.allabtai.com/newsletter/
\ud83c\udf10 My website:
https://www.allabtai.com
Today, we’re following up with some exciting updates:
new function calling capability in the Chat Completions API
updated and more steerable versions of gpt-4 and gpt-3.5-turbo
new 16k context version of gpt-3.5-turbo (vs the standard 4k version)
75% cost reduction on our state-of-the-art embeddings model
25% cost reduction on input tokens for gpt-3.5-turbo
announcing the deprecation timeline for the gpt-3.5-turbo-0301 and gpt-4-0314 models
New models
GPT-4
gpt-4-0613 includes an updated and improved model with function calling.
gpt-4-32k-0613 includes the same improvements as gpt-4-0613, along with an extended context length for better comprehension of larger texts.
With these updates, we’ll be inviting many more people from the waitlist to try GPT-4 over the coming weeks, with the intent to remove the waitlist entirely with this model. Thank you to everyone who has been patiently waiting, we are excited to see what you build with GPT-4!
GPT-3.5 Turbo
gpt-3.5-turbo-0613 includes the same function calling as GPT-4 as well as more reliable steerability via the system message, two features that allow developers to guide the model's responses more effectively.
gpt-3.5-turbo-16k offers 4 times the context length of gpt-3.5-turbo at twice the price: $0.003 per 1K input tokens and $0.004 per 1K output tokens. 16k context means the model can now support ~20 pages of text in a single request.
00:00 ChatGPT 16K Token Context Length
Share this page with your family and friends.