An API to put ChatGPT in any app

OpenAI opens access to ChatGPT via an API and the new Chat Markup Language format. Developers can also run their queries on dedicated instances in Azure.

The dissemination of texts produced by ChatGPT promises to accelerate. Developers can now integrate ChatGPT features into their app via an API made available by OpenAI. The firm had done the same in November for the generation of images. OpenAI specifies that requests must be formulated in Chat Markup Language (or ChatML), a structured language that it introduces at the same time. The latter gives the possibility of introducing metadata (for the moment the author) and a content (for the time being the text) and suggests future additional functionalities. Requests via the API currently cost $1 for 500,000 tokens/words.

If the API has what it takes to explode the number of apps integrating ChatGPT, this growth could be slowed down by the app stores. According to the Wall Street Journal, Apple has just demanded that the latest update of the BlueMail app using the chatbot be only available to those over 17, to the chagrin of the publisher Blix. The Bing application for iOS which integrates the chatbot is also reserved for adults.

Dedicated instances

OpenAI is also announcing the availability of Dedicated Instances in Azure for developers who want API requests not to be performed on shared infrastructure. They no longer pay per token, but per period during which part of the infrastructure is reserved for their requests. Developers thus have better control of the load and can optimize the available power. According to OpenAI, the model can be economically advantageous for developers employing more than 450 million words/tokens per day.

Related Articles:  Leaks about the prices of the new iPhone .. Is it expected?

Related posts:

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.