At a downtown event space in San Francisco on Monday, OpenAI, the creator of ChatGPT and GPT-4, held its first-ever conference for developers, Dev Day. The company unveiled GPTs, a way to throw together easily custom versions of ChatGPT, a store to find and purchase custom ChatGPTs, an “Assistant” API to make it easier for developers to call specific functions for their applications, and many more features and upgrades.
The only swag so far are a bunch of nifty pins with the OpenAI logo and labels such as “Engineering” and “Research” and “Go to market.” Another set of pins represents personal pronouns.
First, Sam Altman took the stage and recapped the various milestones: ChatGPT a year ago, followed by GPT-4, which is “still the most powerful model”.
The company disclosed it has over 2 million developers building on its APIs “for a wide range of use cases,” and for 92% of the Fortune 500 companies. ChatGPT itself gets about 100 million weekly active users, the company said.
Also: MacBook Pro (M3 Max) review: A desktop-class laptop for an AI-powered age
Altman then went into a rapid-fire itemization of the many innovations being announced. The various announcements received healthy rounds of applause — it’s an upbeat, enthusiastic crowd.
Altman brought a special guest on stage: Satya Nadella, CEO of Microsoft. Altman jokingly asked Nadella, “How is Microsoft thinking about the partnership?” That drew laughter from both Nadella and the audience.
“You guys have built something magical,” said Nadella. The partnership has “dramatically changed” the “shape” of Microsoft’s Azure cloud computing service, said Nadella. “Our job is to make the best system so you can build the best models,” added Nadella.
“The first thing we have been doing in partnership with you is building the system,” said Nadella. “We want to build our copilot as developers on OpenAI API.”
“A couple of things are going to be very, very key for us,” said Nadella. “We intend fully to commit ourselves deeply to make sure you have not only the best models but also the best compute,” said Nadella. “Our mission is to empower every individual.”
“I always think of Microsoft as a platform company, a developer company, and a partner company,” said Nadella. “The systems that are needed as you aggressively push forward on your roadmap requires us to be on the top of our game,” said Nadella. He added that the shared mission of the two companies is to empower every person in every organization on the planet to achieve more,” said Nadella.
In response to Nadella, Altman remarked, “I’m excited for us to build AGI together,” meaning, artificial general intelligence, the notion of computers that can match human thought capabilities.
The key product and technology news includes:
- GPTs: custom versions of ChatGPT that OpenAI says “anyone can easily build” for specific tasks. The company is offering two initial custom GPTs, Canva and Zapier AI, for the popular design app and the workflows software, respectively. The company plans to offer additional GPTs;
- GPT Store: Later in November, OpenAI will open the GPT Store, for obtaining the GPTs others have built, where developers can earn money for their creations;
- Copyright Shield that will absorb cost of defending customers;
- Fine-tuning service for GPT-4 for developers;
- Custom Models program for enterprises, a team of OpenAI researchers who will work with “selected organizations” to “train custom GPT-4 to their specific domain”;
- A new user interface for ChatGPT, a simple, dark background with the OpenAI logo, and the phrase, “How can I help you today?” The new user interface will make it easier to juggle between ChatGPT and DALL-E, the image-creation program from OpenAI, the company said;
- GPT-4’s information source gets updated to April 2023, a big step past the traditional limit imposed on the program of September 2021. ChatGPT also gains the ability to search PDFs and other documents;
- The GPT-4 program gets its “context window,” the amount of input it can take into account when formulating an answer, four-fold, from 32,000 to 128,000, in a new “Turbo” version of the program. (For more on the various features of GPT models, see OpenAI’s Web site.);
- GPT-4 Turbo can now accept images as part of the prompt, and can generate “human-quality speech” as its output;
- Assistant API: a function-calling mechanism that makes it easier for developers to plug specific “assistant” functions into their apps, such as “natural language-based data analysis app, a coding assistant, an AI-powered vacation planner, a voice-controlled DJ, a smart visual canvas.”
- A new “seed” parameter makes GPT return “reproducible outputs” “most of the time”;
- A new version of GPT-3.5 Turbo that gains greater function handling and JSON handling;
- Cut the price on GPT-4 Turbo and GPT-3.5 Turbo, based on the price per input and output tokens, and doubled the rate of “tokens per minute” that can be used.
Altman onstage demonstrated GPTs, writing from scratch a program called Startup Mentor, a program to give advice to entrepreneurs. He showed off uploading a file of a talk he had given, as an example of input via external sources. The program is built to answer questions such as, “What are the three things to look for when hiring people for a startup?”
Said Altman of custom models, “We won’t be able to do this with many companies to start, and it won’t be cheap.”
The Copyright Shield program, said Altman, “means that we will step in to defend our customers” in the instance of litigation, “and absorb the cost.”
Also: Overseeing generative AI: New software leadership roles emerge
OpenAI says a key element of the Assistant API is “persistent threads,” which “allow developers to avoid re-sending the entire conversation history with every new message and work around context window constraints.”
The new GPT-4 Turbo can be accessed immediately in preview form, OpenAI said, bypassing the command gpt-4-1106-preview to the to the OpenAI API. A stable version is planned to be released “in the coming weeks.”
GPT-4 Turbo does a better job, the company said, of following specific instructions, such as to output a response in XML. It also gains a capability for responses in JSON via a new parameter, “response_format.”
Also: Robots plus generative AI: Everything you need to know when they work as one
The new seed parameter for GPT-4 is a beta feature that “is useful for use cases such as replaying requests for debugging, writing more comprehensive unit tests and generally having a higher degree of control over the model behavior,” the company said.
The cuts in price announced mean that GPT-4 Turbo, for example, is now a penny per input token, versus three cents before, and three cents per output token, versus six cents before — which the company bills as “three times cheaper” and “two times cheaper,” respectively.