I’ve been interacting with ChatGPT for three months now. Today I built my own client — and got access to their enterprise grade model.

ChatGPT runs on GPT-3.5. It’s impressive, but limited. By building my own client, I can call OpenAI’s API directly and use text-davinci-003 — the enterprise-grade model that powers the serious applications.  It’s more secure.

It sounds trivial, but it changes everything. When you use ChatGPT through the web interface, you’re a visitor in someone else’s house. When you call the API directly, you own the conversation, you control the system prompt, tokens and temperature.  This is the way enterprises use it.

The technical part wasn’t hard — format a message, send it to an endpoint, parse the response. The conceptual shift was bigger. These models aren’t products. They’re infrastructure. Anyone can build on top of them.

I’m already thinking about what else I can wire up. Voice input. Custom prompts for specific tasks.

The API is the unlock.

Did you like the article

Don't miss next posts

If You want to receive notifications about my next post please subscribe

Leave a Reply

avatar
  Subscribe  
Notify of