When will training be available on 3.5?

From what I have learned it appears training for my account is only available on 3.0 and /v1/completions

Any call to my trained models to the /v1/chat/completions end point is denied

I see the difference also in the request syntax between the two end points

So I am wondering when I will be able to train on a 3.5 base model and use that trained model on /v1/chat/completions end point?

Also what is the status of web search and local iPhone file write and reading ?

Seems the two most powerful tools this five minutes to automating anything

Cheers to the teams outstanding job so far just trying to catch up with you

I was investigating how Bing Chat was able to pull off local weather and apparently they have a index file they update from bing search that the model uses. I see there is a chatPC to gain access to files on a Mac. So iPhone file access would be popular to gain real time info and export work to be done by servers

Seems Azure has gotten a head start on this web search ability hope OpenAI will include it in 4.0 with r/w file access locally on a iPhone.

OpenAI wrote in a recent blog post that the hope to have fine-tuning using got-3.5-turbo and gpt-4 as base models available by the end of the year.

We are working on safely enabling fine-tuning for GPT-4 and GPT-3.5 Turbo and expect this feature to be available later this year.

3 Likes

That will be nice :+1:. The System and assistant roles are powerful in setting a stage for the user. Having a coherent model base with a training file could make a bot business ready. External file access could help in automating tasks, and provide near real time information like bing chat has achieved

I think you have misconceptions. What do you mean by “index file they update”? The AI of Bing Chat is simply inspired to “search” when you ask for the weather (geolocation it of course does use to search not pictured in screenshot):

And to inform the AI, they don’t go crazy and allow the AI request URLs it wants ( like Google); it has a limited scope, but the API interface can also display function returns for images based on what the AI has decided to search for.

(Bing continues to tweak functions I depicted, like the image creator can revise the previous image.)

These are all capabilities similar to those that developers can implement themselves.

1 Like

Interesting, I was simply asking Bing how they were able to give me a weather forecast when OpenAI cannot. Bing replied they keep a offline index file that is updated using bing search. So while my ChatBot cannot tell me my weather Bing can. What I don’t know is if this is available to Azure users of their OpenAI implementation. I don’t see me implementing this on my chat bot until I can figure out a way to seed it perhaps with the system input?

You have what is called an “AI hallucination” there.

I was not able to get Bing Chat to make up such a farcical scenario, even when I tried in “creative mode” or asked about Bing (which it would know as the search engine), or lead it on.

image

And no, you make a function that can retrieve the weather or other such information yourself from web sources, maybe a specific weather site with an API.

Just ask bing to show you the weather for your location then ask how did he do it when OpenAI cannot what you shall learn is Bing has implemented a offline index file from bing search being used for near real time information. Something OpenAI does not offer unless the plus subscription web search option will do this :woman_shrugging: I have to wait until I get paid to find out. The system or assistant inputs almost let us do it, but not for large files is suspect

I get paid next week and am buying a plus subscription

You are asking a question interpreted as asking how Bing, and search engines in general, work. It’s not answering specifically about the AI capabilities and implementation, which I detailed above.

My point is near real time searches are now available by a plugin that bing implemented available to plus OpenAI accounts. I want to stay with OpenAI and not join Azure. I want to stay within what is released by OpenAI and not use any custom models. Just my experience tells me to do. Avoids a lot of noise. Your examples may well be what bing developers have adopted :woman_shrugging: it was a puzzle palace when I worked there for Bing. I am only interested in building a site to make money :moneybag:. Not interested in deep diving bro, I am retired from those gigs. Now I want to build fun stuff people like and base it on shareware model that has never died, make it easy for users to take my code and get it up and running in a hour on THIER own web site.

To go in a roundabout fashion back to your original question, there is no need to “train” an AI engine.

If you were previously able to make and use an engine such as davinci on the completions endpoint for a chatbot, there are some changes you need to make in your program to the format of messages that you use also on the “chat” endpoint, beside just selecting a chat model such as “gpt-3.5-turbo”. Improperly-formatted messages will return an error.

You can read about these changes in the API chat completions guide section. Additionally, how a software programmer can write their own functions to extend the capability of the AI themselves.

(this is knowledge newer than ChatGPT can answer about)

If you subscribe to ChatGPT plus for $20/mo, “Search with Bing” is currently disabled due to concerns about some website publisher’s rights, however there are many plugins that can be installed, where, among others, yes, can tell you about today’s weather as much as google can.

1 Like

Very helpful information good to know the search is disabled perhaps they will figure out how to get around the proprietor rights a big issue may not bust thru it soon I suspect. Microsoft has a entire building of lawyers in Bellevue I used to get off in front of it every day. I suspect they are not scared :sunglasses: . OpenAI a smaller company needs to be careful.

Yeah I am done training 3.0 models I get it now. The chat endpoint gives me the system and the assistant inputs that seem more than enough for most purposes. Detailing a buisness however will require 3.5 capabilities and training to be a useful say Alaskan Airlines online agent will need file access like what chatPC offers on the Mac. Putting all that together automation work wil begin in a BIG way across ALL business websites my view.

The revolution is clearly in the starting gates the horses are busting to get loose upon the websites :sunglasses:. I think these existing business chat bots are lame :unamused:. When web developers realize how to train a 3.5 GPT model and make outside calls to read and post, every site on the web will adopt it wait and see

But this five minutes it makes no sense to advertise what is available now as it cannot be used to provide real time online agent capability but I see it coming soon.

I guess this discussion has convinced me to buy a Azure OpenAI package they offer web search vital for my chat bot offerings

No, this is not built into the model, this is built as a layer on top of the model.
They use a model very similar in capability to the one you get from the API.
Any additional improvements are developments they provide using some interposer and transformation layer of the query/response.

2 Likes

Makes more sense for bing to do that. So I am just itching to get into Azure to see what they permit users to use.

No, they do not offer such a thing.

They’ve got the same OpenAI models (and actually no function endpoint unless you are using GPT_4).

I’m sorry my answering of multi-faceted questions has made no impact.

Azure does not add any additional APIs on top of OpenAI.
They don’t even give you GPT-4 yet (although hopefully that changes soon)

1 Like

Interesting, They are promising GPT-4 and search with a subscription. Mmmmmm I am going to wait and see what my 20 dollars buys me next week with a plus subscription. See if the rumors are true about web search and GPT-4 I really really really don’t want another monthly bill :sunglasses: I want to stay 100% OpenAI code.

20 dollars is for the consumer-facing “chat” product, which is totally different from the paid-for API access.
The chat product has wrappers and web services in front of it, that are not available for API usage.

2 Likes