Cannot import name 'GPT4LMHeadModel' from 'transformers'

I use the same key for gpt-4, which is working good.
but when do

from transformers import GPT4LMHeadModel, GPT4Tokenizer

got this cannot import error.

I’ve installed openai, transformers (using github repo);
import GPT2LMHeadModel is working

any idea?

Hi!

Why are you doing that? transformers is for running models from huggingface,

while gpt2 is on huggingface, gpt3, 3.5, and 4 aren’t! you’ll probably want to use the openai lib.

from openai import OpenAI
client = OpenAI()

completion = client.chat.completions.create(
  model="gpt-4-0125-preview",
  messages=[
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "Hello!"}
  ]
)

print(completion.choices[0].message)

https://platform.openai.com/docs/api-reference/chat

3 Likes

I don’t want to call the API but train/use the customized gpt 4 model; and I saw others (in google search) having same code.

I can confirm that will not work.

2 Likes

Interesting. Can you share some of these confusing results?

https: // www.tome01. com /fine-tuning-openai-gpt-4-deep-dive-into-the-process-and-python-implementation

1 Like

Thanks for sharing.
Google is getting overwhelmed by AI created content and this “thing” is apparently one of the many culprits.

In short: GPT-4 fine tuning is currently in limited beta and only available to users with a background in fine-tuning the older models via the OpenAI platform.

the follow post was from a real Data Scientist
https :// mangeshkendre01.medium .com /artificial-intelligence-insights-701a53a0f3da

looks more like a SEO experiment to me :thinking:

1 Like