Share Codex experiments

Just wrote a web scraper for my own blog using. Codex.

Code worked well with just a couple of lines of change.

It even used the prompts to comment on the relevant code.

Some observations. Adding prompts about global variables early helps.

8 Likes

I tried codebase exploration with codex /answers, seems to work well: Codebase Exploration with Codex /answers.

2 Likes

Nice one. thanks for sharing.

1 Like

This is great. Thank you Harish!

Find a sentiment analysis model in @huggingface, create a @gradio app using Codex and test it out in 30 seconds. // by Omar Sanseviero @osanseviero :+1:

It works! but before
pip install transformers
πš™πš’πš™ πš’πš—πšœπšπšŠπš•πš• πšπš›πšŠπšπš’πš˜ ; )

4 Likes

Did you use the Javascript Codex or Playground? I tried β€œFind a sentiment analysis model in @huggingface, create a @gradio app using Codex and test it out all in 30 seconds.” but got an error in the browser (Firefox) console.

import os
import openai

openai.api_key = os.getenv("OPENAI_API_KEY")

response = openai.Completion.create(
  engine="davinci-codex",
  prompt="\"\"\"\ncreate an app to classify a tweet sentiment using distilbert-base-uncased-finetuned-sst-2-english\n\"\"\"\nimport gradio as gr\nimport transformers\nfrom transformers import DistilBertTokenizer\n\ntokenizer = DistilBertTokenizer.from_pretrained(\"distilbert-base-uncased-finetuned-sst-2-english\")\nmodel = transformers.DistilBertForSequenceClassification.from_pretrained(\"distilbert-base-uncased-finetuned-sst-2-english\")\n\ndef classify_sentiment(text):\n    input_ids = tokenizer.encode(text, return_tensors=\"pt\")\n    outputs = model(input_ids)\n    predictions = outputs[0].argmax().item()\n    if predictions == 0:\n        return \"negative\"\n    else:\n        return \"positive\"\n\niface = gr.Interface(classify_sentiment, \"textbox\", \"label\")\nif __name__ == \"__main__\":\n    iface.launch()",
  temperature=0,
  max_tokens=662,
  top_p=1,
  frequency_penalty=0,
  presence_penalty=0
)

2 Likes

thanks - that is a much longer prompt than I had anticipated.

it’s a short prompt + output
prompt:

"""
create an app to classify a tweet sentiment using distilbert-base-uncased-finetuned-sst-2-english
"""
import gradio as gr
import transformers

I appreciate this. I verified it works fine after doing pip install on gradio and transformers. Used jupyter. If we were to draw a prompt lesson from this successful prompt, is it: Start by importing the necessary packages for context? Just doing β€œCreate an app to classify a tweet sentiment” is not sufficient (I obtained a list of things to do rather than code). But by making explicit two things: 1) β€œdistilbert-base-uncased-finetuned-sst-2-english Β· Hugging Face” as the model, and 2) two python packages/modules that are necessary, things work fine. I think we need to develop heuristics for prompt engineering.

2 Likes

Thanks for sharing my experiment. As mentioned, it requires doing pip install previously. Note that if you have the two libraries installed in your environment and you are in a Jupyter notebook, it won’t be required to install again.

Also as mentioned, you need to be explicit about the exact model and having at least the gradio import.

Another prompt I tried was

"""
create a gradio app to classify a tweet sentiment using distilbert-base-uncased-finetuned-sst-2-english
"""

This also generates an acceptable output

"""
create a gradio app to classify a tweet sentiment using distilbert-base-uncased-finetuned-sst-2-english
"""

import gradio as gr
from transformers import AutoTokenizer, AutoModelForSequenceClassification

tokenizer = AutoTokenizer.from_pretrained("distilbert-base-uncased-finetuned-sst-2-english")
model = AutoModelForSequenceClassification.from_pretrained("distilbert-base-uncased-finetuned-sst-2-english")

def classify_sst2(text):
    encoding = tokenizer.encode_plus(text, return_tensors="pt")
    outputs = model(**encoding, return_dict=True)
    predictions = outputs.logits.detach().numpy()
    return predictions.argmax()

iface = gr.Interface(classify_sst2, "textbox", "label")
iface.launch()
2 Likes