AI Integration in Jupyter Notebooks

JupyterLab has unveiled its latest integration with Jupyter, allowing users to seamlessly harness the power of generative AI within Jupyter environments. This innovative tool introduces the %%ai magic, transforming Jupyter notebooks into reproducible AI platforms compatible with various interfaces like JupyterLab, Google Colab, and VSCode.

Here’s a short overview of its functions

  • The %%ai Magic Command:
    By integrating this, one can potentially introduce AI functions into a notebook. It’s designed to be compatible across different platforms.

  • Conversational Element:
    There’s a native chat UI in JupyterLab that presents the idea of interacting with an AI model conversationally.

  • Support for Various Models:
    The tool claims compatibility with a range of AI providers like AI21, Anthropic, Cohere, Hugging Face, OpenAI, SageMaker, and more.

For those interested in exploring further:

I’d love to hear your thoughts. Has anyone here experimented with it? If so, how was your experience? For those who haven’t, do you think this could be a useful addition to your toolkit?

Disclaimer: I want to clarify that I have no affiliation with Jupyter AI or any related products. I just found it interesting and thought it might be worth a discussion here.


For anyone who is looking to get started there’s a whole directory full of example notebooks on the GitHub here:

Here’s the full list of supported models from the %ai list command:

Provider Environment variable Set? Models
ai21 AI21_API_KEY :white_check_mark: ai21:j1-large, ai21:j1-grande, ai21:j1-jumbo, ai21:j1-grande-instruct, ai21:j2-large, ai21:j2-grande, ai21:j2-jumbo, ai21:j2-grande-instruct, ai21:j2-jumbo-instruct
anthropic ANTHROPIC_API_KEY :white_check_mark: anthropic:claude-v1, anthropic:claude-v1.0, anthropic:claude-v1.2, anthropic:claude-instant-v1, anthropic:claude-instant-v1.0
cohere COHERE_API_KEY :white_check_mark: cohere:medium, cohere:xlarge
huggingface_hub HUGGINGFACEHUB_API_TOKEN :white_check_mark: See huggingface models for a list of models. Pass a model’s repository ID as the model ID; for example, huggingface_hub:ExampleOwner/example-model.
openai OPENAI_API_KEY :white_check_mark: openai:text-davinci-003, openai:text-davinci-002, openai:text-curie-001, openai:text-babbage-001, openai:text-ada-001, openai:davinci, openai:curie, openai:babbage, openai:ada
openai-chat OPENAI_API_KEY :white_check_mark: openai-chat:gpt-4, openai-chat:gpt-4-0314, openai-chat:gpt-4-32k, openai-chat:gpt-4-32k-0314, openai-chat:gpt-3.5-turbo, openai-chat:gpt-3.5-turbo-0301
openai-chat-new OPENAI_API_KEY :white_check_mark: openai-chat-new:gpt-4, openai-chat-new:gpt-4-0314, openai-chat-new:gpt-4-32k, openai-chat-new:gpt-4-32k-0314, openai-chat-new:gpt-3.5-turbo, openai-chat-new:gpt-3.5-turbo-0301
sagemaker-endpoint Not applicable. N/A Specify an endpoint name as the model ID. In addition, you must include the --region_name, --request_schema, and the --response_path arguments. For more information, see the documentation about SageMaker endpoints deployment and about using magic commands with SageMaker endpoints.

Very cool. Thanks for sharing. I wish I had more time sometimes haha

1 Like

I’ve finally had some time to play around with this, and in the meantime, they’ve updated it quite a bit. It now has support for langchain and vector databases. So far, it seems very easy to use.

Here’s a few pictures from the GitHub: