JupyterLab has unveiled its latest integration with Jupyter, allowing users to seamlessly harness the power of generative AI within Jupyter environments. This innovative tool introduces the %%ai magic, transforming Jupyter notebooks into reproducible AI platforms compatible with various interfaces like JupyterLab, Google Colab, and VSCode.
Here’s a short overview of its functions
The %%ai Magic Command:
By integrating this, one can potentially introduce AI functions into a notebook. It’s designed to be compatible across different platforms.
There’s a native chat UI in JupyterLab that presents the idea of interacting with an AI model conversationally.
Support for Various Models:
The tool claims compatibility with a range of AI providers like AI21, Anthropic, Cohere, Hugging Face, OpenAI, SageMaker, and more.
For those interested in exploring further:
I’d love to hear your thoughts. Has anyone here experimented with it? If so, how was your experience? For those who haven’t, do you think this could be a useful addition to your toolkit?
Disclaimer: I want to clarify that I have no affiliation with Jupyter AI or any related products. I just found it interesting and thought it might be worth a discussion here.
For anyone who is looking to get started there’s a whole directory full of example notebooks on the GitHub here:
Here’s the full list of supported models from the
%ai list command:
||ai21:j1-large, ai21:j1-grande, ai21:j1-jumbo, ai21:j1-grande-instruct, ai21:j2-large, ai21:j2-grande, ai21:j2-jumbo, ai21:j2-grande-instruct, ai21:j2-jumbo-instruct
||anthropic:claude-v1, anthropic:claude-v1.0, anthropic:claude-v1.2, anthropic:claude-instant-v1, anthropic:claude-instant-v1.0
||See huggingface models for a list of models. Pass a model’s repository ID as the model ID; for example, huggingface_hub:ExampleOwner/example-model.
||openai:text-davinci-003, openai:text-davinci-002, openai:text-curie-001, openai:text-babbage-001, openai:text-ada-001, openai:davinci, openai:curie, openai:babbage, openai:ada
||openai-chat:gpt-4, openai-chat:gpt-4-0314, openai-chat:gpt-4-32k, openai-chat:gpt-4-32k-0314, openai-chat:gpt-3.5-turbo, openai-chat:gpt-3.5-turbo-0301
||openai-chat-new:gpt-4, openai-chat-new:gpt-4-0314, openai-chat-new:gpt-4-32k, openai-chat-new:gpt-4-32k-0314, openai-chat-new:gpt-3.5-turbo, openai-chat-new:gpt-3.5-turbo-0301
||Specify an endpoint name as the model ID. In addition, you must include the --region_name, --request_schema, and the --response_path arguments. For more information, see the documentation about SageMaker endpoints deployment and about using magic commands with SageMaker endpoints.
Very cool. Thanks for sharing. I wish I had more time sometimes haha
I’ve finally had some time to play around with this, and in the meantime, they’ve updated it quite a bit. It now has support for langchain and vector databases. So far, it seems very easy to use.
Here’s a few pictures from the GitHub: