If you like this repo, star it on GitHub — it helps a lot!
Overview • Usage • Run the samples • Next steps • Resources
Try out generative AI models right in your browser for free using this playground!
Using GitHub Codespaces and Ollama, you'll be able to run SLMs (Small Language Models) such as Phi-3 directly in your browser, without having to install anything.
This project is designed to be opened in GitHub Codespaces, which provides you a pre-configured environment to run the code and AI models. Follow these steps to get started:
Click on the "Codespaces: Open" button:
Once the Codespace is loaded, it should have Ollama pre-installed as well as the OpenAI Node SDK.
Ask Ollama to run the SLM of your choice. For example, to run the Phi-3 model:
ollama run phi3
That will take a few seconds to load the model.
Once you see >>>
in the output, you can send a message to that model from the prompt.
>>> Write a haiku about a furry kitten
After several seconds, you should see a response stream in from the model.
Close the model by typing /bye
and pressing Enter.
Open the file ollama.ipynb
in the editor and follow the instructions.
Tip
While you're following the instructions of the interactive notebook, you can run the code cells by clicking on the "Execute cell" (
In the samples folder of this repository, you'll find examples of how to use generative AI models using the OpenAI Node.js SDK. You can run them by executing the following command in the terminal:
tsx samples/
Alternatively, you can open a sample file in the editor and run it directly by clicking the "Run" (
Important
Some samples requires you to start the Azure OpenAI emulator first. You can do so by running the following command in a terminal and keeping it running while you run the samples:
ollamazure -d
Once you're comfortable with this playground, you can explore more advanced topics and tutorials:
Generative AI for beginners [course]: a complete guide to learn about generative AI concepts and usage.
Phi-3 Cookbook [tutorials, samples]: hands-on examples for working with the Phi-3 model.
When you're ready to explore how you can deploy generative using Azure, you should check out these resources:
Quickstart: Get started using GPT-35-Turbo and GPT-4 with Azure OpenAI Service [tutorial]: a tutorial to get started with Azure OpenAI Service.
Build a serverless AI chat with RAG using LangChain.js [sample]: a next step tutorial to build an AI chatbot using Retrieval-Augmented Generation and LangChain.js.
Here are some additional resources to help you learn more about generative AI:
Awesome Generative AI [links]: a curated list of resources about generative AI.
Fundamentals of Responsible Generative AI [course]: a training module to learn about the responsible use of generative AI.
Azure AI Studio [tool]: a web portal to create, train, deploy and experiment with AI models.
Ollama Python Playground
Ollama C# Playground