Phi-3 Cookbook: Hands-On Examples with Microsoft's Phi-3 Models
Phi, is a family of open AI models developed by Microsoft. Phi models are the most capable and cost-effective small language models (SLMs) available, outperforming models of the same size and next size up across a variety of language, reasoning, coding, and math benchmarks. The Phi-3 Family includes mini, small, medium and vision versions, trained based on different parameter amounts to serve various application scenarios. For more detailed information about Microsoft's Phi family, please visit the Welcome to the Phi Family page.
Follow these steps:
- Fork the Repository: Click on the "Fork" button at the top-right corner of this page.
- Clone the Repository:
git clone https://github.com/microsoft/Phi-3CookBook.git
Table of Contents
-
Introduction
- Setting up your environment(✅)
- Welcome to the Phi Family(✅)
- Understanding Key Technologies(✅)
- AI Safety for Phi Models(✅)
- Phi-3 Hardware Support(✅)
- Phi-3 Models & Availability across platforms(✅)
- Using Guidance-ai and Phi(✅)
- GitHub Marketplace Models
- Azure AI Model Catalog
-
Quick Start
- Using Phi-3 in GitHub Model Catalog(✅)
- Using Phi-3 in Hugging face(✅)
- Using Phi-3 with OpenAI SDK(✅)
- Using Phi-3 with Http Requests(✅)
- Using Phi-3 in Azure AI Studio(✅)
- Using Phi-3 Model Inference with Azure MaaS or MaaP(✅)
- Using Phi-3 with Azure Inference API with GitHub and Azure AI
- Deploying Phi-3 models as serverless APIs in Azure AI Studio(✅)
- Using Phi-3 in Ollama(✅)
- Using Phi-3 in LM Studio(✅)
- Using Phi-3 in AI Toolkit VSCode(✅)
- Using Phi-3 and LiteLLM(✅)
-
Inference Phi-3
- Inference Phi-3 in iOS(✅)
- Inference Phi-3.5 in Android(✅)
- Inference Phi-3 in Jetson(✅)
- Inference Phi-3 in AI PC(✅)
- Inference Phi-3 with Apple MLX Framework(✅)
- Inference Phi-3 in Local Server(✅)
- Inference Phi-3 in Remote Server using AI Toolkit(✅)
- Inference Phi-3 with Rust(✅)
- Inference Phi-3-Vision in Local(✅)
- Inference Phi-3 with Kaito AKS, Azure Containers(official support)(✅)
- Inference Your Fine-tuning ONNX Runtime Model(✅)
-
Fine-tuning Phi-3
- Downloading & Creating Sample Data Set(✅)
- Fine-tuning Scenarios(✅)
- Fine-tuning vs RAG(✅)
- Fine-tuning Let Phi-3 become an industry expert(✅)
- Fine-tuning Phi-3 with AI Toolkit for VS Code(✅)
- Fine-tuning Phi-3 with Azure Machine Learning Service(✅)
- Fine-tuning Phi-3 with Lora(✅)
- Fine-tuning Phi-3 with QLora(✅)
- Fine-tuning Phi-3 with Azure AI Studio(✅)
- Fine-tuning Phi-3 with Azure ML CLI/SDK(✅)
- Fine-tuning with Microsoft Olive(✅)
- Fine-tuning Phi-3-vision with Weights and Bias(✅)
- Fine-tuning Phi-3 with Apple MLX Framework(✅)
- Fine-tuning Phi-3-vision (official support)(✅)
- Fine-Tuning Phi-3 with Kaito AKS , Azure Containers(official Support)(✅)
- Fine-Tuning Phi-3 and 3.5 Vision(✅)
-
Evaluation Phi-3
- Introduction to Responsible AI(✅)
- Introduction to Promptflow(✅)
- Introduction to Azure AI Studio for evaluation(✅)
-
E2E Samples for Phi-3-mini
- Introduction to End to End Samples(✅)
- Prepare your industry data(✅)
- Use Microsoft Olive to architect your projects(✅)
- Local Chatbot on Android with Phi-3, ONNXRuntime Mobile and ONNXRuntime Generate API(✅)
- Hugging Face Space WebGPU and Phi-3-mini Demo- Phi-3-mini provides the user with a private (and powerful) chatbot experience. You can try it out(✅)
- Local Chatbot in the browser using Phi3, ONNX Runtime Web and WebGPU(✅)
- OpenVino Chat(✅)
- Multi Model - Interactive Phi-3-mini and OpenAI Whisper(✅)
- MLFlow - Building a wrapper and using Phi-3 with MLFlow(✅)
- Model Optimization - How to optimize Phi-3-min model for ONNX Runtime Web with Olive(✅)
- WinUI3 App with Phi-3 mini-4k-instruct-onnx(✅)
- WinUI3 Multi Model AI Powered Notes App Sample(✅)
- Fine-tune and Integrate custom Phi-3 models with Prompt flow(✅)
- Fine-tune and Integrate custom Phi-3 models with Prompt flow in Azure AI Studio(✅)
- Evaluate the Fine-tuned Phi-3 / Phi-3.5 Model in Azure AI Studio Focusing on Microsoft's Responsible AI Principles(✅)
- Phi-3.5-mini-instruct language prediction sample (Chinese/English)(✅)
-
E2E Samples for Phi-3-vision
- Phi-3-vision-Image text to text(✅)
- Phi-3-vision-ONNX(✅)
- Phi-3-vision CLIP Embedding(✅)
- DEMO: Phi-3 Recycling(✅)
- Phi-3-vision - Visual language assistant with Phi3-Vision and OpenVINO(✅)
- Phi-3 Vision Nvidia NIM(✅)
- Phi-3 Vision OpenVino(✅)
- Phi-3.5 Vision multi-frame or multi-image sample(✅)
-
E2E Samples for Phi-3.5-MoE
- Phi-3.5 Mixture of Experts Models (MoEs) Social Media Sample(✅)
- Building a Retrieval-Augmented Generation (RAG) Pipeline with NVIDIA NIM Phi-3 MOE, Azure AI Search, and LlamaIndex(✅)
-
Labs and workshops samples Phi-3
- C# .NET Labs(✅)
- Build your own Visual Studio Code GitHub Copilot Chat with Microsoft Phi-3 Family(✅)
- Local WebGPU Phi-3 Mini RAG Chatbot Samples with Local RAG File(✅)
- Phi-3 ONNX Tutorial(✅)
- Phi-3-vision ONNX Tutorial(✅)
- Run the Phi-3 models with the ONNX Runtime generate() API(✅)
- Phi-3 ONNX Multi Model LLM Chat UI, This is a chat demo(✅)
- C# Hello Phi-3 ONNX example Phi-3(✅)
- C# API Phi-3 ONNX example to support Phi3-Vision(✅)
- Run C# Phi-3 samples in a CodeSpace(✅)
- Using Phi-3 with Promptflow and Azure AI Search(✅)
- Windows AI-PC APIs with Windows Copilot Library
-
Learning Phi-3.5
- What's new Phi-3.5 Family(✅)
- Quantifying Phi-3.5 Family(✅)
- Quantizing Phi-3.5 using llama.cpp(✅)
- Quantizing Phi-3.5 using Generative AI extensions for onnxruntime(✅)
- Quantizing Phi-3.5 using Intel OpenVINO(✅)
- Quantizing Phi-3.5 using Apple MLX Framework(✅)
- Phi-3.5 Application Samples
- Phi-3.5-Instruct WebGPU RAG Chatbot(✅)
- Create your own Visual Studio Code Chat Copilot Agent with Phi-3.5 by GitHub Models(✅)
- Using Windows GPU to create Prompt flow solution with Phi-3.5-Instruct ONNX (✅)
- Using Microsoft Phi-3.5 tflite to create Android app(✅)
Using Phi-3 Models
Phi-3 on Azure AI Studio
You can learn how to use Microsoft Phi-3 and how to build E2E solutions in your different hardware devices. To experience Phi-3 for yourself, start by playing with the model and customizing Phi-3 for your scenarios using the Azure AI Studio, Azure AI Model Catalog you can learn more at Getting Started with Azure AI Studio
Playground
Each model has a dedicated playground to test the model Azure AI Playground.
Phi-3 on GitHub Models
You can learn how to use Microsoft Phi-3 and how to build E2E solutions in your different hardware devices. To experience Phi-3 for yourself, start by playing with the model and customizing Phi-3 for your scenarios using the GitHub Model Catalog you can learn more at Getting Started with GitHub Model Catalog
Playground
Each model has a dedicated playground to test the model.
Phi-3 on Hugging Face
You can also find the model on the Hugging Face
Playground
Hugging Chat playground
Multi-Language Support
Note:
These translations were automatically generated using the open-source co-op-translator and may contain errors or inaccuracies. For critical information, it is recommended to refer to the original or consult a professional human translation. If you'd like to add or update a translation, please refer to the co-op-translator repository, where you can easily contribute using simple commands.
Language |
Code |
Link to Translated README |
Last Updated |
Chinese (Simplified) |
zh |
Chinese Translation |
2024-10-04 |
Chinese (Traditional) |
tw |
Chinese Translation |
2024-10-04 |
French |
fr |
French Translation |
2024-10-04 |
Japanese |
ja |
Japanese Translation |
2024-10-04 |
Korean |
ko |
Korean Translation |
2024-10-04 |
Spanish |
es |
Spanish Translation |
2024-10-04 |
Trademarks
This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines.
Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.