Member-only story
Microsoft Phi-3 on Ollama and Azure AI’s MaaS
Redefine What’s Possible with SLMs
6 min readMay 27, 2024
Ollama Series’s Articles
- How to Execute LLMs Model Remotely at No Cost Using Google Colab T4 GPU | by Korkrid Kyle Akepanidtaworn | Medium
- Running Microsoft Phi-2 on Ollama and LlamaIndex Using an NVIDIA Tesla T4 GPU | by Korkrid Kyle Akepanidtaworn | Mar, 2024 | Medium
Introduction
Microsoft’s recent unveiling of the Phi-3 family at Microsoft Build 2024 underscores the rapid pace of AI innovation. This leap from my Phi-2 article two months ago highlights the need to stay up-to-date. Phi-3 models redefine the capabilities of small language models, offering power and cost-effectiveness for resource-constrained environments. The Phi-3 family will be available in the Azure AI Model Catalog and Ollama and includes the followings:
- Phi-3-small —Contains 7 billion parameters.
- Phi-3-medium — Contains 14 billion parameters.
- Phi-3-vision — Supports general visual reasoning tasks and chart/graph/table reasoning.