Microsoft Cloud IT Pro Podcast

Episode 397 – Local LLMs: Why Every Microsoft 365 & Azure Pro Should Explore Them
Your support makes this show possible! Please consider becoming a premium member for access to live shows and more. Check out our membership options.
Show Notes- Ollama
- Running LLMs Locally: A Beginner’s Guide to Using Ollama
- open-webui/open-webui
- LM Studio
- LM. Studio Model Catalog
- Why do people like Ollama more than LM Studio?
- A Starter Guide for Playing with Your Own Local AI!
- host ALL your AI locally
- Run your own AI (but private)