Open Chat Playground (OCP) is a web UI that is able to connect virtually any LLM from any platform.
-
Updated
Aug 27, 2025 - C#
Open Chat Playground (OCP) is a web UI that is able to connect virtually any LLM from any platform.
Saraf AI is a fully local assistant built with Next.js and Docker. It connects seamlessly to LLMs via Docker Model Runner using Docker Compose.
A flexible, extensible AI agent backend built with NestJS—designed for running local, open-source LLMs (Llama, Gemma, Qwen, DeepSeek, etc.) via Docker Model Runner. Real-time streaming, Redis messaging, web search, and Postgres memory out of the box. No cloud APIs required!
This project demonstrates how to configure Spring AI to interact with Ollama and Docker Model Runner
Demo of Docker Model Runner in both development and production environments.
This provides sample codes that uses Microsoft.Extensions.AI for locally running LLMs through Ollama, Hugging Face, Docker Model Runner and Foundry Local
A streamlined chat application that leverages Docker Model Runner to serve Large Language Models (LLMs) through a modern Streamlit interface. This project demonstrates containerized LLM deployment with a user-friendly web interface.
Maven plugin for AI security scanning using local LLMs to detect secrets, API keys & passwords in your code
Add a description, image, and links to the docker-model-runner topic page so that developers can more easily learn about it.
To associate your repository with the docker-model-runner topic, visit your repo's landing page and select "manage topics."