Skip to content
#

plasmo

Here are 133 public repositories matching this topic...

Ollama Client – Chat with Local LLMs Inside Your Browser A lightweight, privacy-first Ollama Chrome extension to chat with locally hosted Ollama lllm models like LLaMA 2, Mistral, and CodeLLaMA. Supports streaming, stop/regenerate and easy model switching — all without cloud APIs or data leaks.

  • Updated Dec 20, 2025
  • TypeScript

Improve this page

Add a description, image, and links to the plasmo topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the plasmo topic, visit your repo's landing page and select "manage topics."

Learn more