Multi-tenant fine-tuning for local LLMs with Tinker-compatible API
-
Updated
Feb 14, 2026 - Python
Multi-tenant fine-tuning for local LLMs with Tinker-compatible API
Delta: LLM conversation branching
Playground for learning by doing
A lightweight, self-contained Python project for running local LLM personalities with minimal dependencies. This system uses TinyLlama-1.1B-Chat-v1.0.0 and llama-cpp-python for inference, and Rich for a user-friendly console chat interface. This is a expansion of Tiny-Local-llm which allows you to select from 1 of 3 basic personalities.
A terminal-based tool for building flexible AI workflows anywhere. Process documents, create pipelines, and manage context from the command line.
Chrome extension to summarize and chat with any web page using a local LLM (vLLM) — your data never leaves your machine.
On device autonomous research and content writing using open-sourced LLMs and Crew AI.
(Experiment) Predefined set of instructions for local agents governing LLM usage and selection
An entirely offline, privacy-centric voice assistant that leverages lightweight local AI for speech-to-text (Vosk), large language model processing (GGUF via Llama.cpp), and text-to-speech (Kokoro), offering seamless, low-latency, and secure voice interactions directly from your machine.
Fully local autonomous AI research agent using Ollama with tool-based web search and reasoning.
This repository will be used to add all different types of LLMs projects - basic to advanced
An open-source Agentic RAG solution for seamless local Vector store retrieval and real-time web search. Automatically decides whether to query your internal Vector store or scout the Live Web for the most relevant information.
Universal local AI agent for querying any MCP-enabled data source using Ollama - vaults, databases, emissions data, and more. 100% offline, 100% sovereign.
An Autonomous AI System for Generating Humorous & Viral Tweets using Open-Source LLMs
Add a description, image, and links to the local-llms topic page so that developers can more easily learn about it.
To associate your repository with the local-llms topic, visit your repo's landing page and select "manage topics."