Skip to content
#

local-llms

Here are 14 public repositories matching this topic...

A lightweight, self-contained Python project for running local LLM personalities with minimal dependencies. This system uses TinyLlama-1.1B-Chat-v1.0.0 and llama-cpp-python for inference, and Rich for a user-friendly console chat interface. This is a expansion of Tiny-Local-llm which allows you to select from 1 of 3 basic personalities.

  • Updated Feb 5, 2026
  • Python

An open-source Agentic RAG solution for seamless local Vector store retrieval and real-time web search. Automatically decides whether to query your internal Vector store or scout the Live Web for the most relevant information.

  • Updated Jan 16, 2026
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the local-llms topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the local-llms topic, visit your repo's landing page and select "manage topics."

Learn more