A Python web application that provides a chainlet interface for interacting with local Ollama models.
- Web-based chat interface for Ollama models
- Model selection from available Ollama models
- Conversation history management
- Simple chainlet framework for extensibility
- Python 3.8+
- Ollama installed and running locally
- Internet connection for web assets (Bootstrap, etc.)
-
Clone this repository:
git clone <repository-url> cd ollama-chainlet -
Run the setup script to create a virtual environment and install dependencies:
chmod +x setup.sh ./setup.sh
-
Ensure Ollama is running locally.
-
Activate the virtual environment:
source venv/bin/activate -
Start the application:
python app.py -
Open your web browser and navigate to:
http://localhost:5000 -
Select a model from the dropdown and start chatting!
ollama-chainlet/
├── app.py # Main Flask application
├── static/ # Static assets
│ ├── css/
│ │ └── style.css # Styling
│ └── js/
│ └── main.js # Frontend logic
├── templates/
│ └── index.html # Main interface
├── chainlet/
│ ├── __init__.py
│ ├── core.py # Core chainlet functionality
│ └── ollama.py # Ollama integration
├── requirements.txt # Dependencies
├── .gitignore # Git ignore file
└── README.md # Documentation
MIT