LLMClient is a simple Python library for interacting with Large Language Models (LLMs). It is tailored to my own needs, but if you find it useful, you can use it too.
For now, it supports two types of LLMs:
- Ollama - https://ollama.com/
- WebChat - with a defined endpoint (it is tested with Ollama's webchat interface at http://localhost:11434/api/chat)
Both support setting a desired model, temperature and eventually a JSON format of the output. You can also use a specific system prompt (see llm_client_test_json.py for an example).
See the examples in the llm_client_test_*.py files.
The most simple example is:
from llm_client.LLMClientOllama import LLMClientOllama
llm_client_ollama = LLMClientOllama()
llm_client_ollama.set_model("gemma3:12b")
llm_client_ollama.set_temperature(1)
response, llm_role = llm_client_ollama.call_llm("What is the meaning of life and everything?")
print(f"Response from {llm_role}: {response}")This project uses the following dependencies:
requests- for making HTTP requestsjson- for working with JSON dataollama- for working with Ollama API https://github.com/ollama/ollama-python
This project is licensed under the MIT License - see the LICENSE file for details.