-
Notifications
You must be signed in to change notification settings - Fork 92
Description
Search before asking
- I searched in the issues and found nothing similar.
Description
Motivation
Flink Agents currently supports Ollama, OpenAI, Anthropic (direct), and Azure AI as chat model providers, and Ollama/OpenAI for embeddings. There is no integration for Amazon Bedrock, which is the primary LLM gateway for AWS customers.
Proposed Changes
Add two new integration modules:
-
Chat model (
integrations/chat-models/bedrock/) — Uses the Bedrock Converse API with native tool calling support. SigV4 auth viaDefaultCredentialsProvider. Supports all Bedrock models accessible via Converse API (Claude, Llama, Mistral, Titan, etc.) -
Embedding model (
integrations/embedding-models/bedrock/) — Uses Titan Text Embeddings V2 via InvokeModel. Batchembed(List<String>)parallelizes via configurable thread pool (embed_concurrencyparameter, default 4).
I have a working implementation with unit tests ready to submit as a PR.
Are you willing to submit a PR?
- I'm willing to submit a PR!