Skip to content

[Feature] Add Amazon Bedrock integration for chat models and embedding models #523

@avichaym

Description

@avichaym

Search before asking

  • I searched in the issues and found nothing similar.

Description

Motivation

Flink Agents currently supports Ollama, OpenAI, Anthropic (direct), and Azure AI as chat model providers, and Ollama/OpenAI for embeddings. There is no integration for Amazon Bedrock, which is the primary LLM gateway for AWS customers.

Proposed Changes

Add two new integration modules:

  • Chat model (integrations/chat-models/bedrock/) — Uses the Bedrock Converse API with native tool calling support. SigV4 auth via DefaultCredentialsProvider. Supports all Bedrock models accessible via Converse API (Claude, Llama, Mistral, Titan, etc.)

  • Embedding model (integrations/embedding-models/bedrock/) — Uses Titan Text Embeddings V2 via InvokeModel. Batch embed(List<String>) parallelizes via configurable thread pool (embed_concurrency parameter, default 4).

I have a working implementation with unit tests ready to submit as a PR.

Are you willing to submit a PR?

  • I'm willing to submit a PR!

Metadata

Metadata

Assignees

Labels

feature[Issue Type] New features or improvements to existing features.priority/majorDefault priority of the PR or issue.

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions