Skip to content

feat: batch enqueue, delivery batching, and smart batching modes#3

Merged
vieiralucas merged 1 commit intomainfrom
feat/26.6-batch-smart-batching
Mar 24, 2026
Merged

feat: batch enqueue, delivery batching, and smart batching modes#3
vieiralucas merged 1 commit intomainfrom
feat/26.6-batch-smart-batching

Conversation

@vieiralucas
Copy link
Copy Markdown
Member

@vieiralucas vieiralucas commented Mar 24, 2026

Summary

  • Add batchEnqueue(List<EnqueueMessage>) method for explicit batch enqueue via BatchEnqueue RPC
  • Add transparent delivery batching in consume stream -- unpacks ConsumeResponse.messages (repeated field) with singular message fallback for backward compatibility
  • Add BatchMode with three modes: AUTO (opportunistic, default), LINGER (timer-based), DISABLED
  • enqueue() now routes through background batcher by default (zero config, zero latency penalty at low load)
  • Single-item optimization: 1 message uses regular Enqueue RPC (preserves exact error types like QueueNotFoundException), 2+ messages use BatchEnqueue
  • close() drains pending batched messages before disconnecting
  • Update proto to include BatchEnqueue RPC and ConsumeResponse.messages repeated field

Test plan

  • BatchModeTest -- 7 unit tests for mode configuration and validation
  • BatchEnqueueResultTest -- 4 unit tests for success/error result types
  • BuilderTest -- 4 new tests for batch mode builder options (11 total)
  • BatchClientTest -- 8 integration tests: explicit batch, mixed queue results, auto batching, linger batching, disabled batching, multi-message consume, error propagation through batcher, default mode verification
  • All 34 tests pass (TlsAuthClientTest failure is pre-existing)

Generated with Claude Code


Summary by cubic

Adds explicit batch enqueue, client-side smart batching for enqueue(), and delivery batching for consume(). Default mode is AUTO, which coalesces messages under load without adding latency at low load.

  • New Features

    • Added batchEnqueue(List<EnqueueMessage>) with per-message BatchEnqueueResult.
    • Added delivery batching: consume() now unpacks ConsumeResponse.messages with singular fallback.
    • Introduced BatchMode: AUTO (default), LINGER, DISABLED.
    • enqueue() routes through a background batcher by default; 1 message uses Enqueue, 2+ use BatchEnqueue.
    • close() now drains pending batched messages.
    • Updated proto with BatchEnqueue RPC and batched ConsumeResponse.messages.
  • Migration

    • Requires a server that implements BatchEnqueue to benefit from batching; use withBatchMode(BatchMode.disabled()) to stay compatible with older servers.
    • No changes needed for consumers; the client transparently handles batched and single-message responses.

Written for commit d8f1f73. Summary will update on new commits.

add explicit batchEnqueue() method, transparent delivery batching in
consume stream (unpacks repeated messages field with singular fallback),
and three BatchMode options: AUTO (opportunistic, default), LINGER
(timer-based), and DISABLED. enqueue() now routes through the batcher
by default. single-item optimization uses regular Enqueue RPC for exact
error semantics. close() drains pending messages before disconnecting.
@vieiralucas vieiralucas merged commit 612c858 into main Mar 24, 2026
2 checks passed
Copy link
Copy Markdown

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No issues found across 10 files

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant