-
Notifications
You must be signed in to change notification settings - Fork 5
Filtering
Example:
root@agent:/usr/src/app# agent pipeline create -a
Choose source config (kafka) [kafka]:
...
Filter condition []: "status" == "success" && !("info/type" matches "[a-z]+")
Created pipeline Kafka
This means to send only records where property "status" equals "success" and nested property "info/type" doesn't have lowercase characters in its value
Filtering conditions can consist of multiple expressions. A single expression has one property name, one comparison operator and value to be compared against which are written in a specific order:
"property_name" <comparizon operator> "value"
Property name and value must be enclosed in double-quotes. If you have nested json you can specify a full path to the property with / symbol like this level1/level2/property
Supported comparison operators:
-
==- equals -
!=- not equals containsstartsWithendsWith-
matches- match regexp reference
To combine multiple expressions use conjunction operators && (and) and || (or). For example
"status" == "success" && ("info/type" matches "[a-z]+" || "info/level" contains "top")
To specify negation wrap expression in parentheses and put ! in the beginning
"status" == "success" && !("info/type" matches "[a-z]+")
Here's an example of setting filtering via file:
[{
"pipeline_id": "my_pipeline",
"source": "my_source",
...
"filter": {
"condition": "'status' == 'success' && !('info/type' matches '[a-z]+')"
}
}]
- Home
- CLI reference
- API
- Kubernetes setup using Helm
- Podman setup
- Creating pipelines
- Test sources
- Data formats (JSON, CSV, AVRO, LOG)
- How to parse logs with grok patterns
- How to store sensitive information
- Automated pipelines creation
- Filtering
- Transformation files
- Fields
- DVP Configuration
- Integrations
- Sending events to Anodot