1) Address NB issue #1283 (overlapping label name \"name\" in both parent and child labels when adding Pulsar adapter metrics).

2) Add named scenarios for NB Pulsar and NB Kafka adapters.
This commit is contained in:
yabinmeng
2023-06-28 16:32:55 -05:00
parent a264291c04
commit f7d7fd56c7
28 changed files with 125 additions and 46 deletions

View File

@@ -1,22 +1,33 @@
# Overview
---
weight: 0
title: S4J
---
- [1. Overview](#1-overview)
- [1.1. Example NB Yaml](#11-example-nb-yaml)
- [2. Usage](#2-usage)
- [2.1. NB Kafka adapter specific CLI parameters](#21-nb-kafka-adapter-specific-cli-parameters)
This NB Kafka adapter allows publishing messages to or consuming messages from
---
# 1. Overview
The NB Kafka adapter allows publishing messages to or consuming messages from
* a Kafka cluster, or
* a Pulsar cluster with [S4K](https://github.com/datastax/starlight-for-kafka) or [KoP](https://github.com/streamnative/kop) Kafka Protocol handler for Pulsar.
At high level, this adapter supports the following Kafka functionalities
* Publishing messages to one Kafka topic with sync. or async. message-send acknowledgements (from brokers)
* Subscribing messages from one or multiple Kafka topics with sync. or async. message-recv acknowlegements (to brokers) (aka, message commits)
* Subscribing messages from one or multiple Kafka topics with sync. or async. message-recv acknowledgements (to brokers) (aka, message commits)
* auto message commit
* manual message commit with a configurable number of message commits in one batch
* Kafka Transaction support
## Example NB Yaml
* [kafka_producer.yaml](./kafka_producer.yaml)
## 1.1. Example NB Yaml
* [kafka_producer.yaml](scenarios/kafka_producer.yaml)
*
* [kafka_consumer.yaml](./kafka_consumer.yaml)
* [kafka_consumer.yaml](scenarios/kafka_consumer.yaml)
# Usage
# 2. Usage
```bash
## Kafka Producer
@@ -26,7 +37,7 @@ $ <nb_cmd> run driver=kafka -vv cycles=100 threads=2 num_clnt=2 yaml=kafka_produ
$ <nb_cmd> run driver=kafka -vv cycles=100 threads=4 num_clnt=2 num_cons_grp=2 yaml=kafka_producer.yaml config=kafka_config.properties bootstrap_server=PLAINTEXT://localhost:9092
```
## NB Kafka adapter specific CLI parameters
## 2.1. NB Kafka adapter specific CLI parameters
* `num_clnt`: the number of Kafka clients to publish messages to or to receive messages from
* For producer workload, this is the number of the producer threads to publish messages to the same topic
@@ -40,8 +51,6 @@ $ <nb_cmd> run driver=kafka -vv cycles=100 threads=4 num_clnt=2 num_cons_grp=2 y
* `num_cons_grp`: the number of consumer groups
* Only relevant for consumer workload
For the Kafka NB adapter, Document level parameters can only be statically bound; and currently, the following Document level configuration parameters are supported:
* `async_api` (boolean):

View File

@@ -0,0 +1,34 @@
scenarios:
msg_pub: run driver=kafka cycles=1000 threads=2 num_clnt=2 config=../conf/kafka_config.properties bootstrap_server=localhost:9092
msg_sub: run driver=kafka cycles=1000 threads=4 num_clnt=2 num_cons_grp=2 config=../conf/s4j_config.properties bootstrap_server=localhost:9092
bindings:
mykey: Mod(5); ToString(); Prefix("key-")
mytext_val: AlphaNumericString(30)
random_text_val1: AlphaNumericString(10)
random_text_val2: AlphaNumericString(20)
# document level parameters that apply to all Pulsar client types:
params:
async_api: "true"
blocks:
msg_pub:
ops:
op1:
MessageProduce: "persistent://nbtest/default/s4ktest"
txn_batch_num: 1
msg_header: |
{
"header-1": "{random_text_val1}",
"header-2": "{random_text_val2}"
}
msg_key: "{mykey}"
msg_body: "{mytext_val}"
msg_sub:
ops:
op1:
MessageConsume: "persistent://nbtest/default/s4ktest"
msg_poll_interval: "10"
manual_commit_batch_num: "0"

View File

@@ -31,6 +31,6 @@ java -jar nb5/target/nb5.jar \
threads=1 \
num_clnt=1 \
num_cons_grp=1 \
yaml="${SCRIPT_DIR}/kafka_consumer.yaml" \
config="${SCRIPT_DIR}/kafka_config.properties" \
yaml="${SCRIPT_DIR}/scenarios/kafka_consumer.yaml" \
config="${SCRIPT_DIR}/conf/kafka_config.properties" \
bootstrap_server=PLAINTEXT://localhost:9092

View File

@@ -31,8 +31,8 @@ while [[ 1 -eq 1 ]]; do
cycles="${CYCLES}" \
threads=1 \
num_clnt=1 \
yaml="${SCRIPT_DIR}/kafka_producer.yaml" \
config="${SCRIPT_DIR}/kafka_config.properties" \
yaml="${SCRIPT_DIR}/scenarios/kafka_producer.yaml" \
config="${SCRIPT_DIR}/conf/kafka_config.properties" \
bootstrap_server=PLAINTEXT://localhost:9092
sleep 10
done