nosqlbench/devdocs/metrics_labeling.md

65 lines
2.8 KiB
Markdown
Raw Normal View History

Metrics Publishing (#1234) Included the following with core changes to allow labeled metrics for Prometheus exposition format publishing. * javadoc updates * remove extra types * use NBLabeledElement instead of NBNamedElement * contextualize NBLabeledElement for graphite/metrics * externalize labeled ScriptContext to API * add labels to NicerTimer * remove partial packaging * more progress on formatting for prom exposition format * added metrics diagram * resolve build issues with label params * resolve build issues with label params * prometheus export services * added PromExpoFormat Tests for NBMetricMeter(Counting+Sampling) and NBMetricTimer(Counting) * added test for Gauge Formatting * added Gauge Formatting as well as Sampling values (count, stdev ...) * added sketch for metrics labeling contexts * add NBLabeledElement in all the places, retool calling paths to use it * synchronize antlr versions after partial snyk change * unbreak static initializer block after IntelliJ "fixed" it. * engine-api - adapt to NBLabeledElement * adapters-api - adapt to NBLabeledElement * nb-api - adapt to NBLabeledElement * engine-core - adapt to NBLabeledElement * misc-adapters - adapt to NBLabeledElement * streaming-adapters - adapt to NBLabeledElement * add missing test * initial implementation of a prom push reporter * Resolve build issue with parseGlobalOptions * replaced with PromPushReporter * cleanup unused deps * dependency removal for micrometer * allow empty labels for tests * change space.getName to space.getSpaceName * cleanup poms * graphite linearization now includes space element * http adapter should only depend on adapters API * http space does not create its own metric names * import cleanups * improved javadocs * introduce component concepts --------- Co-authored-by: Jonathan Shook <jshook@gmail.com> Co-authored-by: Mike Yaacoub <mike.yaacoub@datastax.com>
2023-05-09 09:52:42 -05:00
# Metrics Labeling
All metrics flowing from NoSQLBench should come with a useful set of labels which
are presented in a self-consistent manner. These labels serve to identify a given metric
not only within a given study or deployment, but across time with macr-level identifiers.
Those identifiers which are nominal for the study or deployment should also be provided
in the annotations which can be queried later to find the original set of related metrics.
# Naming Context
In order to simplify the naming methods, all metrics instruments are created through
a helper type called ActivityMetrics. (This name might change).
It contains factory methods for all the metric types you may use within the NoSQLBench runtime.
Each factory method must start with an NBLabeledElement, which provides the naming context
for the _thing to which the metric pertains_, *separate* from the actual metric family name.
The metric family name is provided separately. This means that the factory methods have,
injected at the construction site, all the identifying labels needed by the metric for
reporting to the metrics collector.
However, the appropriate set of labels which should be provided might vary by caller, as sometimes
the caller is an Activity, sometimes an OpDispenser within an activity, sometimes a user script,
etc.
This section describes the different caller (instrumented element, AKA NBLabeledElement)
contexts and what labels are expected to be provided for each. Each level is considered
a nested layer below some other element, which implicitly includes all labeling data from
above.
# Labeling Contexts
- NoSQLBench Process
- "appname": "nosqlbench"
- Scenario Context (calling as Scenario)
- IFF Named Scenario Mode:
- "workload": "..." # from the file
- "scenario": "..." # from the scenario name
- "usermode": "named_scenario"
- IFF Run Mode:
- "workload": "..." # from the file
- "scenario": "..." # from the (auto) scenario name
- "usermode": "adhoc_activity"
- Activity Context (calling as Activity)
- includes above labels
- IFF Named Scenario Mode
- "step": "..."
- "alias": "${workload}_${scenario}_${step}"
- ELSE
- "alias": "..." # just the activity alias
- Op Template Context (calling as OpDispenser)
- includes above labels
- "op": "<name of the parsed op>"
# Additional Data
In the future it would be nice to include both the driver adapter name and the space name.
# Caller and Callee Semantics
When constructing child elements, or _owned_ fields, the calling convention is to provide
_this_ element as the labeled object.
When returning labels as a labeled object, the convention is to return the labels from
the labeled parent object with the name of _this_ object appended to the end of the
label set.