merge main into http_finish and fixup conflicts and http APIs

This commit is contained in:
Jonathan Shook
2020-09-15 20:33:31 -05:00
381 changed files with 83170 additions and 8326 deletions

View File

@@ -19,6 +19,9 @@ A clear and concise description of what you expected to happen.
**Additional context**
Add any other context about the problem here.
OS: Windows, Linux (distribution), macOS ...
environment: k8s, docker, ...
version info (`./nb --version` or `java -jar nb.jar --version`)
**Screenshots, if applicable**
If applicable, add screenshots to help explain your problem.

View File

@@ -9,10 +9,10 @@ jobs:
release:
runs-on: ubuntu-18.04
steps:
- name: checkout repo
uses: actions/checkout@v2
- name: setup java
uses: actions/setup-java@v1
with:
@@ -100,6 +100,20 @@ jobs:
MAVEN_REPO_SERVER_USERNAME: ${{ secrets.MVN_REPO_PRIVATE_REPO_USER }}
MAVEN_REPO_SERVER_PASSWORD: ${{ secrets.MVN_REPO_PRIVATE_REPO_PASSWORD }}
- name: bundle integration test logs
run: |
pwd
ls -l
mkdir -p itlogs/nb
cp -R nb/logs itlogs/nb
- name: upload integration test logs
uses: actions/upload-artifact@v1
with:
name: itlogs
path: itlogs
- name: perform release
run: scripts/release-perform.sh
continue-on-error: true
@@ -121,7 +135,8 @@ jobs:
run: |
pwd
ls -l
mkdir staging && cp nb/target/nb.jar nb/target/nb staging
mkdir staging
cp nb/target/nb.jar nb/target/nb staging
- name: upload artifacts
uses: actions/upload-artifact@v1

1
.gitignore vendored
View File

@@ -1,4 +1,5 @@
.run/**
workspaces/**
workshop/**
local/**
metrics/**

View File

@@ -1,7 +0,0 @@
language: java
jdk:
- openjdk12
cache:
directories:
- .autoconf
- $HOME/.m2

View File

@@ -1,18 +1,35 @@
NoSQLBench is an ambitious project. It aims to solve long-standing problems in distributed systems
testing. There are *many* ways you can contribute! Please take a moment to review this document
testing. There are *many* ways you can contribute! Please take a moment to review this document
in order to make the contribution process easy and effective for everyone involved.
## Code of Conduct
This project follows a [CODE_OF_CONDUCT](CODE_OF_CONDUCT.md).
Please read through it at least once if you are going to contribute to our endeavor.
This project follows a [CODE_OF_CONDUCT](CODE_OF_CONDUCT.md). Please read
through it at least once if you are going to contribute to NoSQLBench.
## Licensing
All source code in this repository is licensed exclusively under
[Apache License, Version 2.0](http://www.apache.org/licenses/LICENSE-2.0).
## Ways to Contribute
## Jumping Right In
Some quick how-to docs have been written for some of the subject-matter
areas in NoSQLBench. If you need an onramp that is not listed, here let us
know!
[I am a developer and I want to contribute a driver.](devdocs/contributing_drivers.md)
[I am a user and I want to improve the documentation.](devdocs/improving_documentation.md)
[I am a user and I want to contribute built-in scenarios.](devdocs/contributing_builtins.md)
[I am a UI developer and I want to improve the NoSQLBench UI (NBUI)](devdocs/developint_nbui.md)
## Contribution Ideas
There are lots of ways to contribute to the project. Some ideas on how to
get started are described here.
### Feedback
@@ -22,23 +39,23 @@ All source code in this repository is licensed exclusively under
This can be helpful to help us understand better what conditions cause the issue to occur. If want to help
address a non-trivial issue that has been reported, you can follow the steps that the original user
reported. Updating issues with such findings is immensely helpful to the maintainers.
### Pull Requests
All contributions to this project are made in the form of pull requests, including:
- Easy Picks - Any issue labeled `easy pick` is a good first issue for new contributors. These are tagged
by the maintainers specifically to provide a gentle on-ramp for those joining our endeavor.
- Fixing bugs - Bug fixes are awesome! If you submit a pull request with a bug fix, we will review it
and provide feedback if refinements are needed. Otherwise we'll merge it in for the next release.
- Testing Improvements - Often, a good test is the best form of documentation. Test coverage can always
be improved. Writing tests is also a great way to get familiar with the project up close.
- Writing documentation - Our users need good documentation. Good documentation is, however, a community
effort. If you have used NoSQLBench and want to improve the journey for others, good documentation
with examples is really helpful.
with examples is really helpful.
- Writing web apps - The doc system that is bunded with NoSQLBench is a web application. This is just
the first web app, but we expect there will be more. If you are interested in helping with this part,
@@ -49,41 +66,54 @@ All contributions to this project are made in the form of pull requests, includi
with internal architecture, there is work to be done. This is on the more advanced end of contribution,
and requires some study of the codebase. If you take the time to understand the code a bit before submitting
pull requests, then the maintainers will take the time to help you make your pull requests better.
### Maintainers
We are looking for more community ownership in this project. That means that we will be supportive of
new project contributors who wish to build a sense of trust with the maintainers.
- Reviewing Pull Requests - If you are a maintainer on this project, then you may be called on to review
pull requests. Unless requested directly, pull requests will be allocated to reviewers automatically.
- Enforcing Conduct - You may be called up on as a maintainer of this project to enforce the code of conduct
as it is written.
- Developing Maintainers - It is important that there be a solid core of project maintainers who can depend
on one another and whom the project user base can depend on do govern the project fairly and equitably.
As a project maintainer, you will be expected to help guide contributors as they learn about the project.
You may be responsible for choosing maintainers or voting on maintainer membership.
- Reviewing Pull Requests - If you are a maintainer on this project, then
you may be called on to review pull requests. Unless requested directly,
pull requests will be allocated to reviewers automatically.
- Enforcing Conduct - You may be called up on as a maintainer of this
project to enforce the code of conduct as it is written.
- Developing Maintainers - It is important that there be a solid core of
project maintainers who can depend on one another and whom the project
user base can depend on do govern the project fairly and equitably. As a
project maintainer, you will be expected to help guide contributors as
they learn about the project. You may be responsible for choosing
maintainers or voting on maintainer membership.
- Documenting APIs - The developer docs are pretty lean right now. We need
some examples of building activity types, adapting statement templates
and so on. This level of contribution requires an intimate awareness of
how the core engine works, but it is also some of the most valuable work
that you can do in terms of expanding the nosqlbench ecosystem.
Contributions are welcome here!
- Documenting APIs - The developer docs are pretty lean right now. We need some examples of building
activity types, adapting statement templates and so on. This level of contribution requires an intimate
awareness of how the core engine works, but it is also some of the most valuable work that you can
do in terms of expanding the nosqlbench ecosystem. Contributions are welcome here!
## Outreach
Help us get the word out on NoSQLBench. It is a newly opened project in its current form, and we
are eager to get it into the hands of users who need it.
- Writing blog posts - A blog post that helps others see the good in NoSQLBench is a service to our community.
This is even better when it comes from a new user who has seen the merit of the tooling. We appreciate any
help.
- Helping other users - Helping new users get to a productive state is a great way to build bridges in the
community. The more community advocates we have helping each other the better!
- Writing blog posts - A blog post that helps others see the good in
NoSQLBench is a service to our community. This is even better when it
comes from a new user who has seen the merit of the tooling. We
appreciate any help.
- Helping other users - Helping new users get to a productive state is a
great way to build bridges in the community. The more community
advocates we have helping each other the better!
- Community Development - The NoSQLBench project endeavors to build a
strong, inclusive, and robust support system around users and
contributors alike. This takes on many forms. It is essential that we
keep looking for ways to connect the NoSQLBench community, doing more of
what works and less of what doesn't. If you want to help with community
development, please join our slack channel and raise your hand!

View File

@@ -6,6 +6,8 @@
[Get it Here](DOWNLOADS.md)
[Contribute to NoSQLBench](CONTRIBUTING.md)
[Read the Docs](http://docs.nosqlbench.io/)
## What is NoSQLBench?

View File

@@ -1,35 +1,4 @@
- a689b4c9 initial NBUI docs
- cd450c17 Fix typo
- b723924e #159 Possible NPE bug in URL resolver
- c5a98420 start of op grammar
- d083483f function adapter handles LongToIntFunction
- b3eecb4d clarify naming of statement fields
- a0dc30a6 clarify default type of parameter
- 5ca319e7 #169 fix template preprocessor regression
- e7daf97d #168 fix parameter types for free params on statements
- 2f083001 Allow easy load balancer configuration for CQL driver #173
- 73464c43 fix scenario parsing bug when named arg contains assignments
- 105c59d3 Bump log4j.version from 2.13.0 to 2.13.3 in /mvn-defaults
- 8725cc40 Fix MongoAction incorrect error message
- 7f2c2e31 mongodb: Update yaml
- 46a3bb62 (HEAD -> main) improve bindings-collections
- a69ae0ba add optional bindings filter to stdout
- 91ea3c71 document stdout bindings filter
- 8992dd12 add collection bindings example
- eb470df3 switch list functions over to better function adapter
- e8f76ddf move set functions over to better function adapter
- 7ee17265 improve function adapter coverage
- fdeacc86 (origin/main) add test for new functional type
- 658d400f add missing support for DoubleToLongFunction
- 878c8896 bring Set functions up to parity with List Functions
- 6099ac3e add function and generic specific conversion support
- 9899a15f collection functions blurb
- 9a42aacd deprecate old list funcs
- 6a22d411 make list function descriptions follow a pattern
- 7461f0e8 add http test yamls
- 9aaf2991 bump graalvm to 20.1.0
- 2d9512c2 start error handling dev guide
- 102f60ac restore support for adhoc activities in docker metrics
- 3841693b (HEAD -> main) fix integration tests after alias renaming
- 138251cb (origin/main) dashboard fix release
- 102f60ac restore support for adhoc activities in docker metrics
- 13240cd8 (HEAD -> main) phase 1 of openapi testing endpoints
- 35abcc03 provide JSON friendly optimo result
- f22a0462 correct mispelling of parametrized
- 832064c8 correct mispelling of parametrized

1
category.txt Normal file
View File

@@ -0,0 +1 @@
c1

View File

@@ -0,0 +1,35 @@
# Contributing a Driver
Drivers in NoSQLBench are how you get the core machinery to speak a
particular protocol or syntax. At a high level, a NB driver is responsible
for mapping a data structure into an executable operation that can be
called by NoSQLBench.
Internally, the name `ActivityType` is used to name different high level
drivers within NoSQLBench. This should avoid confusion with other driver
terms.
Drivers in NoSQLBench are separate maven modules that get added to the
main nb.jar artifact (and thus the AppImage binary). For now, all these
driver live in-tree, but we may start allowing these to be packaged as
separate jar files. Let us know if this would help your integration
efforts.
## Start with Examples
There are a few activity types which can be used as templates. It is
recommended that you study the stdout activity type as your first example
of how to use the runtime API. The HTTP driver is also fairly new and thus
uses the simpler paths in the API to construct operations. Both of these
recommended as starting points for new developers.
## Consult the Dev Guide
The developers guide is not complete, but it does call out some of the
features that any well-built NB driver should have. If you are one of the
early builders of NB drivers, please help us improve the dev guide as you
find things you wish you had known before, or ways of getting started.
The developer's guid is a work in progress. It lives under
devdocs/devguide in the root of the project.

View File

@@ -2,9 +2,10 @@
This is a work in progress...
This is the document to read if you want to know if your NoSQLBench driver is complete.
Within this document, the phrase `conformant` will be taken to mean that a driver or feature
is implemented according to the design intent and standards of the NoSQLBench driver API standards.
is implemented according to the design intent and standards of the NoSQLBench driver API.
While it may be possible to partially implement a driver for basic use, following the guidelines
in this document will ensure that contributed drivers for NoSQLBench work in a familiar and
@@ -29,29 +30,112 @@ A conformant driver should use the standard method of creating an operational se
that a driver simply has to provide a function to map an OpTemplate to a more ready to use form that
is specific to the low level driver in question.
## Terms
- NB Driver - The NoSQLBench level driver, the code that this document
refers to.
- Native driver - An underlying driver which is provided by a vendor or
project.
## Metrics
At a minimum, a conformant driver should provide the following metrics:
- **bind** (timer) - A timer around the code that prepares an executable form of a statement
- **execute** (timer) - A timer around the code that submits work to a native driver
- **result** (timer) - A timer around the code that awaits and processes results from a native driver. This
timer should be included around all operations, successful ones and errors too.
- **result-success** (timer) - A timer around the code that awaits and processes results from a native driver.
This timer should only be updated for successful operations.
- **errorcounts-...** (counters)- Each uniquely named exception or error type that is known to the native driver
should be counted.
- **tries** (histogram) - The number of tries for a given operation. This number is incremented before each
execution of a native operation, and when the result timer is updated, this value should be updated
as well (for all operations). This includes errored operations.
- bind (timer) - A timer around the code that prepares an executable form
of a statement.
- execute (timer) - A timer around the code that submits work to a native
driver. This is the section of code which enqueues an operation to
complete, but not the part that waits for a response. If a given driver
doesn't have the ability to hand off a request to an underlying driver
asynchronously, then do not include this metric.
- result (timer) - A timer around the code that awaits and processes
results from a native driver. This timer should be included around all
operations, successful ones and errors too. The timer should start
immediately when the operation is submitted to the native ddriver, which
is immediately after the bind timer above is stopped for non-blocking
APIs, or immediately before an operation is submitted to the native
driver API for all others.
- result-success (timer) - A timer around the code that awaits and
processes results from a native driver. This timer should only be
updated for successful operations. The same timer values should be used
as those on the result timer, but they should be applied only in the
case of no exceptions during the operation's execution.
- errorcounts-... (counters)- Each uniquely named exception or error type
that is known to the native driver should be counted.
- tries (histogram) - The number of tries for a given operation. This
number is incremented before each execution of a native operation, and
when the result timer is updated, this value should be updated as well
(for all operations). This includes errored operations.
## Error Handling
Users often want to control what level of sensitivity their tests have to errors. Testing requirements
vary from the basic "shutdown the test when any error occurs" to the more advanced "tell me when the
error rate exceeds some threshold", and so on.
Users often want to control what level of sensitivity their tests have to
errors. Testing requirements vary from the basic "shutdown the test when
any error occurs" to the more advanced "tell me when the error rate
exceeds some threshold", and so on. The essential point here is that
without flexibility in error handling, users may not be able to do
reasonable testing for their requirements, thus configurable error
handling is essential.
Configurable error handling is essential.
Until the error handling subsystem is put in place, these types of error
handling levels are suggested:
1. stop
2. warn
3. retry
4. histogram
5. count
6. ignore
This serves as an error handling stack, where the user chooses the entry
point. From the user-selected entry point, all of the remaining actions
are taken until the end if possible.
### stop
If an exception occurs, and the user has selected the `stop` error
handler, then the activity should be stopped. This is triggered by
allowing the NB driver to propagate a runtime exception up the stack.
Since this error handler interrupts flow of an activity, no further error
handling is done.
### warn
If an exception occurs and the user has selected the `warn` error handler,
the the exception should be logged at WARN level.
The next error handler `retry` should also be called.
### retry
If an exception occurs and the user has selected the `retry` error
handler, **AND** the exception represents a type of error which could
reasonably be expected to be non-persistent, then the operation should be
re-submitted after incrementing the tries metric.
Whether or not the operation is retried, the next error handler
`histogram` should also be called.
### histogram
If an exception occurs and the user has selected the `histogram` error
handler,the error should be recorded with the help class
`ExceptionHistoMetrics`. This adds metrics under the `errorhistos` label
under the activity's name.
The next error handler `count` should also be called.
### count
If an exception occurs and the user has selected the `count` error
handler, then the error should be counted with the helper class
`ExceptionCountMetrics`. This adds metrics under the `errorcounts` label
under the activity's name.
The next exception handler `ignore` should also be called, but this is
simply a named 'no-op' which is generally the last fall-through case in a
switch statement.
TBD
@@ -64,13 +148,30 @@ TBD
TBD
### Parameter naming
Parameters should be formatted as snake_case by default. Hyphens or camel
case often cause issues when using mixed media such as command lines and
yaml formats. Snake case is a simple common denominator which works across
all these forms with little risk of ambiguity when parsing or documenting
how parameters are set apart from other syntax.
## Documentation
Each activity is required to have a set of markdown documentation in its resource directory.
The name of the driver should also be used as the name of the documentation for that driver.
Each activity is required to have a set of markdown documentation in its
resource directory. The name of the driver should also be used as the name
of the documentation for that driver.
Additional documentation can be added beyond this file. However, all documentation for a given driver
must start with the drivers name and a hyphen.
Additional documentation can be added beyond this file. However, all
documentation for a given driver must start with the drivers name and a
hyphen.
If a driver wants to include topics, the convention is to mention these
other topics within the driver's main help. Any markdown file which is
included in the resources of a driver module will be viewable by users
with the help command `nb help <name>`. For example, if a driver module
contains `../src/main/resources/mydriver-specials.md`, then a user would
be able to find this help by running `nb help mydriver-specials`.
These sources of documentation can be wired into the main NoSQLBench documentation system with a set
of content descriptors.
@@ -79,9 +180,121 @@ of content descriptors.
Conformant driver implementations should come with one or more examples of a workload under the
activities directory path.
Useful driver implementations should come with one or more examples of a
workloads under the activities directory path. These examples should
employ the "named scenarios" format as described in the main docs. By
including named scenarios in the yaml format, these named scenarios then
become available to users when they look for scenarios to call with the
`--list-scenarios` command.
## Examples
To include such scenario, simply add a working yaml with a scenarios
section to the root of your module under the
`src/main/resources/activities` directory.
## Included Examples
Useful driver implementations should come with a set of examples under the
examples directory path which demonstrate useful patterns, bindings, or
statement forms.
Users can find these examples in the same way as they can find the named
scenarios above with the only difference being their location. By
convention the directory `src/main/resources/examples` directory is where
these are located.
The format is the same as for named scenarios, because the examples *are*
named scenarios. Users can find these by using the `--include=examples`
option in addition to the `--list-scenarios` command.
## Testing and Docs
Complete driver implementations should also come with a set of examples under the examples
directory path.
Unit testing within the NB code base is necessary in many places, but not
in others. Use your judgement about when to *not* add unit testing, but
default to adding it when it seems subjective. A treatise on when and how
to choose appropriate unit testing won't fit here, but suffice it to say
that you can always ask the project maintainers for help on this if you
need.
Non-trivial code in pull requests without any form of quality checks or
testing will not be merged until or unless the project maintainers are
satisfied that there is little risk of user impact. Experimental features
clearly labeled as such will be given more wiggle room here, but the label
will not be removable unless/until a degree of robustness is proven in
some testing layer.
### Testing Futures
In the future, the integration testing and the docs system are intended to
become part of one whole. Particularly, docs should provide executable
examples which can also be used to explain how NB or drivers work. Until
this is done, use the guidelines above.
## Usage of the Op Template
The operation which will be executed in a driver should be derivable from
the YAML as provided by a user. Thankfully, NoSQLBench goes to great
lengths to make this easy for both to the user and to the driver
developer. In particular, NB presents the user with a set of formatting
options in the YAML which are highly flexible in terms of syntax and
structure. On the other side, it presents the driver developer with a
service interface which contains all the input from the user as a complete
data structure.
This means that the driver developer needs to make it clear how different
forms of content from the YAML will map into an operation. Fundamentally,
a driver is responsible for mapping the fully realized data structure of
an `op template` into an executable operation by NoSQLBench.
In some protocols or syntaxes, the phrase _the statement_ makes sense, as
it is the normative way to describe what an operation does. This is true,
for example, with CQL. In CQL You have a statement which provides a very
clear indication of what a user is expecting to happen. At a more abstract
level, and more correctly, the content that a user puts into the YAML is a
`statement template`, and more generally within NoSQLBench, and `operation
template`. This is simply called an `op template` going forward, but as a
driver developer, you should know that these are simply different levels
of detail around the same basic idea: an executable operation derived from
a templating format.
Since there are different ways to employ the op template, a few examples
are provided here. As a driver developer, you should make sure that your
primary docs include examples like these for users. Good NB driver docs
will make it clear to users how their op templates map to executable
operations.
### op template: statement form
In this form, the op template is provided as a map, which is also an
element of the statements array. The convention here is that the values
for and _the statement_name_ and _the statement_ are taken as the first
key and value. Otherwise, the special properties `name` and `stmt` are
explicitly recognized.
```text
statements:
- aname: the syntax of the statement with binding {b1}
tags:
tag1: tagvalue1
params:
param1: paramvalue1
freeparamfoo: freevaluebar
bindings:
b1: NumberNameToString()
```
### op template: map form
```text
statements:
- cmd_type:
```
Structural variations and conventions.
## Handling secrets
Reading passwords ...

120
devdocs/devguide/modules.md Normal file
View File

@@ -0,0 +1,120 @@
# NoSQLBench Module Dependencies
```plantuml
digraph Test {
compound=true
rankdir=LR;
node[shape=none]
node[shape="box"]
subgraph cluster1 {
label="drivers"
driver_diag[label="driver-diag"];
driver_stdout[label="driver-stdout"];
driver_tcp[label="driver-tcp"];
driver_http[label="driver-http"];
driver_cql_shaded[label="driver-cql-shaded"];
driver_cqlverify[label="driver-cqlverify"];
driver_web[label="driver-web"];
driver_kafka[label="driver-kafka"];
driver_mongodb[label="driver-mongodb"];
driver_jmx[label="driver-jmx"];
}
driver_diag -> engine_api [ltail="cluster1"];
subgraph cluster0 {
label="engine"
engine_core[label="engine-core"];
engine_extensions[label="engine-extensions"];
engine_docs[label="engine-docs"];
engine_cli[label="engine-cli"];
engine_rest[label="engine-rest"];
engine_docker[label="engine-docker"];
engine_api[label="engine-api"];
docsys[label="docsys"];
}
engine_api -> drivers_api;
// subgraph cluster2 {
// label="APIs"
nb_api[label="nb-api"];
nb_annotations[label="nb-annotations",tooltip="sdf"];
nb_api -> nb_annotations;
//}
subgraph cluster3 {
label="virtdata-userlibs"
virtdata_lib_curves4[label="virtdata-lib-curves4"];
virtdata_lib_realer[label="virtdata-lib-realer"];
virtdata_lib_random[label="virtdata-lib-random"];
virtdata_realdata[label="virtdata-realdata"];
virtdata_lib_basics[label="virtdata-lib-basics"];
virtdata_api[label="virtdata-api"];
virtdata_lang[label="virtdata-lang"];
}
docsys -> nb_api;
engine_api -> nb_api;
engine_api -> nb_annotations;
engine_api -> virtdata_lib_basics[lhead="cluster3"];
engine_core -> engine_api [ltail="cluster0"];
engine_docs -> docsys;
engine_cli -> engine_core;
engine_cli -> engine_docker;
engine_rest -> engine_cli;
/**
nb[label="nb"];
nb -> driver_web;
nb -> driver_kafka;
nb -> driver_stdout;
nb -> driver_diag;
nb -> driver_tcp;
nb -> driver_http;
nb -> driver_jmx;
nb -> driver_cql_shaded;
nb -> driver_cqlverify;
nb -> driver_cql_mongodb;
nb -> engine_rest;
nb -> engine_cli;
nb -> engine_docs;
nb -> engine_core;
nb -> engine_extensions;
**/
driver_tcp -> driver_stdout;
driver_cqlverify -> driver_cql_shaded;
driver_kafka -> driver_stdout;
virtdata_api -> nb_api;
virtdata_api -> virtdata_lang;
virtdata_lib_basics -> virtdata_api [ltail=cluster3];
virtdata_lib_random -> virtdata_lib_basics
virtdata_lib_curves4 -> virtdata_lib_basics;
/**
mvndefaults[label="mvn-defaults"];
mvndefaults -> TESTDEPS;
**/
virtdata_lib_realer -> virtdata_lib_basics;
/**
virtdata_userlibs -> virtdata_realdata;
virtdata_userlibs -> virtdata_lib_realer;
virtdata_userlibs -> virtdata_lib_random;
virtdata_userlibs -> virtdata_lib_basics;
virtdata_userlibs -> virtdata_lib_curves4;
virtdata_userlibs -> docsys;
**/
}
```

View File

@@ -1,48 +1,59 @@
# Bundled Docs
# NBDocs - NoSQLBench Docs
In order to keep the structure of NoSQLBench modular enough to allow for easy extension by contributors, yet cohesive in
how it presents documentation and features to users, it is necessary to provide internal services which aggregate
content by subject matter into a consumable whole that can be used by the documentation system.
__THIS IS A WORK IN PROGRESS__
# MarkdownDocs Service
In order to keep the structure of NoSQLBench modular enough to allow for easy extension by
contributors, yet cohesive in how it presents documentation and features to users, it is necessary
to provide internal services which aggregate content by subject matter into a consumable whole that
can be used by the documentation system.
## MarkdownDocs Service
The primary markdown service that is meant to be consumed by the documetnation system is known simply as
MarkdownDocs
Static methods on this class will provide all of the markdown content in pre-baked and organized form. The markdown
service is responsible for reading all the raw markdown sources and organizing their content into a single cohesive
structure. MardownDocs finds all content that is provided by individual MarkdownProvider services, as described below.
Static methods on this class will provide all of the markdown content in pre-baked and organized
form. The markdown service is responsible for reading all the raw markdown sources and organizing
their content into a single cohesive structure. MardownDocs finds all content that is provided by
individual MarkdownProvider services, as described below.
All of the rules for how raw markdown content is to be combined are owned by the MarkdownDocs service.
All of the rules for how raw markdown content is to be combined are owned by the MarkdownDocs
service.
The MarkdownDocs service relies on SPI published services which provide raw markdown sources as described below.
The MarkdownDocs service relies on SPI published services which provide raw markdown sources as
described below.
# RawMarkdownSource Services
## RawMarkdownSource Services
The `RawMarkdownSource` service is responsible for bundling the raw markdown for a path within a NoSQLBench module. Each
module that wishes to publish markdown docs to users must provide one or more RawMarkdownSource services via SPI. This
is most easily done with a `@Service(RawMarkdownSource.class)` annotation.
The `RawMarkdownSource` service is responsible for bundling the raw markdown for a path within a
NoSQLBench module. Each module that wishes to publish markdown docs to users must provide one or
more RawMarkdownSource services via SPI. This is most easily done with a
`@Service(RawMarkdownSource.class)` annotation.
## RawMarkdownSource endpoints provide Content
Each instance of a RawMarkdownSource service provides all of the individual markdown files it finds indirectly as
io.nosqlbench.nb.api.content.Content, which allows the internal file content to be read appropriately regardless of
whether it comes from a classpath resource stream, a file on disk, or even a dynamic source like function metadata.
Each instance of a RawMarkdownSource service provides all of the individual markdown files it finds
indirectly as io.nosqlbench.nb.api.content.Content, which allows the internal file content to be
read appropriately regardless of whether it comes from a classpath resource stream, a file on disk,
or even a dynamic source like function metadata.
## RawMarkdownSources Aggregator
A service aggregator called RawMarkdownSources provides easy access to all raw markdown sources provided by all
published instances of the service.
A service aggregator called RawMarkdownSources provides easy access to all raw markdown sources
provided by all published instances of the service.
# Front Matter Interpretation
## Front Matter Interpretation
There is a set of rules observed by MarkdownDocs for repacking markdown for structured display. These rules are largely
driven by front matter.
There is a set of rules observed by MarkdownDocs for repacking markdown for structured display.
These rules are largely driven by front matter.
## Doc Scope
### Doc Scope
There are three doc scopes that can be added to source markdown via the `scopes` front matter.
The `scope` front matter property determines where content should be visible. This is useful for
providing documentation about a topic or concept that can be pulled into multiple places. There are
three doc scopes that can be added to source markdown via the `scopes` front matter. This is a
multi-valued property
* `cli` - The source content should be included for command-line searching and viewing.
* `web` - The source content should be included for static web documentation.
@@ -51,49 +62,165 @@ There are three doc scopes that can be added to source markdown via the `scopes`
If no scopes are provided, then a special scope `any` is assigned to source content.
__ THIS IS A WORK IN PROGRESS __
**Examples**
## Topic Names
```yaml
---
scopes: all
---
```
The `topic` property determines
```yaml
---
scopes: cli,web
---
```
1. Front matter may be sanity checked for unrecognized properties.
2. All front matter that is considered is required to have at least one topic value.
3. Topic values which contain `, ` or `; ` patterns are auto-split into multiple topics.
4. Topics can be hierarchical. Topics in the form of `cat1/cat2/topicfoo` are considered nested topics, with the
containing layer being considered a category. The right most word is considered the basic topic name. This means that
in the above topic name, `cat1` is a topic category containing the `cat2` topic category, which contains the topic
`topicfoo`.
5. *Topic Expansion* - A topic entry which starts with a caret `^`, contains either of '.*', '.+', or ends with a `$` is
considered a wildcard topic. It will be treated as a topic pattern which will be compared to known topics. When it
matches another topic, the matched topic is added to the virtualized topic list of the owning item.
6. `aggregations` are used to physically aggregate content from matching topics onto a markdown source:
1. Each aggregation is a pattern that is tested against all topics after topic expansion.
2. When a source item is matched to an aggregation,
3. wildcards, except that they
cause all matching topics to be aggregated onto the body of the owning markdown source.
4. All topics (after topicin order determined by weight. Aggregations are indicated with an `aggregation` property.
regations are split on commas and semicolons as above, and are always considered patterns for matching. Thus,
aggregation with none of the regex indicators above will only match topics with the same literal pattern.
### Topic Names
7. Front matter will be provided with topical aggregations included, with the following conditions:
* aggregations properties are elided from the repacked view. Instead, an `included` header is added which lists all
of the included topics.
The `topic` front-matter property determines the assocation between a documentation fragment and the
ways that a user might name or search for it. All content within NBDocs falls within one or more
nested topics. That is, raw content could be homed under a simple topic name, or it could be homed
under a topic which has a another topic above.
1. All front matter that is considered is required to have at least one topic value.
2. Topic values which contain `, ` or `; ` patterns are auto-split into multiple topics.
3. Topics can be hierarchical. Topics in the form of `cat1/cat2/topicfoo` are considered nested
topics. Topics which contain other topics are called topic categories, but they are also topics.
4. Topics can be literal values or they can be patterns which match other topics.
## Composite Markdown
**examples**
When aggregations occur, the resulting markdown that is produces is simply a composite of all of the included markdown
sources. The front matter of the including markdown source becomes the first element, and all other included are added
after this. The front matter of the including markdown becomes the representative front matter for the composite
markdown.
```yaml
---
topics: cli, parameters
---
```
## Indexing Data
### Topic Aggregation
Indexing data should be provided in two forms:
Topics can be placeholders for matching other topics. When a topic name starts with a caret `^`,
contains either of `.*`, or `.+`, or ends with a `$`, it is called a topic pattern.
1. The basic metadata index which includes topics, titles, and other basic info and logical path info. This view is used
to build menus for traversal and other simple views of topics as needed for direct presence check, or lookup.
2. A FTS index which includes a basic word index with stemming and other concerns pre-baked. This view is used as a
cache-friendly searchable index into the above metadata.
The patterns are regular expressions, even though the patterns above are used explicitly to enable
pattern detection.
This allows for content to be included in other places. When source content is marked with a topic
pattern, any other source content that is matched by that pattern is included in views of its raw
content. Further, the topics which are matched are added to an `included` property in the matching
content's front matter. The matched content is not affected in any other way and is still visible
under the matched topic names.
It is considered an error to create a circular reference between different topics. This condition is
checked explicitly by the topic mapping logic.
**examples**
```yaml
---
title: owning topic
topics: ^cli$, foobarbaz
---
# Main Content
main content
```
```yaml
---
title: included topic
topics: cli, parameters
---
## Included content
included content
```
Taking the two source documents above, the markdown loading system would present the following
document which globs the second one onto the first:
```yaml
---
title: owning topic
topics: cli, foobarbaz, parameters
---
# Main Content
main content
## Included Content
included content
```
Within an aggregate source like the above, all included sections will be ordered according to the
weight front-matter property first, and then by the title, and then by the name of the interior
top-level heading of source item.
## Logical Structure
According to the rules and mechanisms above, it is possible to organize all the provided content
into a clean and consistent strucure for search and presentation.
The example below represents a schematic of what content sources will be provided in every document
after loading and processing:
```
---
title: <String>
scope: <Set<String>>
topics: <Set<String>>
included: <Set<String>>
weight: <Number>
---
<optional content>
```
In specific:
**title** will be provided as a string, even if it is empty. **scope** will be provided as a set of
strings, possibly an empty set. **topics** will be provided as a set of strings, possibly an empty
set. **included** will be provided as a set of strings, possibly an empty set. **weight** will be
provided as a number, possibly 0.
Headings and content may or may not be provided, depending on how the content is aggregated and what
topics are matched for content aggregation.
## Repackaged Artifacts
The documentation aggregator system may be asked to store repackaged forms. These are suggested:
1. A metadata.json which includes the basic metadata of all the documentation entries, including:
1. module path of logical document
2. title
3. topics
4. included
5. weight
2. A manifest.json which includes:
1. module path of logical document
2. title of document
3. heading structure of document
3. A topics.json which includes the topic structure of all topics and headings
4. A full-text search index.
With these artifacts provides as services, clients can be efficient in discovering content and
making it searchable for users.
## Topic Mapping Logic
Topics may be transitive. There is no clear rationale for avoiding this, and at least one good
reason to allow it. The method for validating and resolving topic aggregations, where there may be
layers of dependencies, is explained here:
__BEING REWRITTEN__
1. All content sources are put into a linked list in no particular order.
2. The list is traversed from head to tail repeatedly.
3. When the head of the list is an aggregating source, and all of the matching elements are
non-aggregating, then the element converted to a non-aggregating element.
4. If the head of the list is a non-aggregating source, it is moved to the tail.
5. When all of the elements of the list are non-aggregating, the mapping is complete.
6. When the

View File

@@ -0,0 +1,90 @@
---
title: Docs Design Guide
weight: 10
topics: Docs, Design Guides
---
# Docs Design Guide
This is an example of a well-formed markdown doc source for NBDocs. Everything in this doc is
structured according to the headings. Headings at level one are meant to be descriptive of the whole
document, thus there should only be one level-one heading above. There should always be top level
heading within each raw document. As the paragraph under the top-level heading, this text is
meant to provide top-level information about the contents of this document.
The top level heading does not necessarily mean that the heading is a level-one heading. In this
case it is, but the document is allowed to be hoisted to any heading level between 1 and 6 as long
as the internal structure of the doc is hierachical and properly nested.
The first heading in a document will be taken as the main section label.
Since there is generally only one top-level section in this design approach, there is no way to have
multiple stanzas of top-level content throughout the document _at this level_. This is intentional
and part of keeping content structure in a hierarchical form. If you need to enumerate concepts or
sections with headings at the same level, that need can easily be fulfilled by using multiple
sections at lower levels. Conceptually, if you have a challenge fitting concepts into this
structure, go back to the concept map view of things and decide where the fit before continuing on.
Because the words "Docs Design Guide" are included above in a heading, they will be matched
in searches for topics.
## Basic Conventions
This is the first inner sub-section of the document. The heading here is simple. Since it is
indented further than the heading of the section before it, it is considered a subsection of that.
Level two headings are often rendered slightly smaller than the level one headings, with the same
type visual document partitioning as level one headings like horizontal dividers.
Names of heading should be descriptive and contain no special characters.
A convention of four spaces (and no tabs) is preferred throughout. This provides a familiar layout
for users who are viewing the raw documentation in monospace.
Line lengths are expected to be hard wrapped at 100 characters. This means that users will be able
to see the text rendered cleanly in monospace form even if they have to widen their terminals
*slightly*, but it also means we're not fighting with 80 character limits that no longer make sense.
It is strongly advised that any editor use the auto hard-wrapping and formatting settings of their
editor tools for this.
### Formatting Details
This is the third section within the document, but it is the first subsection of the second section.
Headings at this level are generally visually interior to those at the second level, so you can use
this level to mark a sequence of things that are local to a topic more visually.
**emphasized item**
Emphasized items like the above can be included as shown on the left to demark items which are
nominal in a local sense, but which are considered _only_ content of some other well-known globally
obvious topic or concept. That means you intend for users to find these local topics by way of
finding the other more global name, but you don't want to cloud their view of these fine details
unless they specifically ask for that kind of search.
1. Lists of things can be indented and elaborated on with self-similar indentation.
This is another paragraph of the enumerated section. It can have its own interior details.
1. This is a list within a list.
- items in this list can have bullet points.
- items in this list can have bullet points.
```text
All fenced code is expected to be annotated with a language specifier as above.
```
For example, consider this JSON data:
```json5
{
"description": "This is a block of json, and it might be syntax highlighted."
}
```
When you want to block indent details, simply use the four space rule.
This will stand out to users as an example, hopefully to explain in detail
what is being described above and/or below this content. Generally, it is
formatted as an inline block with monospace rendering and a clearly different
background.

View File

@@ -0,0 +1,35 @@
digraph {
newrank=true
compound=true
node [fontsize = 8,shape = record]
rankdir = LR;
subgraph cluster0 {
rankdir = LR;
step0[shape=none]
node [fontsize = 8, shape = record]
A0 [label="A|topic:a,(b)"]
B0 [label="B|topic:b,(c)"]
C0 [label="C|topic:c"]
}
subgraph cluster1 {
node [fontsize = 8, shape = record]
step1[shape=none]
a1 [label="a",shape=oval]
A1 -> a1 [label="topic of"]
A1 -> b1 [label="topic of"]
expr_b1 -> b1 [label="match"]
expr_b1[label="(b)",shape=oval]
B1 [label="match"]
b1 [label="b", shape=oval]
B1 -> b1 [label="topic of"]
A1 [label="A"]
B1 [label="B|topic:a,(b)"]
C1 [label="C|topic:b,(c)"]
}
step0 -> step1[ltail=cluster0,lhead=cluster1]
}

View File

@@ -0,0 +1,41 @@
digraph {
node [fontsize = 8,shape = record]
rankdir = LR;
subgraph clusterB {
label = "after topic mapping"
node [fontsize = 8,shape = record]
rankdir = LR;
ap; bp; cp; dp; ep;
ap [label = "A|topics: all-topics|included: cli,time,temp"]
ap -> {bp; cp; dp; ep}
bp [label = "B|topics:|included: cli"]
cp [label = "C|topics:cli"]
bp -> cp
dp [label = "D|topics:temp"]
ep [label = "E|topics:time"]
}
subgraph clusterA {
label = "before topic mapping"
node [fontsize = 8,shape = record]
rankdir = LR;
a; b; c; d; e;
a [label = "A|topics: .*,all-topics"]
a -> {b; c; d; e}
b [label = "B|topics:(cli)"]
c [label = "C|topics:cli"]
b -> c
d [label = "D|topics:temp"]
e [label = "E|topics:time"]
}
}

View File

@@ -12,4 +12,7 @@ These are doc sites that have examples of good docs.
- [Javascript.info](https://javascript.info/strict-mode)
- [TiKV docs - explore the tabs](https://tikv.org/docs/3.0/concepts/overview/)
- [rocket](https://rocket.rs/v0.4/overview/)
- [stripe](https://stripe.com/docs/payments/integration-builder)
- [tokio.rs](https://tokio.rs/)
- [amazon ion](http://amzn.github.io/ion-docs/guides/cookbook.html)

View File

@@ -0,0 +1,8 @@
graph {
{ rank=same;
operations -- dataset;
}
dataset -- datashape;
datashape -- operations;
}

View File

@@ -0,0 +1,46 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN"
"http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd">
<!-- Generated by graphviz version 2.40.1 (20161225.0304)
-->
<!-- Title: %3 Pages: 1 -->
<svg width="192pt" height="116pt"
viewBox="0.00 0.00 191.54 116.00" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink">
<g id="graph0" class="graph" transform="scale(1 1) rotate(0) translate(4 112)">
<title>%3</title>
<polygon fill="#ffffff" stroke="transparent" points="-4,4 -4,-112 187.5427,-112 187.5427,4 -4,4"/>
<!-- operations -->
<g id="node1" class="node">
<title>operations</title>
<ellipse fill="none" stroke="#000000" cx="47.4458" cy="-90" rx="47.3916" ry="18"/>
<text text-anchor="middle" x="47.4458" y="-86.3" font-family="Times,serif" font-size="14.00" fill="#000000">operations</text>
</g>
<!-- dataset -->
<g id="node2" class="node">
<title>dataset</title>
<ellipse fill="none" stroke="#000000" cx="148.4458" cy="-90" rx="35.194" ry="18"/>
<text text-anchor="middle" x="148.4458" y="-86.3" font-family="Times,serif" font-size="14.00" fill="#000000">dataset</text>
</g>
<!-- operations&#45;&#45;dataset -->
<g id="edge1" class="edge">
<title>operations&#45;&#45;dataset</title>
<path fill="none" stroke="#000000" d="M95.1841,-90C101.1483,-90 107.1125,-90 113.0767,-90"/>
</g>
<!-- datashape -->
<g id="node3" class="node">
<title>datashape</title>
<ellipse fill="none" stroke="#000000" cx="97.4458" cy="-18" rx="45.4919" ry="18"/>
<text text-anchor="middle" x="97.4458" y="-14.3" font-family="Times,serif" font-size="14.00" fill="#000000">datashape</text>
</g>
<!-- dataset&#45;&#45;datashape -->
<g id="edge2" class="edge">
<title>dataset&#45;&#45;datashape</title>
<path fill="none" stroke="#000000" d="M136.3595,-72.937C128.3475,-61.626 117.8658,-46.8282 109.7996,-35.4407"/>
</g>
<!-- datashape&#45;&#45;operations -->
<g id="edge3" class="edge">
<title>datashape&#45;&#45;operations</title>
<path fill="none" stroke="#000000" d="M85.2803,-35.5182C77.4648,-46.7727 67.3528,-61.3339 59.5494,-72.5708"/>
</g>
</g>
</svg>

After

Width:  |  Height:  |  Size: 2.1 KiB

View File

@@ -0,0 +1,25 @@
digraph {
{ rank = same; data_shape; op_shape; }
{ rank = same; data_types; data_samples; }
system -> op_log;
op_log -> op_syntax_samples;
{ rank = same; op_syntax_samples; op_field_samples; }
op_syntax_samples -> op_shape;
op_log -> op_field_samples;
op_field_samples -> data_samples;
schema -> data_types;
schema -> op_shape;
data_shape -> workload;
op_shape -> workload;
exported_data -> data_samples;
interactive_sampling -> data_samples;
interactive_sampling -> schema;
dataset -> exported_data;
dataset -> interactive_sampling;
data_types -> data_shape;
data_samples -> data_shape;
workload [label = "synthesized\nworkload"]
}

View File

@@ -0,0 +1,42 @@
digraph ws {
node [shape = none]
label = "Workload Synthesis Data Flow"
edge [fontsize = 8]
{
rank = min;
app [label = "application"];
analyzer;
nosqlbench;
}
subgraph clusterf {
rank = same;
label="";
appops [label = "ops in\nflight"];
oplog [label="full query\nlog"];
workload [label="Synthesized\nWorkload"];
test_ops[label="ops in\nflight"];
}
{
rank = same;
system[label="Capture\nTarget"];
test_system[label="Test\nTarget"];
}
app -> appops[label="normal\noperation"];
appops -> system;
system -> oplog;
analyzer [rank = min]
oplog -> analyzer
analyzer -> workload [label="pattern\nanalysis"];
workload -> nosqlbench;
nosqlbench -> test_ops [label = "run\nscenario"];
test_ops -> test_system;
}

View File

@@ -0,0 +1,210 @@
# Workload Synthesis
This document describes the background and essential details for understanding workload
synthesis as a potential NoSQLBench capability. Here, workload synthesis means constructing
a NoSQLBench workload based on op templates which is both representative of some recorded
or implied workload as informed by schema, recorded operations, or data set. The goal is
to construct a template-based workload description that aligns to application access patterns
as closely as the provided data sources allows.
With the release of Apache Cassandra 4.0 imminent, the full query log capability it provides
offers a starting point for advanced workload characterization. However, FQL is only a starting
point for serious testing at scale. The reasons for this can be discussed in detail elsewhere,
but the main reason is that the test apparatus has inertia due to the weight of the data logs
and the operational cost of piping bulk data around. Mechanical sympathy suggests a different
representation that will maintain headroom in testing apparatus, which directly translates to
simplicity in setup as well as accuracy in results.
Further, the capabilities needed to do workload synthesis are not unique to CQL. This is a general
type of approach that can be used for multiple drivers.
# Getting there from here
There are several operational scenarios and possibilities to consider. These can be thought of as incremental goals to getting full workload synthesis into nosqlbech:
1) Schema-driven Workload - Suppose a workload with no operations visibility
This means taking an existing schema as the basis for some supposed set of operations. There are
many possibilites to consider in terms of mapping schema to operations, but this plan only considers
the most basic possible operations which can exercise a schema:
- write data to a database given a schema - Construct insert operations using very basic and default
value generation for the known row structure
- read data to a database, given a schema - Construct read operations using very basic and default
value generation for the known identifier structure
Users should be allowed to specify which keyspaces and/or tables should be included.
Users will be allowed to specify the relative cardinality of values used at each level of identifier,
and the type of value nesting.
The source of the schema will be presumed to be a live system, with the workload generation being done
on-the-fly, but with an option to save the workload description instead of running it. In the case of
running the workload, the option to save it will be allowed additionally.
The option to provide the schema as a local file (a database dump of describe keyspace, for example)
should be provided as an enhancement.
**PROS**
This method can allow users to get started testing quickly on the data model that they've chosen
with *nearly zero* effort.
**CONS**
This method only knows how to assume some valid operations from the user's provided schema. This means
that the data model will be accurate, but it doesn't know how to construct data that is representative
of the production data. It doesn't have the ability to emulate more sophisticated operational
patterns or even inter-operational access patterns. it doesn't know how to construct inter-entity
relationships.
In essence, this is a bare-bones getting started method for users who just want
to exercize a specific data model irrespective of other important factors. It should only be used
for very basic testing or as a quick getting started workflow for more accurate workload definitions.
As such, this is still a serious improvement to the user workflow for getting started.
2) Raw Replay - Replay raw operations with no schema visibility
This is simply the ability to take a raw data file which includes operations and play it back via
some protocol. This is a protocol-specific mechanism, since it requires integration with the
driver level tools of a given protocol. As such it will only be avilable (to start) on a per-driver
basis. For the purposes of this plan, assume this means "CQL".
This level of workload generation may depend on the first "schema-driven workload" as in many cases,
users must have access to DDL statements before running tests with DML statements. In some scenarios,
this may not be required as the testing may be done in-situ against a system that is already populated.
Further, it should be allowed that a user provide a local schema or point their test client at an existing
system to gather the schema needed to replay from raw data in the absence of a schema.
**PROS**
This allows users to start sending operations to a target system that are facsimile copies of operations
previously observed, with *some* effort.
**CONS**
Capturing logs and moving them around is an operational pain. This brings the challenge of managing
big data directly into the workflow of testing at speed and scale.
Specifically: Unless users can adapt the testing apparatus to scale better (in-situ) than the
system under test, they will get invalid results. There is a basic principle at play here that
requires a testing instrument to have more headroom than the thing being tested in order to avoid
simply testing the test instrument itself. This is more difficult than users are generally prepared
to deal with in practice, and navigating it successfully requires more diligence and investment
than just accepting the tools as-is. This means that the results are often unreliable as the
perceived cost of doing it "right" are too high. This doesn't have to be the case.
The raw statement replay is not able to take advantage of optimizations that applications of
scale depend on for efficiency, such as prepared statements.
Raw statement replay may depend on native-API level access ot the source format. Particularly
in the FQL form from Apache Cassandra 4.*, the native form depends on internal buffering formats
which have no typed-ness or marshalling support for external consumers. Export formats can be
provided, but the current built-in export format is textual and extremely inefficient for use
by off-board tools.
The amount of data captured is the amount of data available for replay.
The data captured may not be representative across a whole system unless it is sampled across
all clients and connections.
3) Workload Synthesis - Combine schema and operations visibility into a representative workload.
This builds on the availability of schema details and operation logs to create a workload which
is reasonably accurate to the workload that was observed in terms of data used in operations as
well as relative frequency of operations. Incremental pattern analysis can be used to increase
the level of awareness about obvious patterns as the toolkit evolves.
The degree of realism in the emulated data set and operational patterns depends on a degree of deep analysis
which will start out decidedly simple: relative rates of identifiable statement patterns, and simple
statistical shaping of fields in operations.
**PROS**
This method allows users to achieve a highly representative workload that exactly reproduces the
statement forms in their application, with a mix of operations which is representative, with
a set of data which is representative, with *some* effort. Once this workload is synthesized,
they can take that as much more accurate starting point for experimentation. Changes from this point
are not dependent on big data, but on a simple recipe and description that can be changed
in a text file and immediately use again with different test conditions.
This method allows the test client to run at full speed, using the more efficient and extremely
portable procedural data generation methods of current state-of-the-art testing methods in nb.
This workload description also serves as a fingerprint of the shape of data in recorded operations.
**CONS**
This requires and additional step of analysis and workload characterization. For the raw data collection,
the same challenges associated with raw replay apply. This can be caveated with the option that
users may run the workload analsysi tool on system nodes where the data resides locally and then
use the synthesized workload on the client with no need to move data around.
As explained in the _Operations and Data_ section, this is an operation-centric approach to
workload analysis. While only a minor caveat, this distinction may still be important with respect
to dataset-centric approaches.
**PRO & CON**
The realism of the test depends directly on the quality of the analysis used to synthesize the
workload, which will start with simple rules and then improve over time.
## Workflows
To break these down, we'll start with the full synthesis view using the following "toolchain" diagram:
![workload_synthesis](workload_synthesis.svg)
In this view, the tools are on the top row, the data (in-flight and otherwise) in the middle,
and the target systems in the bottom. This diagram shows how the data flows between tools
and how is manipulated at each step.
## Operations and Data
Data in fields of operations and data within a dataset are related, but not in a way that allows a user to fully understand one through the lens of the other, except in special cases.
This section contrasts different levels of connectedness, in simple terms, between operations
and data that result from them.
First lest start with an "affine" example. Assume you have a data set which was build from additive operations such as (pseudo-code)
for all I in 0..1000
for all names in A B C D E
insert row (I, name), values ...
This is an incremental process wherein the data of the iterators will map exactly to the
data in the dataset. Now take an example like this:
for all I in 0..1000
for all names in A B C D E
insert row ((I mod 37), name), values ...
In this case all we've done is reduce the cardinality of the effective row identifier. Yet, the operations are not limited to 37 unique operations. As a mathematician, you could still work out
the resulting data set. As a DBA, you would never want to be required to do so.
Let's take it to the next level.
for all I in 0..1000
for all names in A B C D E
insert row ((now() mod 37), name), values...
In this case, we've introduced a form of indeterminacy which seems to make it impossible to predict
the resulting state of the dataset. This is actually much more trivial than what happens in practice
as soon as you start using UUIDs, for example.
Attempts have been made to restore the simplicity of using sequences as identifiers in distributed systems, yet no current implementation seems to have a solid solution without self-defeating
trade-offs in other places. Thus, we have to accept that the relationship between operations and dataset is _complicated_ in practice. This is merely one example of how this relationship gets weakened in practice.
The point of explaining this at this fundamental level of detail is make it clear that we need to treat data of operations an datasets as independent types of data.
To be precise, the data used within operations will be called **op data**. In contrast, the term **dataset** will be taken to mean data as it resides within storage, distinct
from op data.

View File

@@ -0,0 +1,124 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE svg PUBLIC "-//W3C//DTD SVG 1.1//EN"
"http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd">
<!-- Generated by graphviz version 2.40.1 (20161225.0304)
-->
<!-- Title: ws Pages: 1 -->
<svg width="340pt" height="281pt"
viewBox="0.00 0.00 340.00 281.00" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink">
<g id="graph0" class="graph" transform="scale(1 1) rotate(0) translate(4 277)">
<title>ws</title>
<polygon fill="#ffffff" stroke="transparent" points="-4,4 -4,-277 336,-277 336,4 -4,4"/>
<text text-anchor="middle" x="166" y="-7.8" font-family="Times,serif" font-size="14.00" fill="#000000">Workload Synthesis Data Flow</text>
<g id="clust2" class="cluster">
<title>clusterf</title>
<polygon fill="none" stroke="#000000" points="8,-116 8,-193 324,-193 324,-116 8,-116"/>
<text text-anchor="middle" x="166" y="-177.8" font-family="Times,serif" font-size="14.00" fill="#000000">data</text>
</g>
<!-- app -->
<g id="node1" class="node">
<title>app</title>
<text text-anchor="middle" x="43" y="-251.3" font-family="Times,serif" font-size="14.00" fill="#000000">application</text>
</g>
<!-- appops -->
<g id="node4" class="node">
<title>appops</title>
<text text-anchor="middle" x="43" y="-146.8" font-family="Times,serif" font-size="14.00" fill="#000000">ops in</text>
<text text-anchor="middle" x="43" y="-131.8" font-family="Times,serif" font-size="14.00" fill="#000000">flight</text>
</g>
<!-- app&#45;&gt;appops -->
<g id="edge1" class="edge">
<title>app&#45;&gt;appops</title>
<path fill="none" stroke="#000000" d="M43,-236.5055C43,-219.1257 43,-192.8262 43,-172.5411"/>
<polygon fill="#000000" stroke="#000000" points="46.5001,-172.2734 43,-162.2734 39.5001,-172.2734 46.5001,-172.2734"/>
<text text-anchor="middle" x="58.5" y="-212.6" font-family="Times,serif" font-size="8.00" fill="#000000">normal</text>
<text text-anchor="middle" x="58.5" y="-203.6" font-family="Times,serif" font-size="8.00" fill="#000000">operation</text>
</g>
<!-- analyzer -->
<g id="node2" class="node">
<title>analyzer</title>
<text text-anchor="middle" x="162" y="-251.3" font-family="Times,serif" font-size="14.00" fill="#000000">analyzer</text>
</g>
<!-- workload -->
<g id="node6" class="node">
<title>workload</title>
<text text-anchor="middle" x="203" y="-146.8" font-family="Times,serif" font-size="14.00" fill="#000000">Synthesized</text>
<text text-anchor="middle" x="203" y="-131.8" font-family="Times,serif" font-size="14.00" fill="#000000">Workload</text>
</g>
<!-- analyzer&#45;&gt;workload -->
<g id="edge5" class="edge">
<title>analyzer&#45;&gt;workload</title>
<path fill="none" stroke="#000000" d="M168.7703,-236.5055C175.2205,-218.8855 185.0269,-192.0974 192.4926,-171.7032"/>
<polygon fill="#000000" stroke="#000000" points="195.7936,-172.8672 195.9446,-162.2734 189.2202,-170.4608 195.7936,-172.8672"/>
</g>
<!-- nosqlbench -->
<g id="node3" class="node">
<title>nosqlbench</title>
<text text-anchor="middle" x="258" y="-251.3" font-family="Times,serif" font-size="14.00" fill="#000000">nosqlbench</text>
</g>
<!-- test_ops -->
<g id="node7" class="node">
<title>test_ops</title>
<text text-anchor="middle" x="289" y="-146.8" font-family="Times,serif" font-size="14.00" fill="#000000">ops in</text>
<text text-anchor="middle" x="289" y="-131.8" font-family="Times,serif" font-size="14.00" fill="#000000">flight</text>
</g>
<!-- nosqlbench&#45;&gt;test_ops -->
<g id="edge7" class="edge">
<title>nosqlbench&#45;&gt;test_ops</title>
<path fill="none" stroke="#000000" d="M263.119,-236.5055C267.9738,-218.9656 275.3432,-192.3407 280.9783,-171.9817"/>
<polygon fill="#000000" stroke="#000000" points="284.3709,-172.8447 283.6654,-162.2734 277.6246,-170.9774 284.3709,-172.8447"/>
<text text-anchor="middle" x="286" y="-212.6" font-family="Times,serif" font-size="8.00" fill="#000000">run</text>
<text text-anchor="middle" x="286" y="-203.6" font-family="Times,serif" font-size="8.00" fill="#000000">scenario</text>
</g>
<!-- system -->
<g id="node8" class="node">
<title>system</title>
<text text-anchor="middle" x="75" y="-45.8" font-family="Times,serif" font-size="14.00" fill="#000000">Capture</text>
<text text-anchor="middle" x="75" y="-30.8" font-family="Times,serif" font-size="14.00" fill="#000000">Target</text>
</g>
<!-- appops&#45;&gt;system -->
<g id="edge2" class="edge">
<title>appops&#45;&gt;system</title>
<path fill="none" stroke="#000000" d="M49.1699,-123.5262C53.8926,-108.6204 60.4807,-87.8266 65.8213,-70.9703"/>
<polygon fill="#000000" stroke="#000000" points="69.2737,-71.6616 68.9575,-61.0715 62.6006,-69.5473 69.2737,-71.6616"/>
</g>
<!-- oplog -->
<g id="node5" class="node">
<title>oplog</title>
<text text-anchor="middle" x="116" y="-139.3" font-family="Times,serif" font-size="14.00" fill="#000000">OpLog</text>
</g>
<!-- oplog&#45;&gt;analyzer -->
<g id="edge4" class="edge">
<title>oplog&#45;&gt;analyzer</title>
<path fill="none" stroke="#000000" d="M123.4546,-161.1504C130.7726,-178.9681 142.0375,-206.3958 150.5028,-227.0069"/>
<polygon fill="#000000" stroke="#000000" points="147.3672,-228.585 154.404,-236.5055 153.8424,-225.9255 147.3672,-228.585"/>
</g>
<!-- workload&#45;&gt;nosqlbench -->
<g id="edge6" class="edge">
<title>workload&#45;&gt;nosqlbench</title>
<path fill="none" stroke="#000000" d="M212.4646,-162.2734C221.2873,-180.2395 234.5404,-207.2276 244.4771,-227.4625"/>
<polygon fill="#000000" stroke="#000000" points="241.3683,-229.0722 248.9179,-236.5055 247.6515,-225.9866 241.3683,-229.0722"/>
</g>
<!-- test_system -->
<g id="node9" class="node">
<title>test_system</title>
<text text-anchor="middle" x="289" y="-45.8" font-family="Times,serif" font-size="14.00" fill="#000000">Test</text>
<text text-anchor="middle" x="289" y="-30.8" font-family="Times,serif" font-size="14.00" fill="#000000">Target</text>
</g>
<!-- test_ops&#45;&gt;test_system -->
<g id="edge8" class="edge">
<title>test_ops&#45;&gt;test_system</title>
<path fill="none" stroke="#000000" d="M289,-123.5262C289,-108.761 289,-88.2184 289,-71.4484"/>
<polygon fill="#000000" stroke="#000000" points="292.5001,-71.0715 289,-61.0715 285.5001,-71.0715 292.5001,-71.0715"/>
</g>
<!-- system&#45;&gt;oplog -->
<g id="edge3" class="edge">
<title>system&#45;&gt;oplog</title>
<path fill="none" stroke="#000000" d="M82.8249,-61.276C89.05,-76.6111 97.8417,-98.2686 104.8106,-115.4359"/>
<polygon fill="#000000" stroke="#000000" points="101.6657,-116.994 108.67,-124.9432 108.1516,-114.3611 101.6657,-116.994"/>
<text text-anchor="middle" x="110.5" y="-99.6" font-family="Times,serif" font-size="8.00" fill="#000000">full</text>
<text text-anchor="middle" x="110.5" y="-90.6" font-family="Times,serif" font-size="8.00" fill="#000000">query</text>
<text text-anchor="middle" x="110.5" y="-81.6" font-family="Times,serif" font-size="8.00" fill="#000000">log</text>
</g>
</g>
</svg>

After

Width:  |  Height:  |  Size: 6.7 KiB

View File

@@ -9,7 +9,7 @@
<parent>
<artifactId>mvn-defaults</artifactId>
<groupId>io.nosqlbench</groupId>
<version>3.12.128-SNAPSHOT</version>
<version>3.12.145-SNAPSHOT</version>
<relativePath>../mvn-defaults</relativePath>
</parent>
@@ -18,7 +18,7 @@
<dependency>
<groupId>io.nosqlbench</groupId>
<artifactId>nb-api</artifactId>
<version>3.12.128-SNAPSHOT</version>
<version>3.12.145-SNAPSHOT</version>
</dependency>
<dependency>
@@ -26,20 +26,6 @@
<artifactId>snakeyaml</artifactId>
</dependency>
<!-- test -->
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.assertj</groupId>
<artifactId>assertj-core</artifactId>
<scope>test</scope>
</dependency>
<!-- jetty -->
<dependency>
@@ -112,7 +98,7 @@
<dependency>
<groupId>io.nosqlbench</groupId>
<artifactId>virtdata-api</artifactId>
<version>3.12.128-SNAPSHOT</version>
<version>3.12.145-SNAPSHOT</version>
</dependency>
</dependencies>

View File

@@ -1,6 +1,6 @@
package io.nosqlbench.docsys.api;
import io.nosqlbench.docsys.core.DocsysMarkdownEndpoint;
import io.nosqlbench.docsys.endpoints.DocsysMarkdownEndpoint;
/**
* At runtime, any instances of this service will be used to find

View File

@@ -4,11 +4,21 @@ package io.nosqlbench.docsys.api;
* Any class which is annotated with <pre>{@code @Service(WebServiceObject.class)}</pre>
* will be found and loaded as a service.
*
* For the methods used to configure these objects, consult
* <A href="https://eclipse-ee4j.github.io/jersey.github.io/documentation/latest/jaxrs-resources.html#d0e2040">jax-rs resources documentation</A>
* For the methods used to configure these objects, consult the
* references below:
*
* and
* <A href="https://jax-rs.github.io/apidocs/2.1/">Jax-RS API Docs</A>
* @see <A href="https://eclipse-ee4j.github.io/jersey.github
* .io/documentation/latest/jaxrs-resources.html#d0e2040">Jersey jax-rs
* resources documentation</A>
*
* @see <A href="https://repo1.maven.org/maven2/org/glassfish/jersey/jersey-documentation/2.5
* .1/jersey-documentation-2.5.1-user-guide.pdf">Jersey 2.5.1 user guide</a>
*
* @see <A href="https://github.com/jax-rs/spec/blob/master/spec
* .pdf">JAX-RS: Java™ API for RESTful Web Services Version 2.1
* Proposed Final Draft June 9, 2017</A>
**
* @see <A href="https://jax-rs.github.io/apidocs/2.1/">Jax-RS API Docs</A>
*/
public interface WebServiceObject {
}

View File

@@ -1,5 +1,6 @@
package io.nosqlbench.docsys.core;
import io.nosqlbench.docsys.endpoints.DocsysMarkdownEndpoint;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
@@ -11,8 +12,7 @@ import java.nio.file.StandardOpenOption;
import java.util.Arrays;
public class DocServerApp {
public final static String APPNAME_DOCSERVER = "docserver";
private static Logger logger = LogManager.getLogger(DocServerApp.class);
private static final Logger logger = LogManager.getLogger(DocServerApp.class);
// static {
// // defer to an extant logger context if it is there, otherwise
@@ -78,36 +78,19 @@ public class DocServerApp {
String[] markdownFileArray = markdownList.split("\n");
for (String markdownFile : markdownFileArray) {
Path relativePath = dirpath.resolve(Path.of("services/docs/markdown", markdownFile));
Path relativePath = dirpath.resolve(Path.of("services/docs", markdownFile));
logger.info("Creating " + relativePath.toString());
String markdown = dds.getFileByPath(markdownFile);
Path path = dds.findPath(markdownFile);
// String markdown = dds.getFileByPath(markdownFile);
// Files.writeString(relativePath, markdown, OVERWRITE);
Files.createDirectories(relativePath.getParent());
Files.writeString(relativePath, markdown, OVERWRITE);
Files.write(relativePath,Files.readAllBytes(path),OVERWRITE);
}
}
// private static void configureDocServerLogging(LoggerContext context) {
// JoranConfigurator jc = new JoranConfigurator();
// jc.setContext(context);
// context.reset();
// context.putProperty("application-name", APPNAME_DOCSERVER);
// InputStream is = DocServerApp.class.getClassLoader().getResourceAsStream("logback-docsys.xml");
// if (is != null) {
// try {
// jc.doConfigure(is);
// } catch (JoranException e) {
// System.err.println("error initializing logging system: " + e.getMessage());
// throw new RuntimeException(e);
// }
// } else {
// throw new RuntimeException("No logging context was provided, and " +
// "logback-docsys.xml could not be loaded from the classpath.");
// }
// }
private static void runServer(String[] serverArgs) {
DocServer server = new DocServer();
NBWebServer server = new NBWebServer();
for (int i = 0; i < serverArgs.length; i++) {
String arg = serverArgs[i];
if (arg.matches(".*://.*")) {
@@ -122,7 +105,21 @@ public class DocServerApp {
} else if (arg.matches("\\d+")) {
server.withPort(Integer.parseInt(arg));
} else if (arg.matches("--public")) {
server.withHost("0.0.0.0");
int nextidx = i+1;
String net_addr = "0.0.0.0";
if (
serverArgs.length>nextidx+1 &&
serverArgs[nextidx].matches("[0-9]+\\.[0-9]+\\.[0-9]+\\.[0-9]+")
) {
i++;
net_addr = serverArgs[nextidx];
}
logger.info("running public server on interface with address " + net_addr);
server.withHost(net_addr);
} else if (arg.matches("--workspaces")) {
String workspaces_root = serverArgs[i + 1];
logger.info("Setting workspace directory to workspace_dir");
server.withContextParam("workspaces_root", workspaces_root);
}
}
//
@@ -131,14 +128,14 @@ public class DocServerApp {
private static void showHelp(String... helpArgs) {
System.out.println(
"Usage: " + APPNAME_DOCSERVER + " " +
" [url] " +
" [path]... " + "\n" +
"\n" +
"If [url] is provided, then the scheme, address and port are all taken from it.\n" +
"Any additional paths are served from the filesystem, in addition to the internal ones.\n" +
"\n" +
"For now, only http:// is supported."
"Usage: appserver " +
" [url] " +
" [path]... " + "\n" +
"\n" +
"If [url] is provided, then the scheme, address and port are all taken from it.\n" +
"Any additional paths are served from the filesystem, in addition to the internal ones.\n" +
"\n" +
"For now, only http:// is supported."
);
}

View File

@@ -41,9 +41,9 @@ import java.util.stream.Collectors;
/**
* For examples, see <a href="https://git.eclipse.org/c/jetty/org.eclipse.jetty.project.git/tree/examples/embedded/src/main/java/org/eclipse/jetty/embedded/">embedded examples</a>
*/
public class DocServer implements Runnable {
public class NBWebServer implements Runnable {
private final static Logger logger = LogManager.getLogger(DocServer.class);
private final static Logger logger = LogManager.getLogger(NBWebServer.class);
private final List<Path> basePaths = new ArrayList<>();
private final List<Class> servletClasses = new ArrayList<>();
@@ -55,17 +55,29 @@ public class DocServer implements Runnable {
private String bindHost = "localhost";
private int bindPort = 12345;
public DocServer withHost(String bindHost) {
private final Map<String,Object> contextParams = new LinkedHashMap<>();
public NBWebServer withContextParams(Map<String,Object> cp) {
this.contextParams.putAll(cp);
return this;
}
public NBWebServer withContextParam(String name, Object object) {
this.contextParams.put(name, object);
return this;
}
public NBWebServer withHost(String bindHost) {
this.bindHost = bindHost;
return this;
}
public DocServer withPort(int bindPort) {
public NBWebServer withPort(int bindPort) {
this.bindPort = bindPort;
return this;
}
public DocServer withURL(String urlSpec) {
public NBWebServer withURL(String urlSpec) {
try {
URL url = new URL(urlSpec);
this.bindPort = url.getPort();
@@ -81,7 +93,7 @@ public class DocServer implements Runnable {
return this;
}
public DocServer withScheme(String scheme) {
public NBWebServer withScheme(String scheme) {
this.bindScheme = scheme;
return this;
}
@@ -129,7 +141,7 @@ public class DocServer implements Runnable {
return servletHolder;
}
public DocServer addPaths(Path... paths) {
public NBWebServer addPaths(Path... paths) {
for (Path path : paths) {
try {
path.getFileSystem().provider().checkAccess(path, AccessMode.READ);
@@ -216,6 +228,7 @@ public class DocServer implements Runnable {
ResourceConfig rc = new ResourceConfig();
rc.addProperties(contextParams);
rc.property("server", this);
ServletContainer container = new ServletContainer(rc);
@@ -305,7 +318,7 @@ public class DocServer implements Runnable {
server.join();
} catch (Exception e) {
logger.error(e.getMessage(), e);
logger.error("error while starting doc server", e);
e.printStackTrace(System.out);
System.exit(2);
}

View File

@@ -1,5 +1,6 @@
package io.nosqlbench.docsys.core;
package io.nosqlbench.docsys.endpoints;
import io.nosqlbench.docsys.core.NBWebServer;
import io.nosqlbench.nb.annotations.Service;
import io.nosqlbench.docsys.api.WebServiceObject;
import org.apache.logging.log4j.Logger;
@@ -17,6 +18,7 @@ import javax.ws.rs.core.MediaType;
@Singleton
@Path("_")
public class DocServerStatusEndpoint implements WebServiceObject {
private final static Logger logger =
LogManager.getLogger(DocServerStatusEndpoint.class);
@@ -26,10 +28,10 @@ public class DocServerStatusEndpoint implements WebServiceObject {
private String name;
@GET
@Path("stats")
@Path("status")
@Produces(MediaType.APPLICATION_JSON)
public String getStats() {
DocServer s = (DocServer) config.getProperty("server");
NBWebServer s = (NBWebServer) config.getProperty("server");
return s.toString();
}

View File

@@ -1,17 +1,19 @@
package io.nosqlbench.docsys.core;
package io.nosqlbench.docsys.endpoints;
import io.nosqlbench.nb.annotations.Service;
import io.nosqlbench.docsys.api.DocsNameSpace;
import io.nosqlbench.docsys.api.Docs;
import io.nosqlbench.docsys.api.DocsBinder;
import io.nosqlbench.docsys.api.DocsNameSpace;
import io.nosqlbench.docsys.api.WebServiceObject;
import org.apache.logging.log4j.Logger;
import io.nosqlbench.docsys.core.DocsysPathLoader;
import io.nosqlbench.docsys.core.PathWalker;
import io.nosqlbench.nb.annotations.Service;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import javax.inject.Singleton;
import javax.ws.rs.*;
import javax.ws.rs.core.MediaType;
import java.io.IOException;
import javax.ws.rs.core.Response;
import java.net.URLDecoder;
import java.nio.charset.StandardCharsets;
import java.nio.file.Files;
@@ -27,7 +29,7 @@ public class DocsysMarkdownEndpoint implements WebServiceObject {
private DocsBinder enabled;
private DocsBinder disabled;
private AtomicLong version = new AtomicLong(System.nanoTime());
private final AtomicLong version = new AtomicLong(System.nanoTime());
private final Set<String> enables = new HashSet<>();
@GET
@@ -126,25 +128,51 @@ public class DocsysMarkdownEndpoint implements WebServiceObject {
return list;
}
@GET
@Path("file")
@Produces(MediaType.TEXT_PLAIN)
public String getFileByPath(@QueryParam("path") String pathspec) {
return getFile(pathspec);
}
// @GET
// @Path("file")
// @Produces(MediaType.TEXT_PLAIN)
// public String getFileByPath(@QueryParam("path") String pathspec) {
// return getFile(pathspec);
// }
//
/**
* @param pathspec the path as known to the manifest
* @return The contents of a file
*/
@GET
@Produces(MediaType.TEXT_PLAIN)
@Path(value = "markdown/{pathspec:.*}")
public String getFileInPath(@PathParam("pathspec") String pathspec) {
return getFile(pathspec);
@Path(value = "{pathspec:.*}")
public Response getFileInPath(@PathParam("pathspec") String pathspec) {
try {
java.nio.file.Path path = findPath(pathspec);
String contentType = Files.probeContentType(path);
MediaType mediaType = MediaType.valueOf(contentType);
return Response.ok(path.toFile(),mediaType).build();
} catch (Exception e) {
return Response.serverError().entity(e.getMessage()).build();
}
}
private String getFile(String pathspec) {
// private String getFile(String pathspec) {
// pathspec = URLDecoder.decode(pathspec, StandardCharsets.UTF_8);
// for (java.nio.file.Path path : enabled.getPaths()) {
// java.nio.file.Path resolved = path.resolve(pathspec);
// if (Files.isDirectory(resolved)) {
// throw new RuntimeException("Path is a directory: '" + pathspec + "'");
// }
// if (Files.exists(resolved)) {
// try {
// String content = Files.readString(resolved, StandardCharsets.UTF_8);
// return content;
// } catch (IOException e) {
// throw new RuntimeException(e);
// }
// }
// }
// throw new RuntimeException("Unable to find any valid file at '" + pathspec + "'");
// }
public java.nio.file.Path findPath(String pathspec) {
pathspec = URLDecoder.decode(pathspec, StandardCharsets.UTF_8);
for (java.nio.file.Path path : enabled.getPaths()) {
java.nio.file.Path resolved = path.resolve(pathspec);
@@ -152,15 +180,11 @@ public class DocsysMarkdownEndpoint implements WebServiceObject {
throw new RuntimeException("Path is a directory: '" + pathspec + "'");
}
if (Files.exists(resolved)) {
try {
String content = Files.readString(resolved, StandardCharsets.UTF_8);
return content;
} catch (IOException e) {
throw new RuntimeException(e);
}
return resolved;
}
}
throw new RuntimeException("Unable to find any valid file at '" + pathspec + "'");
}
private void init(boolean reload) {

View File

@@ -2,5 +2,6 @@ local/**
node_modules
npm-debug.log
.nuxt
_nuxt/**
dist
.idea

View File

@@ -0,0 +1,479 @@
grammar CQL3;
// workaround for:
// https://github.com/antlr/antlr4/issues/118
//random_wrapper
// : statements EOF
// ;
statements
: statement ( ';'+ statement )* ';'+
;
statement
: drop_keyspace_stmt
| create_keyspace_stmt
| alter_keyspace_stmt
| use_stmt
| create_table_stmt
| alter_table_stmt
| drop_table_stmt
| truncate_table_stmt
| create_index_stmt
| drop_index_stmt
| insert_stmt
| update_stmt
| delete_stmt
| batch_stmt
;
dml_statements
: dml_statement (';'+ dml_statement)* ';'+
;
dml_statement
: insert_stmt
| update_stmt
| delete_stmt
;
create_keyspace_stmt
: K_CREATE K_KEYSPACE if_not_exists? keyspace_name K_WITH properties
;
alter_keyspace_stmt
: K_ALTER K_KEYSPACE keyspace_name K_WITH properties
;
drop_keyspace_stmt
: K_DROP K_KEYSPACE if_exists? keyspace_name
;
use_stmt
: K_USE keyspace_name
;
create_table_stmt
: K_CREATE (K_TABLE | K_COLUMNFAMILY) if_not_exists? table_name column_definitions (K_WITH table_options)?
;
alter_table_stmt
: K_ALTER (K_TABLE | K_COLUMNFAMILY) table_name alter_table_instruction
;
alter_table_instruction
: K_ALTER column_name K_TYPE column_type
| K_ADD column_name column_type
| K_DROP column_name
| K_WITH table_options
;
drop_table_stmt
: K_DROP K_TABLE if_exists? table_name
;
truncate_table_stmt
: K_TRUNCATE table_name
;
create_index_stmt
: K_CREATE (K_CUSTOM)? K_INDEX if_not_exists? index_name? K_ON table_name '(' column_name ')'
(K_USING index_class (K_WITH index_options)?)?
;
drop_index_stmt
: K_DROP K_INDEX if_exists? index_name
;
insert_stmt
: K_INSERT K_INTO table_name column_names K_VALUES column_values if_not_exists? upsert_options?
;
column_names
: '(' column_name (',' column_name)* ')'
;
column_values
: '(' term (',' term)* ')'
;
upsert_options
: K_USING upsert_option (K_AND upsert_option)*
;
upsert_option
: K_TIMESTAMP INTEGER
| K_TTL INTEGER
;
index_name
: IDENTIFIER
;
index_class
: STRING
;
index_options
: K_OPTIONS '=' map
;
update_stmt
: K_UPDATE table_name upsert_options? K_SET update_assignments K_WHERE where_clause update_conditions?
;
update_assignments
: update_assignment (',' update_assignment)*
;
update_assignment
: column_name '=' term
| column_name '=' column_name ('+' | '-') (INTEGER | set | list)
| column_name '=' column_name '+' map
| column_name '[' term ']' '=' term
;
update_conditions
: K_IF update_condition (K_AND update_condition)*
;
update_condition
: IDENTIFIER '=' term
| IDENTIFIER '[' term ']' '=' term
;
where_clause
: relation (K_AND relation)*
;
relation
: column_name '=' term
| column_name K_IN '(' (term (',' term)*)? ')'
| column_name K_IN '?'
;
delete_stmt
: K_DELETE delete_selections? K_FROM table_name
(K_USING K_TIMESTAMP INTEGER)?
K_WHERE where_clause
delete_conditions?
;
delete_conditions
: K_IF ( K_EXISTS
| (delete_condition (K_AND delete_condition)*))
;
delete_condition
: IDENTIFIER ('[' term ']')? '=' term
;
delete_selections
: delete_selection (',' delete_selection)*
;
delete_selection
: IDENTIFIER ('[' term ']')?
;
batch_stmt
: K_BEGIN (K_UNLOGGED | K_COUNTER)? K_BATCH batch_options? dml_statements K_APPLY K_BATCH
;
batch_options
: K_USING batch_option (K_AND batch_option)*
;
batch_option
: K_TIMESTAMP INTEGER
;
table_name
: (keyspace_name '.')? table_name_noks
;
table_name_noks
: IDENTIFIER
;
column_name
: IDENTIFIER
;
table_options
: table_option (K_AND table_option)*
;
table_option
: property
| K_COMPACT K_STORAGE
| K_CLUSTERING K_ORDER K_BY IDENTIFIER
| K_CLUSTERING K_ORDER K_BY '('IDENTIFIER asc_or_desc')'
;
asc_or_desc
: K_ASC
| K_DESC
;
column_definitions
: '(' column_definition (',' column_definition)* ')'
;
column_definition
: column_name column_type (K_STATIC)? (K_PRIMARY K_KEY)?
| K_PRIMARY K_KEY primary_key
;
column_type
: data_type
;
primary_key
: '(' partition_key (',' clustering_column)* ')'
;
partition_key
: column_name
| '(' column_name (',' column_name)* ')'
;
clustering_column
: column_name
;
keyspace_name
: IDENTIFIER
;
if_not_exists
: K_IF K_NOT K_EXISTS
;
if_exists
: K_IF K_EXISTS
;
constant
: STRING
| INTEGER
| FLOAT
| bool
| UUID
| BLOB
;
variable
: '?'
| ':' IDENTIFIER
;
term
: constant
| collection
| variable
| function
;
collection
: map
| set
| list
;
map
: '{' (term ':' term (',' term ':' term)*)? '}'
;
set
: '{' (term (',' term)*)? '}'
;
list
: '[' (term (',' term)*)? ']'
;
function
: IDENTIFIER '(' (term (',' term)*)? ')'
;
properties
: property (K_AND property)*
;
property
: property_name '=' property_value
;
property_name
: IDENTIFIER
;
property_value
: IDENTIFIER
| constant
| map
;
data_type
: native_type
| collection_type
| STRING
;
native_type
: 'ascii'
| 'bigint'
| 'blob'
| 'boolean'
| 'counter'
| 'decimal'
| 'double'
| 'float'
| 'inet'
| 'int'
| 'text'
| 'tinyint'
| 'timestamp'
| 'timeuuid'
| 'uuid'
| 'varchar'
| 'varint'
;
collection_type
: 'list' '<' native_type '>'
| 'set' '<' native_type '>'
| 'map' '<' native_type ',' native_type '>'
;
bool
: K_TRUE
| K_FALSE
;
K_ADD: A D D;
K_ALTER: A L T E R;
K_AND: A N D;
K_APPLY: A P P L Y;
K_BATCH: B A T C H;
K_BEGIN: B E G I N;
K_CLUSTERING: C L U S T E R I N G;
K_ASC: A S C;
K_DESC: D E S C;
K_COLUMNFAMILY: C O L U M N F A M I L Y;
K_COMPACT: C O M P A C T;
K_COUNTER: C O U N T E R;
K_CREATE: C R E A T E;
K_CUSTOM: C U S T O M;
K_DELETE: D E L E T E;
K_DROP: D R O P;
K_EXISTS: E X I S T S;
K_FALSE: F A L S E;
K_FROM: F R O M;
K_IF: I F;
K_IN: I N;
K_INDEX: I N D E X;
K_INSERT: I N S E R T;
K_INTO: I N T O;
K_KEY: K E Y;
K_KEYSPACE: K E Y S P A C E;
K_NOT: N O T;
K_ON: O N;
K_OPTIONS: O P T I O N S;
K_ORDER: O R D E R;
K_BY: B Y;
K_PRIMARY: P R I M A R Y;
K_SELECT: S E L E C T;
K_SET: S E T;
K_STATIC: S T A T I C;
K_STORAGE: S T O R A G E;
K_TABLE: T A B L E;
K_TIMESTAMP: T I M E S T A M P;
K_TRUE: T R U E;
K_TRUNCATE: T R U N C A T E;
K_TTL: T T L;
K_TYPE: T Y P E;
K_UNLOGGED: U N L O G G E D;
K_UPDATE: U P D A T E;
K_USE: U S E;
K_USING: U S I N G;
K_VALUES: V A L U E S;
K_WHERE: W H E R E;
K_WITH: W I T H;
IDENTIFIER
: [a-zA-Z0-9_] [a-zA-Z0-9_]*
;
STRING
: '\'' IDENTIFIER '\''
;
INTEGER
: '-'? DIGIT+
;
FLOAT
: '-'? DIGIT+ ( '.' DIGIT* )? ( E [+-]? DIGIT+ )?
| 'NaN'
| 'Infinity'
;
UUID
: HEX HEX HEX HEX HEX HEX HEX HEX '-' HEX HEX HEX HEX '-' HEX HEX HEX HEX '-' HEX HEX HEX HEX '-' HEX HEX HEX HEX HEX HEX HEX HEX HEX HEX HEX HEX
;
BLOB
: '0' X (HEX)+
;
fragment HEX : [0-9a-fA-F];
fragment DIGIT : [0-9];
fragment A : [aA];
fragment B : [bB];
fragment C : [cC];
fragment D : [dD];
fragment E : [eE];
fragment F : [fF];
fragment G : [gG];
fragment H : [hH];
fragment I : [iI];
fragment J : [jJ];
fragment K : [kK];
fragment L : [lL];
fragment M : [mM];
fragment N : [nN];
fragment O : [oO];
fragment P : [pP];
fragment Q : [qQ];
fragment R : [rR];
fragment S : [sS];
fragment T : [tT];
fragment U : [uU];
fragment V : [vV];
fragment W : [wW];
fragment X : [xX];
fragment Y : [yY];
fragment Z : [zZ];
SINGLE_LINE_COMMENT
: ('--'|'//') ~[\r\n]* -> channel(HIDDEN)
;
MULTILINE_COMMENT
: '/*' .*? ( '*/' | EOF ) -> channel(HIDDEN)
;
WS
: [ \t\r\n] -> channel(HIDDEN)
;

View File

@@ -0,0 +1,130 @@
T__0=1
T__1=2
T__2=3
T__3=4
T__4=5
T__5=6
T__6=7
T__7=8
T__8=9
T__9=10
T__10=11
T__11=12
T__12=13
T__13=14
T__14=15
T__15=16
T__16=17
T__17=18
T__18=19
T__19=20
T__20=21
T__21=22
T__22=23
T__23=24
T__24=25
T__25=26
T__26=27
T__27=28
T__28=29
T__29=30
T__30=31
T__31=32
T__32=33
T__33=34
T__34=35
T__35=36
K_ADD=37
K_ALTER=38
K_AND=39
K_APPLY=40
K_BATCH=41
K_BEGIN=42
K_CLUSTERING=43
K_ASC=44
K_DESC=45
K_COLUMNFAMILY=46
K_COMPACT=47
K_COUNTER=48
K_CREATE=49
K_CUSTOM=50
K_DELETE=51
K_DROP=52
K_EXISTS=53
K_FALSE=54
K_FROM=55
K_IF=56
K_IN=57
K_INDEX=58
K_INSERT=59
K_INTO=60
K_KEY=61
K_KEYSPACE=62
K_NOT=63
K_ON=64
K_OPTIONS=65
K_ORDER=66
K_BY=67
K_PRIMARY=68
K_SELECT=69
K_SET=70
K_STATIC=71
K_STORAGE=72
K_TABLE=73
K_TIMESTAMP=74
K_TRUE=75
K_TRUNCATE=76
K_TTL=77
K_TYPE=78
K_UNLOGGED=79
K_UPDATE=80
K_USE=81
K_USING=82
K_VALUES=83
K_WHERE=84
K_WITH=85
IDENTIFIER=86
STRING=87
INTEGER=88
FLOAT=89
UUID=90
BLOB=91
SINGLE_LINE_COMMENT=92
MULTILINE_COMMENT=93
WS=94
';'=1
'('=2
')'=3
','=4
'='=5
'+'=6
'-'=7
'['=8
']'=9
'?'=10
'.'=11
':'=12
'{'=13
'}'=14
'ascii'=15
'bigint'=16
'blob'=17
'boolean'=18
'counter'=19
'decimal'=20
'double'=21
'float'=22
'inet'=23
'int'=24
'text'=25
'tinyint'=26
'timestamp'=27
'timeuuid'=28
'uuid'=29
'varchar'=30
'varint'=31
'list'=32
'<'=33
'>'=34
'set'=35
'map'=36

View File

@@ -0,0 +1,497 @@
// Generated from CQL3.g4 by ANTLR 4.5
// jshint ignore: start
var antlr4 = require('antlr4/index');
var serializedATN = ["\3\u0430\ud6d1\u8206\uad2d\u4417\uaef1\u8d80\uaadd",
"\2`\u0389\b\1\4\2\t\2\4\3\t\3\4\4\t\4\4\5\t\5\4\6\t\6\4\7\t\7\4\b\t",
"\b\4\t\t\t\4\n\t\n\4\13\t\13\4\f\t\f\4\r\t\r\4\16\t\16\4\17\t\17\4\20",
"\t\20\4\21\t\21\4\22\t\22\4\23\t\23\4\24\t\24\4\25\t\25\4\26\t\26\4",
"\27\t\27\4\30\t\30\4\31\t\31\4\32\t\32\4\33\t\33\4\34\t\34\4\35\t\35",
"\4\36\t\36\4\37\t\37\4 \t \4!\t!\4\"\t\"\4#\t#\4$\t$\4%\t%\4&\t&\4\'",
"\t\'\4(\t(\4)\t)\4*\t*\4+\t+\4,\t,\4-\t-\4.\t.\4/\t/\4\60\t\60\4\61",
"\t\61\4\62\t\62\4\63\t\63\4\64\t\64\4\65\t\65\4\66\t\66\4\67\t\67\4",
"8\t8\49\t9\4:\t:\4;\t;\4<\t<\4=\t=\4>\t>\4?\t?\4@\t@\4A\tA\4B\tB\4C",
"\tC\4D\tD\4E\tE\4F\tF\4G\tG\4H\tH\4I\tI\4J\tJ\4K\tK\4L\tL\4M\tM\4N\t",
"N\4O\tO\4P\tP\4Q\tQ\4R\tR\4S\tS\4T\tT\4U\tU\4V\tV\4W\tW\4X\tX\4Y\tY",
"\4Z\tZ\4[\t[\4\\\t\\\4]\t]\4^\t^\4_\t_\4`\t`\4a\ta\4b\tb\4c\tc\4d\t",
"d\4e\te\4f\tf\4g\tg\4h\th\4i\ti\4j\tj\4k\tk\4l\tl\4m\tm\4n\tn\4o\to",
"\4p\tp\4q\tq\4r\tr\4s\ts\4t\tt\4u\tu\4v\tv\4w\tw\4x\tx\4y\ty\4z\tz\4",
"{\t{\3\2\3\2\3\3\3\3\3\4\3\4\3\5\3\5\3\6\3\6\3\7\3\7\3\b\3\b\3\t\3\t",
"\3\n\3\n\3\13\3\13\3\f\3\f\3\r\3\r\3\16\3\16\3\17\3\17\3\20\3\20\3\20",
"\3\20\3\20\3\20\3\21\3\21\3\21\3\21\3\21\3\21\3\21\3\22\3\22\3\22\3",
"\22\3\22\3\23\3\23\3\23\3\23\3\23\3\23\3\23\3\23\3\24\3\24\3\24\3\24",
"\3\24\3\24\3\24\3\24\3\25\3\25\3\25\3\25\3\25\3\25\3\25\3\25\3\26\3",
"\26\3\26\3\26\3\26\3\26\3\26\3\27\3\27\3\27\3\27\3\27\3\27\3\30\3\30",
"\3\30\3\30\3\30\3\31\3\31\3\31\3\31\3\32\3\32\3\32\3\32\3\32\3\33\3",
"\33\3\33\3\33\3\33\3\33\3\33\3\33\3\34\3\34\3\34\3\34\3\34\3\34\3\34",
"\3\34\3\34\3\34\3\35\3\35\3\35\3\35\3\35\3\35\3\35\3\35\3\35\3\36\3",
"\36\3\36\3\36\3\36\3\37\3\37\3\37\3\37\3\37\3\37\3\37\3\37\3 \3 \3 ",
"\3 \3 \3 \3 \3!\3!\3!\3!\3!\3\"\3\"\3#\3#\3$\3$\3$\3$\3%\3%\3%\3%\3",
"&\3&\3&\3&\3\'\3\'\3\'\3\'\3\'\3\'\3(\3(\3(\3(\3)\3)\3)\3)\3)\3)\3*",
"\3*\3*\3*\3*\3*\3+\3+\3+\3+\3+\3+\3,\3,\3,\3,\3,\3,\3,\3,\3,\3,\3,\3",
"-\3-\3-\3-\3.\3.\3.\3.\3.\3/\3/\3/\3/\3/\3/\3/\3/\3/\3/\3/\3/\3/\3\60",
"\3\60\3\60\3\60\3\60\3\60\3\60\3\60\3\61\3\61\3\61\3\61\3\61\3\61\3",
"\61\3\61\3\62\3\62\3\62\3\62\3\62\3\62\3\62\3\63\3\63\3\63\3\63\3\63",
"\3\63\3\63\3\64\3\64\3\64\3\64\3\64\3\64\3\64\3\65\3\65\3\65\3\65\3",
"\65\3\66\3\66\3\66\3\66\3\66\3\66\3\66\3\67\3\67\3\67\3\67\3\67\3\67",
"\38\38\38\38\38\39\39\39\3:\3:\3:\3;\3;\3;\3;\3;\3;\3<\3<\3<\3<\3<\3",
"<\3<\3=\3=\3=\3=\3=\3>\3>\3>\3>\3?\3?\3?\3?\3?\3?\3?\3?\3?\3@\3@\3@",
"\3@\3A\3A\3A\3B\3B\3B\3B\3B\3B\3B\3B\3C\3C\3C\3C\3C\3C\3D\3D\3D\3E\3",
"E\3E\3E\3E\3E\3E\3E\3F\3F\3F\3F\3F\3F\3F\3G\3G\3G\3G\3H\3H\3H\3H\3H",
"\3H\3H\3I\3I\3I\3I\3I\3I\3I\3I\3J\3J\3J\3J\3J\3J\3K\3K\3K\3K\3K\3K\3",
"K\3K\3K\3K\3L\3L\3L\3L\3L\3M\3M\3M\3M\3M\3M\3M\3M\3M\3N\3N\3N\3N\3O",
"\3O\3O\3O\3O\3P\3P\3P\3P\3P\3P\3P\3P\3P\3Q\3Q\3Q\3Q\3Q\3Q\3Q\3R\3R\3",
"R\3R\3S\3S\3S\3S\3S\3S\3T\3T\3T\3T\3T\3T\3T\3U\3U\3U\3U\3U\3U\3V\3V",
"\3V\3V\3V\3W\3W\7W\u02ca\nW\fW\16W\u02cd\13W\3X\3X\3X\3X\3Y\5Y\u02d4",
"\nY\3Y\6Y\u02d7\nY\rY\16Y\u02d8\3Z\5Z\u02dc\nZ\3Z\6Z\u02df\nZ\rZ\16",
"Z\u02e0\3Z\3Z\7Z\u02e5\nZ\fZ\16Z\u02e8\13Z\5Z\u02ea\nZ\3Z\3Z\5Z\u02ee",
"\nZ\3Z\6Z\u02f1\nZ\rZ\16Z\u02f2\5Z\u02f5\nZ\3Z\3Z\3Z\3Z\3Z\3Z\3Z\3Z",
"\3Z\3Z\3Z\5Z\u0302\nZ\3[\3[\3[\3[\3[\3[\3[\3[\3[\3[\3[\3[\3[\3[\3[\3",
"[\3[\3[\3[\3[\3[\3[\3[\3[\3[\3[\3[\3[\3[\3[\3[\3[\3[\3[\3[\3[\3[\3\\",
"\3\\\3\\\6\\\u032c\n\\\r\\\16\\\u032d\3]\3]\3^\3^\3_\3_\3`\3`\3a\3a",
"\3b\3b\3c\3c\3d\3d\3e\3e\3f\3f\3g\3g\3h\3h\3i\3i\3j\3j\3k\3k\3l\3l\3",
"m\3m\3n\3n\3o\3o\3p\3p\3q\3q\3r\3r\3s\3s\3t\3t\3u\3u\3v\3v\3w\3w\3x",
"\3x\3y\3y\3y\3y\5y\u036c\ny\3y\7y\u036f\ny\fy\16y\u0372\13y\3y\3y\3",
"z\3z\3z\3z\7z\u037a\nz\fz\16z\u037d\13z\3z\3z\3z\5z\u0382\nz\3z\3z\3",
"{\3{\3{\3{\3\u037b\2|\3\3\5\4\7\5\t\6\13\7\r\b\17\t\21\n\23\13\25\f",
"\27\r\31\16\33\17\35\20\37\21!\22#\23%\24\'\25)\26+\27-\30/\31\61\32",
"\63\33\65\34\67\359\36;\37= ?!A\"C#E$G%I&K\'M(O)Q*S+U,W-Y.[/]\60_\61",
"a\62c\63e\64g\65i\66k\67m8o9q:s;u<w=y>{?}@\177A\u0081B\u0083C\u0085",
"D\u0087E\u0089F\u008bG\u008dH\u008fI\u0091J\u0093K\u0095L\u0097M\u0099",
"N\u009bO\u009dP\u009fQ\u00a1R\u00a3S\u00a5T\u00a7U\u00a9V\u00abW\u00ad",
"X\u00afY\u00b1Z\u00b3[\u00b5\\\u00b7]\u00b9\2\u00bb\2\u00bd\2\u00bf",
"\2\u00c1\2\u00c3\2\u00c5\2\u00c7\2\u00c9\2\u00cb\2\u00cd\2\u00cf\2\u00d1",
"\2\u00d3\2\u00d5\2\u00d7\2\u00d9\2\u00db\2\u00dd\2\u00df\2\u00e1\2\u00e3",
"\2\u00e5\2\u00e7\2\u00e9\2\u00eb\2\u00ed\2\u00ef\2\u00f1^\u00f3_\u00f5",
"`\3\2\"\6\2\62;C\\aac|\4\2--//\5\2\62;CHch\3\2\62;\4\2CCcc\4\2DDdd\4",
"\2EEee\4\2FFff\4\2GGgg\4\2HHhh\4\2IIii\4\2JJjj\4\2KKkk\4\2LLll\4\2M",
"Mmm\4\2NNnn\4\2OOoo\4\2PPpp\4\2QQqq\4\2RRrr\4\2SSss\4\2TTtt\4\2UUuu",
"\4\2VVvv\4\2WWww\4\2XXxx\4\2YYyy\4\2ZZzz\4\2[[{{\4\2\\\\||\4\2\f\f\17",
"\17\5\2\13\f\17\17\"\"\u037d\2\3\3\2\2\2\2\5\3\2\2\2\2\7\3\2\2\2\2\t",
"\3\2\2\2\2\13\3\2\2\2\2\r\3\2\2\2\2\17\3\2\2\2\2\21\3\2\2\2\2\23\3\2",
"\2\2\2\25\3\2\2\2\2\27\3\2\2\2\2\31\3\2\2\2\2\33\3\2\2\2\2\35\3\2\2",
"\2\2\37\3\2\2\2\2!\3\2\2\2\2#\3\2\2\2\2%\3\2\2\2\2\'\3\2\2\2\2)\3\2",
"\2\2\2+\3\2\2\2\2-\3\2\2\2\2/\3\2\2\2\2\61\3\2\2\2\2\63\3\2\2\2\2\65",
"\3\2\2\2\2\67\3\2\2\2\29\3\2\2\2\2;\3\2\2\2\2=\3\2\2\2\2?\3\2\2\2\2",
"A\3\2\2\2\2C\3\2\2\2\2E\3\2\2\2\2G\3\2\2\2\2I\3\2\2\2\2K\3\2\2\2\2M",
"\3\2\2\2\2O\3\2\2\2\2Q\3\2\2\2\2S\3\2\2\2\2U\3\2\2\2\2W\3\2\2\2\2Y\3",
"\2\2\2\2[\3\2\2\2\2]\3\2\2\2\2_\3\2\2\2\2a\3\2\2\2\2c\3\2\2\2\2e\3\2",
"\2\2\2g\3\2\2\2\2i\3\2\2\2\2k\3\2\2\2\2m\3\2\2\2\2o\3\2\2\2\2q\3\2\2",
"\2\2s\3\2\2\2\2u\3\2\2\2\2w\3\2\2\2\2y\3\2\2\2\2{\3\2\2\2\2}\3\2\2\2",
"\2\177\3\2\2\2\2\u0081\3\2\2\2\2\u0083\3\2\2\2\2\u0085\3\2\2\2\2\u0087",
"\3\2\2\2\2\u0089\3\2\2\2\2\u008b\3\2\2\2\2\u008d\3\2\2\2\2\u008f\3\2",
"\2\2\2\u0091\3\2\2\2\2\u0093\3\2\2\2\2\u0095\3\2\2\2\2\u0097\3\2\2\2",
"\2\u0099\3\2\2\2\2\u009b\3\2\2\2\2\u009d\3\2\2\2\2\u009f\3\2\2\2\2\u00a1",
"\3\2\2\2\2\u00a3\3\2\2\2\2\u00a5\3\2\2\2\2\u00a7\3\2\2\2\2\u00a9\3\2",
"\2\2\2\u00ab\3\2\2\2\2\u00ad\3\2\2\2\2\u00af\3\2\2\2\2\u00b1\3\2\2\2",
"\2\u00b3\3\2\2\2\2\u00b5\3\2\2\2\2\u00b7\3\2\2\2\2\u00f1\3\2\2\2\2\u00f3",
"\3\2\2\2\2\u00f5\3\2\2\2\3\u00f7\3\2\2\2\5\u00f9\3\2\2\2\7\u00fb\3\2",
"\2\2\t\u00fd\3\2\2\2\13\u00ff\3\2\2\2\r\u0101\3\2\2\2\17\u0103\3\2\2",
"\2\21\u0105\3\2\2\2\23\u0107\3\2\2\2\25\u0109\3\2\2\2\27\u010b\3\2\2",
"\2\31\u010d\3\2\2\2\33\u010f\3\2\2\2\35\u0111\3\2\2\2\37\u0113\3\2\2",
"\2!\u0119\3\2\2\2#\u0120\3\2\2\2%\u0125\3\2\2\2\'\u012d\3\2\2\2)\u0135",
"\3\2\2\2+\u013d\3\2\2\2-\u0144\3\2\2\2/\u014a\3\2\2\2\61\u014f\3\2\2",
"\2\63\u0153\3\2\2\2\65\u0158\3\2\2\2\67\u0160\3\2\2\29\u016a\3\2\2\2",
";\u0173\3\2\2\2=\u0178\3\2\2\2?\u0180\3\2\2\2A\u0187\3\2\2\2C\u018c",
"\3\2\2\2E\u018e\3\2\2\2G\u0190\3\2\2\2I\u0194\3\2\2\2K\u0198\3\2\2\2",
"M\u019c\3\2\2\2O\u01a2\3\2\2\2Q\u01a6\3\2\2\2S\u01ac\3\2\2\2U\u01b2",
"\3\2\2\2W\u01b8\3\2\2\2Y\u01c3\3\2\2\2[\u01c7\3\2\2\2]\u01cc\3\2\2\2",
"_\u01d9\3\2\2\2a\u01e1\3\2\2\2c\u01e9\3\2\2\2e\u01f0\3\2\2\2g\u01f7",
"\3\2\2\2i\u01fe\3\2\2\2k\u0203\3\2\2\2m\u020a\3\2\2\2o\u0210\3\2\2\2",
"q\u0215\3\2\2\2s\u0218\3\2\2\2u\u021b\3\2\2\2w\u0221\3\2\2\2y\u0228",
"\3\2\2\2{\u022d\3\2\2\2}\u0231\3\2\2\2\177\u023a\3\2\2\2\u0081\u023e",
"\3\2\2\2\u0083\u0241\3\2\2\2\u0085\u0249\3\2\2\2\u0087\u024f\3\2\2\2",
"\u0089\u0252\3\2\2\2\u008b\u025a\3\2\2\2\u008d\u0261\3\2\2\2\u008f\u0265",
"\3\2\2\2\u0091\u026c\3\2\2\2\u0093\u0274\3\2\2\2\u0095\u027a\3\2\2\2",
"\u0097\u0284\3\2\2\2\u0099\u0289\3\2\2\2\u009b\u0292\3\2\2\2\u009d\u0296",
"\3\2\2\2\u009f\u029b\3\2\2\2\u00a1\u02a4\3\2\2\2\u00a3\u02ab\3\2\2\2",
"\u00a5\u02af\3\2\2\2\u00a7\u02b5\3\2\2\2\u00a9\u02bc\3\2\2\2\u00ab\u02c2",
"\3\2\2\2\u00ad\u02c7\3\2\2\2\u00af\u02ce\3\2\2\2\u00b1\u02d3\3\2\2\2",
"\u00b3\u0301\3\2\2\2\u00b5\u0303\3\2\2\2\u00b7\u0328\3\2\2\2\u00b9\u032f",
"\3\2\2\2\u00bb\u0331\3\2\2\2\u00bd\u0333\3\2\2\2\u00bf\u0335\3\2\2\2",
"\u00c1\u0337\3\2\2\2\u00c3\u0339\3\2\2\2\u00c5\u033b\3\2\2\2\u00c7\u033d",
"\3\2\2\2\u00c9\u033f\3\2\2\2\u00cb\u0341\3\2\2\2\u00cd\u0343\3\2\2\2",
"\u00cf\u0345\3\2\2\2\u00d1\u0347\3\2\2\2\u00d3\u0349\3\2\2\2\u00d5\u034b",
"\3\2\2\2\u00d7\u034d\3\2\2\2\u00d9\u034f\3\2\2\2\u00db\u0351\3\2\2\2",
"\u00dd\u0353\3\2\2\2\u00df\u0355\3\2\2\2\u00e1\u0357\3\2\2\2\u00e3\u0359",
"\3\2\2\2\u00e5\u035b\3\2\2\2\u00e7\u035d\3\2\2\2\u00e9\u035f\3\2\2\2",
"\u00eb\u0361\3\2\2\2\u00ed\u0363\3\2\2\2\u00ef\u0365\3\2\2\2\u00f1\u036b",
"\3\2\2\2\u00f3\u0375\3\2\2\2\u00f5\u0385\3\2\2\2\u00f7\u00f8\7=\2\2",
"\u00f8\4\3\2\2\2\u00f9\u00fa\7*\2\2\u00fa\6\3\2\2\2\u00fb\u00fc\7+\2",
"\2\u00fc\b\3\2\2\2\u00fd\u00fe\7.\2\2\u00fe\n\3\2\2\2\u00ff\u0100\7",
"?\2\2\u0100\f\3\2\2\2\u0101\u0102\7-\2\2\u0102\16\3\2\2\2\u0103\u0104",
"\7/\2\2\u0104\20\3\2\2\2\u0105\u0106\7]\2\2\u0106\22\3\2\2\2\u0107\u0108",
"\7_\2\2\u0108\24\3\2\2\2\u0109\u010a\7A\2\2\u010a\26\3\2\2\2\u010b\u010c",
"\7\60\2\2\u010c\30\3\2\2\2\u010d\u010e\7<\2\2\u010e\32\3\2\2\2\u010f",
"\u0110\7}\2\2\u0110\34\3\2\2\2\u0111\u0112\7\177\2\2\u0112\36\3\2\2",
"\2\u0113\u0114\7c\2\2\u0114\u0115\7u\2\2\u0115\u0116\7e\2\2\u0116\u0117",
"\7k\2\2\u0117\u0118\7k\2\2\u0118 \3\2\2\2\u0119\u011a\7d\2\2\u011a\u011b",
"\7k\2\2\u011b\u011c\7i\2\2\u011c\u011d\7k\2\2\u011d\u011e\7p\2\2\u011e",
"\u011f\7v\2\2\u011f\"\3\2\2\2\u0120\u0121\7d\2\2\u0121\u0122\7n\2\2",
"\u0122\u0123\7q\2\2\u0123\u0124\7d\2\2\u0124$\3\2\2\2\u0125\u0126\7",
"d\2\2\u0126\u0127\7q\2\2\u0127\u0128\7q\2\2\u0128\u0129\7n\2\2\u0129",
"\u012a\7g\2\2\u012a\u012b\7c\2\2\u012b\u012c\7p\2\2\u012c&\3\2\2\2\u012d",
"\u012e\7e\2\2\u012e\u012f\7q\2\2\u012f\u0130\7w\2\2\u0130\u0131\7p\2",
"\2\u0131\u0132\7v\2\2\u0132\u0133\7g\2\2\u0133\u0134\7t\2\2\u0134(\3",
"\2\2\2\u0135\u0136\7f\2\2\u0136\u0137\7g\2\2\u0137\u0138\7e\2\2\u0138",
"\u0139\7k\2\2\u0139\u013a\7o\2\2\u013a\u013b\7c\2\2\u013b\u013c\7n\2",
"\2\u013c*\3\2\2\2\u013d\u013e\7f\2\2\u013e\u013f\7q\2\2\u013f\u0140",
"\7w\2\2\u0140\u0141\7d\2\2\u0141\u0142\7n\2\2\u0142\u0143\7g\2\2\u0143",
",\3\2\2\2\u0144\u0145\7h\2\2\u0145\u0146\7n\2\2\u0146\u0147\7q\2\2\u0147",
"\u0148\7c\2\2\u0148\u0149\7v\2\2\u0149.\3\2\2\2\u014a\u014b\7k\2\2\u014b",
"\u014c\7p\2\2\u014c\u014d\7g\2\2\u014d\u014e\7v\2\2\u014e\60\3\2\2\2",
"\u014f\u0150\7k\2\2\u0150\u0151\7p\2\2\u0151\u0152\7v\2\2\u0152\62\3",
"\2\2\2\u0153\u0154\7v\2\2\u0154\u0155\7g\2\2\u0155\u0156\7z\2\2\u0156",
"\u0157\7v\2\2\u0157\64\3\2\2\2\u0158\u0159\7v\2\2\u0159\u015a\7k\2\2",
"\u015a\u015b\7p\2\2\u015b\u015c\7{\2\2\u015c\u015d\7k\2\2\u015d\u015e",
"\7p\2\2\u015e\u015f\7v\2\2\u015f\66\3\2\2\2\u0160\u0161\7v\2\2\u0161",
"\u0162\7k\2\2\u0162\u0163\7o\2\2\u0163\u0164\7g\2\2\u0164\u0165\7u\2",
"\2\u0165\u0166\7v\2\2\u0166\u0167\7c\2\2\u0167\u0168\7o\2\2\u0168\u0169",
"\7r\2\2\u01698\3\2\2\2\u016a\u016b\7v\2\2\u016b\u016c\7k\2\2\u016c\u016d",
"\7o\2\2\u016d\u016e\7g\2\2\u016e\u016f\7w\2\2\u016f\u0170\7w\2\2\u0170",
"\u0171\7k\2\2\u0171\u0172\7f\2\2\u0172:\3\2\2\2\u0173\u0174\7w\2\2\u0174",
"\u0175\7w\2\2\u0175\u0176\7k\2\2\u0176\u0177\7f\2\2\u0177<\3\2\2\2\u0178",
"\u0179\7x\2\2\u0179\u017a\7c\2\2\u017a\u017b\7t\2\2\u017b\u017c\7e\2",
"\2\u017c\u017d\7j\2\2\u017d\u017e\7c\2\2\u017e\u017f\7t\2\2\u017f>\3",
"\2\2\2\u0180\u0181\7x\2\2\u0181\u0182\7c\2\2\u0182\u0183\7t\2\2\u0183",
"\u0184\7k\2\2\u0184\u0185\7p\2\2\u0185\u0186\7v\2\2\u0186@\3\2\2\2\u0187",
"\u0188\7n\2\2\u0188\u0189\7k\2\2\u0189\u018a\7u\2\2\u018a\u018b\7v\2",
"\2\u018bB\3\2\2\2\u018c\u018d\7>\2\2\u018dD\3\2\2\2\u018e\u018f\7@\2",
"\2\u018fF\3\2\2\2\u0190\u0191\7u\2\2\u0191\u0192\7g\2\2\u0192\u0193",
"\7v\2\2\u0193H\3\2\2\2\u0194\u0195\7o\2\2\u0195\u0196\7c\2\2\u0196\u0197",
"\7r\2\2\u0197J\3\2\2\2\u0198\u0199\5\u00bd_\2\u0199\u019a\5\u00c3b\2",
"\u019a\u019b\5\u00c3b\2\u019bL\3\2\2\2\u019c\u019d\5\u00bd_\2\u019d",
"\u019e\5\u00d3j\2\u019e\u019f\5\u00e3r\2\u019f\u01a0\5\u00c5c\2\u01a0",
"\u01a1\5\u00dfp\2\u01a1N\3\2\2\2\u01a2\u01a3\5\u00bd_\2\u01a3\u01a4",
"\5\u00d7l\2\u01a4\u01a5\5\u00c3b\2\u01a5P\3\2\2\2\u01a6\u01a7\5\u00bd",
"_\2\u01a7\u01a8\5\u00dbn\2\u01a8\u01a9\5\u00dbn\2\u01a9\u01aa\5\u00d3",
"j\2\u01aa\u01ab\5\u00edw\2\u01abR\3\2\2\2\u01ac\u01ad\5\u00bf`\2\u01ad",
"\u01ae\5\u00bd_\2\u01ae\u01af\5\u00e3r\2\u01af\u01b0\5\u00c1a\2\u01b0",
"\u01b1\5\u00cbf\2\u01b1T\3\2\2\2\u01b2\u01b3\5\u00bf`\2\u01b3\u01b4",
"\5\u00c5c\2\u01b4\u01b5\5\u00c9e\2\u01b5\u01b6\5\u00cdg\2\u01b6\u01b7",
"\5\u00d7l\2\u01b7V\3\2\2\2\u01b8\u01b9\5\u00c1a\2\u01b9\u01ba\5\u00d3",
"j\2\u01ba\u01bb\5\u00e5s\2\u01bb\u01bc\5\u00e1q\2\u01bc\u01bd\5\u00e3",
"r\2\u01bd\u01be\5\u00c5c\2\u01be\u01bf\5\u00dfp\2\u01bf\u01c0\5\u00cd",
"g\2\u01c0\u01c1\5\u00d7l\2\u01c1\u01c2\5\u00c9e\2\u01c2X\3\2\2\2\u01c3",
"\u01c4\5\u00bd_\2\u01c4\u01c5\5\u00e1q\2\u01c5\u01c6\5\u00c1a\2\u01c6",
"Z\3\2\2\2\u01c7\u01c8\5\u00c3b\2\u01c8\u01c9\5\u00c5c\2\u01c9\u01ca",
"\5\u00e1q\2\u01ca\u01cb\5\u00c1a\2\u01cb\\\3\2\2\2\u01cc\u01cd\5\u00c1",
"a\2\u01cd\u01ce\5\u00d9m\2\u01ce\u01cf\5\u00d3j\2\u01cf\u01d0\5\u00e5",
"s\2\u01d0\u01d1\5\u00d5k\2\u01d1\u01d2\5\u00d7l\2\u01d2\u01d3\5\u00c7",
"d\2\u01d3\u01d4\5\u00bd_\2\u01d4\u01d5\5\u00d5k\2\u01d5\u01d6\5\u00cd",
"g\2\u01d6\u01d7\5\u00d3j\2\u01d7\u01d8\5\u00edw\2\u01d8^\3\2\2\2\u01d9",
"\u01da\5\u00c1a\2\u01da\u01db\5\u00d9m\2\u01db\u01dc\5\u00d5k\2\u01dc",
"\u01dd\5\u00dbn\2\u01dd\u01de\5\u00bd_\2\u01de\u01df\5\u00c1a\2\u01df",
"\u01e0\5\u00e3r\2\u01e0`\3\2\2\2\u01e1\u01e2\5\u00c1a\2\u01e2\u01e3",
"\5\u00d9m\2\u01e3\u01e4\5\u00e5s\2\u01e4\u01e5\5\u00d7l\2\u01e5\u01e6",
"\5\u00e3r\2\u01e6\u01e7\5\u00c5c\2\u01e7\u01e8\5\u00dfp\2\u01e8b\3\2",
"\2\2\u01e9\u01ea\5\u00c1a\2\u01ea\u01eb\5\u00dfp\2\u01eb\u01ec\5\u00c5",
"c\2\u01ec\u01ed\5\u00bd_\2\u01ed\u01ee\5\u00e3r\2\u01ee\u01ef\5\u00c5",
"c\2\u01efd\3\2\2\2\u01f0\u01f1\5\u00c1a\2\u01f1\u01f2\5\u00e5s\2\u01f2",
"\u01f3\5\u00e1q\2\u01f3\u01f4\5\u00e3r\2\u01f4\u01f5\5\u00d9m\2\u01f5",
"\u01f6\5\u00d5k\2\u01f6f\3\2\2\2\u01f7\u01f8\5\u00c3b\2\u01f8\u01f9",
"\5\u00c5c\2\u01f9\u01fa\5\u00d3j\2\u01fa\u01fb\5\u00c5c\2\u01fb\u01fc",
"\5\u00e3r\2\u01fc\u01fd\5\u00c5c\2\u01fdh\3\2\2\2\u01fe\u01ff\5\u00c3",
"b\2\u01ff\u0200\5\u00dfp\2\u0200\u0201\5\u00d9m\2\u0201\u0202\5\u00db",
"n\2\u0202j\3\2\2\2\u0203\u0204\5\u00c5c\2\u0204\u0205\5\u00ebv\2\u0205",
"\u0206\5\u00cdg\2\u0206\u0207\5\u00e1q\2\u0207\u0208\5\u00e3r\2\u0208",
"\u0209\5\u00e1q\2\u0209l\3\2\2\2\u020a\u020b\5\u00c7d\2\u020b\u020c",
"\5\u00bd_\2\u020c\u020d\5\u00d3j\2\u020d\u020e\5\u00e1q\2\u020e\u020f",
"\5\u00c5c\2\u020fn\3\2\2\2\u0210\u0211\5\u00c7d\2\u0211\u0212\5\u00df",
"p\2\u0212\u0213\5\u00d9m\2\u0213\u0214\5\u00d5k\2\u0214p\3\2\2\2\u0215",
"\u0216\5\u00cdg\2\u0216\u0217\5\u00c7d\2\u0217r\3\2\2\2\u0218\u0219",
"\5\u00cdg\2\u0219\u021a\5\u00d7l\2\u021at\3\2\2\2\u021b\u021c\5\u00cd",
"g\2\u021c\u021d\5\u00d7l\2\u021d\u021e\5\u00c3b\2\u021e\u021f\5\u00c5",
"c\2\u021f\u0220\5\u00ebv\2\u0220v\3\2\2\2\u0221\u0222\5\u00cdg\2\u0222",
"\u0223\5\u00d7l\2\u0223\u0224\5\u00e1q\2\u0224\u0225\5\u00c5c\2\u0225",
"\u0226\5\u00dfp\2\u0226\u0227\5\u00e3r\2\u0227x\3\2\2\2\u0228\u0229",
"\5\u00cdg\2\u0229\u022a\5\u00d7l\2\u022a\u022b\5\u00e3r\2\u022b\u022c",
"\5\u00d9m\2\u022cz\3\2\2\2\u022d\u022e\5\u00d1i\2\u022e\u022f\5\u00c5",
"c\2\u022f\u0230\5\u00edw\2\u0230|\3\2\2\2\u0231\u0232\5\u00d1i\2\u0232",
"\u0233\5\u00c5c\2\u0233\u0234\5\u00edw\2\u0234\u0235\5\u00e1q\2\u0235",
"\u0236\5\u00dbn\2\u0236\u0237\5\u00bd_\2\u0237\u0238\5\u00c1a\2\u0238",
"\u0239\5\u00c5c\2\u0239~\3\2\2\2\u023a\u023b\5\u00d7l\2\u023b\u023c",
"\5\u00d9m\2\u023c\u023d\5\u00e3r\2\u023d\u0080\3\2\2\2\u023e\u023f\5",
"\u00d9m\2\u023f\u0240\5\u00d7l\2\u0240\u0082\3\2\2\2\u0241\u0242\5\u00d9",
"m\2\u0242\u0243\5\u00dbn\2\u0243\u0244\5\u00e3r\2\u0244\u0245\5\u00cd",
"g\2\u0245\u0246\5\u00d9m\2\u0246\u0247\5\u00d7l\2\u0247\u0248\5\u00e1",
"q\2\u0248\u0084\3\2\2\2\u0249\u024a\5\u00d9m\2\u024a\u024b\5\u00dfp",
"\2\u024b\u024c\5\u00c3b\2\u024c\u024d\5\u00c5c\2\u024d\u024e\5\u00df",
"p\2\u024e\u0086\3\2\2\2\u024f\u0250\5\u00bf`\2\u0250\u0251\5\u00edw",
"\2\u0251\u0088\3\2\2\2\u0252\u0253\5\u00dbn\2\u0253\u0254\5\u00dfp\2",
"\u0254\u0255\5\u00cdg\2\u0255\u0256\5\u00d5k\2\u0256\u0257\5\u00bd_",
"\2\u0257\u0258\5\u00dfp\2\u0258\u0259\5\u00edw\2\u0259\u008a\3\2\2\2",
"\u025a\u025b\5\u00e1q\2\u025b\u025c\5\u00c5c\2\u025c\u025d\5\u00d3j",
"\2\u025d\u025e\5\u00c5c\2\u025e\u025f\5\u00c1a\2\u025f\u0260\5\u00e3",
"r\2\u0260\u008c\3\2\2\2\u0261\u0262\5\u00e1q\2\u0262\u0263\5\u00c5c",
"\2\u0263\u0264\5\u00e3r\2\u0264\u008e\3\2\2\2\u0265\u0266\5\u00e1q\2",
"\u0266\u0267\5\u00e3r\2\u0267\u0268\5\u00bd_\2\u0268\u0269\5\u00e3r",
"\2\u0269\u026a\5\u00cdg\2\u026a\u026b\5\u00c1a\2\u026b\u0090\3\2\2\2",
"\u026c\u026d\5\u00e1q\2\u026d\u026e\5\u00e3r\2\u026e\u026f\5\u00d9m",
"\2\u026f\u0270\5\u00dfp\2\u0270\u0271\5\u00bd_\2\u0271\u0272\5\u00c9",
"e\2\u0272\u0273\5\u00c5c\2\u0273\u0092\3\2\2\2\u0274\u0275\5\u00e3r",
"\2\u0275\u0276\5\u00bd_\2\u0276\u0277\5\u00bf`\2\u0277\u0278\5\u00d3",
"j\2\u0278\u0279\5\u00c5c\2\u0279\u0094\3\2\2\2\u027a\u027b\5\u00e3r",
"\2\u027b\u027c\5\u00cdg\2\u027c\u027d\5\u00d5k\2\u027d\u027e\5\u00c5",
"c\2\u027e\u027f\5\u00e1q\2\u027f\u0280\5\u00e3r\2\u0280\u0281\5\u00bd",
"_\2\u0281\u0282\5\u00d5k\2\u0282\u0283\5\u00dbn\2\u0283\u0096\3\2\2",
"\2\u0284\u0285\5\u00e3r\2\u0285\u0286\5\u00dfp\2\u0286\u0287\5\u00e5",
"s\2\u0287\u0288\5\u00c5c\2\u0288\u0098\3\2\2\2\u0289\u028a\5\u00e3r",
"\2\u028a\u028b\5\u00dfp\2\u028b\u028c\5\u00e5s\2\u028c\u028d\5\u00d7",
"l\2\u028d\u028e\5\u00c1a\2\u028e\u028f\5\u00bd_\2\u028f\u0290\5\u00e3",
"r\2\u0290\u0291\5\u00c5c\2\u0291\u009a\3\2\2\2\u0292\u0293\5\u00e3r",
"\2\u0293\u0294\5\u00e3r\2\u0294\u0295\5\u00d3j\2\u0295\u009c\3\2\2\2",
"\u0296\u0297\5\u00e3r\2\u0297\u0298\5\u00edw\2\u0298\u0299\5\u00dbn",
"\2\u0299\u029a\5\u00c5c\2\u029a\u009e\3\2\2\2\u029b\u029c\5\u00e5s\2",
"\u029c\u029d\5\u00d7l\2\u029d\u029e\5\u00d3j\2\u029e\u029f\5\u00d9m",
"\2\u029f\u02a0\5\u00c9e\2\u02a0\u02a1\5\u00c9e\2\u02a1\u02a2\5\u00c5",
"c\2\u02a2\u02a3\5\u00c3b\2\u02a3\u00a0\3\2\2\2\u02a4\u02a5\5\u00e5s",
"\2\u02a5\u02a6\5\u00dbn\2\u02a6\u02a7\5\u00c3b\2\u02a7\u02a8\5\u00bd",
"_\2\u02a8\u02a9\5\u00e3r\2\u02a9\u02aa\5\u00c5c\2\u02aa\u00a2\3\2\2",
"\2\u02ab\u02ac\5\u00e5s\2\u02ac\u02ad\5\u00e1q\2\u02ad\u02ae\5\u00c5",
"c\2\u02ae\u00a4\3\2\2\2\u02af\u02b0\5\u00e5s\2\u02b0\u02b1\5\u00e1q",
"\2\u02b1\u02b2\5\u00cdg\2\u02b2\u02b3\5\u00d7l\2\u02b3\u02b4\5\u00c9",
"e\2\u02b4\u00a6\3\2\2\2\u02b5\u02b6\5\u00e7t\2\u02b6\u02b7\5\u00bd_",
"\2\u02b7\u02b8\5\u00d3j\2\u02b8\u02b9\5\u00e5s\2\u02b9\u02ba\5\u00c5",
"c\2\u02ba\u02bb\5\u00e1q\2\u02bb\u00a8\3\2\2\2\u02bc\u02bd\5\u00e9u",
"\2\u02bd\u02be\5\u00cbf\2\u02be\u02bf\5\u00c5c\2\u02bf\u02c0\5\u00df",
"p\2\u02c0\u02c1\5\u00c5c\2\u02c1\u00aa\3\2\2\2\u02c2\u02c3\5\u00e9u",
"\2\u02c3\u02c4\5\u00cdg\2\u02c4\u02c5\5\u00e3r\2\u02c5\u02c6\5\u00cb",
"f\2\u02c6\u00ac\3\2\2\2\u02c7\u02cb\t\2\2\2\u02c8\u02ca\t\2\2\2\u02c9",
"\u02c8\3\2\2\2\u02ca\u02cd\3\2\2\2\u02cb\u02c9\3\2\2\2\u02cb\u02cc\3",
"\2\2\2\u02cc\u00ae\3\2\2\2\u02cd\u02cb\3\2\2\2\u02ce\u02cf\7)\2\2\u02cf",
"\u02d0\5\u00adW\2\u02d0\u02d1\7)\2\2\u02d1\u00b0\3\2\2\2\u02d2\u02d4",
"\7/\2\2\u02d3\u02d2\3\2\2\2\u02d3\u02d4\3\2\2\2\u02d4\u02d6\3\2\2\2",
"\u02d5\u02d7\5\u00bb^\2\u02d6\u02d5\3\2\2\2\u02d7\u02d8\3\2\2\2\u02d8",
"\u02d6\3\2\2\2\u02d8\u02d9\3\2\2\2\u02d9\u00b2\3\2\2\2\u02da\u02dc\7",
"/\2\2\u02db\u02da\3\2\2\2\u02db\u02dc\3\2\2\2\u02dc\u02de\3\2\2\2\u02dd",
"\u02df\5\u00bb^\2\u02de\u02dd\3\2\2\2\u02df\u02e0\3\2\2\2\u02e0\u02de",
"\3\2\2\2\u02e0\u02e1\3\2\2\2\u02e1\u02e9\3\2\2\2\u02e2\u02e6\7\60\2",
"\2\u02e3\u02e5\5\u00bb^\2\u02e4\u02e3\3\2\2\2\u02e5\u02e8\3\2\2\2\u02e6",
"\u02e4\3\2\2\2\u02e6\u02e7\3\2\2\2\u02e7\u02ea\3\2\2\2\u02e8\u02e6\3",
"\2\2\2\u02e9\u02e2\3\2\2\2\u02e9\u02ea\3\2\2\2\u02ea\u02f4\3\2\2\2\u02eb",
"\u02ed\5\u00c5c\2\u02ec\u02ee\t\3\2\2\u02ed\u02ec\3\2\2\2\u02ed\u02ee",
"\3\2\2\2\u02ee\u02f0\3\2\2\2\u02ef\u02f1\5\u00bb^\2\u02f0\u02ef\3\2",
"\2\2\u02f1\u02f2\3\2\2\2\u02f2\u02f0\3\2\2\2\u02f2\u02f3\3\2\2\2\u02f3",
"\u02f5\3\2\2\2\u02f4\u02eb\3\2\2\2\u02f4\u02f5\3\2\2\2\u02f5\u0302\3",
"\2\2\2\u02f6\u02f7\7P\2\2\u02f7\u02f8\7c\2\2\u02f8\u0302\7P\2\2\u02f9",
"\u02fa\7K\2\2\u02fa\u02fb\7p\2\2\u02fb\u02fc\7h\2\2\u02fc\u02fd\7k\2",
"\2\u02fd\u02fe\7p\2\2\u02fe\u02ff\7k\2\2\u02ff\u0300\7v\2\2\u0300\u0302",
"\7{\2\2\u0301\u02db\3\2\2\2\u0301\u02f6\3\2\2\2\u0301\u02f9\3\2\2\2",
"\u0302\u00b4\3\2\2\2\u0303\u0304\5\u00b9]\2\u0304\u0305\5\u00b9]\2\u0305",
"\u0306\5\u00b9]\2\u0306\u0307\5\u00b9]\2\u0307\u0308\5\u00b9]\2\u0308",
"\u0309\5\u00b9]\2\u0309\u030a\5\u00b9]\2\u030a\u030b\5\u00b9]\2\u030b",
"\u030c\7/\2\2\u030c\u030d\5\u00b9]\2\u030d\u030e\5\u00b9]\2\u030e\u030f",
"\5\u00b9]\2\u030f\u0310\5\u00b9]\2\u0310\u0311\7/\2\2\u0311\u0312\5",
"\u00b9]\2\u0312\u0313\5\u00b9]\2\u0313\u0314\5\u00b9]\2\u0314\u0315",
"\5\u00b9]\2\u0315\u0316\7/\2\2\u0316\u0317\5\u00b9]\2\u0317\u0318\5",
"\u00b9]\2\u0318\u0319\5\u00b9]\2\u0319\u031a\5\u00b9]\2\u031a\u031b",
"\7/\2\2\u031b\u031c\5\u00b9]\2\u031c\u031d\5\u00b9]\2\u031d\u031e\5",
"\u00b9]\2\u031e\u031f\5\u00b9]\2\u031f\u0320\5\u00b9]\2\u0320\u0321",
"\5\u00b9]\2\u0321\u0322\5\u00b9]\2\u0322\u0323\5\u00b9]\2\u0323\u0324",
"\5\u00b9]\2\u0324\u0325\5\u00b9]\2\u0325\u0326\5\u00b9]\2\u0326\u0327",
"\5\u00b9]\2\u0327\u00b6\3\2\2\2\u0328\u0329\7\62\2\2\u0329\u032b\5\u00eb",
"v\2\u032a\u032c\5\u00b9]\2\u032b\u032a\3\2\2\2\u032c\u032d\3\2\2\2\u032d",
"\u032b\3\2\2\2\u032d\u032e\3\2\2\2\u032e\u00b8\3\2\2\2\u032f\u0330\t",
"\4\2\2\u0330\u00ba\3\2\2\2\u0331\u0332\t\5\2\2\u0332\u00bc\3\2\2\2\u0333",
"\u0334\t\6\2\2\u0334\u00be\3\2\2\2\u0335\u0336\t\7\2\2\u0336\u00c0\3",
"\2\2\2\u0337\u0338\t\b\2\2\u0338\u00c2\3\2\2\2\u0339\u033a\t\t\2\2\u033a",
"\u00c4\3\2\2\2\u033b\u033c\t\n\2\2\u033c\u00c6\3\2\2\2\u033d\u033e\t",
"\13\2\2\u033e\u00c8\3\2\2\2\u033f\u0340\t\f\2\2\u0340\u00ca\3\2\2\2",
"\u0341\u0342\t\r\2\2\u0342\u00cc\3\2\2\2\u0343\u0344\t\16\2\2\u0344",
"\u00ce\3\2\2\2\u0345\u0346\t\17\2\2\u0346\u00d0\3\2\2\2\u0347\u0348",
"\t\20\2\2\u0348\u00d2\3\2\2\2\u0349\u034a\t\21\2\2\u034a\u00d4\3\2\2",
"\2\u034b\u034c\t\22\2\2\u034c\u00d6\3\2\2\2\u034d\u034e\t\23\2\2\u034e",
"\u00d8\3\2\2\2\u034f\u0350\t\24\2\2\u0350\u00da\3\2\2\2\u0351\u0352",
"\t\25\2\2\u0352\u00dc\3\2\2\2\u0353\u0354\t\26\2\2\u0354\u00de\3\2\2",
"\2\u0355\u0356\t\27\2\2\u0356\u00e0\3\2\2\2\u0357\u0358\t\30\2\2\u0358",
"\u00e2\3\2\2\2\u0359\u035a\t\31\2\2\u035a\u00e4\3\2\2\2\u035b\u035c",
"\t\32\2\2\u035c\u00e6\3\2\2\2\u035d\u035e\t\33\2\2\u035e\u00e8\3\2\2",
"\2\u035f\u0360\t\34\2\2\u0360\u00ea\3\2\2\2\u0361\u0362\t\35\2\2\u0362",
"\u00ec\3\2\2\2\u0363\u0364\t\36\2\2\u0364\u00ee\3\2\2\2\u0365\u0366",
"\t\37\2\2\u0366\u00f0\3\2\2\2\u0367\u0368\7/\2\2\u0368\u036c\7/\2\2",
"\u0369\u036a\7\61\2\2\u036a\u036c\7\61\2\2\u036b\u0367\3\2\2\2\u036b",
"\u0369\3\2\2\2\u036c\u0370\3\2\2\2\u036d\u036f\n \2\2\u036e\u036d\3",
"\2\2\2\u036f\u0372\3\2\2\2\u0370\u036e\3\2\2\2\u0370\u0371\3\2\2\2\u0371",
"\u0373\3\2\2\2\u0372\u0370\3\2\2\2\u0373\u0374\by\2\2\u0374\u00f2\3",
"\2\2\2\u0375\u0376\7\61\2\2\u0376\u0377\7,\2\2\u0377\u037b\3\2\2\2\u0378",
"\u037a\13\2\2\2\u0379\u0378\3\2\2\2\u037a\u037d\3\2\2\2\u037b\u037c",
"\3\2\2\2\u037b\u0379\3\2\2\2\u037c\u0381\3\2\2\2\u037d\u037b\3\2\2\2",
"\u037e\u037f\7,\2\2\u037f\u0382\7\61\2\2\u0380\u0382\7\2\2\3\u0381\u037e",
"\3\2\2\2\u0381\u0380\3\2\2\2\u0382\u0383\3\2\2\2\u0383\u0384\bz\2\2",
"\u0384\u00f4\3\2\2\2\u0385\u0386\t!\2\2\u0386\u0387\3\2\2\2\u0387\u0388",
"\b{\2\2\u0388\u00f6\3\2\2\2\23\2\u02cb\u02d3\u02d8\u02db\u02e0\u02e6",
"\u02e9\u02ed\u02f2\u02f4\u0301\u032d\u036b\u0370\u037b\u0381\3\2\3\2"].join("");
var atn = new antlr4.atn.ATNDeserializer().deserialize(serializedATN);
var decisionsToDFA = atn.decisionToState.map( function(ds, index) { return new antlr4.dfa.DFA(ds, index); });
function CQL3Lexer(input) {
antlr4.Lexer.call(this, input);
this._interp = new antlr4.atn.LexerATNSimulator(this, atn, decisionsToDFA, new antlr4.PredictionContextCache());
return this;
}
CQL3Lexer.prototype = Object.create(antlr4.Lexer.prototype);
CQL3Lexer.prototype.constructor = CQL3Lexer;
CQL3Lexer.EOF = antlr4.Token.EOF;
CQL3Lexer.T__0 = 1;
CQL3Lexer.T__1 = 2;
CQL3Lexer.T__2 = 3;
CQL3Lexer.T__3 = 4;
CQL3Lexer.T__4 = 5;
CQL3Lexer.T__5 = 6;
CQL3Lexer.T__6 = 7;
CQL3Lexer.T__7 = 8;
CQL3Lexer.T__8 = 9;
CQL3Lexer.T__9 = 10;
CQL3Lexer.T__10 = 11;
CQL3Lexer.T__11 = 12;
CQL3Lexer.T__12 = 13;
CQL3Lexer.T__13 = 14;
CQL3Lexer.T__14 = 15;
CQL3Lexer.T__15 = 16;
CQL3Lexer.T__16 = 17;
CQL3Lexer.T__17 = 18;
CQL3Lexer.T__18 = 19;
CQL3Lexer.T__19 = 20;
CQL3Lexer.T__20 = 21;
CQL3Lexer.T__21 = 22;
CQL3Lexer.T__22 = 23;
CQL3Lexer.T__23 = 24;
CQL3Lexer.T__24 = 25;
CQL3Lexer.T__25 = 26;
CQL3Lexer.T__26 = 27;
CQL3Lexer.T__27 = 28;
CQL3Lexer.T__28 = 29;
CQL3Lexer.T__29 = 30;
CQL3Lexer.T__30 = 31;
CQL3Lexer.T__31 = 32;
CQL3Lexer.T__32 = 33;
CQL3Lexer.T__33 = 34;
CQL3Lexer.T__34 = 35;
CQL3Lexer.T__35 = 36;
CQL3Lexer.K_ADD = 37;
CQL3Lexer.K_ALTER = 38;
CQL3Lexer.K_AND = 39;
CQL3Lexer.K_APPLY = 40;
CQL3Lexer.K_BATCH = 41;
CQL3Lexer.K_BEGIN = 42;
CQL3Lexer.K_CLUSTERING = 43;
CQL3Lexer.K_ASC = 44;
CQL3Lexer.K_DESC = 45;
CQL3Lexer.K_COLUMNFAMILY = 46;
CQL3Lexer.K_COMPACT = 47;
CQL3Lexer.K_COUNTER = 48;
CQL3Lexer.K_CREATE = 49;
CQL3Lexer.K_CUSTOM = 50;
CQL3Lexer.K_DELETE = 51;
CQL3Lexer.K_DROP = 52;
CQL3Lexer.K_EXISTS = 53;
CQL3Lexer.K_FALSE = 54;
CQL3Lexer.K_FROM = 55;
CQL3Lexer.K_IF = 56;
CQL3Lexer.K_IN = 57;
CQL3Lexer.K_INDEX = 58;
CQL3Lexer.K_INSERT = 59;
CQL3Lexer.K_INTO = 60;
CQL3Lexer.K_KEY = 61;
CQL3Lexer.K_KEYSPACE = 62;
CQL3Lexer.K_NOT = 63;
CQL3Lexer.K_ON = 64;
CQL3Lexer.K_OPTIONS = 65;
CQL3Lexer.K_ORDER = 66;
CQL3Lexer.K_BY = 67;
CQL3Lexer.K_PRIMARY = 68;
CQL3Lexer.K_SELECT = 69;
CQL3Lexer.K_SET = 70;
CQL3Lexer.K_STATIC = 71;
CQL3Lexer.K_STORAGE = 72;
CQL3Lexer.K_TABLE = 73;
CQL3Lexer.K_TIMESTAMP = 74;
CQL3Lexer.K_TRUE = 75;
CQL3Lexer.K_TRUNCATE = 76;
CQL3Lexer.K_TTL = 77;
CQL3Lexer.K_TYPE = 78;
CQL3Lexer.K_UNLOGGED = 79;
CQL3Lexer.K_UPDATE = 80;
CQL3Lexer.K_USE = 81;
CQL3Lexer.K_USING = 82;
CQL3Lexer.K_VALUES = 83;
CQL3Lexer.K_WHERE = 84;
CQL3Lexer.K_WITH = 85;
CQL3Lexer.IDENTIFIER = 86;
CQL3Lexer.STRING = 87;
CQL3Lexer.INTEGER = 88;
CQL3Lexer.FLOAT = 89;
CQL3Lexer.UUID = 90;
CQL3Lexer.BLOB = 91;
CQL3Lexer.SINGLE_LINE_COMMENT = 92;
CQL3Lexer.MULTILINE_COMMENT = 93;
CQL3Lexer.WS = 94;
CQL3Lexer.modeNames = [ "DEFAULT_MODE" ];
CQL3Lexer.literalNames = [ 'null', "';'", "'('", "')'", "','", "'='", "'+'",
"'-'", "'['", "']'", "'?'", "'.'", "':'", "'{'",
"'}'", "'ascii'", "'bigint'", "'blob'", "'boolean'",
"'counter'", "'decimal'", "'double'", "'float'",
"'inet'", "'int'", "'text'", "'tinyint'", "'timestamp'",
"'timeuuid'", "'uuid'", "'varchar'", "'varint'",
"'list'", "'<'", "'>'", "'set'", "'map'" ];
CQL3Lexer.symbolicNames = [ 'null', 'null', 'null', 'null', 'null', 'null',
'null', 'null', 'null', 'null', 'null', 'null',
'null', 'null', 'null', 'null', 'null', 'null',
'null', 'null', 'null', 'null', 'null', 'null',
'null', 'null', 'null', 'null', 'null', 'null',
'null', 'null', 'null', 'null', 'null', 'null',
'null', "K_ADD", "K_ALTER", "K_AND", "K_APPLY",
"K_BATCH", "K_BEGIN", "K_CLUSTERING", "K_ASC",
"K_DESC", "K_COLUMNFAMILY", "K_COMPACT", "K_COUNTER",
"K_CREATE", "K_CUSTOM", "K_DELETE", "K_DROP",
"K_EXISTS", "K_FALSE", "K_FROM", "K_IF", "K_IN",
"K_INDEX", "K_INSERT", "K_INTO", "K_KEY", "K_KEYSPACE",
"K_NOT", "K_ON", "K_OPTIONS", "K_ORDER", "K_BY",
"K_PRIMARY", "K_SELECT", "K_SET", "K_STATIC",
"K_STORAGE", "K_TABLE", "K_TIMESTAMP", "K_TRUE",
"K_TRUNCATE", "K_TTL", "K_TYPE", "K_UNLOGGED",
"K_UPDATE", "K_USE", "K_USING", "K_VALUES",
"K_WHERE", "K_WITH", "IDENTIFIER", "STRING",
"INTEGER", "FLOAT", "UUID", "BLOB", "SINGLE_LINE_COMMENT",
"MULTILINE_COMMENT", "WS" ];
CQL3Lexer.ruleNames = [ "T__0", "T__1", "T__2", "T__3", "T__4", "T__5",
"T__6", "T__7", "T__8", "T__9", "T__10", "T__11",
"T__12", "T__13", "T__14", "T__15", "T__16", "T__17",
"T__18", "T__19", "T__20", "T__21", "T__22", "T__23",
"T__24", "T__25", "T__26", "T__27", "T__28", "T__29",
"T__30", "T__31", "T__32", "T__33", "T__34", "T__35",
"K_ADD", "K_ALTER", "K_AND", "K_APPLY", "K_BATCH",
"K_BEGIN", "K_CLUSTERING", "K_ASC", "K_DESC", "K_COLUMNFAMILY",
"K_COMPACT", "K_COUNTER", "K_CREATE", "K_CUSTOM",
"K_DELETE", "K_DROP", "K_EXISTS", "K_FALSE", "K_FROM",
"K_IF", "K_IN", "K_INDEX", "K_INSERT", "K_INTO",
"K_KEY", "K_KEYSPACE", "K_NOT", "K_ON", "K_OPTIONS",
"K_ORDER", "K_BY", "K_PRIMARY", "K_SELECT", "K_SET",
"K_STATIC", "K_STORAGE", "K_TABLE", "K_TIMESTAMP",
"K_TRUE", "K_TRUNCATE", "K_TTL", "K_TYPE", "K_UNLOGGED",
"K_UPDATE", "K_USE", "K_USING", "K_VALUES", "K_WHERE",
"K_WITH", "IDENTIFIER", "STRING", "INTEGER", "FLOAT",
"UUID", "BLOB", "HEX", "DIGIT", "A", "B", "C", "D",
"E", "F", "G", "H", "I", "J", "K", "L", "M", "N",
"O", "P", "Q", "R", "S", "T", "U", "V", "W", "X",
"Y", "Z", "SINGLE_LINE_COMMENT", "MULTILINE_COMMENT",
"WS" ];
CQL3Lexer.grammarFileName = "CQL3.g4";
exports.CQL3Lexer = CQL3Lexer;

View File

@@ -0,0 +1,130 @@
T__0=1
T__1=2
T__2=3
T__3=4
T__4=5
T__5=6
T__6=7
T__7=8
T__8=9
T__9=10
T__10=11
T__11=12
T__12=13
T__13=14
T__14=15
T__15=16
T__16=17
T__17=18
T__18=19
T__19=20
T__20=21
T__21=22
T__22=23
T__23=24
T__24=25
T__25=26
T__26=27
T__27=28
T__28=29
T__29=30
T__30=31
T__31=32
T__32=33
T__33=34
T__34=35
T__35=36
K_ADD=37
K_ALTER=38
K_AND=39
K_APPLY=40
K_BATCH=41
K_BEGIN=42
K_CLUSTERING=43
K_ASC=44
K_DESC=45
K_COLUMNFAMILY=46
K_COMPACT=47
K_COUNTER=48
K_CREATE=49
K_CUSTOM=50
K_DELETE=51
K_DROP=52
K_EXISTS=53
K_FALSE=54
K_FROM=55
K_IF=56
K_IN=57
K_INDEX=58
K_INSERT=59
K_INTO=60
K_KEY=61
K_KEYSPACE=62
K_NOT=63
K_ON=64
K_OPTIONS=65
K_ORDER=66
K_BY=67
K_PRIMARY=68
K_SELECT=69
K_SET=70
K_STATIC=71
K_STORAGE=72
K_TABLE=73
K_TIMESTAMP=74
K_TRUE=75
K_TRUNCATE=76
K_TTL=77
K_TYPE=78
K_UNLOGGED=79
K_UPDATE=80
K_USE=81
K_USING=82
K_VALUES=83
K_WHERE=84
K_WITH=85
IDENTIFIER=86
STRING=87
INTEGER=88
FLOAT=89
UUID=90
BLOB=91
SINGLE_LINE_COMMENT=92
MULTILINE_COMMENT=93
WS=94
';'=1
'('=2
')'=3
','=4
'='=5
'+'=6
'-'=7
'['=8
']'=9
'?'=10
'.'=11
':'=12
'{'=13
'}'=14
'ascii'=15
'bigint'=16
'blob'=17
'boolean'=18
'counter'=19
'decimal'=20
'double'=21
'float'=22
'inet'=23
'int'=24
'text'=25
'tinyint'=26
'timestamp'=27
'timeuuid'=28
'uuid'=29
'varchar'=30
'varint'=31
'list'=32
'<'=33
'>'=34
'set'=35
'map'=36

View File

@@ -0,0 +1,636 @@
// Generated from CQL3.g4 by ANTLR 4.5
// jshint ignore: start
var antlr4 = require('antlr4/index');
// This class defines a complete listener for a parse tree produced by CQL3Parser.
function CQL3Listener() {
antlr4.tree.ParseTreeListener.call(this);
return this;
}
CQL3Listener.prototype = Object.create(antlr4.tree.ParseTreeListener.prototype);
CQL3Listener.prototype.constructor = CQL3Listener;
// Enter a parse tree produced by CQL3Parser#statements.
CQL3Listener.prototype.enterStatements = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#statements.
CQL3Listener.prototype.exitStatements = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#statement.
CQL3Listener.prototype.enterStatement = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#statement.
CQL3Listener.prototype.exitStatement = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#dml_statements.
CQL3Listener.prototype.enterDml_statements = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#dml_statements.
CQL3Listener.prototype.exitDml_statements = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#dml_statement.
CQL3Listener.prototype.enterDml_statement = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#dml_statement.
CQL3Listener.prototype.exitDml_statement = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#create_keyspace_stmt.
CQL3Listener.prototype.enterCreate_keyspace_stmt = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#create_keyspace_stmt.
CQL3Listener.prototype.exitCreate_keyspace_stmt = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#alter_keyspace_stmt.
CQL3Listener.prototype.enterAlter_keyspace_stmt = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#alter_keyspace_stmt.
CQL3Listener.prototype.exitAlter_keyspace_stmt = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#drop_keyspace_stmt.
CQL3Listener.prototype.enterDrop_keyspace_stmt = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#drop_keyspace_stmt.
CQL3Listener.prototype.exitDrop_keyspace_stmt = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#use_stmt.
CQL3Listener.prototype.enterUse_stmt = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#use_stmt.
CQL3Listener.prototype.exitUse_stmt = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#create_table_stmt.
CQL3Listener.prototype.enterCreate_table_stmt = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#create_table_stmt.
CQL3Listener.prototype.exitCreate_table_stmt = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#alter_table_stmt.
CQL3Listener.prototype.enterAlter_table_stmt = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#alter_table_stmt.
CQL3Listener.prototype.exitAlter_table_stmt = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#alter_table_instruction.
CQL3Listener.prototype.enterAlter_table_instruction = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#alter_table_instruction.
CQL3Listener.prototype.exitAlter_table_instruction = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#drop_table_stmt.
CQL3Listener.prototype.enterDrop_table_stmt = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#drop_table_stmt.
CQL3Listener.prototype.exitDrop_table_stmt = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#truncate_table_stmt.
CQL3Listener.prototype.enterTruncate_table_stmt = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#truncate_table_stmt.
CQL3Listener.prototype.exitTruncate_table_stmt = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#create_index_stmt.
CQL3Listener.prototype.enterCreate_index_stmt = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#create_index_stmt.
CQL3Listener.prototype.exitCreate_index_stmt = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#drop_index_stmt.
CQL3Listener.prototype.enterDrop_index_stmt = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#drop_index_stmt.
CQL3Listener.prototype.exitDrop_index_stmt = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#insert_stmt.
CQL3Listener.prototype.enterInsert_stmt = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#insert_stmt.
CQL3Listener.prototype.exitInsert_stmt = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#column_names.
CQL3Listener.prototype.enterColumn_names = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#column_names.
CQL3Listener.prototype.exitColumn_names = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#column_values.
CQL3Listener.prototype.enterColumn_values = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#column_values.
CQL3Listener.prototype.exitColumn_values = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#upsert_options.
CQL3Listener.prototype.enterUpsert_options = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#upsert_options.
CQL3Listener.prototype.exitUpsert_options = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#upsert_option.
CQL3Listener.prototype.enterUpsert_option = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#upsert_option.
CQL3Listener.prototype.exitUpsert_option = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#index_name.
CQL3Listener.prototype.enterIndex_name = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#index_name.
CQL3Listener.prototype.exitIndex_name = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#index_class.
CQL3Listener.prototype.enterIndex_class = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#index_class.
CQL3Listener.prototype.exitIndex_class = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#index_options.
CQL3Listener.prototype.enterIndex_options = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#index_options.
CQL3Listener.prototype.exitIndex_options = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#update_stmt.
CQL3Listener.prototype.enterUpdate_stmt = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#update_stmt.
CQL3Listener.prototype.exitUpdate_stmt = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#update_assignments.
CQL3Listener.prototype.enterUpdate_assignments = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#update_assignments.
CQL3Listener.prototype.exitUpdate_assignments = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#update_assignment.
CQL3Listener.prototype.enterUpdate_assignment = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#update_assignment.
CQL3Listener.prototype.exitUpdate_assignment = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#update_conditions.
CQL3Listener.prototype.enterUpdate_conditions = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#update_conditions.
CQL3Listener.prototype.exitUpdate_conditions = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#update_condition.
CQL3Listener.prototype.enterUpdate_condition = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#update_condition.
CQL3Listener.prototype.exitUpdate_condition = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#where_clause.
CQL3Listener.prototype.enterWhere_clause = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#where_clause.
CQL3Listener.prototype.exitWhere_clause = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#relation.
CQL3Listener.prototype.enterRelation = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#relation.
CQL3Listener.prototype.exitRelation = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#delete_stmt.
CQL3Listener.prototype.enterDelete_stmt = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#delete_stmt.
CQL3Listener.prototype.exitDelete_stmt = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#delete_conditions.
CQL3Listener.prototype.enterDelete_conditions = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#delete_conditions.
CQL3Listener.prototype.exitDelete_conditions = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#delete_condition.
CQL3Listener.prototype.enterDelete_condition = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#delete_condition.
CQL3Listener.prototype.exitDelete_condition = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#delete_selections.
CQL3Listener.prototype.enterDelete_selections = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#delete_selections.
CQL3Listener.prototype.exitDelete_selections = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#delete_selection.
CQL3Listener.prototype.enterDelete_selection = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#delete_selection.
CQL3Listener.prototype.exitDelete_selection = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#batch_stmt.
CQL3Listener.prototype.enterBatch_stmt = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#batch_stmt.
CQL3Listener.prototype.exitBatch_stmt = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#batch_options.
CQL3Listener.prototype.enterBatch_options = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#batch_options.
CQL3Listener.prototype.exitBatch_options = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#batch_option.
CQL3Listener.prototype.enterBatch_option = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#batch_option.
CQL3Listener.prototype.exitBatch_option = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#table_name.
CQL3Listener.prototype.enterTable_name = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#table_name.
CQL3Listener.prototype.exitTable_name = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#table_name_noks.
CQL3Listener.prototype.enterTable_name_noks = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#table_name_noks.
CQL3Listener.prototype.exitTable_name_noks = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#column_name.
CQL3Listener.prototype.enterColumn_name = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#column_name.
CQL3Listener.prototype.exitColumn_name = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#table_options.
CQL3Listener.prototype.enterTable_options = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#table_options.
CQL3Listener.prototype.exitTable_options = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#table_option.
CQL3Listener.prototype.enterTable_option = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#table_option.
CQL3Listener.prototype.exitTable_option = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#asc_or_desc.
CQL3Listener.prototype.enterAsc_or_desc = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#asc_or_desc.
CQL3Listener.prototype.exitAsc_or_desc = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#column_definitions.
CQL3Listener.prototype.enterColumn_definitions = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#column_definitions.
CQL3Listener.prototype.exitColumn_definitions = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#column_definition.
CQL3Listener.prototype.enterColumn_definition = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#column_definition.
CQL3Listener.prototype.exitColumn_definition = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#column_type.
CQL3Listener.prototype.enterColumn_type = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#column_type.
CQL3Listener.prototype.exitColumn_type = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#primary_key.
CQL3Listener.prototype.enterPrimary_key = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#primary_key.
CQL3Listener.prototype.exitPrimary_key = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#partition_key.
CQL3Listener.prototype.enterPartition_key = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#partition_key.
CQL3Listener.prototype.exitPartition_key = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#clustering_column.
CQL3Listener.prototype.enterClustering_column = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#clustering_column.
CQL3Listener.prototype.exitClustering_column = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#keyspace_name.
CQL3Listener.prototype.enterKeyspace_name = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#keyspace_name.
CQL3Listener.prototype.exitKeyspace_name = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#if_not_exists.
CQL3Listener.prototype.enterIf_not_exists = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#if_not_exists.
CQL3Listener.prototype.exitIf_not_exists = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#if_exists.
CQL3Listener.prototype.enterIf_exists = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#if_exists.
CQL3Listener.prototype.exitIf_exists = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#constant.
CQL3Listener.prototype.enterConstant = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#constant.
CQL3Listener.prototype.exitConstant = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#variable.
CQL3Listener.prototype.enterVariable = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#variable.
CQL3Listener.prototype.exitVariable = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#term.
CQL3Listener.prototype.enterTerm = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#term.
CQL3Listener.prototype.exitTerm = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#collection.
CQL3Listener.prototype.enterCollection = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#collection.
CQL3Listener.prototype.exitCollection = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#map.
CQL3Listener.prototype.enterMap = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#map.
CQL3Listener.prototype.exitMap = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#set.
CQL3Listener.prototype.enterSet = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#set.
CQL3Listener.prototype.exitSet = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#list.
CQL3Listener.prototype.enterList = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#list.
CQL3Listener.prototype.exitList = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#function.
CQL3Listener.prototype.enterFunction = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#function.
CQL3Listener.prototype.exitFunction = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#properties.
CQL3Listener.prototype.enterProperties = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#properties.
CQL3Listener.prototype.exitProperties = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#property.
CQL3Listener.prototype.enterProperty = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#property.
CQL3Listener.prototype.exitProperty = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#property_name.
CQL3Listener.prototype.enterProperty_name = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#property_name.
CQL3Listener.prototype.exitProperty_name = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#property_value.
CQL3Listener.prototype.enterProperty_value = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#property_value.
CQL3Listener.prototype.exitProperty_value = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#data_type.
CQL3Listener.prototype.enterData_type = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#data_type.
CQL3Listener.prototype.exitData_type = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#native_type.
CQL3Listener.prototype.enterNative_type = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#native_type.
CQL3Listener.prototype.exitNative_type = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#collection_type.
CQL3Listener.prototype.enterCollection_type = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#collection_type.
CQL3Listener.prototype.exitCollection_type = function(ctx) {
};
// Enter a parse tree produced by CQL3Parser#bool.
CQL3Listener.prototype.enterBool = function(ctx) {
};
// Exit a parse tree produced by CQL3Parser#bool.
CQL3Listener.prototype.exitBool = function(ctx) {
};
exports.CQL3Listener = CQL3Listener;

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,361 @@
// Generated from CQL3.g4 by ANTLR 4.5
// jshint ignore: start
var antlr4 = require('antlr4/index');
// This class defines a complete generic visitor for a parse tree produced by CQL3Parser.
function CQL3Visitor() {
antlr4.tree.ParseTreeVisitor.call(this);
return this;
}
CQL3Visitor.prototype = Object.create(antlr4.tree.ParseTreeVisitor.prototype);
CQL3Visitor.prototype.constructor = CQL3Visitor;
// Visit a parse tree produced by CQL3Parser#statements.
CQL3Visitor.prototype.visitStatements = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#statement.
CQL3Visitor.prototype.visitStatement = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#dml_statements.
CQL3Visitor.prototype.visitDml_statements = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#dml_statement.
CQL3Visitor.prototype.visitDml_statement = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#create_keyspace_stmt.
CQL3Visitor.prototype.visitCreate_keyspace_stmt = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#alter_keyspace_stmt.
CQL3Visitor.prototype.visitAlter_keyspace_stmt = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#drop_keyspace_stmt.
CQL3Visitor.prototype.visitDrop_keyspace_stmt = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#use_stmt.
CQL3Visitor.prototype.visitUse_stmt = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#create_table_stmt.
CQL3Visitor.prototype.visitCreate_table_stmt = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#alter_table_stmt.
CQL3Visitor.prototype.visitAlter_table_stmt = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#alter_table_instruction.
CQL3Visitor.prototype.visitAlter_table_instruction = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#drop_table_stmt.
CQL3Visitor.prototype.visitDrop_table_stmt = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#truncate_table_stmt.
CQL3Visitor.prototype.visitTruncate_table_stmt = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#create_index_stmt.
CQL3Visitor.prototype.visitCreate_index_stmt = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#drop_index_stmt.
CQL3Visitor.prototype.visitDrop_index_stmt = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#insert_stmt.
CQL3Visitor.prototype.visitInsert_stmt = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#column_names.
CQL3Visitor.prototype.visitColumn_names = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#column_values.
CQL3Visitor.prototype.visitColumn_values = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#upsert_options.
CQL3Visitor.prototype.visitUpsert_options = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#upsert_option.
CQL3Visitor.prototype.visitUpsert_option = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#index_name.
CQL3Visitor.prototype.visitIndex_name = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#index_class.
CQL3Visitor.prototype.visitIndex_class = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#index_options.
CQL3Visitor.prototype.visitIndex_options = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#update_stmt.
CQL3Visitor.prototype.visitUpdate_stmt = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#update_assignments.
CQL3Visitor.prototype.visitUpdate_assignments = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#update_assignment.
CQL3Visitor.prototype.visitUpdate_assignment = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#update_conditions.
CQL3Visitor.prototype.visitUpdate_conditions = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#update_condition.
CQL3Visitor.prototype.visitUpdate_condition = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#where_clause.
CQL3Visitor.prototype.visitWhere_clause = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#relation.
CQL3Visitor.prototype.visitRelation = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#delete_stmt.
CQL3Visitor.prototype.visitDelete_stmt = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#delete_conditions.
CQL3Visitor.prototype.visitDelete_conditions = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#delete_condition.
CQL3Visitor.prototype.visitDelete_condition = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#delete_selections.
CQL3Visitor.prototype.visitDelete_selections = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#delete_selection.
CQL3Visitor.prototype.visitDelete_selection = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#batch_stmt.
CQL3Visitor.prototype.visitBatch_stmt = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#batch_options.
CQL3Visitor.prototype.visitBatch_options = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#batch_option.
CQL3Visitor.prototype.visitBatch_option = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#table_name.
CQL3Visitor.prototype.visitTable_name = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#table_name_noks.
CQL3Visitor.prototype.visitTable_name_noks = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#column_name.
CQL3Visitor.prototype.visitColumn_name = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#table_options.
CQL3Visitor.prototype.visitTable_options = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#table_option.
CQL3Visitor.prototype.visitTable_option = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#asc_or_desc.
CQL3Visitor.prototype.visitAsc_or_desc = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#column_definitions.
CQL3Visitor.prototype.visitColumn_definitions = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#column_definition.
CQL3Visitor.prototype.visitColumn_definition = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#column_type.
CQL3Visitor.prototype.visitColumn_type = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#primary_key.
CQL3Visitor.prototype.visitPrimary_key = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#partition_key.
CQL3Visitor.prototype.visitPartition_key = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#clustering_column.
CQL3Visitor.prototype.visitClustering_column = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#keyspace_name.
CQL3Visitor.prototype.visitKeyspace_name = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#if_not_exists.
CQL3Visitor.prototype.visitIf_not_exists = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#if_exists.
CQL3Visitor.prototype.visitIf_exists = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#constant.
CQL3Visitor.prototype.visitConstant = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#variable.
CQL3Visitor.prototype.visitVariable = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#term.
CQL3Visitor.prototype.visitTerm = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#collection.
CQL3Visitor.prototype.visitCollection = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#map.
CQL3Visitor.prototype.visitMap = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#set.
CQL3Visitor.prototype.visitSet = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#list.
CQL3Visitor.prototype.visitList = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#function.
CQL3Visitor.prototype.visitFunction = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#properties.
CQL3Visitor.prototype.visitProperties = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#property.
CQL3Visitor.prototype.visitProperty = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#property_name.
CQL3Visitor.prototype.visitProperty_name = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#property_value.
CQL3Visitor.prototype.visitProperty_value = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#data_type.
CQL3Visitor.prototype.visitData_type = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#native_type.
CQL3Visitor.prototype.visitNative_type = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#collection_type.
CQL3Visitor.prototype.visitCollection_type = function(ctx) {
};
// Visit a parse tree produced by CQL3Parser#bool.
CQL3Visitor.prototype.visitBool = function(ctx) {
};
exports.CQL3Visitor = CQL3Visitor;

View File

@@ -0,0 +1,248 @@
lexer grammar CqlLexer;
LR_BRACKET: '(';
RR_BRACKET: ')';
LC_BRACKET: '{';
RC_BRACKET: '}';
LS_BRACKET: '[';
RS_BRACKET: ']';
COMMA: ',';
SEMI: ';';
COLON: ':';
DQUOTE: '"';
SQUOTE: '\'';
SPACE: [ \t\r\n]+ -> channel(HIDDEN);
SPEC_CQL_COMMENT: '/*!' .+? '*/' -> channel(HIDDEN);
COMMENT_INPUT: '/*' .*? '*/' -> channel(HIDDEN);
LINE_COMMENT: (
('-- ' | '#' | '//') ~[\r\n]* ('\r'? '\n' | EOF)
| '--' ('\r'? '\n' | EOF)
) -> channel(HIDDEN);
DOT: '.';
STAR: '*';
DIVIDE: '/';
MODULE: '%';
PLUS: '+';
MINUSMINUS: '--';
MINUS: '-';
// Keywords
K_ADD: A D D | 'ADD';
K_AGGREGATE: A G G R E G A T E | 'AGGREGATE';
K_ALL: A L L | 'ALL';
K_ALL_ROLES: A L L R O L E S| 'ALL ROLES';
K_ALL_KEYSPACES: A L L K E Y S P A C E S | 'ALL KEYSPACES';
K_ALL_FUNCTIONS: A L L F U N C T I O N S| 'ALL FUNCTIONS';
K_ALLOW: A L L O W | 'ALLOW';
K_ALTER: A L T E R | 'ALTER';
K_AND: A N D | 'AND';
K_ANY: A N Y | 'ANY';
K_APPLY: A P P L Y | 'APPLY';
K_AS: A S | 'AS';
K_ASC: A S C | 'ASC';
K_AUTHORIZE: A U T H O R I Z E | 'AUTHORIZE';
K_BATCH: B A T C H | 'BATCH';
K_BEGIN: B E G I N | 'BEGIN';
K_BY: B Y | 'BY';
K_CALLED: C A L L E D | 'CALLED';
K_CLUSTERING: C L U S T E R I N G | 'CLUSTERING';
K_COLUMNFAMILY: C O L U M N F A M I L Y | 'COLUMNFAMILY';
K_COMPACT: C O M P A C T | 'COMPACT';
K_CONSISTENCY: C O N S I S T E N C Y | 'CONSISTENCY';
K_CONTAINS: C O N T A I N S | 'CONTAINS';
K_CREATE: C R E A T E | 'CREATE';
K_CUSTOM: C U S T O M | 'CUSTOM';
K_DELETE: D E L E T E | 'DELETE';
K_DESC: D E S C | 'DESC';
K_DESCRIBE: D E S C R I B E | 'DESCRIBE';
K_DISTINCT: D I S T I N C T | 'DISTINCT';
K_DROP: D R O P | 'DROP';
K_DURABLE_WRITES: D U R A B L E '_' W R I T E S | 'DURABLE_WRITES';
K_EACH_QUORUM: E A C H '_' Q U O R U M | 'EACH_QUORUM';
K_ENTRIES: E N T R I E S | 'ENTRIES';
K_EXECUTE: E X E C U T E | 'EXECUTE';
K_EXISTS: E X I S T S | 'EXISTS';
K_FALSE: F A L S E | 'FALSE';
K_FILTERING: F I L T E R I N G | 'FILTERING';
K_FINALFUNC: F I N A L F U N C | 'FINALFUNC';
K_FROM: F R O M | 'FROM';
K_FULL: F U L L | 'FULL';
K_FUNCTION: F U N C T I O N | 'FUNCTION';
K_GRANT: G R A N T | 'GRANT';
K_IF: I F | 'IF';
K_IN: I N | 'IN';
K_INDEX: I N D E X | 'INDEX';
K_INFINITY: I N F I N I T Y | 'INFINITY';
K_INITCOND: I N I T C O N D | 'INITCOND';
K_INPUT: I N P U T | 'INPUT';
K_INSERT: I N S E R T | 'INSERT';
K_INTO: I N T O | 'INTO';
K_IS: I S | 'IS';
K_KEY: K E Y | 'KEY';
K_KEYS: K E Y S | 'KEYS';
K_KEYSPACE: K E Y S P A C E | 'KEYSPACE';
K_LANGUAGE: L A N G U A G E | 'LANGUAGE';
K_LEVEL: L E V E L | 'LEVEL';
K_LIMIT: L I M I T | 'LIMIT';
K_LOCAL_ONE: L O C A L '_' O N E | 'LOCAL_ONE';
K_LOCAL_QUORUM: L O C A L '_' Q U O R U M | 'LOCAL_QUORUM';
K_LOGGED: L O G G E D | 'LOGGED';
K_LOGIN: L O G I N | 'LOGIN';
K_MATERIALIZED: M A T E R I A L I Z E D | 'MATERIALIZED';
K_MODIFY: M O D I F Y | 'MODIFY';
K_NAN: N A N | 'NAN';
K_NORECURSIVE: N O R E C U R S I V E | 'NORECURSIVE';
K_NOSUPERUSER: N O S U P E R U S E R | 'NOSUPERUSER';
K_NOT: N O T | 'NOT';
K_NULL: N U L L | 'NULL';
K_OF: O F | 'OF';
K_ON: O N | 'ON';
K_ONE: O N E | 'ONE';
K_OPTIONS: O P T I O N S | 'OPTIONS';
K_OR: O R | 'OR';
K_ORDER: O R D E R | 'ORDER';
K_PARTITION: P A R T I T I O N | 'PARTITION';
K_PASSWORD: P A S S W O R D | 'PASSWORD';
K_PER: P E R | 'PER';
K_PERMISSION: P E R M I S S I O N | 'PERMISSION';
K_PERMISSIONS: P E R M I S S I O N S | 'PERMISSIONS';
K_PRIMARY: P R I M A R Y | 'PRIMARY';
K_QUORUM: Q U O R U M | 'QUORUM';
K_RENAME: R E N A M E | 'RENAME';
K_REPLACE: R E P L A C E | 'REPLACE';
K_REPLICATION: R E P L I C A T I O N | 'REPLICATION';
K_RETURNS: R E T U R N S | 'RETURNS';
K_REVOKE: R E V O K E | 'REVOKE';
K_SCHEMA: S C H E M A | 'SCHEMA';
K_SELECT: S E L E C T | 'SELECT';
K_SET: S E T | 'SET';
K_SFUNC: S F U N C | 'SFUNC';
K_STATIC: S T A T I C | 'STATIC';
K_STORAGE: S T O R A G E | 'STORAGE';
K_STYPE: S T Y P E | 'STYPE';
K_SUPERUSER: S U P E R U S E R | 'SUPERUSER';
K_TABLE: T A B L E | 'TABLE';
K_THREE: T H R E E | 'THREE';
K_TIMESTAMP: T I M E S T A M P | 'TIMESTAMP';
K_TO: T O | 'TO';
K_TOKEN: T O K E N | 'TOKEN';
K_TRIGGER: T R I G G E R | 'TRIGGER';
K_TRUE: T R U E | 'TRUE';
K_TRUNCATE: T R U N C A T E | 'TRUNCATE';
K_TTL: T T L | 'TTL';
K_TWO: T W O | 'TWO';
K_TYPE: T Y P E | 'TYPE';
K_UNLOGGED: U N L O G G E D | 'UNLOGGED';
K_UPDATE: U P D A T E | 'UPDATE';
K_USE: U S E | 'USE';
K_USING: U S I N G | 'USING';
K_VALUES: V A L U E S | 'VALUES';
K_VIEW: V I E W | 'VIEW';
K_WHERE: W H E R E | 'WHERE';
K_WITH: W I T H | 'WITH';
K_WRITETIME: W R I T E T I M E | 'WRITETIME';
K_ASCII: A S C I I | 'ASCII';
K_BIGINT: B I G I N T | 'BIGINT';
K_BLOB: B L O B | 'BLOB';
K_BOOLEAN: B O O L E A N | 'BOOLEAN';
K_COUNTER: C O U N T E R | 'COUNTER';
K_DATE: D A T E | 'DATE';
K_DECIMAL: D E C I M A L | 'DECIMAL';
K_DOUBLE: D O U B L E | 'DOUBLE';
K_FLOAT: F L O A T | 'FLOAT';
K_FROZEN: F R O Z E N | 'FROZEN';
K_INET: I N E T | 'INET';
K_INT: I N T | 'INT';
K_LIST: L I S T;
K_ROLES: R O L E S;
K_ROLE: R O L E;
K_MAP: M A P | 'MAP';
K_SMALLINT: S M A L L I N T | 'SMALLINT';
K_TEXT: T E X T | 'TEXT';
K_TIMEUUID: T I M E U U I D | 'TIMEUUID';
K_TIME: T I M E | 'TIME';
K_TINYINT: T I N Y I N T | 'TINYINT';
K_TUPLE: T U P L E | 'TUPLE';
K_UUID: U U I D | 'UUID';
K_VARCHAR: V A R C H A R | 'VARCHAR';
K_VARINT: V A R I N T | 'VARINT';
K_USERS: U S E R S | 'USERS';
K_USER: U S E R | 'USER';
fragment A : [aA]; // match either an 'a' or 'A'
fragment B : [bB];
fragment C : [cC];
fragment D : [dD];
fragment E : [eE];
fragment F : [fF];
fragment G : [gG];
fragment H : [hH];
fragment I : [iI];
fragment J : [jJ];
fragment K : [kK];
fragment L : [lL];
fragment M : [mM];
fragment N : [nN];
fragment O : [oO];
fragment P : [pP];
fragment Q : [qQ];
fragment R : [rR];
fragment S : [sS];
fragment T : [tT];
fragment U : [uU];
fragment V : [vV];
fragment W : [wW];
fragment X : [xX];
fragment Y : [yY];
fragment Z : [zZ];
fragment CODE_BLOCK_DELIMITER: '$$';
fragment CODE_BLOCK_FRAG: '$$' ( ~'$' | ( '$' ~('$') ))* '$$';
// CODE_BLOCK_FRAG: '$$' .*? '$$';
fragment HEX_4DIGIT: [0-9a-fA-F][0-9a-fA-F][0-9a-fA-F][0-9a-fA-F];
fragment OBJECT_NAME_FRAG: [a-zA-Z][A-Za-z0-9_$]*;
fragment SQUOTA_STRING: '\'' ('\\' . | '\'\'' | ~('\'' | '\\'))* '\'';
CODE_BLOCK: CODE_BLOCK_FRAG;
STRING_LITERAL: SQUOTA_STRING;
DECIMAL_LITERAL: DEC_DIGIT+;
FLOAT_LITERAL: (MINUS)? [0-9]+(DOT [0-9]+)?;
HEXADECIMAL_LITERAL
: 'X' '\'' (HEX_DIGIT HEX_DIGIT)+ '\''
| '0X' HEX_DIGIT+;
REAL_LITERAL
: (DEC_DIGIT+)? '.' DEC_DIGIT+
| DEC_DIGIT+ '.' EXPONENT_NUM_PART
| (DEC_DIGIT+)? '.' (DEC_DIGIT+ EXPONENT_NUM_PART)
| DEC_DIGIT+ EXPONENT_NUM_PART;
OBJECT_NAME: OBJECT_NAME_FRAG;
UUID: HEX_4DIGIT HEX_4DIGIT '-' HEX_4DIGIT '-' HEX_4DIGIT '-' HEX_4DIGIT '-' HEX_4DIGIT HEX_4DIGIT HEX_4DIGIT;
fragment HEX_DIGIT: [0-9A-F];
fragment DEC_DIGIT: [0-9];
fragment EXPONENT_NUM_PART: 'E' '-'? DEC_DIGIT+;
fragment OPERATOR_EQ_FRAG: '=';
fragment OPERATOR_LT_FRAG: '<';
fragment OPERATOR_GT_FRAG: '>';
fragment OPERATOR_GTE_FRAG: '>=';
fragment OPERATOR_LTE_FRAG: '<=';
OPERATOR_EQ: OPERATOR_EQ_FRAG;
OPERATOR_LT: OPERATOR_LT_FRAG;
OPERATOR_GT: OPERATOR_GT_FRAG;
OPERATOR_LTE: OPERATOR_LTE_FRAG;
OPERATOR_GTE: OPERATOR_GTE_FRAG;

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,194 @@
LR_BRACKET=1
RR_BRACKET=2
LC_BRACKET=3
RC_BRACKET=4
LS_BRACKET=5
RS_BRACKET=6
COMMA=7
SEMI=8
COLON=9
DQUOTE=10
SQUOTE=11
SPACE=12
SPEC_CQL_COMMENT=13
COMMENT_INPUT=14
LINE_COMMENT=15
DOT=16
STAR=17
DIVIDE=18
MODULE=19
PLUS=20
MINUSMINUS=21
MINUS=22
K_ADD=23
K_AGGREGATE=24
K_ALL=25
K_ALL_ROLES=26
K_ALL_KEYSPACES=27
K_ALL_FUNCTIONS=28
K_ALLOW=29
K_ALTER=30
K_AND=31
K_ANY=32
K_APPLY=33
K_AS=34
K_ASC=35
K_AUTHORIZE=36
K_BATCH=37
K_BEGIN=38
K_BY=39
K_CALLED=40
K_CLUSTERING=41
K_COLUMNFAMILY=42
K_COMPACT=43
K_CONSISTENCY=44
K_CONTAINS=45
K_CREATE=46
K_CUSTOM=47
K_DELETE=48
K_DESC=49
K_DESCRIBE=50
K_DISTINCT=51
K_DROP=52
K_DURABLE_WRITES=53
K_EACH_QUORUM=54
K_ENTRIES=55
K_EXECUTE=56
K_EXISTS=57
K_FALSE=58
K_FILTERING=59
K_FINALFUNC=60
K_FROM=61
K_FULL=62
K_FUNCTION=63
K_GRANT=64
K_IF=65
K_IN=66
K_INDEX=67
K_INFINITY=68
K_INITCOND=69
K_INPUT=70
K_INSERT=71
K_INTO=72
K_IS=73
K_KEY=74
K_KEYS=75
K_KEYSPACE=76
K_LANGUAGE=77
K_LEVEL=78
K_LIMIT=79
K_LOCAL_ONE=80
K_LOCAL_QUORUM=81
K_LOGGED=82
K_LOGIN=83
K_MATERIALIZED=84
K_MODIFY=85
K_NAN=86
K_NORECURSIVE=87
K_NOSUPERUSER=88
K_NOT=89
K_NULL=90
K_OF=91
K_ON=92
K_ONE=93
K_OPTIONS=94
K_OR=95
K_ORDER=96
K_PARTITION=97
K_PASSWORD=98
K_PER=99
K_PERMISSION=100
K_PERMISSIONS=101
K_PRIMARY=102
K_QUORUM=103
K_RENAME=104
K_REPLACE=105
K_REPLICATION=106
K_RETURNS=107
K_REVOKE=108
K_SCHEMA=109
K_SELECT=110
K_SET=111
K_SFUNC=112
K_STATIC=113
K_STORAGE=114
K_STYPE=115
K_SUPERUSER=116
K_TABLE=117
K_THREE=118
K_TIMESTAMP=119
K_TO=120
K_TOKEN=121
K_TRIGGER=122
K_TRUE=123
K_TRUNCATE=124
K_TTL=125
K_TWO=126
K_TYPE=127
K_UNLOGGED=128
K_UPDATE=129
K_USE=130
K_USING=131
K_VALUES=132
K_VIEW=133
K_WHERE=134
K_WITH=135
K_WRITETIME=136
K_ASCII=137
K_BIGINT=138
K_BLOB=139
K_BOOLEAN=140
K_COUNTER=141
K_DATE=142
K_DECIMAL=143
K_DOUBLE=144
K_FLOAT=145
K_FROZEN=146
K_INET=147
K_INT=148
K_LIST=149
K_ROLES=150
K_ROLE=151
K_MAP=152
K_SMALLINT=153
K_TEXT=154
K_TIMEUUID=155
K_TIME=156
K_TINYINT=157
K_TUPLE=158
K_UUID=159
K_VARCHAR=160
K_VARINT=161
K_USERS=162
K_USER=163
CODE_BLOCK=164
STRING_LITERAL=165
DECIMAL_LITERAL=166
FLOAT_LITERAL=167
HEXADECIMAL_LITERAL=168
REAL_LITERAL=169
OBJECT_NAME=170
UUID=171
OPERATOR_EQ=172
OPERATOR_LT=173
OPERATOR_GT=174
OPERATOR_LTE=175
OPERATOR_GTE=176
'('=1
')'=2
'{'=3
'}'=4
'['=5
']'=6
','=7
';'=8
':'=9
'"'=10
'\''=11
'.'=16
'*'=17
'/'=18
'%'=19
'+'=20
'--'=21
'-'=22

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,194 @@
LR_BRACKET=1
RR_BRACKET=2
LC_BRACKET=3
RC_BRACKET=4
LS_BRACKET=5
RS_BRACKET=6
COMMA=7
SEMI=8
COLON=9
DQUOTE=10
SQUOTE=11
SPACE=12
SPEC_CQL_COMMENT=13
COMMENT_INPUT=14
LINE_COMMENT=15
DOT=16
STAR=17
DIVIDE=18
MODULE=19
PLUS=20
MINUSMINUS=21
MINUS=22
K_ADD=23
K_AGGREGATE=24
K_ALL=25
K_ALL_ROLES=26
K_ALL_KEYSPACES=27
K_ALL_FUNCTIONS=28
K_ALLOW=29
K_ALTER=30
K_AND=31
K_ANY=32
K_APPLY=33
K_AS=34
K_ASC=35
K_AUTHORIZE=36
K_BATCH=37
K_BEGIN=38
K_BY=39
K_CALLED=40
K_CLUSTERING=41
K_COLUMNFAMILY=42
K_COMPACT=43
K_CONSISTENCY=44
K_CONTAINS=45
K_CREATE=46
K_CUSTOM=47
K_DELETE=48
K_DESC=49
K_DESCRIBE=50
K_DISTINCT=51
K_DROP=52
K_DURABLE_WRITES=53
K_EACH_QUORUM=54
K_ENTRIES=55
K_EXECUTE=56
K_EXISTS=57
K_FALSE=58
K_FILTERING=59
K_FINALFUNC=60
K_FROM=61
K_FULL=62
K_FUNCTION=63
K_GRANT=64
K_IF=65
K_IN=66
K_INDEX=67
K_INFINITY=68
K_INITCOND=69
K_INPUT=70
K_INSERT=71
K_INTO=72
K_IS=73
K_KEY=74
K_KEYS=75
K_KEYSPACE=76
K_LANGUAGE=77
K_LEVEL=78
K_LIMIT=79
K_LOCAL_ONE=80
K_LOCAL_QUORUM=81
K_LOGGED=82
K_LOGIN=83
K_MATERIALIZED=84
K_MODIFY=85
K_NAN=86
K_NORECURSIVE=87
K_NOSUPERUSER=88
K_NOT=89
K_NULL=90
K_OF=91
K_ON=92
K_ONE=93
K_OPTIONS=94
K_OR=95
K_ORDER=96
K_PARTITION=97
K_PASSWORD=98
K_PER=99
K_PERMISSION=100
K_PERMISSIONS=101
K_PRIMARY=102
K_QUORUM=103
K_RENAME=104
K_REPLACE=105
K_REPLICATION=106
K_RETURNS=107
K_REVOKE=108
K_SCHEMA=109
K_SELECT=110
K_SET=111
K_SFUNC=112
K_STATIC=113
K_STORAGE=114
K_STYPE=115
K_SUPERUSER=116
K_TABLE=117
K_THREE=118
K_TIMESTAMP=119
K_TO=120
K_TOKEN=121
K_TRIGGER=122
K_TRUE=123
K_TRUNCATE=124
K_TTL=125
K_TWO=126
K_TYPE=127
K_UNLOGGED=128
K_UPDATE=129
K_USE=130
K_USING=131
K_VALUES=132
K_VIEW=133
K_WHERE=134
K_WITH=135
K_WRITETIME=136
K_ASCII=137
K_BIGINT=138
K_BLOB=139
K_BOOLEAN=140
K_COUNTER=141
K_DATE=142
K_DECIMAL=143
K_DOUBLE=144
K_FLOAT=145
K_FROZEN=146
K_INET=147
K_INT=148
K_LIST=149
K_ROLES=150
K_ROLE=151
K_MAP=152
K_SMALLINT=153
K_TEXT=154
K_TIMEUUID=155
K_TIME=156
K_TINYINT=157
K_TUPLE=158
K_UUID=159
K_VARCHAR=160
K_VARINT=161
K_USERS=162
K_USER=163
CODE_BLOCK=164
STRING_LITERAL=165
DECIMAL_LITERAL=166
FLOAT_LITERAL=167
HEXADECIMAL_LITERAL=168
REAL_LITERAL=169
OBJECT_NAME=170
UUID=171
OPERATOR_EQ=172
OPERATOR_LT=173
OPERATOR_GT=174
OPERATOR_LTE=175
OPERATOR_GTE=176
'('=1
')'=2
'{'=3
'}'=4
'['=5
']'=6
','=7
';'=8
':'=9
'"'=10
'\''=11
'.'=16
'*'=17
'/'=18
'%'=19
'+'=20
'--'=21
'-'=22

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,23 @@
bindings:
partition: Div(<<partitions:10>>) -> long
cluster: Mod(<<clusters:10>>) -> long
cycle: Identity() -> long
asciival: NumberNameToString()
varcharval: NumberNameToString()
textval: NumberNameToString()
bigintval: Identity() -> long
blobval: HashedToByteBuffer(1024)
booleanval: ModuloToBoolean()
decimalval: ModuloToBigDecimal(9223372036854775807L)
doubleval: Pareto(1.161,5.0) -> double
floatval: compose Normal(15.0,3.0); ToFloat();
inetval: ToInetAddress()
intval: ToInt()
timestampval: ToDate()
uuidval: ToEpochTimeUUID('2017-01-01 23:59:59')
timeuuidval: ToFinestTimeUUID('2017-12-31 23:59:59',123,456) # tuuid node data 123 and tuuid clock data 456
varintval: ModuloToBigInt(50000) -> BigInteger
tinyintval: LongToByte()
smallintval: LongToShort()
dateval: LongToLocalDateDays()
timeval: Identity()

View File

@@ -0,0 +1,86 @@
description: |
Use case for
scenarios:
default:
- run driver=cql tags==phase:schema threads==1 cycles==UNDEF
- run driver=cql tags==phase:rampup cycles===TEMPLATE(rampup-cycles,10000000) threads=1000
- run driver=cql tags==phase:main cycles===TEMPLATE(main-cycles,10000000) threads=1000
main:
- run driver=cql tags==phase:main cycles===TEMPLATE(main-cycles,10000000) threads=1000
bindings:
version: Hash(); Identity() -> long
blocks:
- tags:
phase: schema
params:
prepared: false
statements:
- create-table : |
CREATE TABLE IF NOT EXISTS <<keyspace:test>>.a (
a text,
PRIMARY KEY (a)
);
- tags:
phase: rampup
params:
cl: <<write_cl:LOCAL_QUORUM>>
instrument: true
statements:
- insert-rampup: |
insert into <<keyspace:test>>.a
(
a
)
VALUES
(
{a}
)
idempotent: true
- tags:
phase: main
type: write
params:
ratio: <<write_ratio:1>>
cl: <<write_cl:LOCAL_QUORUM>>
instrument: true
statements:
- insert-main: |
insert into <<keyspace:test>>.a
(
a
)
VALUES
(
{a}
)
- tags:
phase: main
type: read
params:
ratio: <<read_partition_ratio:1>>
cl: <<read_cl:LOCAL_QUORUM>>
instrument: true
statements:
- read-partition: |
select * from <<keyspace:test>>.a
where
a = {a}
limit <<limit:1000>>;
- tags:
phase: main
type: read
params:
ratio: <<read_row_ratio:1>>
cl: <<read_cl:LOCAL_QUORUM>>
instrument: true
statements:
- read-row: |
select * from <<keyspace:test>>.a
where
a= {a};

View File

@@ -0,0 +1,32 @@
<template>
<div class="d-flex align-center ma-2 pa-1 text-center">
<v-btn to="/ui/build/" title="Build a workload from a schema">Build</v-btn>
<v-btn to="/ui/run/" title="Run a workload">Run</v-btn>
<v-btn to="/ui/watch/" title="Watch workload status">Watch</v-btn>
<v-btn to="/docs/" title="Documentation">Docs</v-btn>
<v-btn
title="Give us your feedback!"
href="https://github.com/nosqlbench/nosqlbench/wiki/Submitting-Feedback">
<v-icon>mdi-lightbulb-on-outline</v-icon>
</v-btn>
</div>
</template>
<script>
export default {
name: 'app-selector',
data() {
let data = {
empty: [],
apps: ['build', 'run', 'status', 'workspaces'],
thisapp: 'build'
};
return data;
}
}
</script>
<style>
</style>

View File

@@ -1,133 +1,148 @@
<template>
<v-navigation-drawer v-model="drawer" :permanent="lockmenu" @transitionend="setDrawer" app class="secondary">
<v-navigation-drawer app
v-model="isDrawerOpen"
:permanent="isMenuLocked"
@transitionend="toggleDrawerOpened"
:title="drawerTitle">
<div class="menu">
<!-- active_category: {{active_category}} active_topic: {{active_topic}}-->
<div class="menu">
<!-- Use active_category and active_topic to select inactive -->
<!-- active_category: {{ active_category.name }} active_topic: {{ active_topic.name }}-->
<!-- Use active_category and active_topic to select inactive -->
<v-list nav dense>
<v-list nav dense>
<!-- KEEP OPEN -->
<v-list-item v-if="$vuetify.breakpoint.mdAndDown">
<v-list-item-action>
<v-switch inset dark color="accent" v-model="lockmenu" label="keep open"
@change="setLockMenu"></v-switch>
</v-list-item-action>
</v-list-item>
<!-- KEEP OPEN -->
<v-list-item>
<!-- <v-list-item v-if="$vuetify.breakpoint.mdAndDown">-->
<v-list-item-action>
<v-switch inset v-model="isMenuLocked" label="keep open"
@change="toggleMenuLocked"></v-switch>
</v-list-item-action>
</v-list-item>
<!-- by category -->
<v-list-group v-for="(category,c) in categories" :key="c"
:value="active_category === category.category" active-class="isactive">
<!-- <nuxt-link :to="{path: category.category}">foo</nuxt-link>-->
<!-- <router-link :to="{path: category.category+'.html'}">-->
<template v-slot:activator>
<v-list-item-content>
<v-list-item-title @click="$nuxt.$router.push({path: category.category})">
{{category.categoryName}}
</v-list-item-title>
</v-list-item-content>
</template>
<!-- </router-link>-->
<v-list-item v-for="(doc, i) in category.docs" :key="i" link :to="doc.filename">
<!-- <router-link :to="{ name: 'docs-slug', params: {lockmenu:lockmenu}}">-->
<!-- <router-link :to="{ path: doc.filename}">-->
<v-list-item-title @click="$nuxt.$router.push({path: doc.filename})">{{doc.attributes.title}}</v-list-item-title>
<!-- </router-link>-->
</v-list-item>
<!-- by category -->
<v-list-group v-for="(category,c) in categories" :key="c"
:value="active_category.title === category.title" active-class="isactive">
<template v-slot:activator>
<v-list-item-content>
<v-list-item-title link :to="category.path">{{ category.title }}</v-list-item-title>
</v-list-item-content>
</template>
</v-list-group>
</v-list>
<!-- by topic -->
<v-list-item v-for="(topic, i) in category.topics" :key="i" link :to="topic.path">
<v-list-item-title>{{ topic.title }}</v-list-item-title>
</v-list-item>
</div>
</v-navigation-drawer>
</v-list-group>
</v-list>
</div>
</v-navigation-drawer>
</template>
<script>
export default {
name: 'DocsMenu',
props: {
categories: Array,
active_category: String,
active_topic: String
},
data(context) {
let lockmenu = this.$store.state.docs.isDrawerPinned;
if (lockmenu == null) {
lockmenu = false
}
console.log("data says context is: " + context);
// console.log("g says" + this.$getters.docs.drawerState);
return {
lockmenu: lockmenu,
drawer: null
}
},
methods: {
setLockMenu() {
this.$store.commit('docs/setMenuLock', this.lockmenu);
},
setDrawer() {
this.$store.commit('docs/setDrawer', this.drawer);
}
},
created() {
this.setDrawer();
this.$store.subscribe((mutation, state) => {
console.log("mutation type " + mutation.type);
if (mutation.type === 'docs/toggleDrawerState') {
this.drawer = this.$store.state.docs.isDrawerOpen;
} else if (mutation.type === 'docs/setMenuLock') {
this.lockmenu = this.$store.state.docs.isDrawerPinned;
}
});
}
// computed: {
// drawerState() {
// console.log("getting drawerState...");
// return this.$store.getters.drawerState;
// }
// },
export default {
name: 'DocsMenu',
props: {
categories: Array,
active_category: Object,
active_topic: Object
},
data() {
let drawer = null;
return {
drawer
}
},
computed: {
drawerTitle: {
get() {
return "category=" + this.active_category.name
+"\ntopic=" + this.active_topic.name
}
},
isMenuLocked: {
get() {
return this.$store.getters["docs/getIsMenuLocked"]
}
},
isDrawerOpen: {
get() {
return this.$store.getters["docs/getIsDrawerOpen"]
},
set(val) {
this.$store.dispatch("docs/setIsDrawerOpen", val)
}
}
},
methods: {
clickCategory(category) {
this.$store.dispatch("docs/setActiveCategory", category)
this.$nuxt.$router.push({path: category.path})
},
clickTopic(category, topic) {
this.$store.dispatch("docs/setActiveCategory", category)
this.$store.dispatch("docs/setActiveTopic", topic)
this.$nuxt.$router.push({path: topic.path})
},
toggleDrawerOpened() {
this.drawer = !this.drawer;
},
toggleMenuLocked() {
this.$store.dispatch("docs/setIsMenuLocked", !this.$store.getters["docs/getIsMenuLocked"])
},
setIsDrawerOpen() {
this.$store.commit('docs/setIsDrawerOpen', this.drawer);
}
},
async created() {
await this.$store.dispatch("docs/loadCategories");
}
}
</script>
<style>
.v-list-item {
color: #FFFFFF;
}
/*.v-list-item {*/
/* color: #FFFFFF;*/
/*}*/
.v-list-item--title {
color: #FFFFFF;
}
/*.v-list-item--title {*/
/* color: #FFFFFF;*/
/*}*/
.v-list-item--active {
color: #FFFFFF !important;
}
.v-list-item--disabled {
color: #DDDDDD !important;
}
.v-list-item--active {
/*border: 1px black;*/
background-color: #EEEEEE;
/*color: #FFFFFF !important;*/
}
div.theme--light.v-list-item:not(.v-list-item--active):not(.v-list-item--disabled) {
color: #DDDDDD !important;
}
a.theme--light.v-list-item:not(.v-list-item--active):not(.v-list-item--disabled) {
color: #DDDDDD !important;
}
/*.v-list-item--disabled {*/
/* color: #DDDDDD !important;*/
/*}*/
.nuxt-link-exact-active {
/*color: #52c41a;*/
background-color: #7F828B;
color: #52c41a;
/*color: #000000;*/
}
/*div.theme--light.v-list-item:not(.v-list-item--active):not(.v-list-item--disabled) {*/
/* color: #DDDDDD !important;*/
/*}*/
/*a.theme--light.v-list-item:not(.v-list-item--active):not(.v-list-item--disabled) {*/
/* color: #DDDDDD !important;*/
/*}*/
.isactive {
background-color: #7F828B;
color: #52c41a;
}
/*.nuxt-link-exact-active {*/
/* !*color: #52c41a;*!*/
/* background-color: #7F828B;*/
/* color: #52c41a;*/
/* !*color: #000000;*!*/
/*}*/
.router-link-active {
background-color: #FFFFFF;
color: #FFFFFF;
/*.isactive {*/
/* background-color: #7F828B;*/
/* color: #52c41a;*/
/*}*/
}
/*.router-link-active {*/
/* background-color: #FFFFFF;*/
/* color: #FFFFFF;*/
/*}*/
</style>

View File

@@ -0,0 +1,25 @@
<template>
<v-app-bar app fluid>
<!-- <v-app-bar app dark fluid dense flat>-->
<v-toolbar-title><slot></slot></v-toolbar-title>
<v-spacer></v-spacer>
<v-toolbar-items>
<app-selector></app-selector>
<workspace-selector></workspace-selector>
</v-toolbar-items>
</v-app-bar>
</template>
<script>
import AppSelector from "@/components/AppSelector";
import WorkspaceSelector from "@/components/WorkspaceSelector";
export default {
name: "MainAppBar",
components: {WorkspaceSelector, AppSelector}
}
</script>
<style scoped>
</style>

View File

@@ -1,149 +1,197 @@
<template>
<div>
<markdown-it-vue class="md-body" :content="mdcontent" :options="mdoptions"/>
</div>
<!-- <v-main fluid class="d-block">-->
<!-- <markdown-it-vue class="d-block" :content="mdcontent" :options="mdoptions"/>-->
<markdown-it-vue class="md-body" ref="myMarkdownItVue" :content="mdcontent" :options="options"/>
<!-- </v-main>-->
</template>
<script>
import MarkdownItVue from 'markdown-it-vue'
// import 'markdown-it-vue/dist/markdown-it-vue.css'
// var hljs = require('highlight.js');
import MarkdownItVue from 'markdown-it-vue'
// https://www.npmjs.com/package/markdown-it-replace-link
// https://www.npmjs.com/package/markdown-it-relativelink
// import 'markdown-it-link-attributes'
// import 'markdown-it-relativelink'
// let mireli = require('markdown-it-relativelink')({prefix: 'http://example.com/'})
// let msa = require('markdown-it-smartarrows');
// this.use(msa);
import 'markdown-it-replace-link'
export default {
name: "MarkdownVue",
components: {MarkdownItVue},
mounted() {
console.log("mounted");
},
let mireplink = require('markdown-it-replace-link')
props: {
mdcontent: String,
mdoptions: Object
export default {
name: "MarkdownVue",
components: {MarkdownItVue},
mounted() {
let mmd = this.$refs.myMarkdownItVue;
mmd.use(mireplink);
// mdoptions: {
// type: Object,
// default: {
// markdownIt: {
// linkify: true
// },
// linkAttributes: {
// attrs: {
// target: '_blank',
// rel: 'noopener'
// }
// }
// }
// }
}
// mmd.use(mirl)
// console.log("typeof(mmd):" + typeof (mmd))
},
// data() {
// let options = {
// markdownIt: {
// linkify: true
// },
// linkAttributes: {
// attrs: {
// target: '_blank',
// rel: 'noopener'
// }
// },
// replaceLink: function(link, env) {
// return "LINK:" + link;
// }
// }
// return {options}
// },
props: {
mdcontent: String,
options: {
testing: "one two three"
}
}
}
</script>
<style>
code.hljs::before,code:before {
content: '';
}
.v-application code:before,.v-application code:after {
content: '';
}
.v-application code {
background-color: #E3FFE3;
/*background-color: #FBE5E1;*/
padding: unset;
/*padding: 10px;*/
color: unset;
}
code::before {
content: '';
}
.markdown-body pre {
/*background-color: #52c41a;*/
padding: 10px 10px 10px 0;
}
pre {
padding: 5px;
margin: 0px 0px 10px 5px;
}
ul {
padding-bottom: 10px;
}
div.markdown-body p {
padding: 10px 0 0 0;
}
code:after {
content: '';
}
code::after {
content: '';
}
.hljs code {
background-color: #E3FFE3;
padding: unset;
color: unset;
/*padding: 100px;*/
}
.md-body {
padding-left: 10px;
justify-content: left;
align-items: left;
text-align: left;
vertical-align: top;
alignment: top;
text-align: top;
webkit-box-align: initial;
}
pre {
background-color: #E3FFE3;
padding: unset;
color: unset;
}
.markdown-it-vue-alter-info {
border: 1px solid #91d5ff;
background-color: #e6f7ff;
}
.markdown-body {
padding: 15px;
}
.markdown-it-vue-alert-icon-info {
color: #1890ff;
}
/* code.hljs::before,code:before {*/
/* content: '';*/
/* }*/
/* .v-application code:before,.v-application code:after {*/
/* content: unset;*/
/* }*/
.markdown-it-vue-alter-success {
border: 1px solid #b7eb8f;
background-color: #f6ffed;
}
/* code::before {*/
/* content: '';*/
/* }*/
/* code:after {*/
/* content: '';*/
/* }*/
.markdown-it-vue-alert-icon-success {
color: #52c41a;
}
/* pre {*/
/* padding: 5px;*/
/* margin: 0px 0px 10px 5px;*/
/* }*/
/* ul {*/
/* padding-bottom: 10px;*/
/* }*/
.markdown-it-vue-alter-error {
border: 1px solid #f5222d;
background-color: #fff1f0;
}
/* code::after {*/
/* content: '';*/
/* }*/
.markdown-it-vue-alert-icon-error {
color: #f5222d;
}
/* .md-body {*/
/* padding-left: 10px;*/
/* justify-content: left;*/
/* align-items: left;*/
/* text-align: left;*/
/* vertical-align: top;*/
/* alignment: top;*/
/* text-align: top;*/
/* webkit-box-align: initial;*/
/* }*/
.markdown-it-vue-alter-warning {
border: 1px solid #ffe58f;
background-color: #fffbe6;
}
.markdown-it-vue-alter-info {
border: 1px solid #91d5ff;
background-color: #e6f7ff;
}
.markdown-it-vue-alert-icon-warning {
color: #faad14;
}
.markdown-it-vue-alert-icon-info {
color: #1890ff;
}
.markdown-it-vue-alter {
border-radius: 0;
border: 0;
margin-bottom: 0;
display: inline-flex;
font-family: 'Chinese Quote', -apple-system, BlinkMacSystemFont, 'Segoe UI',
'PingFang SC', 'Hiragino Sans GB', 'Microsoft YaHei', 'Helvetica Neue',
Helvetica, Arial, sans-serif, 'Apple Color Emoji', 'Segoe UI Emoji',
'Segoe UI Symbol';
font-size: 14px;
font-variant: tabular-nums;
line-height: 1.5;
color: rgba(0, 0, 0, 0.65);
box-sizing: border-box;
padding: 0;
list-style: none;
position: relative;
padding: 8px 15px 8px 37px;
border-radius: 4px;
width: 100%;
margin-bottom: 16px;
}
.markdown-it-vue-alter-success {
border: 1px solid #b7eb8f;
background-color: #f6ffed;
}
.markdown-it-vue-alter p {
margin-bottom: 2px;
}
.markdown-it-vue-alert-icon-success {
color: #52c41a;
}
.markdown-it-vue-alert-icon {
top: 11.5px;
left: 16px;
position: absolute;
}
.markdown-it-vue-alter-error {
border: 1px solid #f5222d;
background-color: #fff1f0;
}
.markdown-it-vue-alert-icon-error {
color: #f5222d;
}
.markdown-it-vue-alter-warning {
border: 1px solid #ffe58f;
background-color: #fffbe6;
}
.markdown-it-vue-alert-icon-warning {
color: #faad14;
}
.markdown-it-vue-alter {
border-radius: 0;
border: 0;
margin-bottom: 0;
display: inline-flex;
font-family: 'Chinese Quote', -apple-system, BlinkMacSystemFont, 'Segoe UI',
'PingFang SC', 'Hiragino Sans GB', 'Microsoft YaHei', 'Helvetica Neue',
Helvetica, Arial, sans-serif, 'Apple Color Emoji', 'Segoe UI Emoji',
'Segoe UI Symbol';
font-size: 14px;
font-variant: tabular-nums;
line-height: 1.5;
color: rgba(0, 0, 0, 0.65);
box-sizing: border-box;
padding: 0;
list-style: none;
position: relative;
padding: 8px 15px 8px 37px;
border-radius: 4px;
width: 100%;
margin-bottom: 16px;
}
.markdown-it-vue-alter p {
margin-bottom: 2px;
}
.markdown-it-vue-alert-icon {
top: 11.5px;
left: 16px;
position: absolute;
}
</style>

View File

@@ -0,0 +1,94 @@
<template>
<!-- <v-container fluid class="d-flex pa-3">-->
<v-row>
<v-col cols="12">
<v-text-field dense
full-width
label="Name of new workspace"
v-if="mode==='adding'"
v-model="new_workspace"
ref="new_workspace_input"
hint="workspace name"
@blur="commitWorkspace(new_workspace)"
@keydown.enter="commitWorkspace(new_workspace)"
@keydown.esc="cancelWorkspace()"
></v-text-field>
<!-- label="workspace"-->
<v-select dense outlined
full-width
hide-details="true"
hint="current workspace"
v-if="mode==='showing'"
v-model="workspace"
:items="workspaces"
item-text="name"
item-value="name"
prepend-inner-icon="mdi-folder"
title="active workspace"
>
<template v-slot:append-item>
<v-list-item>
<v-btn link @click="addWorkspace()">+ Add Workspace</v-btn>
<v-spacer></v-spacer>
<v-btn to="/ui/workspaces">Manage</v-btn>
</v-list-item>
</template>
</v-select>
</v-col>
</v-row>
<!-- </v-container>-->
</template>
<script>
export default {
name: 'workspace-selector',
data() {
let mode = "showing";
let new_workspace = "";
return {mode, new_workspace}
},
computed: {
workspace: {
get() {
return this.$store.getters["workspaces/getWorkspace"]
},
set(val) {
this.$store.dispatch("workspaces/setWorkspace", val)
}
},
workspaces: {
get() {
return this.$store.getters["workspaces/getWorkspaces"]
},
set(val) {
this.$store.dispatch("workspaces/setWorkspaces", val)
}
}
},
methods: {
cancelWorkspace: function () {
this.mode = "showing";
this.new_workspace = "";
},
addWorkspace: function () {
this.mode = "adding";
// this.$refs.new_workspace_input.focus();
this.$nextTick(() => {
this.$refs.new_workspace_input.focus();
});
},
commitWorkspace: function ({$store}) {
// console.log("commit:" + JSON.stringify(this.new_workspace));
this.$store.dispatch("workspaces/activateWorkspace", this.new_workspace);
this.new_workspace = "";
this.mode = "showing";
}
},
created() {
console.log("created component...");
this.$store.dispatch('workspaces/initWorkspaces', "selector load");
}
}
</script>
<style>
</style>

View File

@@ -0,0 +1,51 @@
/**
* Path scheme:
* /docs/cat1/t1/t2/t3.md
* _____ docs prefix
* / delimiter
* ____ category name
* / delimiter
* ___________ topic path, including separators
* ________ topic name, without extension
*
* /docs/cat1/index.md summary doc for cat1
* ______________________ topic path
* __________ category path
*/
export default {
getCategory(route, categories) {
let active_category = categories[0];
if (!route.path) {
active_category = categories[0];
} else {
let parts = route.path.split(/\/|%2F/)
if (parts[1]!=="docs" || parts.length===1) {
throw "invalid path for docs: '" + route.path + "' parts[0]=" + parts[0] + " parts=" + JSON.stringify(parts,null,2)
}
if (parts.length>2) {
let found = categories.find(x => x.name === parts[2]);
if (found) {
active_category = found;
}
}
}
return active_category;
},
getTopic(route, categories, active_category) {
let active_topic = active_category.summary_topic;
if (!route.path) {
active_topic = active_category.topics[0]
} else {
let found_topic = active_category.topics.find(x => {return x.path === route.path})
if (found_topic) {
active_topic = found_topic;
}
}
return active_topic;
}
}

View File

@@ -0,0 +1,42 @@
/**
* Path scheme:
* /docs/cat1/t1/t2/t3.md
* _____ docs prefix
* / delimiter
* ____ category name
* / delimiter
* ___________ topic path, including separators
* ________ topic name, without extension
*
* /docs/cat1/index.md summary doc for cat1
* ______________________ topic path
* __________ category path
*/
export default {
url(document, context, path) {
console.log("isDev=" + context.isDev)
console.log("document.location=" + JSON.stringify(document.location, null, 2))
let origin = document.location.origin;
let base = origin;
if (origin.includes(":3003")) {
base = origin.replace(":3003", ":12345")
}
console.log("base=" + base);
let url = base + path;
console.log("url=" + url);
return url;
},
localize(body, baseurl) {
// console.log("localize([body],baseurl='" + baseurl + "'")
// const regex = /\[([^/][^]]+)]/;
// let remaining = body;
//
// while (remaining.)
//
return body;
}
}

View File

@@ -3,53 +3,57 @@
<nuxt />
</div>
</template>
<script>
export default {
name: "DefaultLayout"
}
</script>
<style>
html {
font-family: 'Source Sans Pro', -apple-system, BlinkMacSystemFont, 'Segoe UI',
Roboto, 'Helvetica Neue', Arial, sans-serif;
font-size: 16px;
word-spacing: 1px;
-ms-text-size-adjust: 100%;
-webkit-text-size-adjust: 100%;
-moz-osx-font-smoothing: grayscale;
-webkit-font-smoothing: antialiased;
box-sizing: border-box;
}
/*html {*/
/* font-family: 'Source Sans Pro', -apple-system, BlinkMacSystemFont, 'Segoe UI',*/
/* Roboto, 'Helvetica Neue', Arial, sans-serif;*/
/* font-size: 16px;*/
/* word-spacing: 1px;*/
/* -ms-text-size-adjust: 100%;*/
/* -webkit-text-size-adjust: 100%;*/
/* -moz-osx-font-smoothing: grayscale;*/
/* -webkit-font-smoothing: antialiased;*/
/* box-sizing: border-box;*/
/*}*/
*,
*:before,
*:after {
box-sizing: border-box;
margin: 0;
}
/**,*/
/**:before,*/
/**:after {*/
/* box-sizing: border-box;*/
/* margin: 0;*/
/*}*/
.button--purple {
display: inline-block;
border-radius: 4px;
border: 1px solid #0C1439;
color: #0C1439;
text-decoration: none;
padding: 10px 30px;
}
/*.button--purple {*/
/* display: inline-block;*/
/* border-radius: 4px;*/
/* border: 1px solid #0C1439;*/
/* color: #0C1439;*/
/* text-decoration: none;*/
/* padding: 10px 30px;*/
/*}*/
.button--purple:hover {
color: #fff;
background-color: #0C1439;
}
/*.button--purple:hover {*/
/* color: #fff;*/
/* background-color: #0C1439;*/
/*}*/
.button--grey {
display: inline-block;
border-radius: 4px;
border: 1px solid #35495e;
color: #35495e;
text-decoration: none;
padding: 10px 30px;
margin-left: 15px;
}
/*.button--grey {*/
/* display: inline-block;*/
/* border-radius: 4px;*/
/* border: 1px solid #35495e;*/
/* color: #35495e;*/
/* text-decoration: none;*/
/* padding: 10px 30px;*/
/* margin-left: 15px;*/
/*}*/
.button--grey:hover {
color: #fff;
background-color: #35495e;
}
/*.button--grey:hover {*/
/* color: #fff;*/
/* background-color: #35495e;*/
/*}*/
</style>

View File

@@ -1,5 +1,5 @@
<template>
<v-app dark>
<v-app>
<h1 v-if="error.statusCode === 404">
{{ pageNotFound }}
</h1>

View File

@@ -1,188 +0,0 @@
// asyncData in multiple mixins seems to be broken, or worse, working as designed
export default {
async asyncData(context) {
function fetchStatusHandler(response) {
if (response.status === 200) {
return response;
} else {
throw new Error(response.statusText);
}
}
if (context.req) {
console.log("avoiding server-side async");
return;
}
let baseurl = document.location.href.split('/').slice(0,3).join('/');
if (context.isDev && baseurl.includes(":3000")) {
console.log("Dev mode: remapping 3000 to 12345 for split dev environment.");
baseurl = baseurl.replace("3000","12345");
}
let services = baseurl + "/services";
console.log("async loading get_categories data: context: " + context);
var fm = require('front-matter');
let paths = await fetch(services+"/docs/markdown.csv")
.then(res => {
return res.text()
})
.then(body => {
return body.split("\n")
})
.catch(err => {
console.log("error:" + err)
});
let imports = [];
let promises = [];
for (let index in paths) {
let key = paths[index];
if (key == null || key == "") {
continue
}
const [, name] = key.match(/(.+)\.md$/);
let detailName = key.split("/").filter(x => x.includes(".md"))[0];
detailName = detailName.substr(0, detailName.length - 3);
let categories = key.split("/").filter(x => !x.includes("."))
//const mdMeta = resolve(key);
promises.push(fetch(services + "/docs/markdown/" + key)
.then(res => res.text())
.then(body => {
return {
"rawMD": body,
"detailName": detailName,
"categories": categories,
"name": name
}
}));
}
var mdData = await Promise.all(
promises
);
for(var data of mdData){
let rawMD = data.rawMD;
var mdMeta = fm(rawMD);
if (mdMeta.attributes == null || mdMeta.attributes.title == null) {
mdMeta.attributes.title = data.detailName;
}
if (typeof mdMeta.attributes.weight === 'undefined') {
mdMeta.attributes.weight = 0;
}
mdMeta.categories = data.categories;
mdMeta.filename = encodeURIComponent(data.name);
//console.log("mdMeta:" + JSON.stringify(mdMeta));
imports.push(mdMeta);
}
const categorySet = new Set();
imports.forEach(x => {
categorySet.add(x.categories.toString())
});
const categories = Array.from(categorySet).map(category => {
let docs = imports.filter(x => x.categories.toString() === category);
let summarydoc = docs.find(x => x.filename.endsWith('index'));
docs = docs.filter(x=>!x.filename.endsWith('index'));
docs.forEach(d => delete d.body);
docs.sort((a, b) => a.attributes.weight - b.attributes.weight);
let weight = summarydoc ? summarydoc.attributes.weight : 0;
let categoryName = summarydoc ? ( summarydoc.attributes.title ? summarydoc.attributes.title : category ) : category;
return {category, categoryName, docs, summarydoc, weight}
}
).sort((c1,c2) => c1.weight - c2.weight);
let active_category='';
let active_category_name='';
let active_topic='';
// IFF no category was active, then make the first category active.
if (!context.params.slug) {
console.log("params.slug was not defined");
active_category=categories[0].category;
active_category_name=categories[0].categoryName;
if (categories[0].summarydoc == null && categories[0].docs.length>0) {
active_topic = categories[0].docs[0].filename;
}
} else {
let parts = context.params.slug.split("/",2);
active_category=parts[0];
console.log("==> params.slug[" + context.params.slug + "] active_category[" + active_category + "]");
active_topic = parts.length>1 ? parts[1] : null;
}
if (active_topic !== null && active_topic.endsWith(".html")) {
active_topic = active_topic.substr(0,active_topic.length-5);
}
if (active_topic !== null && active_topic.endsWith(".md")) {
active_topic = active_topic.substr(0,active_topic.length-3);
}
if (active_category !== null && active_category.endsWith(".html")) {
active_category = active_category.substr(0,active_category.length-5);
}
if (active_category !== null && active_category.endsWith(".md")) {
active_category = active_category.substr(0,active_category.length-3);
}
let foundCategory = categories.find(c => c.category === active_category);
if (foundCategory != undefined){
active_category_name = categories.find(c => c.category === active_category).categoryName;
}
console.log("==> active category[" + active_category + "] topic["+ active_topic +"]");
// At this point, we have an active category or even a topic.
// We're all in on loading markdown, but which one?
let docname = active_category;
if (active_topic) {
docname += '/' + active_topic + '.md';
} else {
docname += '/' + 'index.md';
}
console.log("docname: " + docname);
var fm = require('front-matter');
let docbody = "";
let mdPath = services + '/docs/markdown/' + docname;
let rawMD = await fetch(services + "/docs/markdown/" + docname)
.then(fetchStatusHandler)
.then(res => res.text())
.then(body => docbody = body)
.catch(function(error) {
console.log(error);
});;
var markdown = fm(rawMD);
// console.log("markdown_body:\n" + markdown.body);
let mydata = {
markdown_attr: markdown.attributes,
markdown_body: markdown.body,
categories: categories,
active_category: active_category,
active_category_name : active_category_name,
active_topic: active_topic
};
return mydata;
}
}

View File

@@ -1,52 +1,52 @@
// asyncData in multiple mixins seems to be broken, or worse, working as designed
export default {
async asyncData(context) {
// if (context.req) {
// console.log("avoiding server-side async");
// return;
// }
let baseurl = document.location.href.split('/').slice(0,3).join('/');
if (context.isDev && baseurl.includes(":3000")) {
console.log("Dev mode: remapping 3000 to 12345 for split dev environment.");
baseurl = baseurl.replace("3000","12345");
}
let services = baseurl + "/services";
// console.log("async loading get_categories data: context: " + context);
var fm = require('front-matter');
let namespaces_endpoint = services + "/docs/namespaces";
let namespaces = await context.$axios.$get(namespaces_endpoint);
// let namespaces = await fetch(services+"/docs/namespaces")
// .then(response => {
// return response.json()
// })
// .catch(err => {
// console.log("error:" + err)
// });
const collated = Array();
for (let ena in namespaces) {
for (let ns in namespaces[ena]) {
collated.push({
namespace: ns,
show: (ena==="enabled"),
paths: namespaces[ena]
});
console.log ("ns:"+ ns + ", ena: " + ena);
}
// namespaces[ena].forEach(e => {e.isEnabled = (ena === "enabled")});
// collated=collated.concat(namespaces[ena])
}
let result={namespaces: collated};
console.log("namespaces result:"+JSON.stringify(result));
return result;
}
}
// // asyncData in multiple mixins seems to be broken, or worse, working as designed
// export default {
// async asyncData(context) {
//
// // if (context.req) {
// // console.log("avoiding server-side async");
// // return;
// // }
//
// let baseurl = document.location.href.split('/').slice(0,3).join('/');
//
// if (context.isDev && baseurl.includes(":3000")) {
// console.log("Dev mode: remapping 3000 to 12345 for split dev environment.");
// baseurl = baseurl.replace("3000","12345");
// }
//
// let services = baseurl + "/services";
//
// // console.log("async loading get_categories data: context: " + context);
// var fm = require('front-matter');
//
// let namespaces_endpoint = services + "/docs/namespaces";
//
// let namespaces = await context.$axios.$get(namespaces_endpoint);
// // let namespaces = await fetch(services+"/docs/namespaces")
// // .then(response => {
// // return response.json()
// // })
// // .catch(err => {
// // console.log("error:" + err)
// // });
//
// const collated = Array();
// for (let ena in namespaces) {
// for (let ns in namespaces[ena]) {
// collated.push({
// namespace: ns,
// show: (ena==="enabled"),
// paths: namespaces[ena]
// });
// console.log ("ns:"+ ns + ", ena: " + ena);
// }
// // namespaces[ena].forEach(e => {e.isEnabled = (ena === "enabled")});
// // collated=collated.concat(namespaces[ena])
// }
//
// let result={namespaces: collated};
// // console.log("namespaces result:"+JSON.stringify(result));
// return result;
// }
//
// }

View File

@@ -46,30 +46,33 @@ export default {
'@nuxtjs/axios'
],
axios: {
baseURL: "http://localhost:12345/",
port: 12345,
__browserBaseURL: 'http://localhost:12345/services/',
baseUrl: '/services/',
progress: true
},
/*
** vuetify module configuration
** https://github.com/nuxt-community/vuetify-module
*/
vuetify: {
theme: {
dark: false,
themes: {
light: {
primary: '#51DDBD',
secondary: '#2D4ADE',
accent: '#FA7D2B',
// primary: '#1976D2',
// secondary: '#424242',
// accent: '#82B1FF',
error: '#FF5252',
info: '#2196F3',
success: '#4CAF50',
warning: '#FFC107'
}
}
}
// theme: {
// dark: false,
// themes: {
// light: {
// primary: '#51DDBD',
// secondary: '#2D4ADE',
// accent: '#FA7D2B',
// // primary: '#1976D2',
// // secondary: '#424242',
// // accent: '#82B1FF',
// error: '#FF5252',
// info: '#2196F3',
// success: '#4CAF50',
// warning: '#FFC107'
// }
// }
// }
},
router: {
mode: 'hash'
@@ -96,6 +99,19 @@ export default {
** Build configuration
*/
build: {
html: {
minify: {
collapseBooleanAttributes: false,
decodeEntities: false,
minifyCSS: false,
minifyJS: false,
processConditionalComments: false,
removeEmptyAttributes: false,
removeRedundantAttributes: false,
trimCustomFragments: false,
useShortDoctype: false
}
},
// analyze: {
// analyzerMode: 'static'
// },
@@ -107,6 +123,17 @@ export default {
*/
extend(config, ctx) {
config.devtool = ctx.isClient ? 'eval-source-map' : 'inline-source-map'
config.module.rules.push({
test: /.g4/, loader: 'antlr4-webpack-loader'
})
config.module.rules.push({
test: /\.ya?ml$/,
use: 'js-yaml-loader',
})
config.node = {
fs: 'empty'
}
config.optimization.minimize = false;
}
}
, generate: {

View File

@@ -1,9 +1,10 @@
import colors from 'vuetify/es5/util/colors'
var glob = require('glob');
var path = require('path');
// var glob = require('glob');
// var path = require('path');
export default {
// target: 'static',
mode: 'spa',
/*
** Headers of the page
@@ -46,31 +47,34 @@ export default {
'@nuxtjs/axios'
],
axios: {
port: 12345
// port: 12345,
// browserBaseURL: 'http://localhost:12345/services/',
// baseURL: '/services/',
// browserBaseURL: '/services/',
progress: true
},
/*
** vuetify module configuration
** https://github.com/nuxt-community/vuetify-module
*/
vuetify: {
theme: {
dark: false,
themes: {
light: {
primary: '#51DDBD',
secondary: '#2D4ADE',
accent: '#FA7D2B',
// primary: '#1976D2',
// secondary: '#424242',
// accent: '#82B1FF',
error: '#FF5252',
info: '#2196F3',
success: '#4CAF50',
warning: '#FFC107'
}
}
}
// theme: {
// dark: false,
// themes: {
// light: {
// primary: '#51DDBD',
// secondary: '#2D4ADE',
// accent: '#FA7D2B',
// // primary: '#1976D2',
// // secondary: '#424242',
// // accent: '#82B1FF',
// error: '#FF5252',
// info: '#2196F3',
// success: '#4CAF50',
// warning: '#FFC107'
// }
// }
// }
},
router: {
mode: 'hash'
@@ -97,37 +101,63 @@ export default {
** Build configuration
*/
build: {
html: {
minify: {
collapseBooleanAttributes: false,
decodeEntities: false,
minifyCSS: false,
minifyJS: false,
processConditionalComments: false,
removeEmptyAttributes: false,
removeRedundantAttributes: false,
trimCustomFragments: false,
useShortDoctype: false
}
},
// analyze: {
// analyzerMode: 'static'
// },
cssSourceMap: true,
extractCSS: false,
// parallel: true,
optimization: {
minimize: false
},
/*
** You can extend webpack config here
*/
extend(config, ctx) {
config.devtool = ctx.isClient ? 'eval-source-map' : 'inline-source-map'
config.module.rules.push({
test: /.g4/, loader: 'antlr4-webpack-loader'
})
config.module.rules.push({
test: /\.ya?ml$/,
use: 'js-yaml-loader',
})
config.node = {
fs: 'empty'
}
config.optimization.minimize = false;
}
}
, generate: {
routes: dynamicRoutes
}
}
// , generate: {
// routes: dynamicRoutes
// }
var dynamicRoutes = getDynamicPaths({
'/docs': 'docs/*.md',
'/#/docs': '/#/docs/*.md'
});
function getDynamicPaths(urlFilepathTable) {
return [].concat(
...Object.keys(urlFilepathTable).map(url => {
var filepathGlob = urlFilepathTable[url];
return glob
.sync(filepathGlob, {cwd: 'content'})
.map(filepath => `${url}/${path.basename(filepath, '.md')}`);
})
);
}
// var dynamicRoutes = getDynamicPaths({
// '/docs': 'docs/*.md',
// '/#/docs': '/#/docs/*.md'
// });
//
// function getDynamicPaths(urlFilepathTable) {
// return [].concat(
// ...Object.keys(urlFilepathTable).map(url => {
// var filepathGlob = urlFilepathTable[url];
// return glob
// .sync(filepathGlob, {cwd: 'content'})
// .map(filepath => `${url}/${path.basename(filepath, '.md')}`);
// })
// );
// }

View File

@@ -1,129 +0,0 @@
var glob = require('glob');
var path = require('path');
export default {
node: {
fs: 'empty',
dgram: 'empty'
},
mode: 'spa',
/*
** Headers of the page
*/
head: {
title: process.env.npm_package_name || '',
meta: [
{ charset: 'utf-8' },
{ name: 'viewport', content: 'width=device-width, initial-scale=1' },
{ hid: 'description', name: 'description', content: process.env.npm_package_description || '' }
],
link: [
{ rel: 'icon', type: 'image/x-icon', href: '/favicon.ico' }
]
},
/*
** Customize the progress-bar color
*/
loading: { color: '#fff' },
/*
** Global CSS
*/
css: [
],
/*
** Plugins to load before mounting the App
*/
plugins: [
],
/*
** Nuxt.js dev-modules
*/
buildModules: [
// Simple usage
'@nuxtjs/vuetify'
],
/*
** Nuxt.js modules
*/
modules: [
],
/*
** Build configuration
*/
build: {
babel: {
presets({ envName }) {
const envTargets = {
client: { "chrome": "78" },
server: { node: "current" },
};
return [
[
"@nuxt/babel-preset-app",
{
targets: envTargets[envName]
}
]
]
}
},
html: {
minify: {
minifyJS: false,
minifyCSS: false,
collapseBooleanAttributes: false,
decodeEntities: false,
processConditionalComments: false,
removeEmptyAttributes: false,
removeRedundantAttributes: false,
trimCustomFragments: false,
useShortDoctype: false
}
},
optimization: {
minimize: false
}
,extend(config,ctx) {
config.devtool = ctx.isClient ? 'eval-source-map' : 'inline-source-map'
}
// , extend(config,ctx) {
// config.devtool = ctx.isClient ? 'eval-source-map' : 'inline-source-map';
// }
},
vuetify: {
theme: {
dark: true,
themes:{
dark:{
primary: '#FF7D2B',
secondary: '#0C153A',
accent: '#FF7D2B',
}
}
}
},
generate: {
routes: dynamicRoutes
}
}
var dynamicRoutes = getDynamicPaths({
'/docs': 'docs/*.md'
});
/* https://github.com/jake-101/bael-template */
function getDynamicPaths(urlFilepathTable) {
return [].concat(
...Object.keys(urlFilepathTable).map(url => {
var filepathGlob = urlFilepathTable[url];
return glob
.sync(filepathGlob, { cwd: 'content' })
.map(filepath => `${url}/${path.basename(filepath, '.md')}`);
})
);
}

File diff suppressed because it is too large Load Diff

View File

@@ -5,18 +5,37 @@
"author": "Sebastian Estevez & Jonathan Shook",
"private": true,
"scripts": {
"dev": "nuxt -c nuxt.config.dev.js",
"olddev": "nuxt -c nuxt.config.dev.js --port 3003",
"dev": "nuxt --port 3003",
"build": "nuxt build",
"start": "nuxt start",
"generate": "nuxt generate"
"generate": "nuxt generate",
"antlr": "cd antlr && node ../node_modules/antlr4-cli/bin/antlr4.js -Dlanguage=JavaScript -Werror -visitor -listener CQL3.g4"
},
"dependencies": {
"@nuxtjs/axios": "^5.9.5",
"@nuxtjs/vuetify": "^1.11.0",
"front-matter": "^3.1.0",
"markdown-it-smartarrows": "^1.0.1",
"markdown-it-vue": "^1.0.11",
"@nuxtjs/axios": "^5.12.2",
"@nuxtjs/vuetify": "^2.0.0-beta.2",
"@nuxt/webpack": "^2.14.4",
"@nuxtjs/babel-preset-app": "^0.8.0",
"vue": "^2.6.12",
"vue-template-compiler": "^2.6.12",
"front-matter": "^4.0.2",
"antlr4": "^4.8.0",
"antlr4-cli": "^4.5.3",
"file-saver": "^2.0.2",
"js-yaml": "^3.14.0",
"js-yaml-loader": "^1.2.2",
"node-fetch": "^2.6.0",
"nuxt": "^2.12.0"
"source-map-resolve": "^0.6.0",
"eslint": "^7.7.0",
"chokidar": "^3.4.2",
"npm": "^6.14.8",
"nuxt": "^2.14.4",
"webpack": "^4.44.1",
"markdown-it-vue": "^1.1.3",
"markdown-it-smartarrows": "^1.0.1",
"markdown-it-relativelink": "^0.2.0",
"markdown-it-replace-link": "^1.1.0",
"markdown-it-highlightjs": "^3.2.0"
}
}

View File

@@ -0,0 +1,137 @@
<template>
<v-app>
<docs-menu
:active_category="active_category"
:active_topic="active_topic"
:categories="categories"
@change="menuChanged()"
@select="menuChanged()"></docs-menu>
<v-app-bar app collapse-on-scroll dense>
<v-app-bar-nav-icon @click.stop="toggleDrawer"/>
<v-toolbar-title>{{ toolbarTitle }}</v-toolbar-title>
<v-toolbar-items>
</v-toolbar-items>
</v-app-bar>
<v-main>
<markdown-vue :mdcontent="markdown_body"/>
</v-main>
<v-footer app>
<span>&copy; 2020</span>
</v-footer>
</v-app>
</template>
<script>
import DocsMenu from '~/components/DocsMenu.vue'
import MarkdownVue from "~/components/MarkdownVue";
import docpaths from "@/js/docpaths.js"
export default {
data() {
return {
markdown_body: "testing",
active_category: null,
active_topic: null
}
},
components: {
DocsMenu, MarkdownVue
},
computed: {
categories: {
get() {
return this.$store.getters["docs/getCategories"]
}
},
toolbarTitle: {
get() {
if (this.active_category) {
return this.active_category.title
}
return "NoSQLBench Docs"
}
},
// markdown_body: {
// get() {
// return this.$store.getters["docs/getActiveMarkdownContent"]
// }
// },
active_category: {
get() {
return this.$store.getters["docs/getActiveCategory"]
},
async set(val) {
await this.$store.dispatch("docs/setCategories", val)
}
},
active_topic: {
get() {
return this.$store.getters["docs/getActiveTopic"]
},
async set(val) {
await this.$store.dispatch("docs/setActiveTopic")
}
}
},
async asyncData({params, route, store}) {
await store.dispatch("docs/loadCategories")
let categories = await store.getters["docs/getCategories"]
let active_category =docpaths.getCategory(route,categories);
let active_topic = docpaths.getTopic(route,categories, active_category);
return {
active_category,
active_topic,
markdown_body: active_topic.content
}
},
methods: {
async toggleDrawer() {
await this.$store.dispatch("docs/setIsDrawerOpen", this.$store.getters["docs/getIsDrawerOpen"])
},
menuChanged(evt) {
console.log("menu changed:" + JSON.stringify(evt, null, 2))
this.$forceUpdate()
}
}
}
</script>
<style>
.container {
min-height: 60vh;
display: flex;
justify-content: flex-start;
align-items: flex-start;
text-align: start;
margin: 0 auto 0 15px;
}
.title {
font-family: 'Quicksand', 'Source Sans Pro', -apple-system, BlinkMacSystemFont,
'Segoe UI', Roboto, 'Helvetica Neue', Arial, sans-serif;
display: block;
font-weight: 300;
font-size: 100px;
color: #35495e;
letter-spacing: 1px;
}
.subtitle {
font-weight: 300;
font-size: 42px;
color: #526488;
word-spacing: 5px;
padding-bottom: 15px;
}
.links {
padding-top: 15px;
}
</style>

View File

@@ -1,93 +0,0 @@
<template>
<v-app>
<docs-menu v-model="isDrawerOpen"
:categories="categories"
:active_category="active_category"
:active_category_name="active_category_name"
:active_topic="active_topic"/>
<v-app-bar app dark color="secondary" collapse-on-scroll dense >
<v-app-bar-nav-icon @click.stop="toggleDrawer"/>
<v-toolbar-title>NoSQLBench Docs</v-toolbar-title>
<v-spacer></v-spacer>
<v-toolbar-items>
</v-toolbar-items>
</v-app-bar>
<v-content>
<v-container>
<v-row align="stretch">
<div>{{testdata}}</div>
<div class="Doc">
<div class="doc-title">
<h1></h1>
</div>
<div>
<markdown-vue class="md-body content" :mdcontent="markdown_body"/>
</div>
</div>
</v-row>
</v-container>
</v-content>
<v-footer app dark color="secondary">
<span>&copy; 2020</span>
</v-footer>
</v-app>
</template>
<script>
import get_data from '~/mixins/get_data.js';
import DocsMenu from '~/components/DocsMenu.vue'
import MarkdownVue from "~/components/MarkdownVue";
export default {
mixins: [get_data],
components: {
DocsMenu, MarkdownVue
},
computed: {
isDrawerOpen() {
return this.$store.state.docs.isDrawerOpen;
},
isDrawerOpen2() {
return this.$store.getters.drawerState;
}
},
methods: {
toggleDrawer() {
this.$store.commit('docs/toggleDrawerState');
}
},
data(context) {
console.log("data context.params:" + JSON.stringify(context.params));
console.log("data context.route:" + JSON.stringify(context.route));
console.log("data context.query:" + JSON.stringify(context.query));
return {
testdata: this.$store.state.docs.example,
categories_list: [],
markdown_body: '',
active_topic: null,
active_category: null,
options: function () {
return {
markdownIt: {
linkify: true
},
linkAttributes: {
attrs: {
target: '_blank',
rel: 'noopener'
}
}
}
}
}
}
}
</script>

View File

@@ -1,125 +0,0 @@
<template>
<v-app>
<docs-menu v-model="isDrawerOpen"
:categories="categories"
:active_category="active_category"
:active_topic="active_topic"/>
<v-app-bar app dark color="secondary">
<v-app-bar-nav-icon color="primary" @click.stop="toggleDrawer"/>
<v-toolbar-title>NoSQLBench Docs</v-toolbar-title>
<v-spacer></v-spacer>
<v-toolbar-items>
<v-btn text href="https://github.com/nosqlbench/nosqlbench/wiki/Submitting-Feedback">SUBMIT FEEDBACK</v-btn>
</v-toolbar-items>
</v-app-bar>
<v-content>
<v-container>
<v-row align="stretch">
<div>{{testdata}}</div>
<div class="Doc">
<div class="doc-title">
<h1></h1>
</div>
<div>
<markdown-vue class="md-body content" :mdcontent="markdown_body"/>
</div>
</div>
</v-row>
</v-container>
</v-content>
<v-footer app dark color="secondary">
<span>&copy; 2020</span>
</v-footer>
</v-app>
</template>
<script>
import get_data from '~/mixins/get_data.js';
import DocsMenu from '~/components/DocsMenu.vue'
import MarkdownVue from "~/components/MarkdownVue";
export default {
mixins: [get_data],
components: {
DocsMenu, MarkdownVue
},
computed: {
isDrawerOpen() {
return this.$store.state.docs.isDrawerOpen;
},
isDrawerOpen2() {
return this.$store.getters.drawerState;
}
},
methods: {
toggleDrawer() {
this.$store.commit('docs/toggleDrawerState');
}
},
data(context) {
console.log("data context.params:" + JSON.stringify(context.params));
console.log("data context.route:" + JSON.stringify(context.route));
console.log("data context.query:" + JSON.stringify(context.query));
return {
testdata: this.$store.state.docs.example,
categories_list: [],
markdown_body: '',
active_topic: null,
active_category: null,
options: function () {
return {
markdownIt: {
linkify: true
},
linkAttributes: {
attrs: {
target: '_blank',
rel: 'noopener'
}
}
}
}
}
}
}
</script>
<style>
.container {
margin: 0 auto;
min-height: 60vh;
display: flex;
justify-content: center;
align-items: center;
text-align: center;
}
.title {
font-family: 'Quicksand', 'Source Sans Pro', -apple-system, BlinkMacSystemFont,
'Segoe UI', Roboto, 'Helvetica Neue', Arial, sans-serif;
display: block;
font-weight: 300;
font-size: 100px;
color: #35495e;
letter-spacing: 1px;
}
.subtitle {
font-weight: 300;
font-size: 42px;
color: #526488;
word-spacing: 5px;
padding-bottom: 15px;
}
.links {
padding-top: 15px;
}
</style>

View File

@@ -1,9 +1,9 @@
<template>
<v-app>
<v-main>
<div>
<v-list v-model="namespaces">
<v-list-item v-for="(namespace,idx) in namespaces" :key="idx" :title="namespace.namespace">
<div display="inline">ns:{{namespace.namespace + ": " + namespace.show}}</div>
<div>ns:{{namespace.namespace + ": " + namespace.show}}</div>
<v-list-item-title>{{namespace.namespace}}</v-list-item-title>
<v-switch v-model="namespace.show"></v-switch>
<!-- <v-list-item-action>-->
@@ -28,15 +28,15 @@
<!-- <v-list-item><v-label :v-bind="ns"/></v-list-item>-->
<!-- {{ns}}-->
<!-- </v-list>-->
</v-app>
</v-main>
</template>
<script>
import get_namespaces from '~/mixins/get_namespaces.js';
// import get_namespaces from '@/mixins/get_namespaces.js';
export default {
name: "namespaces",
mixins: [get_namespaces],
// mixins: [get_namespaces],
data(context) {
// return {
// namespaces: []

View File

@@ -1,9 +1,9 @@
<template>
<!-- https://github.com/Microsoft/vscode-recipes/tree/master/vuejs-cli -->
<div class="container">
<div class="container" style="justify-content: center; align-content: center; display: flex; width: 100%; margin-left: unset">
<!-- <recursive-menu/>-->
<div>
<logo/>
<div style="justify-content: center" align="middle">
<logo align="middle" style="justify-content: center; align-content: center"></logo>
<h1 class="title">nosqlbench</h1>
<h2 class="subtitle">open source, pluggable, nosql benchmarking suite</h2>
<div class="links">
@@ -25,34 +25,34 @@
</script>
<style>
.container {
margin: 0 auto;
min-height: 60vh;
display: flex;
justify-content: center;
align-items: center;
text-align: center;
}
/*.container {*/
/* margin: 0 auto;*/
/* min-height: 60vh;*/
/* display: flex;*/
/* justify-content: center;*/
/* align-items: center;*/
/* text-align: center;*/
/*}*/
.title {
font-family: 'Quicksand', 'Source Sans Pro', -apple-system, BlinkMacSystemFont,
'Segoe UI', Roboto, 'Helvetica Neue', Arial, sans-serif;
display: block;
font-weight: 300;
font-size: 100px;
color: #35495e;
letter-spacing: 1px;
}
/*.title {*/
/* font-family: 'Quicksand', 'Source Sans Pro', -apple-system, BlinkMacSystemFont,*/
/* 'Segoe UI', Roboto, 'Helvetica Neue', Arial, sans-serif;*/
/* display: block;*/
/* font-weight: 300;*/
/* font-size: 100px;*/
/* color: #35495e;*/
/* letter-spacing: 1px;*/
/*}*/
.subtitle {
font-weight: 300;
font-size: 42px;
color: #526488;
word-spacing: 5px;
padding-bottom: 15px;
}
/*.subtitle {*/
/* font-weight: 300;*/
/* font-size: 42px;*/
/* color: #526488;*/
/* word-spacing: 5px;*/
/* padding-bottom: 15px;*/
/*}*/
.links {
padding-top: 15px;
}
/*.links {*/
/* padding-top: 15px;*/
/*}*/
</style>

View File

@@ -0,0 +1,311 @@
<template>
<v-app>
<main-app-bar>NoSQLBench - Workload Builder</main-app-bar>
<v-layout>
<v-main>
<v-container fluid>
<v-layout row>
<v-flex>
<v-card>
<v-card-title>
Workload details
</v-card-title>
<v-col
cols="12"
sm="6"
md="10"
lg="10"
>
<v-text-field
outlined
label="Workload name"
v-model="workloadName"
></v-text-field>
<v-textarea
outlined
label="Create Table Statement"
v-model="createTableDef"
v-on:blur="parseStatement()"
></v-textarea>
</v-col>
<v-col cols="12">
<v-btn :title="save_title" v-if="parseSuccess" v-on:click="saveWorkloadToWorkspace()">{{ save_button }}</v-btn>
<v-btn :title="dl_title" v-if="parseSuccess" v-on:click="downloadWorkload()">{{ dl_button }}</v-btn>
</v-col>
</v-card>
</v-flex>
</v-layout>
</v-container>
</v-main>
</v-layout>
<v-footer app>
<span>&copy; 2020</span>
</v-footer>
</v-app>
</template>
<script>
import antlr4 from "antlr4";
import {saveAs} from "file-saver";
import yamlDumper from "js-yaml";
import CQL3Parser from '@/antlr/CQL3Parser.js';
import CQL3Lexer from '@/antlr/CQL3Lexer.js';
import defaultYaml from 'assets/default.yaml';
import basictypes from 'assets/basictypes.yaml';
import WorkspaceSelector from "@/components/WorkspaceSelector";
import AppSelector from "@/components/AppSelector";
import MainAppBar from "@/components/MainAppBar";
export default {
components: {
MainAppBar,
AppSelector,
WorkspaceSelector
},
data(context) {
let data = {
enabled: false,
createTableDef: "",
workloadName: "",
parseSuccess: false,
blob: null,
};
return data;
},
computed: {
save_button: function () {
return "Save to workspace '" + this.$store.getters["workspaces/getWorkspace"] + "'";
},
dl_button: function () {
return "Download as " + this.filename;
},
dl_title: function () {
return "Click to download the workload as '" + this.filename + "'";
},
filename: function () {
return this.workloadName + ".yaml";
},
save_title: function () {
return "Click to save this workload in the '" + this.workspace + "' workspace, or change the workspace in the app bar first.\n"
},
workspace: function () {
return this.$store.getters["workspaces/getWorkspace"]
},
enabled: function () {
return this.$store.getters["service_status/getEndpoints"]
}
},
methods: {
async parseStatement() {
console.log(this.$data.createTableDef);
const input = this.$data.createTableDef;
const chars = new antlr4.InputStream(input);
const lexer = new CQL3Lexer.CQL3Lexer(chars);
lexer.strictMode = false; // do not use js strictMode
const tokens = new antlr4.CommonTokenStream(lexer);
const parser = new CQL3Parser.CQL3Parser(tokens);
const context = parser.create_table_stmt();
try {
const keyspaceName = context.table_name().keyspace_name().getChild(0).getText()
const tableName = context.table_name().table_name_noks().getChild(0).getText()
const columnDefinitions = context.column_definitions().column_definition();
let columns = [];
let partitionKeys = [];
let clusteringKeys = [];
columnDefinitions.forEach(columnDef => {
if (columnDef.column_name() != null) {
columns.push({
"name": columnDef.column_name().getText(),
"type": columnDef.column_type().getText()
})
} else {
const primaryKeyContext = columnDef.primary_key()
if (primaryKeyContext.partition_key() != null) {
const partitionKeysContext = primaryKeyContext.partition_key().column_name();
partitionKeysContext.map((partitionKey, i) => {
const partitionKeyName = partitionKey.getText()
const col = {
"name": partitionKeyName,
"type": columns.filter(x => x.name == partitionKeyName)[0].type
}
partitionKeys.push(col)
})
}
if (primaryKeyContext.clustering_column().length != 0) {
const clusteringKeysContext = primaryKeyContext.clustering_column();
clusteringKeysContext.map((clusteringKey, i) => {
const clusteringKeyName = clusteringKey.getText()
const col = {
"name": clusteringKeyName,
"type": columns.filter(x => x.name == clusteringKeyName)[0].type
}
clusteringKeys.push(col)
})
}
}
})
columns = columns.filter(col => {
return partitionKeys.filter(pk => pk.name == col.name).length == 0 && clusteringKeys.filter(cc => cc.name == col.name).length == 0
})
const allColumns = [].concat(columns, partitionKeys, clusteringKeys)
this.$data.tableName = tableName;
this.$data.keyspaceName = keyspaceName;
this.$data.columns = columns;
this.$data.clusteringKeys = clusteringKeys;
this.$data.partitionKeys = partitionKeys;
this.$data.allColumns = allColumns;
console.log(this.$data)
console.log(defaultYaml)
// schema and bindings
let createTableStatement = "CREATE TABLE IF NOT EXISTS <<keyspace:" + keyspaceName + ">>." + tableName + " (\n";
console.log(basictypes)
defaultYaml.bindings = {}
allColumns.forEach(column => {
let recipe = basictypes.bindings[column.type + "val"];
if (recipe == undefined) {
const chars = new antlr4.InputStream(column.type);
const lexer = new CQL3Lexer.CQL3Lexer(chars);
lexer.strictMode = false; // do not use js strictMode
const tokens = new antlr4.CommonTokenStream(lexer);
const parser = new CQL3Parser.CQL3Parser(tokens);
const typeContext = parser.column_type();
const collectionTypeContext = typeContext.data_type().collection_type();
const collectionType = collectionTypeContext.children[0].getText();
if (collectionType.toLowerCase() == "set") {
const type = collectionTypeContext.children[2].getText();
recipe = "Set(HashRange(1,<<set-count-" + column.name + ":5>>)," + basictypes.bindings[type + "val"] + ") -> java.util.Set"
} else if (collectionType.toLowerCase() == "list") {
const type = collectionTypeContext.children[2].getText();
recipe = "List(HashRange(1,<<list-count-" + column.name + ":5>>)," + basictypes.bindings[type + "val"] + ") -> java.util.List"
} else if (collectionType.toLowerCase() == "map") {
const type1 = collectionTypeContext.children[2].getText();
const type2 = collectionTypeContext.children[4].getText();
recipe = "Map(HashRange(1,<<map-count-" + column.name + ":5>>)," + basictypes.bindings[type1 + "val"] + "," + basictypes.bindings[type2 + "val"] + ") -> java.util.Map"
} else {
alert("Could not generate recipe for type: " + column.type + " for column: " + column.name)
}
}
defaultYaml.bindings[column.name] = recipe
createTableStatement = createTableStatement + column.name + " " + column.type + ",\n";
})
let pk = "PRIMARY KEY (("
pk = pk + partitionKeys.map(x => x.name).reduce((x, acc) => acc = acc + "," + x)
pk = pk + ")"
if (clusteringKeys.length > 0) {
pk = pk + "," + clusteringKeys.map(x => x.name).reduce((x, acc) => acc = acc + "," + x)
}
pk = pk + ")"
createTableStatement = createTableStatement + pk + "\n);"
defaultYaml.blocks[0].statements[0] = {"create-table": createTableStatement}
//rampup
let insertStatement = "INSERT INTO <<keyspace:" + keyspaceName + ">>." + tableName + " (\n";
insertStatement = insertStatement + allColumns.map(x => x.name).reduce((x, acc) => acc = acc + ",\n" + x) + "\n) VALUES (\n";
insertStatement = insertStatement + allColumns.map(x => "{" + x.name + "}").reduce((x, acc) => acc = acc + ",\n" + x) + "\n);"
defaultYaml.blocks[1].statements[0] = {"insert-rampup": insertStatement}
//main-write
defaultYaml.blocks[2].statements[0] = {"insert-main": insertStatement}
//main-read-partition
let readPartitionStatement = "SELECT * from <<keyspace:" + keyspaceName + ">>." + tableName + " WHERE ";
readPartitionStatement = readPartitionStatement + partitionKeys.map(x => x.name + "={" + x.name + "}").reduce((x, acc) => acc = acc + " AND " + x);
let readRowStatement = readPartitionStatement + ";";
if (clusteringKeys.length > 0) {
readPartitionStatement = readPartitionStatement + " AND " + clusteringKeys.map(x => x.name + "={" + x.name + "}").reduce((x, acc) => acc = acc + " AND " + x);
}
readPartitionStatement = readPartitionStatement + ";";
defaultYaml.blocks[3].statements[0] = {"read-partition": readPartitionStatement}
//main-read-row
defaultYaml.blocks[4].statements[0] = {"read-row": readRowStatement}
defaultYaml.description = this.$data.workloadName
const yamlOutputText = yamlDumper.dump(defaultYaml)
this.blob = new Blob([yamlOutputText], {type: "text/plain;charset=utf-8"});
this.parseSuccess = true;
} catch (e) {
console.log("blur, invalid create table def")
console.log(e)
}
},
downloadWorkload() {
saveAs(this.blob, this.$data.filename);
},
saveWorkloadToWorkspace() {
this.$store.dispatch("workspaces/putFile",{
workspace: this.workspace,
filename: this.filename,
content: this.blob
})
}
},
created() {
this.$store.dispatch('service_status/loadEndpoints')
}
}
</script>
<style>
/*.container {*/
/* margin: 0 auto;*/
/* display: flex;*/
/* justify-content: center;*/
/* align-items: center;*/
/* text-align: center;*/
/*}*/
/*.title {*/
/* font-family: 'Quicksand', 'Source Sans Pro', -apple-system, BlinkMacSystemFont,*/
/* 'Segoe UI', Roboto, 'Helvetica Neue', Arial, sans-serif;*/
/* display: block;*/
/* font-weight: 300;*/
/* font-size: 100px;*/
/* color: #35495e;*/
/* letter-spacing: 1px;*/
/*}*/
/*.subtitle {*/
/* font-weight: 300;*/
/* font-size: 42px;*/
/* color: #526488;*/
/* word-spacing: 5px;*/
/* padding-bottom: 15px;*/
/*}*/
/*.links {*/
/* padding-top: 15px;*/
/*}*/
</style>

View File

@@ -1,159 +1,16 @@
<template>
<v-app>
<v-app-bar app dark color="secondary">
<v-toolbar-title>NoSQLBench</v-toolbar-title>
<v-spacer></v-spacer>
<v-toolbar-items>
<v-btn text href="https://github.com/nosqlbench/nosqlbench/wiki/Submitting-Feedback">SUBMIT FEEDBACK</v-btn>
</v-toolbar-items>
</v-app-bar>
<v-layout
justify-center
align-center>
<v-content>
<v-container fluid v-if="enabled">
<v-select
:items="workloadNames"
v-model="workloadName"
chips
v-on:change="getTemplates();"
label="Workload"
></v-select>
</v-container>
<v-container fluid>
<v-row
v-if="templates"
>
<v-col
cols="12"
sm="6"
md="10"
lg="10"
>
<v-card>
<v-card-title>
{{ workloadName }}
</v-card-title>
<v-col
v-for="(item, j) in Object.keys(templates)"
:key="item.command"
cols="12"
sm="6"
md="10"
lg="10"
>
<v-text-field
v-model="templates[item]"
:label="item"
>{{ item.name }}</v-text-field>
</v-col>
<v-col cols="12">
<v-btn v-if="workloadName" v-on:click="runWorkload()">Run Workload</v-btn>
</v-col>
</v-card>
</v-col>
</v-row>
</v-container>
</v-content>
</v-layout>
<v-footer app dark color="secondary">
<span>&copy; 2020</span>
</v-footer>
</v-app>
<v-app>
<main-app-bar></main-app-bar>
</v-app>
</template>
<script>
import get_data from '~/mixins/get_data.js';
export default {
mixins: [get_data],
components: {
},
computed: {
},
methods: {
async getTemplates() {
const data = await this.$axios.$get('/services/nb/parameters?workloadName=' + this.workloadName)
if (!data.err) {
this.$data.templates = data;
}
},
},
data(context) {
let data = {
workloadNames: [],
enabled: false,
workloadName: null,
templates: null,
};
return data;
},
async asyncData({ $axios, store }) {
let enabled = await $axios.$get("/services/nb/enabled")
.then(res => {
return res
})
.catch((e) => {
console.log("back-end not found");
})
let workloadNames = await $axios.$get("/services/nb/workloads")
.then(res => {
return res
})
.catch((e) => {
console.log("back-end not found");
})
return {
enabled: enabled,
workloadNames: workloadNames,
}
},
}
import MainAppBar from "~/components/MainAppBar";
export default {
name: "ui-index",
components: {MainAppBar}
}
</script>
<style>
.container {
margin: 0 auto;
display: flex;
justify-content: center;
align-items: center;
text-align: center;
}
.title {
font-family: 'Quicksand', 'Source Sans Pro', -apple-system, BlinkMacSystemFont,
'Segoe UI', Roboto, 'Helvetica Neue', Arial, sans-serif;
display: block;
font-weight: 300;
font-size: 100px;
color: #35495e;
letter-spacing: 1px;
}
.subtitle {
font-weight: 300;
font-size: 42px;
color: #526488;
word-spacing: 5px;
padding-bottom: 15px;
}
.links {
padding-top: 15px;
}
</style>

View File

@@ -0,0 +1,322 @@
<template>
<v-app>
<main-app-bar>NoSQLBench - Workload Executor</main-app-bar>
<v-container class="d-flex justify-center">
<v-main>
<v-row fluid class="d-flex">
<v-col cols="12" fluid class="d-flex justify-space-between justify-center">
<!-- <v-btn-toggle max="1" v-model="toggle_workspaces" @change="validateAndSearch()">-->
<!-- <v-btn :disabled="this.toggle_builtins===undefined && this.toggle_workspaces===0">-->
<!-- <v-container fluid class="d-flex">-->
<!-- <v-icon title="'include ' + workspace">mdi-folder-star</v-icon>-->
<!-- <div class="ma-2">workspace '{{ workspace }}'</div>-->
<!-- <v-icon v-if="this.toggle_workspaces===0">mdi-check</v-icon>-->
<!-- </v-container>-->
<!-- </v-btn>-->
<!-- <v-btn :disabled="this.toggle_builtins===undefined && this.toggle_workspaces===1">-->
<!-- <v-container fluid class="d-flex">-->
<!-- <v-icon title="search workspaces">mdi-folder-star-multiple</v-icon>-->
<!-- <div class="ma-2">all workspaces</div>-->
<!-- <v-icon v-if="this.toggle_workspaces===1">mdi-check</v-icon>-->
<!-- </v-container>-->
<!-- </v-btn>-->
<!-- </v-btn-toggle>-->
<v-btn-toggle v-model="toggle_builtins" @change="validateAndSearch()">
<v-btn :disabled="this.toggle_workspaces===undefined">
<v-container fluid class="d-flex">
<v-icon title="include built-in workloads">mdi-folder-open</v-icon>
<div class="ma-2">bundled</div>
<v-icon v-if="this.toggle_builtins===0">mdi-check</v-icon>
</v-container>
</v-btn>
</v-btn-toggle>
</v-col>
</v-row>
<v-row fluid v-if="workloads!==undefined">
<!-- :item-text="workspace"-->
<!-- :item-value="workloadName"-->
<!-- v-model="workloadName"-->
<v-select
:items="availableWorkloads"
item-text="workloadName"
item-value="workloadName"
v-model="selected"
v-on:change="loadTemplates()"
label="Workload"
></v-select>
</v-row>
<!-- TEMPLATES -->
<v-main justify-start align-start class="d-inline-block pa-4 ma-10">
<v-row no-gutters v-if="templates">
<v-card v-for="(item, j) in Object.keys(templateparams)"
:key="item"
class="ma-4 pa-4"
>
<!-- <v-card-title>{{item}}</v-card-title>-->
<v-card-title class="ma-0 pa-0">{{ item }}</v-card-title>
<v-text-field hide-details v-model="templateparams[item]" align="center"></v-text-field>
<!-- <v-card-title>{{ this.workloadName }}</v-card-title>-->
<!-- <v-row v-for="(item, j) in Object.keys(templates)" :key="item.command">-->
<!-- <v-text-field v-model="templates[item]" :label="item">{{ item.name }}></v-text-field>-->
<!-- </v-row>-->
</v-card>
</v-row>
<!-- <v-row v-if="templates">-->
<!-- <v-col>-->
<!-- <v-card>-->
<!-- <v-card-title>{{ this.workloadName }}</v-card-title>-->
<!-- <v-row v-for="(item, j) in Object.keys(templates)" :key="item.command">-->
<!-- <v-text-field v-model="templates[item]" :label="item">{{ item.name }}></v-text-field>-->
<!-- </v-row>-->
<!-- </v-card>-->
<!-- </v-col>-->
<!-- </v-row>-->
<v-row>
<v-col></v-col>
</v-row>
<v-row>
<v-col cols="12">
<v-btn :title="runtitle" v-if="this.selected" v-on:click="runWorkload()">{{ runin }}</v-btn>
</v-col>
</v-row>
</v-main>
</v-main>
</v-container>
<v-footer app>
<span>&copy; 2020</span>
</v-footer>
</v-app>
</template>
<script>
import WorkspaceSelector from "@/components/WorkspaceSelector";
import AppSelector from "@/components/AppSelector";
import MainAppBar from "@/components/MainAppBar";
export default {
name: 'app-run',
components: {
AppSelector,
WorkspaceSelector,
MainAppBar
},
data(context) {
let data = {
extraparams: {},
templateparams: {},
availableWorkloads: [],
enabled: false,
selected: null,
toggle_builtins: false,
toggle_workspaces: 0,
workspace_names: ['current', 'all'],
sample: {
"workspace": "default",
"yamlPath": "test1.yaml",
"scenarioNames": [
"default",
"main"
],
"templates": {
"keyspace": "a",
"main-cycles": "10000000",
"rampup-cycles": "10000000",
"read_cl": "LOCAL_QUORUM",
"read_partition_ratio": "1",
"read_row_ratio": "1",
"write_cl": "LOCAL_QUORUM",
"write_ratio": "1"
},
"description": "test1",
"workloadName": "test1"
},
};
return data;
},
computed: {
searchin: function () {
let searchin = Array();
if (this.toggle_workspaces === 0) {
searchin.push(this.workspace);
} else if (this.toggle_workspaces === 1) {
console.log("workspaces typeof: '" + typeof (this.workspaces) + "'");
this.workspaces.forEach(w => {
searchin.push(w.name);
})
}
if (this.toggle_builtins === 0) {
searchin.push("builtins");
}
let joined = searchin.join(",");
console.log("joined:'" + joined + "'")
return joined;
},
runin: function () {
return "Run Workload";
},
runtitle: function () {
return "Run this workload in workspace [" + this.workspace + "].\n"
},
workspace: function () {
return this.$store.getters["workspaces/getWorkspace"]
},
workspaces: function () {
return this.$store.getters["workspaces/getWorkspaces"]
},
workloads: function () {
return this.$store.getters["workloads/getWorkloads"];
},
templates: function () {
return this.$store.getters["workloads/getTemplates"];
}
// ,
// workloadNames: function () {
// for (const [key, value] of Object.entries(this.workloads)) {
// console.log("key=[" + key + "] value=[" + value + "]");
// }
// }
},
watch: {
toggle_builtins: function (val) {
this.validateAndSearch();
},
toggle_workspaces: function (val) {
this.validateAndSearch();
},
templates: function (val) {
console.log("templates property changed");
if (val === undefined) {
this.templateparams = undefined;
} else {
this.templateparams = {};
Object.keys(val).forEach(k => {
console.log("k:" + k + " = " + val[k])
if (!this.templateparams[k]) {
this.templateparams[k] = val[k];
}
})
}
}
},
created() {
this.validateAndSearch();
this.$store.subscribe((mutation, state) => {
if (mutation.type === 'workloads/setWorkloads') {
// console.log("detected update to workloads:" + JSON.stringify(this.workloads));
this.availableWorkloads = state.workloads.workloads;
} else if (mutation.type === 'workspaces/setWorkspace') {
// console.log("detected update to workspace:" + JSON.stringify(this.workspace));
this.validateAndSearch();
}
// else if (mutation.type === 'workloads/setTemplates') {
// console.log("detected update to templates:" + JSON.stringify(this.workspace));
// Object.keys(this.templates).forEach(t => {
// console.log("template:" + t)
// if (this.templateparams[t]) {
// this.templateparams[t] = this.templates[t];
// console.log("added '" + this.templates[t])
// }
// })
// }
});
},
methods: {
validateAndSearch() {
let params = {
searchin: this.searchin,
reason: "search params changed"
}
this.$store.dispatch("workloads/fetchWorkloads", params);
},
loadTemplates() {
let params = {
workload: this.selected,
searchin: this.searchin,
reason: "workload selected"
}
this.$store.dispatch("workloads/fetchTemplates", params);
let templates = this.$store.getters["workloads/getTemplates"]
console.log("templates?" + JSON.stringify(templates, null, 2))
// Object.keys(this.templates).forEach(t => {
// console.log("t:" + t)
// // if (!this.templateparams[t]) {
// // this.templateparams[t]=this.templates[t];
// // }
// })
},
runWorkload() {
console.log("running workload '" + this.selected + "'")
// let workload = this.availableWorkloads.find(w => w.workloadName === this.selected);
let commands=[];
commands.push(this.selected);
Object.keys(this.templates).forEach(k => {
if (this.templateparams && this.templateparams[k]!==this.templates[k]) {
commands.push(k+"="+this.templateparams[k])
}
})
Object.keys(this.extraparams).forEach(k => {
commands.push(k+"="+this.extraparams[k])
})
let erq = {
scenario_name: this.selected + "_DATESTAMP",
workspace: this.workspace,
commands
}
console.log("submitting:" + JSON.stringify(erq));
this.$store.dispatch("scenarios/runScenario", erq);
}
}
}
</script>
<style>
/*.container {*/
/* margin: 0 auto;*/
/* display: flex;*/
/* justify-content: center;*/
/* align-items: center;*/
/* text-align: center;*/
/*}*/
/*.title {*/
/* font-family: 'Quicksand', 'Source Sans Pro', -apple-system, BlinkMacSystemFont,*/
/* 'Segoe UI', Roboto, 'Helvetica Neue', Arial, sans-serif;*/
/* display: block;*/
/* font-weight: 300;*/
/* font-size: 100px;*/
/* color: #35495e;*/
/* letter-spacing: 1px;*/
/*}*/
/*.subtitle {*/
/* font-weight: 300;*/
/* font-size: 42px;*/
/* color: #526488;*/
/* word-spacing: 5px;*/
/* padding-bottom: 15px;*/
/*}*/
/*.links {*/
/* padding-top: 15px;*/
/*}*/
</style>

View File

@@ -0,0 +1,226 @@
<template>
<v-app>
<main-app-bar>NoSQLBench - Execution Status</main-app-bar>
<v-main justify-start align-start class="d-inline-block pa-4 ma-10">
<div class="row no-gutters">
<!-- SCENARIO CARD -->
<v-card min-width="300" max-width="300" max-height="400" raised elevation="3"
v-for="(scenario,w) in scenarios" :key="w"
class="pa-4 ma-4"
:title="JSON.stringify(scenario,null,2)"
:color="scenario.state==='Running' ? 'accent' : 'default'">
<v-row align-content="start" align="start">
<v-card-title>{{ scenario.scenario_name }}</v-card-title>
<v-card-subtitle>{{ scenario.state }}</v-card-subtitle>
<!-- <v-icon>mdi-magnify</v-icon>-->
</v-row>
<!-- Scenario Controls -->
<v-row>
<v-btn
v-if="scenario.state==='Running'"
@click="stop_scenario(scenario)"
:title="'Stop scenario ' + scenario.scenario_name"
><v-icon>mdi-stop</v-icon>
</v-btn>
<!-- <v-btn v-if="scenario.state!=='Running'">-->
<!-- <v-icon>mdi-play</v-icon>-->
<!-- </v-btn>-->
<v-btn v-if="scenario.state==='Finished' || scenario.state==='Errored'"
:title="'Purge scenario ' + scenario.scenario_name"
@click="purge_scenario(scenario)"
><v-icon>mdi-delete</v-icon>
</v-btn>
<v-btn v-if="scenario.state==='Scheduled'"
:title="'cancel scenario ' + scenario.scenario_name"
@click="purge_scenario(scenario)"
><v-icon>mdi-cancel</v-icon></v-btn>
</v-row>
<v-divider></v-divider>
<!-- Activities -->
<!-- <v-card-subtitle>activities:</v-card-subtitle>-->
<v-row v-for="(progress,p) in scenario.progress" :key="p">
<v-card min-width="290" :title="JSON.stringify(progress,null,2)">
<v-card-subtitle class="pa-2 ma-2">{{ progress.name }}</v-card-subtitle>
<v-progress-linear v-if="progress.completed"
:title="(progress.completed*100.0).toFixed(4)"
:value="progress_of(progress.completed)"
:color="colorof(progress.state)"
height="10"
></v-progress-linear>
<v-card-text v-if="progress.state==='Running'">
{{ (progress.eta_millis / 60000).toFixed(2) + " minutes remaining" }}
</v-card-text>
<v-card-text v-if="scenario.result && scenario.result.error">
{{ JSON.stringify(scenario.result.error, null, 2) }}
</v-card-text>
</v-card>
</v-row>
</v-card>
</div>
</v-main>
<!-- FOOTER -->
<v-footer :color="this.paused ? 'secondary' : 'unset'" app>
<v-row align="middle">
<v-progress-circular color="#AAAAAA"
width="1"
:title="'refresh in '+(this.beats_per_refresh-this.beats_in_refresh) + 's (' + this.beats_per_suffix + ')'"
:value="(this.beats_in_refresh/this.beats_per_refresh)*100"
@click="do_refresh"
>{{ beats_per_refresh - beats_in_refresh }}
</v-progress-circular>
<v-icon v-if="!this.paused" @click="pause()">mdi-pause</v-icon>
<v-icon v-if="this.paused" @click="unpause()">mdi-play</v-icon>
<v-icon @click="scale_up">mdi-plus</v-icon>
<v-icon @click="scale_down">mdi-minus</v-icon>
</v-row>
</v-footer>
</v-app>
</template>
<script>
import WorkspaceSelector from "@/components/WorkspaceSelector";
import {mapActions, mapGetters, mapMutations} from "vuex";
import AppSelector from "@/components/AppSelector";
import MainAppBar from "@/components/MainAppBar";
export default {
name: "workspaces.vue",
components: {
MainAppBar,
AppSelector,
WorkspaceSelector
},
data(context) {
let data = {
beat: false,
beats_per_refresh: 10,
beats_in_refresh: 0,
beats_per_suffix: '10s',
scales: [
[1, '1s'], [5, '5s'], [10, '10s'], [20, '20s'], [60, '60s'], [300, '5m'], [600, '10m'], [1800, '15m']
],
beat_tick: 1000,
heart_is_beating: false,
paused: false,
};
return data;
},
computed: {
scenarios: {
get() {
return this.$store.getters["scenarios/getScenarios"]
}
},
},
methods: {
colorof(state) {
if (state==='Running') {
return "success";
}
if (state==='Errored') {
return "error"
}
return "blue";
},
scale_up() {
let idx =Array.from(Array(this.scales.length), (x,i) => i)
.find(y => {return this.scales[y][0]===this.beats_per_refresh})
idx = Math.min(idx+1,this.scales.length-1);
console.log("idx:"+idx)
this.beats_per_refresh=this.scales[idx][0];
this.beats_per_suffix = this.scales[idx][1];
},
scale_down() {
let idx =Array.from(Array(this.scales.length), (x,i) => i)
.find(y => {return this.scales[y][0]===this.beats_per_refresh})
idx = Math.max(idx-1,0);
console.log("idx:"+idx)
this.beats_per_refresh=this.scales[idx][0];
this.beats_per_suffix = this.scales[idx][1];
},
stop_scenario(scenario) {
console.log("stopping scenario: " + scenario.scenario_name);
this.$store.dispatch("scenarios/stopScenario", scenario.scenario_name)
this.do_refresh();
},
purge_scenario(scenario) {
console.log("purging scenario: " + scenario.scenario_name);
this.$store.dispatch("scenarios/deleteScenario", scenario.scenario_name);
this.do_refresh();
},
pause() {
this.paused = true;
},
unpause() {
this.paused = false;
this.heartbeat();
},
heartbeat() {
this.heart_is_beating = false;
if (this.paused) {
return;
}
let seconds = new Date().getSeconds();
// console.log("seconds:" + seconds);
this.beat = (seconds % 2) === 0;
this.beats_in_refresh++;
if (this.beats_in_refresh >= this.beats_per_refresh) {
this.do_refresh();
}
if (!this.heart_is_beating) {
this.heart_is_beating = true;
setTimeout(this.heartbeat, this.beat_tick);
}
},
do_refresh() {
this.beats_in_refresh = 0;
this.$store.dispatch("scenarios/loadScenarios", "timer refresh")
},
progress_of(completion) {
if (isNaN(completion)) {
return undefined;
}
let progress = (completion * 100.0).toFixed(2);
return progress;
},
is_running(scenario) {
return scenario.progress.find(x => {
return x.state === "Running"
});
},
abbrev(name) {
return name;
}
},
created() {
console.log("created component...");
this.$store.dispatch("scenarios/loadScenarios", "watch panel load");
},
mounted() {
setTimeout(this.heartbeat, 1000);
}
}
</script>
<style>
</style>

View File

@@ -0,0 +1,129 @@
<template>
<v-app>
<main-app-bar></main-app-bar>
<!-- <workspace-selector @changed="seenChange(workspace)"></workspace-selector>-->
<!-- <v-btn title="start a new workspace" @click="startNewWorkspace()">-->
<!-- <v-icon>mdi-folder-plus-outline</v-icon>-->
<!-- </v-btn>-->
<!-- <v-btn icon title="upload a workspace zip file">-->
<!-- <v-icon>mdi-folder-upload</v-icon>-->
<!-- </v-btn>-->
<!-- <v-btn icon title="download workspaces.zip">-->
<!-- <v-icon>mdi-briefcase-download</v-icon>-->
<!-- </v-btn>-->
<!-- <v-btn icon title="upload workspaces.zip">-->
<!-- <v-icon>mdi-briefcase-upload</v-icon>-->
<!-- </v-btn>-->
<v-content justify-start align-start class="d-inline-block pa-4 ma-10">
<div class="row no-gutters">
<v-card min-width="300" max-width="300" max-height="400" raised elevation="5" v-for="(cardspace,w) in workspaces" :key="w"
class="pa-4 ma-4">
<v-row>
<v-card-title title="workspace name">{{ cardspace.name }}</v-card-title>
<v-icon v-if="workspace === cardspace.name">mdi-check-bold</v-icon>
</v-row>
<v-card-subtitle title="last change">{{ abbrev(cardspace.summary.last_changed_filename) }}</v-card-subtitle>
<v-divider></v-divider>
<v-list align-start>
<v-simple-table>
<tbody>
<tr>
<td>Bytes</td>
<td>{{ cardspace.summary.total_bytes }}</td>
</tr>
<tr>
<td>Files</td>
<td>{{ cardspace.summary.total_files }}</td>
</tr>
</tbody>
</v-simple-table>
<v-divider></v-divider>
<v-list-item>
<v-btn title="view details of workspace">
<v-icon>mdi-magnify</v-icon>
</v-btn>
<v-spacer></v-spacer>
<v-btn title="download zipped workspace">
<v-icon>mdi-folder-download</v-icon>
</v-btn>
<v-spacer></v-spacer>
<v-btn title="purge workspace">
<v-icon @click="purgeWorkspace(cardspace.name)">mdi-trash-can</v-icon>
</v-btn>
</v-list-item>
</v-list>
</v-card>
</div>
</v-content>
</v-app>
</template>
<script>
import WorkspaceSelector from "@/components/WorkspaceSelector";
import {mapActions, mapGetters, mapMutations} from "vuex";
import AppSelector from "@/components/AppSelector";
import MainAppBar from "@/components/MainAppBar";
export default {
name: "workspaces.vue",
components: {
MainAppBar,
AppSelector,
WorkspaceSelector
},
data(context) {
let data = {};
return data;
},
computed: {
workspace: {
get() {
return this.$store.getters["workspaces/getWorkspace"]
},
set(val) {
this.$store.dispatch("workspaces/setWorkspace", val)
}
},
workspaces: {
get() {
return this.$store.getters["workspaces/getWorkspaces"]
},
set(val) {
this.$store.dispatch("workspaces/setWorkspaces", val)
}
},
},
methods: {
abbrev(name) {
return name;
},
purgeWorkspace: function (ws) {
console.log("purging " + ws);
this.$store.dispatch('workspaces/purgeWorkspace', ws);
// this.$store.dispatch("workspaces/setWorkspace")
this.$forceUpdate();
},
},
created() {
console.log("created component...");
this.$store.dispatch('workspaces/initWorkspaces', "workspace panel load");
}
}
</script>
<style>
</style>

View File

@@ -0,0 +1,86 @@
<template>
<v-app>
<main-app-bar></main-app-bar>
<!-- <workspace-selector @changed="seenChange(workspace)"></workspace-selector>-->
<!-- <v-btn title="start a new workspace" @click="startNewWorkspace()">-->
<!-- <v-icon>mdi-folder-plus-outline</v-icon>-->
<!-- </v-btn>-->
<!-- <v-btn icon title="upload a workspace zip file">-->
<!-- <v-icon>mdi-folder-upload</v-icon>-->
<!-- </v-btn>-->
<!-- <v-btn icon title="download workspaces.zip">-->
<!-- <v-icon>mdi-briefcase-download</v-icon>-->
<!-- </v-btn>-->
<!-- <v-btn icon title="upload workspaces.zip">-->
<!-- <v-icon>mdi-briefcase-upload</v-icon>-->
<!-- </v-btn>-->
<v-content justify-start align-start class="d-inline-block pa-4 ma-10">
<div class="row no-gutters">
</div>
</v-content>
</v-app>
</template>
<script>
import WorkspaceSelector from "~/components/WorkspaceSelector";
import {mapActions, mapGetters, mapMutations} from "vuex";
import AppSelector from "@/components/AppSelector";
import MainAppBar from "@/components/MainAppBar";
export default {
name: "workspaces.vue",
components: {
MainAppBar,
AppSelector,
WorkspaceSelector
},
data(context) {
let data = {};
return data;
},
computed: {
workspace: {
get() {
return this.$store.getters["workspaces/getWorkspace"]
},
set(val) {
this.$store.dispatch("workspaces/setWorkspace", val)
}
},
workspaces: {
get() {
return this.$store.getters["workspaces/getWorkspaces"]
},
set(val) {
this.$store.dispatch("workspaces/setWorkspaces", val)
}
},
},
methods: {
abbrev(name) {
return name;
},
purgeWorkspace: function (ws) {
console.log("purging " + ws);
this.$store.dispatch('workspaces/purgeWorkspace', ws);
// this.$store.dispatch("workspaces/setWorkspace")
this.$forceUpdate();
},
},
created() {
console.log("created component...");
this.$store.dispatch('workspaces/initWorkspaces', "workspace panel load");
}
}
</script>
<style>
</style>

View File

@@ -1,31 +1,254 @@
import endpoints from "@/js/endpoints";
// https://www.mikestreety.co.uk/blog/vue-js-using-localstorage-with-the-vuex-store
/**
categories: [
{
name,
title,
topics,
weight,
path
summary_topic: {
name,
path,
basename,
categories,
weight,
title,
content: mdMeta.body
},
topics: [
{
name,
path,
basename,
categories,
weight,
title,
content: mdMeta.body
}
]
}
]
* @returns {{isDrawerOpen: boolean, isMenuLocked: boolean, active_category: null, active_topic: null, categories: []}}
*/
export const state = () => ({
categories: [],
active_category: null,
active_topic: null,
isDrawerOpen: true,
isDrawerPinned: false
isMenuLocked: false
});
export const mutations = {
// initializeStore(state) {
// if(localStorage.getItem('store')) {
// this.replaceState(
// Object.assign(state,JSON.parse(localStorage.getItem('store')))
// );
// }
// },
toggleDrawerState(state, newDrawerState) {
if (state.isDrawerPinned) {
return;
}
state.isDrawerOpen=!state.isDrawerOpen;
export const getters = {
getCategories: (state, getters) => {
return state.categories;
},
setDrawer(state, newDrawerState) {
if (state.isDrawerPinned) {
return;
}
state.isDrawerOpen=newDrawerState;
getActiveCategory: (state, getters) => {
return state.active_category;
},
setMenuLock(state, newLockState) {
state.isDrawerPinned=newLockState;
getActiveTopic: (state, getters) => {
return state.active_topic;
},
getIsMenuLocked: (state, getters) => {
return state.isMenuLocked;
},
getIsDrawerOpen: (state, getters) => {
return state.isDrawerOpen;
},
getActiveMarkdownContent: (state, getters) => {
if (state.active_category===null) {
throw "unable to load active markdown for undefined category";
}
if (state.active_topic===null) {
throw "unable to load active markdown for undefined topic";
}
return state.active_topic.content;
}
};
}
export const mutations = {
setActiveCategory(state, active_category) {
state.active_category = active_category;
},
setActiveTopic(state, active_topic) {
state.active_topic = active_topic;
},
setCategories(state, categories) {
state.categories = categories;
},
toggleDrawerState(state, newDrawerState) {
if (state.isMenuLocked) {
return;
}
state.isDrawerOpen = !state.isDrawerOpen;
},
setIsDrawerOpen(state, newDrawerState) {
if (state.isMenuLocked) {
return;
}
state.isDrawerOpen = newDrawerState;
},
setIsMenuLocked(state, newLockState) {
state.isMenuLocked = newLockState;
}
};
export const actions = {
async setActiveCategory({commit, state, dispatch}, active_category) {
await commit("setActiveCategory", active_category)
},
async setActiveTopic({commit, state, dispatch}, active_topic) {
await commit("setActiveTopic", active_topic);
},
async setIsMenuLocked({commit, state, dispatch}, isMenuLocked) {
await commit("setIsMenuLocked", isMenuLocked);
},
async setIsDrawerOpen({commit, state, dispatch}, isDrawerOpen) {
await commit("setIsDrawerOpen", isDrawerOpen)
},
async setCategories({commit, state, dispatch, context}, categories) {
await commit("setCategories", categories)
},
async loadCategories(context) {
// let location = document.location;
// console.log("location:" + location);
let commit = context.commit;
let state = context.state;
let dispatch = context.dispatch;
if (state.categories === null || state.categories.length === 0) {
let fm = require('front-matter');
const category_data = await this.$axios.get(endpoints.url(document, context, "/services/docs/markdown.csv"))
.then(manifest => {
// console.log("typeof(manifest):" + typeof (manifest))
// console.log("manifest:" + JSON.stringify(manifest, null, 2))
return manifest.data.split("\n").filter(x => {
return x !== null && x.length > 0
})
})
.catch((e) => {
console.log("error getting data:" + e);
throw e;
})
.then(async lines => {
let val = await Promise.all(lines.map(line => {
let url = "/docs" + "/" + line;
// console.log("url:"+url)
return this.$axios.get(endpoints.url(document, context, "/services/docs/" + line))
.then(res => {
// console.log("typeof(res):" + typeof(res))
return {
path: line,
content: res.data
};
})
})).then(r => {
// console.log("r:" + JSON.stringify(r,null,2))
return r
});
return val;
// let mapof =Object.fromEntries(val)
// console.log("mapof:" + JSON.stringify(mapof, null, 2))
// return mapof;
})
.catch((e) => {
console.log("error getting entries:" + e);
throw e;
})
.then(fetched => {
return fetched.map(entry => {
// console.log("entry:" + JSON.stringify(entry,null,2))
let [, name] = entry.path.match(/(.+)\.md$/);
let basename = entry.path.split("/").find(x => x.includes(".md"))
let categories = entry.path.split("/").filter(x => !x.includes("."))
let mdMeta = fm(entry.content);
let weight = ((mdMeta.attributes.weight) ? mdMeta.attributes.weight : 0)
let title = ((mdMeta.attributes.title) ? mdMeta.attributes.title : basename)
let path = "/docs/" + entry.path
let baseurl = endpoints.url(document, context, "/services/docs/" + path);
console.log("baseurl for doc:" + baseurl);
let body = endpoints.localize(mdMeta.body, baseurl)
// console.log("path:" + entry.path)
return {
name,
path,
basename,
categories,
weight,
title,
content: mdMeta.body
}
})
}
)
.catch((e) => {
console.log("error parsing entries:" + e);
throw e;
})
.then(alltopics => {
// console.log("input:" + JSON.stringify(input, null, 2))
let categorySet = new Set();
alltopics.forEach(x => {
x.categories.forEach(y => {
categorySet.add(y);
})
})
return Array.from(categorySet).map(name => {
// console.log("category:" + JSON.stringify(category, null, 2))
let topics = alltopics.filter(x => x.categories.toString() === name);
// console.log("docs_in_category = " + JSON.stringify(docs_in_category, null, 2))
let summary_topic = topics.find(x => x.path.endsWith('index.md'));
let weight = summary_topic ? summary_topic.weight : 0;
let title = summary_topic ? (summary_topic.title ? summary_topic.title : name) : name;
let content = summary_topic ? (summary_topic.content ? summary_topic.content: "") : "";
topics = topics.filter(x => !x.path.endsWith('index.md'));
topics.sort((a, b) => a.weight - b.weight);
let path = "/docs/" + name;
let entry = {
path,
name,
title,
weight,
content,
topics,
summary_topic
}
// console.log("entry=> " + entry);
return entry;
}).sort((c1, c2) => c1.weight - c2.weight);
})
.catch((e) => {
console.error("error in loadCategories:" + e);
})
await dispatch("setCategories", category_data)
}
if (state.active_category===null) {
commit("setActiveCategory", state.categories[0]);
}
if (state.active_topic===null) {
commit("setActiveTopic", state.active_category.topics[0]);
}
// console.log("typeof(result):" + typeof (docinfo))
// console.log("result:" + JSON.stringify(docinfo, null, 2))
}
}

View File

@@ -1,5 +0,0 @@
export const state = () => ({
})
export const getters = {}

View File

@@ -0,0 +1,61 @@
// https://www.mikestreety.co.uk/blog/vue-js-using-localstorage-with-the-vuex-store
import {mapGetters} from "vuex";
import endpoints from "@/js/endpoints";
export const state = () => ({
scenarios: []
});
export const getters = {
getScenarios: (state, getters) => {
return state.scenarios;
}
}
export const mutations = {
setScenarios(state, scenarios) {
state.scenarios = scenarios;
}
}
export const actions = {
async loadScenarios(context, reason) {
console.log("loading scenarios because '" + reason + "'")
await this.$axios.$get(endpoints.url(document, context, "/services/executor/scenarios/"))
.then(res => {
// console.log("axios/vuex scenarios async get:" + JSON.stringify(res));
// console.log("committing setScenarios:" + JSON.stringify(res));
context.commit('setScenarios', res)
})
.catch((e) => {
console.error("axios/nuxt scenarios async error:", e);
})
},
async runScenario(context, scenario_config) {
await this.$axios.post(endpoints.url(document, context, "/services/executor/cli"), scenario_config)
.then(res => {
console.log("execute scenarios response:" + res)
}
)
.catch((e) => {
console.error("axios/nuxt cli error: " + e)
})
},
async stopScenario(context, scenario_name) {
await this.$axios.$post(endpoints.url(document, context, "/services/executor/stop/" + scenario_name))
.then()
.catch((e) => {
console.error("axios/nuxt scenario stop error:", e);
})
await context.dispatch("loadScenarios")
},
async deleteScenario(context, scenario_name) {
await this.$axios.$delete(endpoints.url(document, context, "/services/executor/scenario/" + scenario_name))
.then()
.catch((e) => {
console.error("axios/nuxt scenario stop error:", e);
})
await context.dispatch("loadScenarios")
}
};

View File

@@ -0,0 +1,33 @@
// https://www.mikestreety.co.uk/blog/vue-js-using-localstorage-with-the-vuex-store
import {mapGetters} from "vuex";
import endpoints from "@/js/endpoints";
export const state = () => ({
endpoints: {},
enabled: false
});
export const getters = {
getEndpoints: (state, getters) => {
return state.endpoints;
}
}
export const mutations = {
setEndpoints(state, endpoints) {
state.endpoints = endpoints;
}
}
export const actions = {
async loadEndpoints(context, reason) {
console.log("loading endpoint status because '" + reason + "'")
await this.$axios.get(endpoints.url(document, context, "/services/status"))
.then(res => {
context.commit('setEndpoints', res)
})
.catch((e) => {
console.error("axios/nuxt status async error:" + e);
})
}
};

View File

@@ -0,0 +1,88 @@
// https://www.mikestreety.co.uk/blog/vue-js-using-localstorage-with-the-vuex-store
import {mapGetters} from "vuex";
import endpoints from "@/js/endpoints";
export const state = () => ({
workloads: [],
templates: {},
searchin: ''
});
export const getters = {
getWorkloads: (state, getters) => {
return state.workloads;
},
getSearchin: (state, getters) => {
return state.searchin;
},
getTemplates: (state, getters) => {
return state.templates;
}
}
export const mutations = {
setWorkloads(state, workloads) {
state.workloads = workloads;
},
setSearchin(state, searchin) {
state.searchin = searchin;
},
setTemplates(state, templates) {
state.templates = templates;
}
};
export const actions = {
async setWorkloads(context, val) {
// console.log("committing setWorkloads:" + JSON.stringify(val));
context.commit('setWorkloads', val);
},
async setTemplates(context, val) {
// console.log("commiting setTemplates:" + JSON.stringify(val));
context.commit("setTemplates", val);
},
async setSearchin(context, val) {
// console.log("committing setsearchin:" + JSON.stringify(val));
context.commit('setSearchin', val);
},
async fetchWorkloads(context, params) {
let reason = params.reason;
let searchin = params.searchin;
if (reason === undefined || searchin === undefined) {
throw "Unable to fetch workloads without a reason or searchin: " + JSON.stringify(params);
}
// console.log("fetching workloads because '" + reason + "'")
context.commit("setTemplates", undefined);
this.$axios.$get(endpoints.url(document, context, "/services/workloads/?searchin=" + searchin))
.then(res => {
// console.log("axios/vuex workloads async get:" + JSON.stringify(res));
context.commit("setWorkloads", res);
})
.catch((e) => {
console.error("axios/nuxt workloads async error:", e);
})
},
async fetchTemplates(context, params) {
let reason = params.reason;
let workload = params.workload;
let searchin = params.searchin;
if (reason === undefined || workload === undefined || searchin === undefined) {
throw "Unable to fetch templates for workload without a {reason,workload,searchin}: " + JSON.stringify(params);
}
console.log("fetching templates for '" + workload + "' because '" + reason + "'")
this.$axios.$get(endpoints.url(document, context, "/services/workloads/parameters?workloadName=" + workload + "&" + "searchin=" + searchin))
.then(res => {
// console.log("axios/vuex templates async get:" + JSON.stringify(res));
context.dispatch("setTemplates", res)
.then(r => {
console.log("setTemplates result:" + JSON.stringify(r, null, 2))
});
})
.catch((e) => {
console.error("axios/nuxt templates async error:", e);
})
}
};

View File

@@ -0,0 +1,106 @@
// https://www.mikestreety.co.uk/blog/vue-js-using-localstorage-with-the-vuex-store
import {mapGetters} from "vuex";
import endpoints from "@/js/endpoints";
export const state = () => ({
workspace: 'default',
workspaces: [],
fileview: []
});
export const getters = {
getWorkspace: (state, getters) => {
return state.workspace;
},
getWorkspaces: (state, getters) => {
return state.workspaces;
},
getFileview: (state, getters) => {
return state.fileview;
}
// ...mapGetters(['workspace','workspaces'])
}
export const mutations = {
setWorkspace(state, workspace) {
state.workspace = workspace;
},
setWorkspaces(state, workspaces) {
state.workspaces = workspaces;
},
setFileview(state, fileview) {
state.fileview = fileview;
}
};
export const actions = {
async setWorkspace(context, val) {
// console.log("committing setWorkspace:" + JSON.stringify(val));
context.commit('setWorkspace', val);
},
async setWorkspaces(context, val) {
// console.log("committing setWorkspaces:" + JSON.stringify(val));
context.commit('setWorkspaces', val);
},
async initWorkspaces(context, reason) {
// console.log("initializing workspaces because '" + reason + "'")
this.$axios.$get(endpoints.url(document, context, "/services/workspaces/"))
.then(res => {
// console.log("axios/vuex workspaces async get:" + JSON.stringify(res));
// console.log("committing setWorkspaces:" + JSON.stringify(res));
context.commit('setWorkspaces', res)
})
.catch((e) => {
console.error("axios/nuxt workspaces async error:", e);
})
},
async putFile(context, params) {
let to_workspace = params.workspace;
let to_filename = params.filename;
let to_content = params.content;
if (!to_workspace || !to_filename || !to_content) {
throw("Unable to save file to workspace without params having workspace, filename, content");
}
const result = await this.$axios.$post(endpoints.url(document, context, "/services/workspaces/" + to_workspace + "/" + to_filename, to_content))
.then(res => {
console.log("axios/vuex workspace put:" + JSON.stringify(res));
return res;
})
.catch((e) => {
console.error("axios/vuex workspace put:", e)
});
},
async activateWorkspace(context, workspace) {
const fresh_workspace = await this.$axios.$get(endpoints.url(document, context, "/services/workspaces/" + workspace))
.then(res => {
// console.log("axios/vuex workspace async get:" + JSON.stringify(res))
return res;
})
.catch((e) => {
console.error("axios/nuxt getWorkspace async error:", e)
})
await context.dispatch('initWorkspaces', "workspace '" + workspace + "' added");
// await dispatch.initWorkspaces({commit, state, dispatch}, "workspace '" + workspace + "' added")
// await this.$store.dispatch("workspaces/initWorkspaces", "workspace '" + workspace + "' added")
// await this.initWorkspaces({commit}, "workspace added");
context.commit('setWorkspace', fresh_workspace.name)
return workspace;
},
async purgeWorkspace(context, workspace) {
await this.$axios.$delete(endpoints.url(document, context, "/services/workspaces/" + workspace))
.then(res => {
console.log("purged workspace:" + res)
return res;
})
.catch((e) => {
console.error("axios/nuxt purgeWorkspace error:", e)
})
const found = this.state.workspaces.workspaces.find(w => w.name === workspace);
if (!found) {
console.log("setting active workspace to 'default' since the previous workspace '" + workspace + "' is not found")
await context.dispatch('activateWorkspace', "default");
}
await context.dispatch('initWorkspaces', "workspace '" + workspace + "' purged");
}
};

File diff suppressed because it is too large Load Diff

View File

@@ -1,9 +1,135 @@
<!doctype html>
<!DOCTYPE html>
<html>
<head>
<title>NoSQLBench</title><meta data-n-head="1" charset="utf-8"><meta data-n-head="1" name="viewport" content="width=device-width,initial-scale=1"><meta data-n-head="1" data-hid="description" name="description" content="Docs App for NoSQLBench"><link data-n-head="1" rel="icon" type="image/x-icon" href="/favicon.ico"><link data-n-head="1" rel="stylesheet" type="text/css" href="https://fonts.googleapis.com/css?family=Roboto:100,300,400,500,700,900&display=swap"><link data-n-head="1" rel="stylesheet" type="text/css" href="https://cdn.jsdelivr.net/npm/@mdi/font@latest/css/materialdesignicons.min.css"><link rel="preload" href="/_nuxt/7c52b47a9d09e9501071.js" as="script"><link rel="preload" href="/_nuxt/2db30ff69d2689005afd.js" as="script"><link rel="preload" href="/_nuxt/c01ee420d65f0f86e261.js" as="script"><link rel="preload" href="/_nuxt/fa799dd3c7503934188d.js" as="script">
<title>NoSQLBench</title>
<meta data-n-head="1" charset="utf-8">
<meta data-n-head="1" name="viewport"
content="width=device-width,initial-scale=1">
<meta data-n-head="1" data-hid="description" name="description"
content="Docs App for NoSQLBench">
<link data-n-head="1" rel="icon" type="image/x-icon"
href="/favicon.ico">
<link data-n-head="1" rel="stylesheet" type="text/css"
href="https://fonts.googleapis.com/css?family=Roboto:100,300,400,500,700,900&amp;display=swap">
<link data-n-head="1" rel="stylesheet" type="text/css"
href="https://cdn.jsdelivr.net/npm/@mdi/font@latest/css/materialdesignicons.min.css">
<link rel="preload" href="/_nuxt/runtime.c50e81a.js" as="script">
<link rel="preload" href="/_nuxt/vendors/commons.0d00bcd.js"
as="script">
<link rel="preload" href="/_nuxt/app.4c18aa8.js" as="script">
</head>
<body>
<div id="__nuxt"><style>#nuxt-loading{visibility:hidden;opacity:0;position:absolute;left:0;right:0;top:0;bottom:0;display:flex;justify-content:center;align-items:center;flex-direction:column;animation:nuxtLoadingIn 10s ease;-webkit-animation:nuxtLoadingIn 10s ease;animation-fill-mode:forwards;overflow:hidden}@keyframes nuxtLoadingIn{0%{visibility:hidden;opacity:0}20%{visibility:visible;opacity:0}100%{visibility:visible;opacity:1}}@-webkit-keyframes nuxtLoadingIn{0%{visibility:hidden;opacity:0}20%{visibility:visible;opacity:0}100%{visibility:visible;opacity:1}}#nuxt-loading>div,#nuxt-loading>div:after{border-radius:50%;width:5rem;height:5rem}#nuxt-loading>div{font-size:10px;position:relative;text-indent:-9999em;border:.5rem solid #f5f5f5;border-left:.5rem solid #fff;-webkit-transform:translateZ(0);-ms-transform:translateZ(0);transform:translateZ(0);-webkit-animation:nuxtLoading 1.1s infinite linear;animation:nuxtLoading 1.1s infinite linear}#nuxt-loading.error>div{border-left:.5rem solid #ff4500;animation-duration:5s}@-webkit-keyframes nuxtLoading{0%{-webkit-transform:rotate(0);transform:rotate(0)}100%{-webkit-transform:rotate(360deg);transform:rotate(360deg)}}@keyframes nuxtLoading{0%{-webkit-transform:rotate(0);transform:rotate(0)}100%{-webkit-transform:rotate(360deg);transform:rotate(360deg)}}</style><script>window.addEventListener("error",function(){var e=document.getElementById("nuxt-loading");e&&(e.className+=" error")})</script><div id="nuxt-loading" aria-live="polite" role="status"><div>Loading...</div></div></div>
<script type="text/javascript" src="/_nuxt/7c52b47a9d09e9501071.js"></script><script type="text/javascript" src="/_nuxt/2db30ff69d2689005afd.js"></script><script type="text/javascript" src="/_nuxt/c01ee420d65f0f86e261.js"></script><script type="text/javascript" src="/_nuxt/fa799dd3c7503934188d.js"></script></body>
<div id="__nuxt">
<style>#nuxt-loading {
background: white;
visibility: hidden;
opacity: 0;
position: absolute;
left: 0;
right: 0;
top: 0;
bottom: 0;
display: flex;
justify-content: center;
align-items: center;
flex-direction: column;
animation: nuxtLoadingIn 10s ease;
-webkit-animation: nuxtLoadingIn 10s ease;
animation-fill-mode: forwards;
overflow: hidden;
}
@keyframes nuxtLoadingIn {
0% {
visibility: hidden;
opacity: 0;
}
20% {
visibility: visible;
opacity: 0;
}
100% {
visibility: visible;
opacity: 1;
}
}
@-webkit-keyframes nuxtLoadingIn {
0% {
visibility: hidden;
opacity: 0;
}
20% {
visibility: visible;
opacity: 0;
}
100% {
visibility: visible;
opacity: 1;
}
}
#nuxt-loading > div, #nuxt-loading > div:after {
border-radius: 50%;
width: 5rem;
height: 5rem;
}
#nuxt-loading > div {
font-size: 10px;
position: relative;
text-indent: -9999em;
border: .5rem solid #F5F5F5;
border-left: .5rem solid #fff;
-webkit-transform: translateZ(0);
-ms-transform: translateZ(0);
transform: translateZ(0);
-webkit-animation: nuxtLoading 1.1s infinite linear;
animation: nuxtLoading 1.1s infinite linear;
}
#nuxt-loading.error > div {
border-left: .5rem solid #ff4500;
animation-duration: 5s;
}
@-webkit-keyframes nuxtLoading {
0% {
-webkit-transform: rotate(0deg);
transform: rotate(0deg);
}
100% {
-webkit-transform: rotate(360deg);
transform: rotate(360deg);
}
}
@keyframes nuxtLoading {
0% {
-webkit-transform: rotate(0deg);
transform: rotate(0deg);
}
100% {
-webkit-transform: rotate(360deg);
transform: rotate(360deg);
}
}</style>
<script>window.addEventListener('error', function () {
var e = document.getElementById('nuxt-loading');
if (e) {
e.className += ' error';
}
});</script>
<div id="nuxt-loading" aria-live="polite" role="status">
<div>Loading...</div>
</div>
</div>
<script>window.__NUXT__ = {
config: {},
staticAssetsBase: void 0
}</script>
<script src="/_nuxt/runtime.c50e81a.js"></script>
<script src="/_nuxt/vendors/commons.0d00bcd.js"></script>
<script src="/_nuxt/app.4c18aa8.js"></script>
</body>
</html>

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@@ -1 +0,0 @@
!function(e){function r(data){for(var r,n,c=data[0],l=data[1],d=data[2],i=0,h=[];i<c.length;i++)n=c[i],Object.prototype.hasOwnProperty.call(o,n)&&o[n]&&h.push(o[n][0]),o[n]=0;for(r in l)Object.prototype.hasOwnProperty.call(l,r)&&(e[r]=l[r]);for(v&&v(data);h.length;)h.shift()();return f.push.apply(f,d||[]),t()}function t(){for(var e,i=0;i<f.length;i++){for(var r=f[i],t=!0,n=1;n<r.length;n++){var l=r[n];0!==o[l]&&(t=!1)}t&&(f.splice(i--,1),e=c(c.s=r[0]))}return e}var n={},o={11:0},f=[];function c(r){if(n[r])return n[r].exports;var t=n[r]={i:r,l:!1,exports:{}};return e[r].call(t.exports,t,t.exports,c),t.l=!0,t.exports}c.e=function(e){var r=[],t=o[e];if(0!==t)if(t)r.push(t[2]);else{var n=new Promise((function(r,n){t=o[e]=[r,n]}));r.push(t[2]=n);var f,script=document.createElement("script");script.charset="utf-8",script.timeout=120,c.nc&&script.setAttribute("nonce",c.nc),script.src=function(e){return c.p+""+{0:"52a7cce22511daf04466",1:"407f8405125899b013f8",2:"74955e3a4ded093cd4a2",3:"a243bd228956b1ed8d5f",6:"8702cf295bbfbfa997dc",7:"c8bbbc322f8f1fe65a27",8:"5dbfbf0f0448fbb70c73",9:"3484794277c7edf13895",10:"51b41e8ba4499fa7c407",13:"e7c9fbb1e91c35f241b3"}[e]+".js"}(e);var l=new Error;f=function(r){script.onerror=script.onload=null,clearTimeout(d);var t=o[e];if(0!==t){if(t){var n=r&&("load"===r.type?"missing":r.type),f=r&&r.target&&r.target.src;l.message="Loading chunk "+e+" failed.\n("+n+": "+f+")",l.name="ChunkLoadError",l.type=n,l.request=f,t[1](l)}o[e]=void 0}};var d=setTimeout((function(){f({type:"timeout",target:script})}),12e4);script.onerror=script.onload=f,document.head.appendChild(script)}return Promise.all(r)},c.m=e,c.c=n,c.d=function(e,r,t){c.o(e,r)||Object.defineProperty(e,r,{enumerable:!0,get:t})},c.r=function(e){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(e,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(e,"__esModule",{value:!0})},c.t=function(e,r){if(1&r&&(e=c(e)),8&r)return e;if(4&r&&"object"==typeof e&&e&&e.__esModule)return e;var t=Object.create(null);if(c.r(t),Object.defineProperty(t,"default",{enumerable:!0,value:e}),2&r&&"string"!=typeof e)for(var n in e)c.d(t,n,function(r){return e[r]}.bind(null,n));return t},c.n=function(e){var r=e&&e.__esModule?function(){return e.default}:function(){return e};return c.d(r,"a",r),r},c.o=function(object,e){return Object.prototype.hasOwnProperty.call(object,e)},c.p="/_nuxt/",c.oe=function(e){throw console.error(e),e};var l=window.webpackJsonp=window.webpackJsonp||[],d=l.push.bind(l);l.push=r,l=l.slice();for(var i=0;i<l.length;i++)r(l[i]);var v=d;t()}([]);

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

Some files were not shown because too many files have changed in this diff Show More