Files
grafana/public/app/plugins/datasource/testdata/runStreams.ts

231 lines
6.0 KiB
TypeScript
Raw Normal View History

QueryProcessing: Observable query interface and RxJS for query & stream processing (#18899) * I needed to learn some rxjs and understand this more, so just playing around * Updated * Removed all the complete calls * Refactoring * StreamHandler -> observable start * progress * simple singal works * Handle update time range * added error handling * wrap old function * minor changes * handle data format in the subscribe function * Use replay subject to return last value to subscribers * Set loading state after no response in 50ms * added missing file * updated comment * Added cancelation of network requests * runRequest: Added unit test scenario framework * Progress on tests * minor refactor of unit tests * updated test * removed some old code * Shared queries work again, and also became so much simplier * unified query and observe methods * implict any fix * Fixed closed subject issue * removed comment * Use last returned data for loading state * WIP: Explore to runRequest makover step1 * Minor progress * Minor progress on explore and runRequest * minor progress * Things are starting to work in explore * Updated prometheus to use new observable query response, greatly simplified code * Revert refId change * Found better solution for key/refId/requestId problem * use observable with loki * tests compile * fix loki query prep * Explore: correct first response handling * Refactorings * Refactoring * Explore: Fixes LoadingState and GraphResults between runs (#18986) * Refactor: Adds state to DataQueryResponse * Fix: Fixes so we do not empty results before new data arrives Fixes: #17409 * Transformations work * observable test data * remove single() from loki promise * Fixed comment * Explore: Fixes failing Loki and Prometheus unit tests (#18995) * Tests: Makes datasource tests work again * Fix: Fixes loki datasource so highligthing works * Chore: Runs Prettier * Fixed query runner tests * Delay loading state indication to 200ms * Fixed test * fixed unit tests * Clear cached calcs * Fixed bug getProcesedDataFrames * Fix the correct test is a better idea * Fix: Fixes so queries in Explore are only run if Graph/Table is shown (#19000) * Fix: Fixes so queries in Explore are only run if Graph/Table is shown Fixes: #18618 * Refactor: Removes unnecessary condition * PanelData: provide legacy data only when needed (#19018) * no legacy * invert logic... now compiles * merge getQueryResponseData and getDataRaw * update comment about query editor * use single getData() function * only send legacy when it is used in explore * pre process rather than post process * pre process rather than post process * Minor refactoring * Add missing tags to test datasource response * MixedDatasource: Adds query observable pattern to MixedDatasource (#19037) * start mixed datasource * Refactor: Refactors into observable parttern * Tests: Fixes tests * Tests: Removes console.log * Refactor: Adds unique requestId
2019-09-12 17:28:46 +02:00
import { defaults } from 'lodash';
import { Observable } from 'rxjs';
import {
DataQueryRequest,
DataQueryResponse,
FieldType,
CircularDataFrame,
CSVReader,
Field,
LoadingState,
} from '@grafana/data';
QueryProcessing: Observable query interface and RxJS for query & stream processing (#18899) * I needed to learn some rxjs and understand this more, so just playing around * Updated * Removed all the complete calls * Refactoring * StreamHandler -> observable start * progress * simple singal works * Handle update time range * added error handling * wrap old function * minor changes * handle data format in the subscribe function * Use replay subject to return last value to subscribers * Set loading state after no response in 50ms * added missing file * updated comment * Added cancelation of network requests * runRequest: Added unit test scenario framework * Progress on tests * minor refactor of unit tests * updated test * removed some old code * Shared queries work again, and also became so much simplier * unified query and observe methods * implict any fix * Fixed closed subject issue * removed comment * Use last returned data for loading state * WIP: Explore to runRequest makover step1 * Minor progress * Minor progress on explore and runRequest * minor progress * Things are starting to work in explore * Updated prometheus to use new observable query response, greatly simplified code * Revert refId change * Found better solution for key/refId/requestId problem * use observable with loki * tests compile * fix loki query prep * Explore: correct first response handling * Refactorings * Refactoring * Explore: Fixes LoadingState and GraphResults between runs (#18986) * Refactor: Adds state to DataQueryResponse * Fix: Fixes so we do not empty results before new data arrives Fixes: #17409 * Transformations work * observable test data * remove single() from loki promise * Fixed comment * Explore: Fixes failing Loki and Prometheus unit tests (#18995) * Tests: Makes datasource tests work again * Fix: Fixes loki datasource so highligthing works * Chore: Runs Prettier * Fixed query runner tests * Delay loading state indication to 200ms * Fixed test * fixed unit tests * Clear cached calcs * Fixed bug getProcesedDataFrames * Fix the correct test is a better idea * Fix: Fixes so queries in Explore are only run if Graph/Table is shown (#19000) * Fix: Fixes so queries in Explore are only run if Graph/Table is shown Fixes: #18618 * Refactor: Removes unnecessary condition * PanelData: provide legacy data only when needed (#19018) * no legacy * invert logic... now compiles * merge getQueryResponseData and getDataRaw * update comment about query editor * use single getData() function * only send legacy when it is used in explore * pre process rather than post process * pre process rather than post process * Minor refactoring * Add missing tags to test datasource response * MixedDatasource: Adds query observable pattern to MixedDatasource (#19037) * start mixed datasource * Refactor: Refactors into observable parttern * Tests: Fixes tests * Tests: Removes console.log * Refactor: Adds unique requestId
2019-09-12 17:28:46 +02:00
import { TestDataQuery, StreamingQuery } from './types';
import { getRandomLine } from './LogIpsum';
export const defaultQuery: StreamingQuery = {
type: 'signal',
speed: 250, // ms
spread: 3.5,
noise: 2.2,
bands: 1,
};
export function runStream(target: TestDataQuery, req: DataQueryRequest<TestDataQuery>): Observable<DataQueryResponse> {
const query = defaults(target.stream, defaultQuery);
if ('signal' === query.type) {
return runSignalStream(target, query, req);
}
if ('logs' === query.type) {
return runLogsStream(target, query, req);
}
if ('fetch' === query.type) {
return runFetchStream(target, query, req);
}
throw new Error(`Unknown Stream Type: ${query.type}`);
}
export function runSignalStream(
target: TestDataQuery,
query: StreamingQuery,
req: DataQueryRequest<TestDataQuery>
): Observable<DataQueryResponse> {
return new Observable<DataQueryResponse>(subscriber => {
const streamId = `signal-${req.panelId}-${target.refId}`;
const maxDataPoints = req.maxDataPoints || 1000;
const data = new CircularDataFrame({
append: 'tail',
capacity: maxDataPoints,
});
data.refId = target.refId;
data.name = target.alias || 'Signal ' + target.refId;
data.addField({ name: 'time', type: FieldType.time });
data.addField({ name: 'value', type: FieldType.number });
const { spread, speed, bands, noise } = query;
for (let i = 0; i < bands; i++) {
const suffix = bands > 1 ? ` ${i + 1}` : '';
data.addField({ name: 'Min' + suffix, type: FieldType.number });
data.addField({ name: 'Max' + suffix, type: FieldType.number });
}
let value = Math.random() * 100;
let timeoutId: any = null;
const addNextRow = (time: number) => {
value += (Math.random() - 0.5) * spread;
let idx = 0;
data.fields[idx++].values.add(time);
data.fields[idx++].values.add(value);
let min = value;
let max = value;
for (let i = 0; i < bands; i++) {
min = min - Math.random() * noise;
max = max + Math.random() * noise;
data.fields[idx++].values.add(min);
data.fields[idx++].values.add(max);
}
};
// Fill the buffer on init
if (true) {
let time = Date.now() - maxDataPoints * speed;
for (let i = 0; i < maxDataPoints; i++) {
addNextRow(time);
time += speed;
}
}
const pushNextEvent = () => {
addNextRow(Date.now());
subscriber.next({
data: [data],
key: streamId,
});
timeoutId = setTimeout(pushNextEvent, speed);
};
// Send first event in 5ms
setTimeout(pushNextEvent, 5);
return () => {
console.log('unsubscribing to stream ' + streamId);
clearTimeout(timeoutId);
};
});
}
export function runLogsStream(
target: TestDataQuery,
query: StreamingQuery,
req: DataQueryRequest<TestDataQuery>
): Observable<DataQueryResponse> {
return new Observable<DataQueryResponse>(subscriber => {
const streamId = `logs-${req.panelId}-${target.refId}`;
const maxDataPoints = req.maxDataPoints || 1000;
const data = new CircularDataFrame({
append: 'tail',
capacity: maxDataPoints,
});
data.refId = target.refId;
data.name = target.alias || 'Logs ' + target.refId;
data.addField({ name: 'time', type: FieldType.time });
data.addField({ name: 'line', type: FieldType.string });
const { speed } = query;
let timeoutId: any = null;
const pushNextEvent = () => {
data.values.time.add(Date.now());
data.values.line.add(getRandomLine());
subscriber.next({
data: [data],
key: streamId,
});
timeoutId = setTimeout(pushNextEvent, speed);
};
// Send first event in 5ms
setTimeout(pushNextEvent, 5);
return () => {
console.log('unsubscribing to stream ' + streamId);
clearTimeout(timeoutId);
};
});
}
export function runFetchStream(
target: TestDataQuery,
query: StreamingQuery,
req: DataQueryRequest<TestDataQuery>
): Observable<DataQueryResponse> {
return new Observable<DataQueryResponse>(subscriber => {
const streamId = `fetch-${req.panelId}-${target.refId}`;
const maxDataPoints = req.maxDataPoints || 1000;
let data = new CircularDataFrame({
append: 'tail',
capacity: maxDataPoints,
});
data.refId = target.refId;
data.name = target.alias || 'Fetch ' + target.refId;
let reader: ReadableStreamReader<Uint8Array>;
const csv = new CSVReader({
callback: {
onHeader: (fields: Field[]) => {
// Clear any existing fields
if (data.fields.length) {
data = new CircularDataFrame({
append: 'tail',
capacity: maxDataPoints,
});
data.refId = target.refId;
data.name = 'Fetch ' + target.refId;
}
for (const field of fields) {
data.addField(field);
}
},
onRow: (row: any[]) => {
data.add(row);
},
},
});
const processChunk = (value: ReadableStreamReadResult<Uint8Array>): any => {
if (value.value) {
const text = new TextDecoder().decode(value.value);
csv.readCSV(text);
}
subscriber.next({
data: [data],
key: streamId,
state: value.done ? LoadingState.Done : LoadingState.Streaming,
});
if (value.done) {
console.log('Finished stream');
subscriber.complete(); // necessary?
return;
}
return reader.read().then(processChunk);
};
fetch(new Request(query.url)).then(response => {
reader = response.body.getReader();
reader.read().then(processChunk);
});
return () => {
// Cancel fetch?
console.log('unsubscribing to stream ' + streamId);
};
});
}